Android: getHolder().setFixedSize(1920,1080) causes Issue with screen Resolution - java

Im developing a Game. I want my screen to always be 1920x1080.
getHolder().setFixedSize(1920,1080);
setFixedSize on my SurfaceView achieves this. It also creates a problem, however: the screen is still in WQHD res instead of also being FHD. This leads to mouseclickposition being bigger than the canvas is visually, and thus clicking on buttons no longer works unless I press where they actually would be in the smaller resolution.
Is there a way to set Android's resolution to 1920x1080? Maybe setting the Layout to that? I also tried making my canvas equal to a bitmap thats the right dimensions but that doesnt seem to change Anything at all.
Bitmap bitmap = Bitmap.createBitmap(10,100,null);
Canvas c = new Canvas(bitmap);
An alternative way to solve this might be to calculate the corresponding mouseClick Position from the bigger screen to the smaller, I suppose, but that seems like the suboptimal solution.

Here is what solved it
public static float normalize(float value, float min, float max) {
return Math.abs((value - min) / (max - min));
}
public boolean onTouchEvent(MotionEvent event) {
int differenceInWidth = actualScreenwidth - screenwidth; //2560-1920
float xPercent = normalize(event.getX(),0,actualScreenwidth);
Log.d("mouse_P_xPercent", Float.toString(xPercent));
float yPercent = normalize(event.getY(),0,actualScreenheight);
Log.d("mouse_P_yPercent", Float.toString(yPercent));
mouseCurrentPositionX = (int) (screenwidth*xPercent);
mouseCurrentPositionY = (int) (screenheight*xPercent);

Related

Android Button different position on other devices (using setX)

I want to move several buttons, rearranging them in a different position than the original one at a specific moment, while playing the app. I used the setX() method for this purpose, to move the button to the place I want. I know that this method takes pixels as imput (and pixels depend on the density of the device), so I took the density of the device and multiplied it by a certain number (the position in density pixels), so the output is that position in pixels for each device. I thought that would give me the same button position for all devices, but it doesn´t work. The buttons appear displaced on different devices. This is the method I used to convert density pixels to the corresponding pixels for each device:
public void Converter_Dp_to_Px(){
pxX = (int) (dpX * Resources.getSystem().getDisplayMetrics().density); //Pixels in X direction
pxY = (int) (dpY * Resources.getSystem().getDisplayMetrics().density); //Pixels in Y direction
}
Now I set values for dpX and dpY, convert them into pixels for each device, and place the button in that position with setX() and setY() methods:
dpX = 254;
Converter_Dp_to_Px();
dpY = 477;
Converter_Dp_to_Px();
button1.setX(pxX);
button1.setY(pxY);
I also tried not with absolute position, but with one with percentages, as follows:
int maxX = Resources.getSystem().getDisplayMetrics().heightPixels;
int maxY = Resources.getSystem().getDisplayMetrics().widthPixels;
mov_percenX = 0.37f;
mov_percenY = 0.63f;
button1.setX(button1.getX() + maxX * mov_percenX);
button1.setY(button1.getY() + maxY * mov_percenY);
But it doesn´t work anyway. I hope you can help me, thanks in advance.

How to prevent texture bleeding in a tilemap in LibGDX

I know there are quite some questions (and answers) on this topic, but they all have different solutions, and none of them seems to be working in my case.
I'm developing a small test project with libGDX, in which I tried to add a simple tilemap. I created the tilemap using Tiled, which seems to be working quite good, except for the texture bleeding, that causes black lines (the background color) to appear between the tiles sometimes.
What I've tried so far:
I read several SO-questions, tutorials and forum posts, and tried almost all of the solutions, but I just don't seem to get this working. Most of the answers said that I would need a padding between the tiles, but this doesn't seem to fix it. I also tried loading the tilemap with different parameters (e.g. to use the Nearest filter when loading them) or rounding the camera's position to prevent rounding problems, but this did even make it worse.
My current setup:
You can find the whole project on GitHub. The branch is called 'tile_map_scaling'
At the moment I'm using a tileset that is made of this tile-picture:
It has two pixels of space between every tile, to use as padding and margin.
My Tiled tileset settings look like this:
I use two pixels of margin and spacing, to (try to) prevent the bleeding here.
Most of the time it is rendered just fine, but still sometimes there are these lines between the tiles like in this picture (sometimes they seem to appear only on a part of the map):
I'm currently loading the tile map into the asset manager without any parameters:
public void load() {
AssetManager manager = new AssetManager();
manager.setLoader(TiledMap.class, new TmxMapLoader(new InternalFileHandleResolver()));
manager.setErrorListener(this);
manager.load("map/map.tmx", TiledMap.class, new AssetLoaderParameters());
}
... and use it like this:
public class GameScreen {
public static final float WORLD_TO_SCREEN = 4.0f;
public static final float SCENE_WIDTH = 1280f;
public static final float SCENE_HEIGHT = 720f;
//...
private Viewport viewport;
private OrthographicCamera camera;
private TiledMap map;
private OrthogonalTiledMapRenderer renderer;
public GameScreen() {
camera = new OrthographicCamera();
viewport = new FitViewport(SCENE_WIDTH, SCENE_HEIGHT, camera);
map = assetManager.get("map/map.tmx");
renderer = new OrthogonalTiledMapRenderer(map);
}
#Override
public void render(float delta) {
//clear the screen (with a black screen)
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
moveCamera(delta);
renderer.setView(camera);
renderer.render();
//... draw the player, some debug graphics, a hud, ...
moveCameraToPlayer();
}
private void moveCamera(float delta) {
if (Gdx.input.isKeyPressed(Keys.LEFT)) {
camera.position.x -= CAMERA_SPEED * delta;
}
else if (Gdx.input.isKeyPressed(Keys.RIGHT)) {
camera.position.x += CAMERA_SPEED * delta;
}
// ...
//update the camera to re-calculate the matrices
camera.update();
}
private void moveCameraToPlayer() {
Vector2 dwarfPosition = dwarf.getPosition();
//movement in positive X and Y direction
float deltaX = camera.position.x - dwarfPosition.x;
float deltaY = camera.position.y - dwarfPosition.y;
float movementXPos = deltaX - MOVEMENT_RANGE_X;
float movementYPos = deltaY - MOVEMENT_RANGE_Y;
//movement in negative X and Y direction
deltaX = dwarfPosition.x - camera.position.x;
deltaY = dwarfPosition.y - camera.position.y;
float movementXNeg = deltaX - MOVEMENT_RANGE_X;
float movementYNeg = deltaY - MOVEMENT_RANGE_Y;
camera.position.x -= Math.max(movementXPos, 0);
camera.position.y -= Math.max(movementYPos, 0);
camera.position.x += Math.max(movementXNeg, 0);
camera.position.y += Math.max(movementYNeg, 0);
camera.update();
}
// ... some other methods ...
}
The question:
I am using padding on the tilemap and also tried different loading parameters and rounding the camera position, but still I have this texture bleeding problem in my tilemap.
What am I missing? Or what am I doing wrong?
Any help on this would be great.
You need to pad the edges of your tiles in you tilesheet.
It looks like you've tried to do this but the padding is transparent, it needs to be of the color of the pixel it is padding.
So if you have an image like this (where each letter is a pixel and the tile size is one):
AB
CB
then padding it should look something like this
A B
AAABBB
A B
C C
CCCCCC
C C
The pixel being padded must be padded with a pixel of the same color.
(I'll try try create a pull request with a fix for your git-repo as well.)
As a little addition to bornander's answer, I created some python scripts, that do all the work to generate a tileset texture, that has the correct edge padding (that bornander explained in his answer) from a texture, that has no padding yet.
Just in case anyone can make use of it, it can be found on GitHub:
https://github.com/tfassbender/libGdxImageTools
There is also a npm package that can extrude the tiles. It was built for the Phaser JS game library, but you could still use it. https://github.com/sporadic-labs/tile-extruder

Custom actor for BitmapFont (libgdx)

I've spent several frustrating hours trying to implement (what I thought would be) a simple FontActor class.
The idea is to just draw text at a specific position using a provided BitmapFont. That much, I've managed to accomplish. However, I'm struggling to compute my actor's width/height based on the rendered text.
(Using a FitViewport for testing)
open class FontActor<T : BitmapFont>(val font: T, var text: CharSequence = "") : GameActor() {
val layout = Pools.obtain(GlyphLayout::class.java)!!
companion object {
val identity4 = Matrix4().idt()
val distanceFieldShader: ShaderProgram = DistanceFieldFont.createDistanceFieldShader()
}
override fun draw(batch: Batch?, parentAlpha: Float) {
if (batch == null) return
batch.end()
// grab ui camera and backup current projection
val uiCamera = Game.context.inject<OrthographicCamera>()
val prevTransform = batch.transformMatrix
val prevProjection = batch.projectionMatrix
batch.transformMatrix = identity4
batch.projectionMatrix = uiCamera.combined
if (font is DistanceFieldFont) batch.shader = distanceFieldShader
// the actor has pos = x,y in local coords, but we need UI coords
// start by getting group -> stage coords (world)
val coords = Vector3(localToStageCoordinates(Vector2(0f, 0f)), 0f)
// world coordinate destination -> screen coords
stage.viewport.project(coords)
// screen coords -> font camera world coords
uiCamera.unproject(coords,
stage.viewport.screenX.toFloat(),
stage.viewport.screenY.toFloat(),
stage.viewport.screenWidth.toFloat(),
stage.viewport.screenHeight.toFloat())
// adjust position by cap height so that bottom left of text aligns with x, y
coords.y = uiCamera.viewportHeight - coords.y + font.capHeight
/// TODO: use BitmapFontCache to prevent this call on every frame and allow for offline bounds calculation
batch.begin()
layout.setText(font, text)
font.draw(batch, layout, coords.x, coords.y)
batch.end()
// viewport screen coordinates -> world coordinates
setSize((layout.width / stage.viewport.screenWidth) * stage.width,
(layout.height / stage.viewport.screenHeight) * stage.height)
// restore camera
if (font is DistanceFieldFont) batch.shader = null
batch.projectionMatrix = prevProjection
batch.transformMatrix = prevTransform
batch.begin()
}
}
And in my parent Screen class implementation, I rescale my fonts on every window resize so that they don't become "smooshed" or stretched:
override fun resize(width: Int, height: Int) {
stage.viewport.update(width, height)
context.inject<OrthographicCamera>().setToOrtho(false, width.toFloat(), height.toFloat())
// rescale fonts
scaleX = width.toFloat() / Config.screenWidth
scaleY = height.toFloat() / Config.screenHeight
val scale = minOf(scaleX, scaleY)
gdxArrayOf<BitmapFont>().apply {
Game.assets.getAll(BitmapFont::class.java, this)
forEach { it.data.setScale(scale) }
}
gdxArrayOf<DistanceFieldFont>().apply {
Game.assets.getAll(DistanceFieldFont::class.java, this)
forEach { it.data.setScale(scale) }
}
}
This works and looks great until you resize your window.
After a resize, the fonts look fine and automatically adjust with the relative size of the window, but the FontActor has the wrong size, because my call to setSize is wrong.
Initial window:
After making window horizontally larger:
For example, if I then scale my window horizontally (which has no effect on the world size, because I'm using a FitViewport), the font looks correct, just as intended. However, the layout.width value coming back from the draw() changes, even though the text size hasn't changed on-screen. After investigation, I realized this is due to my use of setScale, but simply dividing the width by the x-scaling factor doesn't correct the error. And again, if I remove my setScale calls, the numbers make sense, but the font is now squished!
Another strategy I tried was converting the width/height into screen coordinates, then using the relevant project/unproject methods to get the width and height in world coordinates. This suffers from the same issue shown in the images.
How can I fix my math?
Or, is there a smarter/easier way to implement all of this? (No, I don't want Label, I just want a text actor.)
One problem was my scaling code.
The fix was to change the camera update as follows:
context.inject<OrthographicCamera>().setToOrtho(false, stage.viewport.screenWidth.toFloat(), stage.viewport.screenHeight.toFloat())
Which causes my text camera to match the world viewport camera. I was using the entire screen for my calculations, hence the stretching.
My scaleX/Y calculations were wrong for the same reason. After correcting both of those miscalculations, I have a nicely scaling FontActor with correct bounds in world coordinates.

make x y work the same on all devices android studio

I just began to develop a app with java, and I only got some experience in C. In my code in Activity.java (in android studio) I got things like, just to give some examples:
meteorite1.setX(meteoritePlacementX(meteorite1.getX()));
meteorite1.setY(-2000);
gnome.setX(330);
gnome.setY(800);
meteorite2.setX(meteoritePlacementX(meteorite2.getX()));
meteorite2.setY(meteoritePlacementY(meteorite1.getY()));
meteorite3.setX(meteoritePlacementX(meteorite3.getX()));
meteorite3.setY(meteoritePlacementY(meteorite2.getY()));
meteorite4.setX(meteoritePlacementX(meteorite4.getX()));
meteorite4.setY(meteoritePlacementY(meteorite3.getY()));
meteorite5.setX(meteoritePlacementX(meteorite5.getX()));
meteorite5.setY(meteoritePlacementY(meteorite4.getY()));
meteoritedestruction1.setX(0);
meteoritedestruction1.setY(-2000);
meteoritedestruction2.setX(0);
meteoritedestruction2.setY(-2000);
meteoritedestruction3.setX(0);
meteoritedestruction3.setY(-2000);
meteoritedestruction4.setX(0);
meteoritedestruction4.setY(-2000);
meteoritedestruction5.setX(0);
meteoritedestruction5.setY(-2000);
star1.setX(300);
star2.setX(150);
star3.setX(50);
star4.setX(500);
star5.setX(600);
star6.setX(350);
star7.setX(80);
star8.setX(450);
tinystar1.setX(302);
tinystar2.setX(240);
tinystar3.setX(57);
tinystar4.setX(660);
tinystar5.setX(400);
star1.setY(300);
star2.setY(-300);
star3.setY(-100);
star4.setY(100);
star5.setY(300);
star6.setY(500);
star7.setY(700);
star8.setY(900);
tinystar1.setY(300);
tinystar2.setY(-400);
tinystar3.setY(-200);
tinystar4.setY(150);
tinystar5.setY(30);
and
public float meteoritePlacementX(float X){
float MeteoriteNewX = 0f;
int random = (int )(Math.random() * 480 - 50);
MeteoriteNewX = random;
return MeteoriteNewX;
}
Which workes fine, but just on my phone (720 x 1280 pixels (~294 ppi pixel density)) which I tested my code at. Now I published my app, but on other device, the layout of the app is totally out of sync (which makes sense to me now, cause x and y are different for every screen). Buttons and pictures workes fine, but moving object like
meteorite1.setY(meteorite1.getY() + 20);
where I use x and y are broken on other devices. I use the relative layout.
So long story short; Is there a way to change x and y, so it becomes relative to the screen? Otherwise I need to change the whole code.
In general using placement based on hard coded pixel values is not a good practice. Not only would this break with backwards compatibility but also think about what you would have to do when 2k+ phones come out, you would need an entire refactor. Look at this question and the answer by Guillaume Perrot you can get get the maximum and minimum pixel values relative to the user's phone and use those instead of the 480 - 50 and your star set functions.
For the movement do
DisplayMetrics displayMetrics = new DisplayMetrics();
WindowManager wm = (WindowManager)getApplicationContext().getSystemService(Context.WINDOW_SERVICE); // the results will be higher than using the activity context object or the getWindowManager() shortcut
wm.getDefaultDisplay().getMetrics(displayMetrics);
int maxWidth = displayMetrics.widthPixels;
//Make this percentage whatever you want
float movementPercentage = 0.02
//Will move the object 2 percent up the y axis
meteorite1.setY(meteorite1.getY() + maxWidth*movementPercentage);

LibGDX BitmapFont won't stop shaking

I have a BitmapFont that is displaying a player's score as he moves across the screen at a constant rate. Because the player is always moving, I have to recalculate at what position I draw the font every frame. I use this code.
scoreFont.setScale(4f, 4f);
scoreFont.draw(batch, "" + scoreToShow, playerGhost.pos.x + 100f, 600f);
playerGhost.render(batch);
The problem? The font won't stop shaking. It's only a couple of pixels worth of vibration, but it's slightly noticeable. It's more noticeable when I run it on my tablet.
Is this a known bug?
How can I get it to stop shaking?
Call scorefont.setUseIntegerPositions(false); so it won't round the font's position to the nearest integer. You will also probably want to set the font's min filtering to Linear or MipmapLinearNearest, and max filtering to Linear.
The reason for the default behavior is that the default configuration is for text that is pixel perfect, for a viewport set with units equal to the size of a pixel. If your viewport had dimensions exactly the same as the screen's pixel dimensions, this configuration would help keep text from looking slightly blurry.
It could actually be the fact that you're scaling your font.
I had this problem and it's quite complex to understand (and also to fix).
Basically, when you scale fonts, BitmapFont changes the values inside the BitmapFontData by dividing/multiplying. If you do a lot of scaling, with a lot of different values (or an unlucky combination of values), it can introduce rounding errors which can cause flickering around the edges of the font.
The solution I implemented in the end was to write a Fontholder which stores all of the original BitmapFontData values. I then reset the font data to those original values at the beginning of every frame (i.e. start of render() method).
Here's the code...
package com.bigcustard.blurp.core;
import com.badlogic.gdx.graphics.g2d.*;
public class FontHolder {
private BitmapFont font;
private final float lineHeight;
private final float spaceWidth;
private final float xHeight;
private final float capHeight;
private final float ascent;
private final float descent;
private final float down;
private final float scaleX;
private final float scaleY;
public FontHolder(BitmapFont font) {
this.font = font;
BitmapFont.BitmapFontData data = font.getData();
this.lineHeight = data.lineHeight;
this.spaceWidth = data.spaceWidth;
this.xHeight = data.xHeight;
this.capHeight = data.capHeight;
this.ascent = data.ascent;
this.descent = data.descent;
this.down = data.down;
this.scaleX = data.scaleX;
this.scaleY = data.scaleY;
}
// Call this at start of each frame.
public void reset() {
BitmapFont.BitmapFontData data = font.getData();
data.lineHeight = this.lineHeight;
data.spaceWidth = this.spaceWidth;
data.xHeight = this.xHeight;
data.capHeight = this.capHeight;
data.ascent = this.ascent;
data.descent = this.descent;
data.down = this.down;
data.scaleX = this.scaleX;
data.scaleY = this.scaleY;
}
public BitmapFont getFont() {
return font;
}
}
I'm not wild about this, as it's slightly hacky, but it's a necessary evil, and will completely and properly solve the issue.
The correct way to handle this would be to use two different cameras, and two different spriteBatches, one for the game itself and one for the UI.
You call the update() method on both cameras, and use spriteBatch.setProjectionMatrix(camera.combined); on each batch to render them at the same time each frame.

Categories