LibGDX BitmapFont won't stop shaking - java

I have a BitmapFont that is displaying a player's score as he moves across the screen at a constant rate. Because the player is always moving, I have to recalculate at what position I draw the font every frame. I use this code.
scoreFont.setScale(4f, 4f);
scoreFont.draw(batch, "" + scoreToShow, playerGhost.pos.x + 100f, 600f);
playerGhost.render(batch);
The problem? The font won't stop shaking. It's only a couple of pixels worth of vibration, but it's slightly noticeable. It's more noticeable when I run it on my tablet.
Is this a known bug?
How can I get it to stop shaking?

Call scorefont.setUseIntegerPositions(false); so it won't round the font's position to the nearest integer. You will also probably want to set the font's min filtering to Linear or MipmapLinearNearest, and max filtering to Linear.
The reason for the default behavior is that the default configuration is for text that is pixel perfect, for a viewport set with units equal to the size of a pixel. If your viewport had dimensions exactly the same as the screen's pixel dimensions, this configuration would help keep text from looking slightly blurry.

It could actually be the fact that you're scaling your font.
I had this problem and it's quite complex to understand (and also to fix).
Basically, when you scale fonts, BitmapFont changes the values inside the BitmapFontData by dividing/multiplying. If you do a lot of scaling, with a lot of different values (or an unlucky combination of values), it can introduce rounding errors which can cause flickering around the edges of the font.
The solution I implemented in the end was to write a Fontholder which stores all of the original BitmapFontData values. I then reset the font data to those original values at the beginning of every frame (i.e. start of render() method).
Here's the code...
package com.bigcustard.blurp.core;
import com.badlogic.gdx.graphics.g2d.*;
public class FontHolder {
private BitmapFont font;
private final float lineHeight;
private final float spaceWidth;
private final float xHeight;
private final float capHeight;
private final float ascent;
private final float descent;
private final float down;
private final float scaleX;
private final float scaleY;
public FontHolder(BitmapFont font) {
this.font = font;
BitmapFont.BitmapFontData data = font.getData();
this.lineHeight = data.lineHeight;
this.spaceWidth = data.spaceWidth;
this.xHeight = data.xHeight;
this.capHeight = data.capHeight;
this.ascent = data.ascent;
this.descent = data.descent;
this.down = data.down;
this.scaleX = data.scaleX;
this.scaleY = data.scaleY;
}
// Call this at start of each frame.
public void reset() {
BitmapFont.BitmapFontData data = font.getData();
data.lineHeight = this.lineHeight;
data.spaceWidth = this.spaceWidth;
data.xHeight = this.xHeight;
data.capHeight = this.capHeight;
data.ascent = this.ascent;
data.descent = this.descent;
data.down = this.down;
data.scaleX = this.scaleX;
data.scaleY = this.scaleY;
}
public BitmapFont getFont() {
return font;
}
}
I'm not wild about this, as it's slightly hacky, but it's a necessary evil, and will completely and properly solve the issue.

The correct way to handle this would be to use two different cameras, and two different spriteBatches, one for the game itself and one for the UI.
You call the update() method on both cameras, and use spriteBatch.setProjectionMatrix(camera.combined); on each batch to render them at the same time each frame.

Related

Android: getHolder().setFixedSize(1920,1080) causes Issue with screen Resolution

Im developing a Game. I want my screen to always be 1920x1080.
getHolder().setFixedSize(1920,1080);
setFixedSize on my SurfaceView achieves this. It also creates a problem, however: the screen is still in WQHD res instead of also being FHD. This leads to mouseclickposition being bigger than the canvas is visually, and thus clicking on buttons no longer works unless I press where they actually would be in the smaller resolution.
Is there a way to set Android's resolution to 1920x1080? Maybe setting the Layout to that? I also tried making my canvas equal to a bitmap thats the right dimensions but that doesnt seem to change Anything at all.
Bitmap bitmap = Bitmap.createBitmap(10,100,null);
Canvas c = new Canvas(bitmap);
An alternative way to solve this might be to calculate the corresponding mouseClick Position from the bigger screen to the smaller, I suppose, but that seems like the suboptimal solution.
Here is what solved it
public static float normalize(float value, float min, float max) {
return Math.abs((value - min) / (max - min));
}
public boolean onTouchEvent(MotionEvent event) {
int differenceInWidth = actualScreenwidth - screenwidth; //2560-1920
float xPercent = normalize(event.getX(),0,actualScreenwidth);
Log.d("mouse_P_xPercent", Float.toString(xPercent));
float yPercent = normalize(event.getY(),0,actualScreenheight);
Log.d("mouse_P_yPercent", Float.toString(yPercent));
mouseCurrentPositionX = (int) (screenwidth*xPercent);
mouseCurrentPositionY = (int) (screenheight*xPercent);

How to prevent texture bleeding in a tilemap in LibGDX

I know there are quite some questions (and answers) on this topic, but they all have different solutions, and none of them seems to be working in my case.
I'm developing a small test project with libGDX, in which I tried to add a simple tilemap. I created the tilemap using Tiled, which seems to be working quite good, except for the texture bleeding, that causes black lines (the background color) to appear between the tiles sometimes.
What I've tried so far:
I read several SO-questions, tutorials and forum posts, and tried almost all of the solutions, but I just don't seem to get this working. Most of the answers said that I would need a padding between the tiles, but this doesn't seem to fix it. I also tried loading the tilemap with different parameters (e.g. to use the Nearest filter when loading them) or rounding the camera's position to prevent rounding problems, but this did even make it worse.
My current setup:
You can find the whole project on GitHub. The branch is called 'tile_map_scaling'
At the moment I'm using a tileset that is made of this tile-picture:
It has two pixels of space between every tile, to use as padding and margin.
My Tiled tileset settings look like this:
I use two pixels of margin and spacing, to (try to) prevent the bleeding here.
Most of the time it is rendered just fine, but still sometimes there are these lines between the tiles like in this picture (sometimes they seem to appear only on a part of the map):
I'm currently loading the tile map into the asset manager without any parameters:
public void load() {
AssetManager manager = new AssetManager();
manager.setLoader(TiledMap.class, new TmxMapLoader(new InternalFileHandleResolver()));
manager.setErrorListener(this);
manager.load("map/map.tmx", TiledMap.class, new AssetLoaderParameters());
}
... and use it like this:
public class GameScreen {
public static final float WORLD_TO_SCREEN = 4.0f;
public static final float SCENE_WIDTH = 1280f;
public static final float SCENE_HEIGHT = 720f;
//...
private Viewport viewport;
private OrthographicCamera camera;
private TiledMap map;
private OrthogonalTiledMapRenderer renderer;
public GameScreen() {
camera = new OrthographicCamera();
viewport = new FitViewport(SCENE_WIDTH, SCENE_HEIGHT, camera);
map = assetManager.get("map/map.tmx");
renderer = new OrthogonalTiledMapRenderer(map);
}
#Override
public void render(float delta) {
//clear the screen (with a black screen)
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
moveCamera(delta);
renderer.setView(camera);
renderer.render();
//... draw the player, some debug graphics, a hud, ...
moveCameraToPlayer();
}
private void moveCamera(float delta) {
if (Gdx.input.isKeyPressed(Keys.LEFT)) {
camera.position.x -= CAMERA_SPEED * delta;
}
else if (Gdx.input.isKeyPressed(Keys.RIGHT)) {
camera.position.x += CAMERA_SPEED * delta;
}
// ...
//update the camera to re-calculate the matrices
camera.update();
}
private void moveCameraToPlayer() {
Vector2 dwarfPosition = dwarf.getPosition();
//movement in positive X and Y direction
float deltaX = camera.position.x - dwarfPosition.x;
float deltaY = camera.position.y - dwarfPosition.y;
float movementXPos = deltaX - MOVEMENT_RANGE_X;
float movementYPos = deltaY - MOVEMENT_RANGE_Y;
//movement in negative X and Y direction
deltaX = dwarfPosition.x - camera.position.x;
deltaY = dwarfPosition.y - camera.position.y;
float movementXNeg = deltaX - MOVEMENT_RANGE_X;
float movementYNeg = deltaY - MOVEMENT_RANGE_Y;
camera.position.x -= Math.max(movementXPos, 0);
camera.position.y -= Math.max(movementYPos, 0);
camera.position.x += Math.max(movementXNeg, 0);
camera.position.y += Math.max(movementYNeg, 0);
camera.update();
}
// ... some other methods ...
}
The question:
I am using padding on the tilemap and also tried different loading parameters and rounding the camera position, but still I have this texture bleeding problem in my tilemap.
What am I missing? Or what am I doing wrong?
Any help on this would be great.
You need to pad the edges of your tiles in you tilesheet.
It looks like you've tried to do this but the padding is transparent, it needs to be of the color of the pixel it is padding.
So if you have an image like this (where each letter is a pixel and the tile size is one):
AB
CB
then padding it should look something like this
A B
AAABBB
A B
C C
CCCCCC
C C
The pixel being padded must be padded with a pixel of the same color.
(I'll try try create a pull request with a fix for your git-repo as well.)
As a little addition to bornander's answer, I created some python scripts, that do all the work to generate a tileset texture, that has the correct edge padding (that bornander explained in his answer) from a texture, that has no padding yet.
Just in case anyone can make use of it, it can be found on GitHub:
https://github.com/tfassbender/libGdxImageTools
There is also a npm package that can extrude the tiles. It was built for the Phaser JS game library, but you could still use it. https://github.com/sporadic-labs/tile-extruder

Is cropping and rescaling background a good solution for java games?

I'm very new to java and I'm currently developping a 2D game.
I'm trying to use Java Swing for the graphics and I have a problem doing so :
Displaying the background, which is a fairly high definition image, (currently 2000x2000 but will grow bigger for higher definition), with a map of 50 units in width and height.
The problem is that I don't want to display the whole map but only a fixed amount of cells in width and height of it (here i chose 20). I first tried to rescale the image wider than the screen to make the 20 cells fit perfectly (a bit like a zoom on the area we want) and then draw it with an offset related to the player's position (which is always displayed on the centered of the screen).
But while i try to go with bigger images, i get a java heap space memory exception.
So i was thinking of getting a cropped version of the image and then rescale it to the screen's dimension to have a smaller rescaled image. I'm not getting exceptions any more but I have some important performances issues with a drop of 30 fps.
I was also thinking about getting all cropped images possible and storing them but I wanted to know if that's a thing or not.
To sum up, I was wondering what were the best ways to display a background in games.
Since 2D games or even 3D games have maps even larger than I do, I think I must be missing something, I don't get how to display sprites with high resolution while keeping a decent frame rate.
Thank your for your time.
edit:
To put a bit of context : The map is a big maze and the player should only be able to see a local view of the maze. And because I would want a rather detailed Background, i have to be able to display large images.
Here is a reduced view of my code sample :
public class Background implements Drawable, Anchor {
private final String name;
private final Image image;
private final int width;
public Background(String name){
this.name = name;
BufferedImage image = FileSystem.readBufferedImage(GraphicType.BACKGROUND, name);
//image is a 2000x2000 image
this.image = image.getScaledInstance((int)(behavior.width()
* (Game.frame.getWidth()/(double) Settings.NB_CELLS_ON_SCREEN_WIDTH)),
-1,
0
);
//result in a 19200x19200 image
this.width = (int)(behavior.width()
* (Game.frame.getWidth()/(double)Settings.NB_CELLS_ON_SCREEN_WIDTH));
}
#Override
public void draw(Graphics g) {
g.drawImage(
image,
-8640,
-9060,
null);
}
}
With the GraphicPosition class computing the position on screen, with the following arguments in constructor : An anchor object, an xOffset and a yOffset
I now draw the background using the drawImage(Image img, int dx1, int dy1, int dx2, int dy2, int sx1, int sy1, int sx2, int sy2, ImageObserver observer)
As matt suggested, it causes a less significant fps drop because i'm no longer creating a huge temporary image and the good thing about this solution is that the difference in frame rate with bigger images seems to be small.
Here is the code I used and worked for me :
public class Foreground implements Drawable {
private final String name;
private final BufferedImage image;
private final Behavior behavior;
private final int width;
public Foreground(String name, Behavior behavior){
this.name = name;
this.image = FileSystem.readBufferedImage(GraphicType.FOREGROUND, name);
this.behavior = behavior;
}
#Override
public void draw(Graphics g) {
/*This variable represents the width on the image of the number of cells
In my code its is the width on the image of 20 cells*/
int widthOnImageForNbCellsDisplayed = (image.getWidth()/behavior.width()) * Settings.NB_CELLS_ON_SCREEN_WIDTH;
/*This posXInImage represents the position of the top left corner of the
area of the image we want to draw*/
int posXInImage = (int)(
Game.player.posX()
* image.getWidth()
/ (double)behavior.width()
- widthOnImageForNbCellsDisplayed/2
);
int posYInImage = (int)(
Game.player.posY()
* image.getHeight()
/ (double)behavior.height()
- widthOnImageForNbCellsDisplayed/2
);
/*Here is struggled with the fact that the position of the second
coordinates is absolute and not relative to the first coordinates
,what I was originally thinking*/
g.drawImage(image,
0,
-(Game.frame.getWidth() - Game.frame.getHeight())/2,
Game.frame.getWidth(),
Game.frame.getWidth()-(Game.frame.getWidth() - Game.frame.getHeight())/2,
posXInImage,
posYInImage,
posXInImage + widthOnImageForNbCellsDisplayed,
posYInImage + widthOnImageForNbCellsDisplayed,
null);
}
}

LibGDX text not centered in smaller viewport

I'm working on a LibGDX game which uses a smaller viewport.
public static float BOX_SCALE = 10;
public static final float VIRTUAL_WIDTH = (int) (320 / BOX_SCALE);
public static final float VIRTUAL_HEIGHT = (int) (480 / BOX_SCALE);
float viewportHeight = MyConstants.Screen.VIRTUAL_HEIGHT;
float viewportWidth = MyConstants.Screen.VIRTUAL_HEIGHT * Gdx.graphics.getWidth() / Gdx.graphics.getHeight();
For example my viewport can have the size (32, 48). I use Scene2D for rendering. For some reason whenever i create a TextButton the text is never centered. This is the BitmapFont used for the button.
FreeTypeFontParameter fontParam = new FreeTypeFontParameter();
fontParam.size = 14;
FreeTypeFontGenerator generator2 = new FreeTypeFontGenerator(Gdx.files.internal("data/font.ttf"));
labelFont = generator2.generateFont(fontParam);
labelFont.setScale(1f / BOX_SCALE);
labelFont.setColor(Color.BLACK);
If i set the BOX_SCALE value to 1 then TextButton acts normal but i need for simulating the Box2D world. I guess i could create separate labels for each button and position them manually but I can't figure out why this is happening. Also interested if there is a cleaner solution.
By default, font positions are rounded off to nearest world game unit. This is based on an assumption that your font will render pixel perfect. In your case, you don't want a pixel perfect font, so call:
labelFont.setUseIntegerPositions(false);
Also, in your fontParam you should set it to use mipmaps, and set the minFilter to MipmapLinearNearest and the magFilter to Linear. That'll make it look better, since by default the filtering is set to Nearest/Nearest which looks bad if you aren't rendering pixel perfect.

Figure out width of a String in a certain Font

Is there any way to figure out how many pixels wide a certain String in a certain Font is?
In my Activity, there are dynamic Strings put on a Button. Sometimes, the String is too long and it's divided on two lines, what makes the Button look ugly. However, as I don't use a sort of a console Font, the single char-widths may vary. So it's not a help writing something like
String test = "someString";
if(someString.length()>/*someValue*/){
// decrement Font size
}
because an "mmmmmmmm" is wider than "iiiiiiii".
Alternatively, is there a way in Android to fit a certain String on a single line, so the system "scales" the Font size automatically?
EDIT:
since the answer from wsanville was really nice, here's my code setting the font size dynamically:
private void setupButton(){
Button button = new Button();
button.setText(getButtonText()); // getButtonText() is a custom method which returns me a certain String
Paint paint = button.getPaint();
float t = 0;
if(paint.measureText(button.getText().toString())>323.0){ //323.0 is the max width fitting in the button
t = getAppropriateTextSize(button);
button.setTextSize(t);
}
}
private float getAppropriateTextSize(Button button){
float textSize = 0;
Paint paint = button.getPaint();
textSize = paint.getTextSize();
while(paint.measureText(button.getText().toString())>323.0){
textSize -= 0.25;
button.setTextSize(textSize);
}
return textSize;
}
You should be able to use Paint.setTypeface() and then Paint.measureText(). You'll find other methods on the Paint class like setTextSize() to help too.
Your followup question about scaling text was addressed in this question.

Categories