How to remove gaps between tiled textures? - java

I'm using LibGDX to make a platformer. I'm using square tiles for the platforms but when they are drawn some of them have gaps between them. When I zoom in/out or move the camera around the gaps move position.
More details:
Tiles are 32x32 and I have tried both 32x32 and 64x64.
Tiles are lined up 32 pixels apart (e.g. first tile would be x=0 y=0, second x=32 y=0, and so on in both x and y directions).
The gaps are not texture artifacts as I have checked this.
I use the TexturePacker with padding.
My best guess is that it is a problem when converting the textures to screen coords but have no idea how to fix this and I couldn't find any solution. I have checked and double-checked my precision with tile sizes and lining them up.
Has anyone had the same problem or know how it fix it?

I got it fixed by setting the duplicatePadding field of the TexturePacker.Settings class to true.
Example code:
import com.badlogic.gdx.tools.texturepacker.TexturePacker;
import com.badlogic.gdx.tools.texturepacker.TexturePacker.Settings;
Settings settings = new Settings();
settings.maxWidth = 1024;
settings.maxHeight = 1024;
settings.duplicatePadding = true;
TexturePacker.process(settings, "source", "destination", "name");

Well, I'm here to save your day!
The solution is called "Edge padding". Now if you are working with tilesets, I can assure you that this will work.
Personally I'm using Tiled which allows me to adjust margin and spacing in my tilesets. The only downside by this is that you'll have to use GIMP with this plugin: http://registry.gimp.org/node/26044
This plugin will let you apply edge padding to your tileset and voila! No more ugly artifacts.

Bleeding
Gaps
The short answer is that it may be your filter, which likely needs to be set to NEAREST.
Might also want to check out the working tutorials at Libgdx.

It's called "texture bleeding". You need to add padding to your tiles so that when the texture bleeds, it can collect the correct pixel data to fill the gap.

I know it's a bit late to answer on this post, but when I was looking for a solution I came here.
However, for me I found a much easier way to get rid of the flickering or gaps that appear randomly between the tiles.
I simply added a cast to the very long decimals I got for the player's position:
camera.position.set((int)robot.position.x, (int)robot.position.y, 0);
That made the player move very weirdly and to make him move smoothly again I added a cast to its render method aswell:
batcher.draw(robotSprite, (int)robot.position.x, (int)robot.position.y, 16, 16);
Et voilĂ ! Working perfectly for me.

I would post my solution here and what I had tried for Libgdx about this problem.
--
T1. Make original spritesheet (no atlas file) that downloaded from somewhere to padding 2.
A1. This would be impossible for repacking the spritesheet that have no atlas, even if you find a slice/splitter tool, it should be a bunch of images that need to be repack properly for TiledMap(.tmx)
A1(Updated). Script that provided by #Nine Magics would be the best way to do this! (I use this as my final solution)
--
T2. Use TiledMapPacker that provided by libgdx-nighty or gdx-toolg, The batch code should be:
java -classpath "gdx.jar";"gdx-natives.jar";"gdx-backend-lwjgl.jar";"gdx-backend-lwjgl-natives.jar";"gdx-tiled-preprocessor.jar";"extensions/gdx-tools/gdx-tools.jar" com.badlogic.gdx.tiledmappacker.TiledMapPacker "PathToYourProject\android\assets\RawMap" "PathToYourProject\android\assets\Map" --strip-unused
A2. The output .tmx that could not be readable by Tiled If you are using complex folder path to category your .png files. And the output file could be possibly failed to load by AtlasTmxMapLoader.
--
T3. Camera position correction, make camera position to integer. The code liked #Julian or #strangecat from libgdx tiledmap flicker with Nearest filtering
A3. I use this solution for no problem, and also post my code that different from them.
float cameraX = (int)(mainCamera.position.x * Game.PPM_X) / Game.PPM_X;
float cameraY = (int)(mainCamera.position.y * Game.PPM_X) / Game.PPM_X;
float cameraZ = mainCamera.position.z;
mainCamera.position.set(cameraX, cameraY, cameraZ);
And also load it with TmxMapLoader.Parameters
TmxMapLoader.Parameters params = new TmxMapLoader.Parameters();
params.textureMinFilter = Texture.TextureFilter.Linear;
params.textureMagFilter = Texture.TextureFilter.Nearest;
params.generateMipMaps = true;
assetManager.load(TILED_MAP_SETS.FIRST_MAP, TiledMap.class, params);
If you used PPM and want to move pixel by pixel, you could use this integer correction for your game, If not you could just convert the position to integer.
I almost wasted whole day to slove this, hope these investigation could help every game developers :)
Edit(2018/04/21)
I found out that Unity is also having the same problem, but I haven't tested If Libgdx have 2x Anti-Alias setting by default. Libgdx might fix the issue as Unity by turning off the Anti-Alias.

Related

libgdx- padding didn't solve black lines on tilemap

I read all the million posts about the problem with black lines shown on screen when rendering tilemaps on libgdx. All the solutions talk about add padding to the tile sets. So I did it, but it didn't solved the problem.
I think maybe I didn't understand it well, so I will post here the process I'm doing, and I will be glad if someone can point out my mistake, if any:
This is an image of my initial tile set:
This is an image of my tile set after adding padding using GIMP plugin which was suggested from this post:
Then in Tiled, I add this tileset and set Margin to 1px and Spacing to 2px like written in this post.
Now as I understand the problem should be solved, but still when I run the program I get:
I can add code parts if you want, but there is really nothing special the way I render my map- just load it with TmxMapLoader and render with MapRenderer (using camera also, of course, otherwise the problem wouldn't appear).
I know this thread is very old, but it took me two hours to figure out the solution myself. So in case somebody has the same problem in the future:
I noticed that only some tiles behaved like this - for example, a grass tile would, a water tile wouldn't. Even if they were used on the exact same position in the map.
For me the problem was - apparently - that my PNG tile sheet's height was not a power of 2. A few days before, I added a line to an existing tilesheet and changed the height from 1024px to 1056px. After a lot of experimenting, I found out that after removing this line again, the black stripes would disappear.
Load your maps like this:
TmxMapLoader.Parameters() params = new TmxMapLoader.Parameters();
params.generateMipMaps = true;
TmxMapLoader mapLoader = new TmxMapLoader();
TiledMap map = mapLoader.load("pathToMap", params);
I had the same problem I fixed it by changing the size of my spritesheet to a power of two
example :
640*640 --> wrong
512*512 --> right

Anti Aliasing based on colors (not textures)

I was searching for an anti-aliasing algorithm for my OpenGL program (so I searched for a good shader). The thing is, all shaders want to do something with the textures, but I dont use textures, only colors. I looked at FXAA most of the time, so is there a anti-aliasing algorithm that just works with colors? My game, what this is for looks blocky like minecraft, but only works with colors and cubes of different size.
I hope someone can help me.
Greetings
Anti-aliasing has nothing specifically to do with either textures or colors.
Proper anti-aliasing is about sample rate, which while highly technical can be thought of as doing extra work to make a better educated guess at some value that cannot be directly looked up (e.g. a pixel that is only partially covered by a triangle).
Multisample Anti-Aliasing (MSAA) will work nicely for you, it will only anti-alias polygon edges and does nothing for texture aliasing on the interior of a polygon. Since you are not using textures you do not need to worry about aliasing inside a polygon.
Incidentally, FXAA is not proper anti-aliasing. FXAA is basically a shader-based edge detection and blur image processing filter. FXAA will blur any part of the scene with sharp edges, whether it is a polygon edge or an edge due to a mapped texture. It indiscriminately blurs anything it thinks is an aliased edge and gets this wrong often, resulting in blurry textures.
To use MSAA, you need:
A framebuffer with at least 2 samples
Enable multisample rasterization
Satisfying (1) is going to depend on what you used to create your window (in this case LWJGL). Most frameworks let you select the sample count as one of the parameters at the time of creation.
Framebuffer Objects can also be used to do this without messing with your window's parameters, but they are more complicated than need be for this discussion.
(2) is as simple as calling glEnable (GL_MULTISAMPLE).

Why do I have lines going across my libgdx game using Tiled?

I'm using LibGdx and Tiled and when moving around the screen, there are both horizontal and vertical lines appearing on the game. I can post any code you need, if necessary. How do I get these lines to stop?
Here's a gfycat gif of the lines:
http://gfycat.com/FastUnnaturalAmericanwirehair
Edit:
Here's a small bitbucket repository, as small as I could get it that has the same glitch in it:
https://bitbucket.org/Chemical_Studios/example-of-line-glitch/src/8eeb153ec02236d836763072611bd7aa55d38495/minimalExample/src/com/weebly/chemicalstudios/minEx/?at=master
This is because you need to add a padding to your tiles.
This is a pretty common problem and you are not the first to encounter it. Basically due to rounding errors when scaling and panning around, sometimes you will render the area "between" two tiles, which will result in nothing being rendered -> black background colour comes through.
You basically need to use some tools to add the padding to your tileset. In this forum thread I explained how to do it.
There is also one more questions regarding this topic on stackoverflow here.
When you have rounding errors you can always force the number to snap to the grid you want. In my case that looked like this:
gameCam.position.x = (float) Math.round(player.b2body.getPosition().x * 100f) / 100f;
Because I used a pixels-per-meter constant of 100f throughout the game, to scale everything

libgdx texture filters and mipmap

When I try to use mipmap filtering in LibGDX, none of the images appear.
I'm new to LibGDX, and I have a simple 2d scene with three rotating, scaled circles. In order to anti-alias them, I wanted to use linear filtering. For advice, I looked to this article,which said that, for heavily scaled images, a mipmap can be used to improve speed or quality.
The first unexpected appearance was that, despite the fact that all of my images were scaled down, I would only see a linear filter if the magFilter was linear. In other words:
This code will show a linear filter for minified images:
parentTexture.setFilter(TextureFilter.Nearest, TextureFilter.Linear);
whlie this code will not:
parentTexture.setFilter(TextureFilter.Linear, TextureFilter.Nearest);
which seems opposite to the libGDX function:
void com.badlogic.gdx.graphics.Texture.setFilter(TextureFilter minFilter, TextureFilter magFilter)
This would not bother me, except that it indicates that either libgdx is wrong (unlikely), the article is wrong (unlikely), or I don't understand texture filters. The latter seems especially likely when I try mipmap filters.
This code causes nothing to display
parentTexture.setFilter(TextureFilter.MipMapLinearLinear, TextureFilter.Linear);
This code displays, but with nearest filtering
parentTexture.setFilter(TextureFilter.Linear, TextureFilter.MipMapLinearLinear);
Any explanation of where I'm wrong would be greatly appreciated. I have searched elsewhere, but texture filters in libGDX is pretty specific, so aside from the article, I haven't found much to help.
I had this same problem, and the fix turned out to be insanely simple. When you create a Texture, you need to specify that it uses mipmaps.
All you have to do is pass a second parameter to the Texture constructor like this:
Texture myTexture = new Texture(Gdx.files.internal("myImage.png"), true);
You can view all the Texture class constructors in the API docs here: http://libgdx.badlogicgames.com/nightlies/docs/api/com/badlogic/gdx/graphics/Texture.html
Just like Mitesh said in his answer mipmaps filters doesn't work because you are not telling Libgdx to generate mipmaps.
if you are using assets manager the code will be something like this
TextureParameter param = new TextureParameter();
param.genMipMaps = true; // enabling mipmaps
manager.load("path/to/texfile.png", Texture.class, param);
Texture tex = manager.get("path/to/texfile.png", Texture.class);
tex.setFilter(TextureFilter.MipMap, TextureFilter.Nearest);
There can be multiple issues with your image:
It should be power of 2, if you are using an image with size like 354X420, it won't work. In this case you need to take an image of 512X512 or any other power of 2.
When you want to enable Mipmap filtering, then you need to enable it using boolean genMipMaps which tells libgdx whether to generate mapmaps.
Try using the same minFilter and maxFilter. I had a similiar problem and if I put
TextureFilter.Linear,
TextureFilter.Linear
or both MipMap the problem is solved.
Hope this helps.

LibGDX FrameBuffer scaling

I'm working on a painting application using the LibGDX framework, and I am using their FrameBuffer class to merge what the user draws onto a solid texture, which is what they see as their drawing. That aspect is working just fine, however, the area the user can draw on isn't always going to be the same size, and I am having trouble getting it to display properly on resolutions other than that of the entire window.
I have tested this very extensively, and what seems to be happening is the FrameBuffer is creating the texture at the same resolution as the window itself, and then simply stretching or shrinking it to fit the actual area it is meant to be in, which is a very unpleasant effect for any drawing larger or smaller than the window.
I have verified, at every single step of my process, that I am never doing any of this stretching myself, and that everything is being drawn how and where it should, with the right dimensions and locations. I've also looked into the FrameBuffer class itself to try and find the answer, but strangely found nothing in there either, but, given all of the testing I've done, it seems to be the only possible place for this issue to be created somehow.
I am simply completely out of ideas, having spent a considerable amount of time trying to troubleshoot this problem.
Thank you so much Synthetik for finding the core issue. Here is the proper way to fix this situation that you elude to. (I think!)
The way to make frame buffer produce a correct ratio and scale texture regardless of actual device window size is to set the projection matrix to the size required like so :
SpriteBatch batch = new SpriteBatch();
Matrix4 matrix = new Matrix4();
matrix.setToOrtho2D(0, 0, 480,800); // here is the actual size you want
batch.setProjectionMatrix(matrix);
I believe I've solved my problem, and I will give a very brief overview of what the problem is.
Basically, the cause of this issue lies within the SpriteBatch class. Specifically, assuming I am not using an outdated version of the class, the problem lies on line 181, where the projection matrix is set. The line :
projectionMatrix.setToOrtho2D(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
This is causing everything that is drawn to, essentially, be drawn at the scale of the window/screen and then stretched to fit where it needs to afterwards. I am not sure if there is a more "proper" way to handle this, but I simply created another method within the SpriteBatch class that allows me to call this method again with my own dimensions, and call that when necessary. Note that it isn't required on every draw or anything like that, only once, or any time the dimensions may change.

Categories