I am attempting to use texture coordinates from a pre-generated PNG file on a 3d world of quads loaded into Java with LWJGL's slick-util extension.
The texture file is 192x96pixels, and properly formatted. It's composed of 6x3 32x32 tiles.
The 3d quads are 1.5f wide and long. They are spaced apart properly.
I am having issues getting the right texture coordinates. When I put 0.0f to 0.333333f as the y coordinates, I get slightly more than the top tile's height displayed. However, if I put 0.0f-0.25f, I get exactly 1/3rd, which is my tile's height. I have yet to find a magic number for the X coordinates, but maybe someone could explain to me why 1/4 of 96 is 24 according to textures coordinates, or what I'm doing wrong? I'm suspecting it could be a clash between my quad size and textures.
The tops of the cubes are using the texture coordinates (0.0, 0.0f), (0.0, 0.333333f), (0.166666f, 0.333333f), (0.166666f, 0.0f), which is applied moving anticlockwise from the top left to the top right. Again, the main texture file is 32x32 tiles arranged to make 192x96(96 is the height).
Notice I placed a white line at the top of one of the tiles to see its border, and black line at it's bottom, then a white line for the top of the next one below it. The texture 'bleeds' too far down. The other textures have their own even stranger coordinates, as you can see.
Arranging texture coordinates with the assumption the top of the image is 1.0 rather than the bottom produces odd squares with a rectangular hole in the center where quads should be.
I am using TEX_ENV GL_MODULATE.
Texture sizes are usually a power of 2. I suspect something resized your 192x96 texture as a 256x128 or 256x256 texture. This doesn't really explain the values you found however... But, I think, if you resize your texture to 256x256(increase the size, don't scale!) and calculate your texture coordinates based on that, your problem will go away.
I don't know about java but with my image atlases in objective C and openGL ES you need to make the textures smaller than what you are referring to when selecting them from the atlas.
Have you left a sufficient gap between the texture images to prevent 'bleeding?
Related
I want to develop a simple 2D side scrolling game using libGDX.
My world contains many different 64x64 pixel blocks that are drawn by a SpriteBatch using a camera to fit the screen. My 640x640px resource file contains all these images. The block textures are positioned at (0, 0), (0, 64), (64, 0), ... and so on in my resource file.
When my app launches, I load the texture and create many different TextureRegions:
texture = new Texture(Gdx.files.internal("texture.png"));
block = new TextureRegion(texture, 0, 0, 64, 64);
block.flip(false, true);
// continue with the other blocks
Now, when I render my world, everything seems fine. But some blocks (about 10% of my blocks) are drawn as if the TextureRegion's rectangle was positioned wrong - it draws the bottommost pixel row of the above (in the resource texture) block's texture as its topmost pixel row. Most of the blocks are rendered correctly and I checked that I entered the correct position multiple times.
The odd thing is, that when I launch the game on my computer - instead of my android device - the textures are drawn correctly!
When searching for solutions, many people refer to the filter, but neither of both Linear and Nearest works for me. :(
Hopefully, I was able to explain the problem in an accessible way and you have any ideas how to fix that (= how to draw only the texture region that I want to draw)!
Best regards
EDIT: The bug does only appear at certain positions. When I draw two blocks with the same texture at different positions, one of them is drawn correctly and the other is not.. I don't get it....
You should always leave empty space between your images when packing into one texture, because if you use FILTER_LINEAR (which I think is default) for every pixel it will sample from the four nearest pixels. And if your images are without empty pixels padding,for all edge pixels it will get pixels from the neighbor image.
So three options to solve your issue:
Manually add space between images in you texture file
Stop using FILTER_LINEAR (but you will get ugly results if you are not drawing in the native image dimentions e.g. scaling the image)
Use the Libgdx Texture Packer, it has a build it functionality to do just that, when you pack your images
Having a weird issue.
I have a Texture with four frames of a sprite Animation. Each frame is loaded as a TextureRegion.
Most of the time the Animation play without any issues, but occasionally it will draw too much of the Texture in one frame.
Here's an example of what I mean:
As you can see the UFO has a red bar on the left side of it. That red bar is part of a frame on the outside of the TextureRegion bounds stated in my code. (The red frame is just there to make it easier for me to measure, since there is transparency on all the corners)
Here's the Texture:
In the above sprite sheet the red frame for the slide at the top has the bounds 0, 0, 202, 71. The TextureRegion for the frame of the anim is 1, 1, 200, 69 -- at no point should any of the red frame be displayed, as far as I can tell.
I realise as a workaround I could just set the frame as transparent now that I have the measurements I need, but I'd like to keep the red frame in case I need to take the measurements again later, or replace the sprite images, etc, and really a workaround is just a band-aid whereas I'm hoping to find a proper solution to address the root of the issue -- the fact that it's drawing wrong seems to indicate a larger problem than what exists just in this particular case (eg, in a densely-packed Texture it might draw pixels from a different sprite frame or even a different sprite or a menu image or something like that).
Oh and one last note, in case it's helpful: when the SpriteBatch displays the image it applies a rotation based on the movement of the UFO (tilts to the left when moving left, etc). The glitchy red bars sometimes show up on the top, right, bottom, or left randomly (though most of the time they don't show up at all) however they only seem to show up when the UFO has a rotation of zero. (Again, I realise I could just include a check to see if rotation is 0 and then call the SpriteBatch.draw() method without the rotation figure, but that too would be treating the symptom rather than addressing the root of the problem).
Any thoughts from the learned masters?
Your frames of animation need padding around them to account for rounding error. Put two pixels of clear pixels all around each image. If you use TexturePacker to combine the images into your file, it will automatically add the two pixels of padding by default.
If you name your four images with an underscore-frame number suffix, like myAnimation_0.png, myAnimation_1.png, myAnimation_2.png, and myAnimation_3.png, then when you load your TextureAtlas, it allows you get the animation very easily.
Array<TextureRegion> myAnimationFrames = textureAtlas.findRegions("myAnimation");
What is the easiest way to give 2D texture in OpenGL (lwjgl) some kind of "Thickness". Of course i could get the border of the texture somehow and add Quads, orriented by the normal of the quad that the texture is drawn on, in the color of the adjacent texture pixel. But there has to be an easier way to do it.
Minecraft is using lwigl as well and there are the (new) 3D Items, that spin down on the ground and don't cause as much of a performance issue, as is if they were drawn of dozends of polygons. As well, when you hold an item in your hand, there is that kind of "stretched" Texture in depth, which also works with high resolution textures.
Does anyone know how that is done?
A 2D texture is always infinitely thin. If you want actual thickness (when you look edge onto it) you need geometry. In Minecraft things look blocky, because they've been modeled that way.
If you look at some angle and ignore the edges you can use parallax mapping to "fake" some depth in the texture. Or you can use a depth map and use a combination of tesselation shaders and vertex shaders to implement a displacement map, that generates geometry from the texture.
I have been reading "Learning Libgdx Game development". I tried the below snippet:
// First the camera object is created with viewport of 5 X 5.
OrthographicCamera camera = new OrthographicCamera(5, 5);
I have a texture having a dimension of 32 pixels by 32 pixels. I form a sprite out of this
Sprite spr = new Sprite(texture);
// I set the size of Spr as
spr.setSize(1,1);
According to the book the dimensions above are meters and not pixels.
What I don't understand is how is mapping from meters to pixels happening on the screen? When I draw the sprite on the screen the size is not even half a meter let alone 1.
Also, the size of the underlying texture is 32 X 32 pixels. WHen I resize, the size of my sprites also changes.
Then, what would be the dimensions of spr.setPosition(x, y)? Will they be meters or pixels?
The library uses pixels for dimensions like texture size, and meters for in-game units.
setPosition will move an object in game units. When you move an object X game units, the number of pixels changes based on the camera's projection matrix amongst other settings.
If you think about it, it wouldn't make sense to move in pixels. If camera A is zoomed in more than cameraB moving X pixels in the view of each camera would require moving two different amounts.
Edit: Sorry, I made some assumptions in your understanding above, partially misunderstood the question, and frankly used the misleading wording. The key is that the convention of meters for units is not built-in, it's one that you enforce because the ratio of one pixel to one meter in Box2D wouldn't make sense. The wording I used implied that internally setPosition cares about meters, but you should be doing the scaling yourself. Often times the ratio I see in libgdx is 30 pixels = 1 meter.
While working on Projectiles I thought that it would be a good idea to rotate the sprite as well, to make it look nicer.
I am currently using a 1-Dimensional Array, and the sprite's width and height can and will vary, so it makes it a bit more difficult for me to figure out on how to do this correctly.
I will be honest and straight out say it: I have absolutely no idea on how to do this. There have been a few searches that I have done to try to find some stuff, and there were some things out there, but the best I found was this:
DreamInCode ~ Rotating a 1-dimensional Array of Pixels
This method works fine, but only for square Sprites. I would also like to apply this for non-square (rectangular) Sprites. How could I set it up so that rectangular sprites can be rotated?
Currently, I'm attempting to make a laser, and it would look much better if it didn't only go along a vertical or horizontal axis.
You need to recalculate the coordinate points of your image (take a look here). You've to do a matrix product of every point of your sprite (x, y) for the rotation matrix, to get the new point in the space x' and y'.
You can assume that the bottom left (or the bottom up, depends on your system coordinate orientation) of your sprite is at (x,y) = (0,0)
And you should recalculate the color too (because if you have a pure red pixel surrounded by blue pixel at (x,y)=(10,5) when you rotate it can move for example to (x, y)=(8.33, 7.1) that it's not a real pixel position because pixel haven't float coordinate. So the pixel at real position (x, y)=(8, 7) will be not anymore pure red, but a red with a small percentage of blue)... but one thing for time.
It's easier than you think: you only have to copy the original rectangular sprites centered into bigger square ones with transparent background. .png files have that option and I think you may use them.