Rendering and Cropping/Stretching an Image Using Slick/OpenGL (using getSubImage) - java

I'm trying to recreate a shadow effect on some 2D sprites in a project using Slick. To do this, I'm recolouring a copy of the sprite and stretching it using Slick OpenGL using this method:
public static void getStretched(Shape shape, Image image) {
TextureImpl.bindNone();
image.getTexture().bind();
SGL GL = Renderer.get();
GL.glEnable(SGL.GL_TEXTURE_2D);
GL.glBegin(SGL.GL_QUADS);
//topleft
GL.glTexCoord2f(0f, 0f);
GL.glVertex2f(shape.getPoints()[0], shape.getPoints()[1]);
//topright
GL.glTexCoord2f(0.5f, 0f);
GL.glVertex2f(shape.getPoints()[2], shape.getPoints()[3]);
//bottom right
GL.glTexCoord2f(1f, 1f);
GL.glVertex2f(shape.getPoints()[4], shape.getPoints()[5]);
//btoom left
GL.glTexCoord2f(0.5f, 1f);
GL.glVertex2f(shape.getPoints()[6], shape.getPoints()[7]);
GL.glEnd();
GL.glDisable(SGL.GL_TEXTURE_2D);
TextureImpl.bindNone();
}
This gives the almost the desired effect, aside from the fact that the image is cropped a bit.
This becomes more extreme for higher distortions
I'm new to using OpenGL, so some help regarding how to fix this would be great.
Furthermore, if I feed an image into the method that was obtained using getSubImage, OpenGL renders the original image, rather than the sub image.
I'm unsure as to why this happens, as the sprite itself is taken from a spritesheet using getSubImage and has no problem rendering.
Help would be greatly appreciated!

I'm recolouring a copy of the sprite and stretching it
The issue is that you stretch the texture coordinates, but the region which is covered by the sprite stays the same. If the shadow exceeds the the region which is covered by the quad primitive, then it is cropped.
You have to "stretch" the vertex coordinates rather than the texture coordinates. Define a rhombic geometry for the shadow and wrap the texture on it:
float distortionX = ...;
GL.glEnable(SGL.GL_TEXTURE_2D);
GL.glBegin(SGL.GL_QUADS);
//topleft
GL.glTexCoord2f(0f, 0f);
GL.glVertex2f(shape.getPoints()[0] + distortionX, shape.getPoints()[1]);
//topright
GL.glTexCoord2f(1f, 0f);
GL.glVertex2f(shape.getPoints()[2] + distortionX, shape.getPoints()[3]);
//bottom right
GL.glTexCoord2f(1f, 1f);
GL.glVertex2f(shape.getPoints()[4], shape.getPoints()[5]);
//btoom left
GL.glTexCoord2f(0f, 1f);
GL.glVertex2f(shape.getPoints()[6], shape.getPoints()[7]);
GL.glEnd();
[...] if I feed an image into the method that was obtained using getSubImage, [...] the sprite itself is taken from a spritesheet [...]
The sprite is covers just a small rectangular region of the entire texture. This region is defined by the texture coordinates. You have to use the same texture coordinates as when you draw the sprite itself.

Related

OpenGL not drawing textured and non-textured quads simultaneously

I'm trying to draw a textured quad, then on top of that a black rectangle which has no texture. I use glEnable(GL_TEXTURE_2D) before I draw the textured quad and glDisable(GL_TEXTURE_2D) before I draw the non-textured quad, but My textured quad only appears for a split second and then I'm left with my clear color filling the entire screen and only the non-textured quad showing up. Why is this happening?
P.S: I call glClear(GL_COLOR_BUFFER_BIT) before doing any render work, not in between the render methods so that shouldn't be the problem I think
Also, I can only see the non-textured quad even if I draw the textured quad on top of it
I solved my problem. Turns out I didn't set a color before doing the textured quad so whenever i changed the color of the other quad this color is changing as well, so setting the color before binding the texture worked.

Difficulty drawing a background sprite using LWJGL

I'm trying to render a background image for a new game I'm creating. To do this, I thought I'd just create a simple Quad and draw it first so that it stretched over the background of my game. The problem is that the quad doesn't draw to it's correct size and draws at the complete wrong place on the screen. I am using LWJGL and an added slick-util library for loading textures.
background = TextureHandler.getTexture("background", "png");
This is the line of code which basically gets my background texture using a class that I wrote using slick-util. I then bind the texture to a quad and draw it using glBegin() and glEnd() like this:
// Draw the background.
background.bind();
glBegin(GL_QUADS);
{
glTexCoord2d(0.0, 0.0);
glVertex2d(0, 0);
glTexCoord2d(1.0, 0.0);
glVertex2d(Game.WIDTH, 0);
glTexCoord2d(1.0, 1.0);
glVertex2d(Game.WIDTH, Game.HEIGHT);
glTexCoord2d(0.0, 1.0);
glVertex2d(0, Game.HEIGHT);
}
glEnd();
You'd expect this block to draw the quad so that it covered the entire screen, but it actually doesn't do this. It draws it in the middle of the screen, like so:
http://imgur.com/Xw9Xs9Z
The large, multicolored sprite that takes up the larger portion of the screen is my background, but it isn't taking up the full space like I want it to.
A few things I've tried:
Checking, double-checking, and triple-checking to make sure that the sprite's size and the window's size are identical
Resizing the sprite so that it is both larger and smaller than my target size. Nothing seems to change when I do this.
Positioning the sprite at different intervals or messing with the parameters of the glTexCoord2d() and glVertex2d(). This is just messy, and looks unnatural.
Why won't this background sprite draw to it's correct size?
If you have not created your own orthogonal projection (I.E. using glOrtho()), then your vertex coordinates will need to range from -1 to +1. Right now you're only drawing on the left half of that projection, thus giving you this result.

Android OpenGL ES 1: Filling display with a quad

I'm creating an app that only runs in landscape mode. I'm trying to create a background using a textured quad although I'm not going to worry about texturing yet. I've been trying to simply draw a quad that fills the screen from drawOverlay(GL10 gl) with GL_DEPTH_TEST disabled but whenever I do that the quad does not completely fill the screen and I can see bars of the glClearColor on the bottom and top of the screen.
Unable to draw it using the modelview matrix I was using for all the other objects, I tried to draw it using gluOrtho2D and glOrthof but neither of them worked. I don't really understand how the near and far clipping plane work with orthographic drawing. Whenever I tried to draw it using glOrtho2D or glOrthof, the quad wasn't drawn at all(although the rest of the scene was still rendered).
Here is my attempt at drawing using an orthographic matrix
private void drawOverlay(GL10 gl) {
gl.glDisable(GL10.GL_DEPTH_TEST);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glPushMatrix();
gl.glLoadIdentity();
GLU.gluOrtho2D(gl, 0f, 1f, 1f, 0f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
background.draw(gl, 1.0f, 1.0f, 1.0f, 1.0f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glPopMatrix();
gl.glMatrixMode(GL10.GL_MODELVIEW);
}
I call that function from the beginning of onDrawFrame - before anything else is drawn:
public void onDrawFrame(GL10 gl) {
drawOverlay(gl);
gl.glLoadIdentity
//...
}
Here is how "background" is created:
background = new ShapePostcard(1f, 1f);
I'm fairly certain I'm not going to be able to get the quad to cover the screen using the normal modelview matrix, but basically all I was doing was drawing "background" in onDrawFrame before everything else with depth testing disabled.
Thanks for any support!
The easiest way to draw a quad that fills the screen is to set both the projection and model to an identity matrix, and then draw a mesh with coordinates [-1,-1] to [1,1]. Was it this what you were drawing when you saw the borders?
I mean,
(x, y, widht, height) = (-1,-1, 2, 2)
OpenGL ES does not support quads.
http://www.songho.ca/opengl/gl_vertexarray.html

How would I read a sprite sheet in LWJGL?

I currently use LWJGL Textures to draw images on the screen. I would like to read Textures* from a sprite sheet. I am using slick's TextureLoader class to load the textures.
I draw an LWJGL Shape and bind a Texture onto it.
e.g:
Me drawing an image:
Texture texture = ResourceManager.loadTexture("Images/Tests/test.png");
GL11.glBegin(GL11.GL_QUADS);
{
GL11.glTexCoord2f(0, 0);
GL11.glVertex2f(0, 0);
GL11.glTexCoord2f(0, texture.getHeight());
GL11.glVertex2f(0, height);
GL11.glTexCoord2f(texture.getWidth(), texture.getHeight());
GL11.glVertex2f(width,height);
GL11.glTexCoord2f(texture.getWidth(), 0);
GL11.glVertex2f(width,0);
}
GL11.glEnd();
I think there is a way by when calling glTexCoord2f, I could give it a sprite offset and load the sprite sheet in the texture instead,
for example one call would be like this:
GL11.glTexCoord2f(0+spriteXOffset, texture.getHeight()-spriteYOffset);
But I would really like to know if there is a simpler way, maybe extracting Textures from a single texture for example like they do in here:
Reading images from a sprite sheet Java
Just instead of BufferedImage, Texture object.
Thank you for the help!
Texture coordinates for GL_TEXTURE_2D, used internally by the Slick texture loader, require normalized texture coordinates. That is, the coordinates range from 0.0 to 1.0. So (0,0) is the top-left corner of the texture, and (1,1) is the bottom-right corner. Assuming that you have your sprite coordinates in pixel coordinates at hand, you then have to divide the x coordinate by the texture width and the y coordinate by the texture height, resulting in normalized texture coordinates. You would then supply these coordinates to OpenGL using glTexCoord.
glTexCoord2f(spriteX / textureWidth, spriteY / textureHeight);
glVertex2f(coordinateX, coordinateY);
glTexCoord2f(spriteX+spriteWidth / textureWidth, spriteY / textureHeight);
glVertex2f(coordinateX2, coordinateY);
// Et cetera
There is, however, also an easier way of doing this. Take a look at this video (I created it), to see how you can use the pixel coordinates for textures instead of normalized ones.

Java OpenGL only blue Textures

I've have a little Problem with Textures and OpenGL. I made a small .obj Loader (with Texture Loading) but everything is drawn blue. Example:
I load a Texture. I bind the texture with GL11.glBindTexture(GL11.GL_TEXTURE_2D,textureId).
If I do:
glColor3f(1f,1f,1f);
glBegin(GL_QUADS);
glVertex3f(50f,0,-50);
glVertex3f(-50f,0,-50f);
glVertex3f(-50f,0,50f);
glVertex3f(50f,0,50f);
glEnd();
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
it draws a white quad ... but if I do:
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
glColor3f(1f,1f,1f);
glBegin(GL_QUADS);
glVertex3f(50f,0,-50);
glVertex3f(-50f,0,-50f);
glVertex3f(-50f,0,50f);
glVertex3f(50f,0,50f);
glEnd();
it draws a blue quad and everything else is blue too.
Maybe somebody knows a solution?
There is no texture because you didn't specify texture coordinates using glTexCoord2f.
The colors are wrong probably due to incorrect parameters to glTexImage.
Everything else is blue because, you are using the same texture for everything. Bind different texture or use the default texture 0.
This is the problem because you are not using the glTexCoord2f with each vertex. Please try this with each vertex because it can solve it as presently your color is not binding to all the areas.

Categories