I'm trying to draw a textured quad, then on top of that a black rectangle which has no texture. I use glEnable(GL_TEXTURE_2D) before I draw the textured quad and glDisable(GL_TEXTURE_2D) before I draw the non-textured quad, but My textured quad only appears for a split second and then I'm left with my clear color filling the entire screen and only the non-textured quad showing up. Why is this happening?
P.S: I call glClear(GL_COLOR_BUFFER_BIT) before doing any render work, not in between the render methods so that shouldn't be the problem I think
Also, I can only see the non-textured quad even if I draw the textured quad on top of it
I solved my problem. Turns out I didn't set a color before doing the textured quad so whenever i changed the color of the other quad this color is changing as well, so setting the color before binding the texture worked.
Related
I'm trying to recreate a shadow effect on some 2D sprites in a project using Slick. To do this, I'm recolouring a copy of the sprite and stretching it using Slick OpenGL using this method:
public static void getStretched(Shape shape, Image image) {
TextureImpl.bindNone();
image.getTexture().bind();
SGL GL = Renderer.get();
GL.glEnable(SGL.GL_TEXTURE_2D);
GL.glBegin(SGL.GL_QUADS);
//topleft
GL.glTexCoord2f(0f, 0f);
GL.glVertex2f(shape.getPoints()[0], shape.getPoints()[1]);
//topright
GL.glTexCoord2f(0.5f, 0f);
GL.glVertex2f(shape.getPoints()[2], shape.getPoints()[3]);
//bottom right
GL.glTexCoord2f(1f, 1f);
GL.glVertex2f(shape.getPoints()[4], shape.getPoints()[5]);
//btoom left
GL.glTexCoord2f(0.5f, 1f);
GL.glVertex2f(shape.getPoints()[6], shape.getPoints()[7]);
GL.glEnd();
GL.glDisable(SGL.GL_TEXTURE_2D);
TextureImpl.bindNone();
}
This gives the almost the desired effect, aside from the fact that the image is cropped a bit.
This becomes more extreme for higher distortions
I'm new to using OpenGL, so some help regarding how to fix this would be great.
Furthermore, if I feed an image into the method that was obtained using getSubImage, OpenGL renders the original image, rather than the sub image.
I'm unsure as to why this happens, as the sprite itself is taken from a spritesheet using getSubImage and has no problem rendering.
Help would be greatly appreciated!
I'm recolouring a copy of the sprite and stretching it
The issue is that you stretch the texture coordinates, but the region which is covered by the sprite stays the same. If the shadow exceeds the the region which is covered by the quad primitive, then it is cropped.
You have to "stretch" the vertex coordinates rather than the texture coordinates. Define a rhombic geometry for the shadow and wrap the texture on it:
float distortionX = ...;
GL.glEnable(SGL.GL_TEXTURE_2D);
GL.glBegin(SGL.GL_QUADS);
//topleft
GL.glTexCoord2f(0f, 0f);
GL.glVertex2f(shape.getPoints()[0] + distortionX, shape.getPoints()[1]);
//topright
GL.glTexCoord2f(1f, 0f);
GL.glVertex2f(shape.getPoints()[2] + distortionX, shape.getPoints()[3]);
//bottom right
GL.glTexCoord2f(1f, 1f);
GL.glVertex2f(shape.getPoints()[4], shape.getPoints()[5]);
//btoom left
GL.glTexCoord2f(0f, 1f);
GL.glVertex2f(shape.getPoints()[6], shape.getPoints()[7]);
GL.glEnd();
[...] if I feed an image into the method that was obtained using getSubImage, [...] the sprite itself is taken from a spritesheet [...]
The sprite is covers just a small rectangular region of the entire texture. This region is defined by the texture coordinates. You have to use the same texture coordinates as when you draw the sprite itself.
I'm trying to render a background image for a new game I'm creating. To do this, I thought I'd just create a simple Quad and draw it first so that it stretched over the background of my game. The problem is that the quad doesn't draw to it's correct size and draws at the complete wrong place on the screen. I am using LWJGL and an added slick-util library for loading textures.
background = TextureHandler.getTexture("background", "png");
This is the line of code which basically gets my background texture using a class that I wrote using slick-util. I then bind the texture to a quad and draw it using glBegin() and glEnd() like this:
// Draw the background.
background.bind();
glBegin(GL_QUADS);
{
glTexCoord2d(0.0, 0.0);
glVertex2d(0, 0);
glTexCoord2d(1.0, 0.0);
glVertex2d(Game.WIDTH, 0);
glTexCoord2d(1.0, 1.0);
glVertex2d(Game.WIDTH, Game.HEIGHT);
glTexCoord2d(0.0, 1.0);
glVertex2d(0, Game.HEIGHT);
}
glEnd();
You'd expect this block to draw the quad so that it covered the entire screen, but it actually doesn't do this. It draws it in the middle of the screen, like so:
http://imgur.com/Xw9Xs9Z
The large, multicolored sprite that takes up the larger portion of the screen is my background, but it isn't taking up the full space like I want it to.
A few things I've tried:
Checking, double-checking, and triple-checking to make sure that the sprite's size and the window's size are identical
Resizing the sprite so that it is both larger and smaller than my target size. Nothing seems to change when I do this.
Positioning the sprite at different intervals or messing with the parameters of the glTexCoord2d() and glVertex2d(). This is just messy, and looks unnatural.
Why won't this background sprite draw to it's correct size?
If you have not created your own orthogonal projection (I.E. using glOrtho()), then your vertex coordinates will need to range from -1 to +1. Right now you're only drawing on the left half of that projection, thus giving you this result.
How would I go about having both textured quads and untextured quads both being rendered. The issue is if I have textured quads drawn the untextured quads do not show if the textured one is drawn after, and vice versa.
I'm using LWJGL.
Use glEnable(GL_TEXTURE_2D) before your textured quads, and glDisable(GL_TEXTURE_2D) before your untextured qauds.
I've have a little Problem with Textures and OpenGL. I made a small .obj Loader (with Texture Loading) but everything is drawn blue. Example:
I load a Texture. I bind the texture with GL11.glBindTexture(GL11.GL_TEXTURE_2D,textureId).
If I do:
glColor3f(1f,1f,1f);
glBegin(GL_QUADS);
glVertex3f(50f,0,-50);
glVertex3f(-50f,0,-50f);
glVertex3f(-50f,0,50f);
glVertex3f(50f,0,50f);
glEnd();
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
it draws a white quad ... but if I do:
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
glColor3f(1f,1f,1f);
glBegin(GL_QUADS);
glVertex3f(50f,0,-50);
glVertex3f(-50f,0,-50f);
glVertex3f(-50f,0,50f);
glVertex3f(50f,0,50f);
glEnd();
it draws a blue quad and everything else is blue too.
Maybe somebody knows a solution?
There is no texture because you didn't specify texture coordinates using glTexCoord2f.
The colors are wrong probably due to incorrect parameters to glTexImage.
Everything else is blue because, you are using the same texture for everything. Bind different texture or use the default texture 0.
This is the problem because you are not using the glTexCoord2f with each vertex. Please try this with each vertex because it can solve it as presently your color is not binding to all the areas.
I am using Ardor3D for a 3D application in Java. I can draw a quad to the screen with a texture mapped to it. Part of the texture image is transparent, and the quad background shows through there.
How do you make the quad itself transparent, so the rendered scene will show through?
If your texture format supports an alpha channel (png, tga, dds, etc. but not jpg) then you just also need a BlendState. Something like:
BlendState blend = new BlendState();
blend.setBlendEnabled(true);
myQuad.setRenderState(blend);
You may also want to place your quad into the Transparent render queue if your quad is partially transparent (an alpha in between 0% and 100%) to get the correct sorting:
myQuad.getSceneHints().setRenderBucketType(RenderBucketType.Transparent);
There are other, non-alpha based ways to blend, but the above is usually what you want.