I've have a little Problem with Textures and OpenGL. I made a small .obj Loader (with Texture Loading) but everything is drawn blue. Example:
I load a Texture. I bind the texture with GL11.glBindTexture(GL11.GL_TEXTURE_2D,textureId).
If I do:
glColor3f(1f,1f,1f);
glBegin(GL_QUADS);
glVertex3f(50f,0,-50);
glVertex3f(-50f,0,-50f);
glVertex3f(-50f,0,50f);
glVertex3f(50f,0,50f);
glEnd();
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
it draws a white quad ... but if I do:
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
glColor3f(1f,1f,1f);
glBegin(GL_QUADS);
glVertex3f(50f,0,-50);
glVertex3f(-50f,0,-50f);
glVertex3f(-50f,0,50f);
glVertex3f(50f,0,50f);
glEnd();
it draws a blue quad and everything else is blue too.
Maybe somebody knows a solution?
There is no texture because you didn't specify texture coordinates using glTexCoord2f.
The colors are wrong probably due to incorrect parameters to glTexImage.
Everything else is blue because, you are using the same texture for everything. Bind different texture or use the default texture 0.
This is the problem because you are not using the glTexCoord2f with each vertex. Please try this with each vertex because it can solve it as presently your color is not binding to all the areas.
Related
I'm trying to recreate a shadow effect on some 2D sprites in a project using Slick. To do this, I'm recolouring a copy of the sprite and stretching it using Slick OpenGL using this method:
public static void getStretched(Shape shape, Image image) {
TextureImpl.bindNone();
image.getTexture().bind();
SGL GL = Renderer.get();
GL.glEnable(SGL.GL_TEXTURE_2D);
GL.glBegin(SGL.GL_QUADS);
//topleft
GL.glTexCoord2f(0f, 0f);
GL.glVertex2f(shape.getPoints()[0], shape.getPoints()[1]);
//topright
GL.glTexCoord2f(0.5f, 0f);
GL.glVertex2f(shape.getPoints()[2], shape.getPoints()[3]);
//bottom right
GL.glTexCoord2f(1f, 1f);
GL.glVertex2f(shape.getPoints()[4], shape.getPoints()[5]);
//btoom left
GL.glTexCoord2f(0.5f, 1f);
GL.glVertex2f(shape.getPoints()[6], shape.getPoints()[7]);
GL.glEnd();
GL.glDisable(SGL.GL_TEXTURE_2D);
TextureImpl.bindNone();
}
This gives the almost the desired effect, aside from the fact that the image is cropped a bit.
This becomes more extreme for higher distortions
I'm new to using OpenGL, so some help regarding how to fix this would be great.
Furthermore, if I feed an image into the method that was obtained using getSubImage, OpenGL renders the original image, rather than the sub image.
I'm unsure as to why this happens, as the sprite itself is taken from a spritesheet using getSubImage and has no problem rendering.
Help would be greatly appreciated!
I'm recolouring a copy of the sprite and stretching it
The issue is that you stretch the texture coordinates, but the region which is covered by the sprite stays the same. If the shadow exceeds the the region which is covered by the quad primitive, then it is cropped.
You have to "stretch" the vertex coordinates rather than the texture coordinates. Define a rhombic geometry for the shadow and wrap the texture on it:
float distortionX = ...;
GL.glEnable(SGL.GL_TEXTURE_2D);
GL.glBegin(SGL.GL_QUADS);
//topleft
GL.glTexCoord2f(0f, 0f);
GL.glVertex2f(shape.getPoints()[0] + distortionX, shape.getPoints()[1]);
//topright
GL.glTexCoord2f(1f, 0f);
GL.glVertex2f(shape.getPoints()[2] + distortionX, shape.getPoints()[3]);
//bottom right
GL.glTexCoord2f(1f, 1f);
GL.glVertex2f(shape.getPoints()[4], shape.getPoints()[5]);
//btoom left
GL.glTexCoord2f(0f, 1f);
GL.glVertex2f(shape.getPoints()[6], shape.getPoints()[7]);
GL.glEnd();
[...] if I feed an image into the method that was obtained using getSubImage, [...] the sprite itself is taken from a spritesheet [...]
The sprite is covers just a small rectangular region of the entire texture. This region is defined by the texture coordinates. You have to use the same texture coordinates as when you draw the sprite itself.
I'm trying to draw a textured quad, then on top of that a black rectangle which has no texture. I use glEnable(GL_TEXTURE_2D) before I draw the textured quad and glDisable(GL_TEXTURE_2D) before I draw the non-textured quad, but My textured quad only appears for a split second and then I'm left with my clear color filling the entire screen and only the non-textured quad showing up. Why is this happening?
P.S: I call glClear(GL_COLOR_BUFFER_BIT) before doing any render work, not in between the render methods so that shouldn't be the problem I think
Also, I can only see the non-textured quad even if I draw the textured quad on top of it
I solved my problem. Turns out I didn't set a color before doing the textured quad so whenever i changed the color of the other quad this color is changing as well, so setting the color before binding the texture worked.
I need someone to help me with this. I can't really figure out how the alpha channel or blending works. The Image is in .bmp format, it loads perfectly.
But I want a specific color to be transparent, in paint that color is R255xG0xB255.
I've been searching for similar topics, but everything I found or tried just seems to mess stuff up, like give me a full black screen or make everything have a magenta touch...
Everything else works just fine with my code... Should I maybe switch to .png? could that solve the issue? does it have any pros or cons if I use png or bmp?
// Initialization Code OpenGL
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, HEIGHT, WIDTH, 0, 1, -1);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glClearColor(1.0f, 0.0f, 1.0f, 0.0f);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
while (!Display.isCloseRequested()) {
// Render
glClear(GL_COLOR_BUFFER_BIT);
this.drawMapToScreen();
this.drawCreatureToScreen();
Display.update();
Display.sync(60);
}
Display.destroy();
}
private void drawCreatureToScreen() {
Texture tex3 = SpriteLoader.loadTexture("player");
Color.magenta.bind();
tex3.bind();
glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2i((HEIGHT-32)/2,(WIDTH-32)/2);
glTexCoord2f(1, 0);
glVertex2i((HEIGHT-32)/2 +32,(WIDTH-32)/2);
glTexCoord2f(1, 1);
glVertex2i((HEIGHT-32)/2 +32,(WIDTH-32)/2 +32);
glTexCoord2f(0, 1);
glVertex2i((HEIGHT-32)/2,(WIDTH-32)/2 +32);
glEnd();
}
What are looking to do is that when your color is 1.0f, 0.0f, 1.0f (in the eyes of OpenGL: full red, no green, full blue) have the alpha (transparency) channel be zero, so that the pixel is drawn completely transparent. Unfortunately, there is no way to do this in OpenGL, unless you use shaders, and believe me using shaders is far messier than the solution I propose: doing "pre-multiplied alpha" in your image editor. What this means: it doesn't depend on which format you use - .png or .bmp are both fine, and they both support an alpha channel. Alpha is transparency; in OpenGL, you'll be dealing with floats a lot, so I'll use floats here. Alpha is the fourth color channel; we have Red, Green, Blue, and then Alpha. Alpha controls transparency (as a float) by being multiplied by the other channels: if alpha is 0.0f, then that color is fully transparent, but if alpha is 1.0f, it is fully opaque. To sum up: In your editor, you must make sure that the area you want transparent is transparent in the editor. For alpha blending, which still must be enabled for any transparency whatsoever: the generally preferred blend mode is as follows:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Now, blending is a technique for making objects appear transparent, among other things. In your case, you'll want to use the blend function above: use glDisable(GL_BLEND) to disable blending from occurring after enabling it. The blend function above essentially mixes the preexisting colors and colors to be rendered in such a way that an object rendered overlapping another, after said other, makes it appear that the "top" and most recently rendered object is transparent relative to the "bottom" object.
TL;DR:
Make sure the area you want transparent is transparent in your image editor. OpenGL cannot make specific colors have alpha values of 0.0f unless you use a needlessly complex shader.
Use the blending setup glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); to enable blending. Make sure the top object is rendered after the bottom object.
So I am developing a small Pong replica simply for some practice with LWJGL. Since there is no easy way to write customizable text in LWJGL, I am using textures for the start button, other buttons, and so on. However, when I drew the texture, it turned out to be discolored on my purple background. If I change the background to white, there is no discoloration. Help?
Also, my start button texture is something like 50x22, but I put it on a 64x64 image because Slick can only load resolutions that are an exponent of two. I adjusted the rectangle being drawn so that it is not warped, and the rest of the image is transparent, so it shouldn't be visible once I sort out the above problem. Are there any alternatives to my method?
This is where I initialize my OpenGL stuff:
public static void setCamera()
{
glClear(GL_COLOR_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0,width,0,height,-1,1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
And this is where I draw the texture:
public void draw()
{
logic();
glPushMatrix();
glTranslated(x,y,0);
texture.bind();
glBegin(GL_QUADS);
glTexCoord2d(0,1); glVertex2d(0,-196.875);
glTexCoord2d(1,1); glVertex2d(width+18.75,-196.875);
glTexCoord2d(1,0); glVertex2d(width+18.75,height);
glTexCoord2d(0,0); glVertex2d(0,height);
glEnd();
glPopMatrix();
}
Thanks :)
As discussed in comments, your initial problem was that you had neglected to reset the "current color" before drawing your texture. GL is a glorified state machine, it will continue to use the color you set for every other draw operation... so setting glColor3d (...) when you drew your background also affects your foreground image.
Adding the following before drawing your textured quad will fix this problem:
glColor3f (1.0f, 1.0f, 1.0f);
However, you have brought up a new issue in your comments related to blending. This question boils down to a lack of a blending function. By default when you draw something in OpenGL it will merely overwrite anything else in the framebuffer.
What you need for transparency to work is enable blending and use this function:
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
This effectively does the following:
NewColor = (NewColor.RGB * NewColor.A) + OldColor.RGB * (1.0 - NewColor.A)
With this, the parts of your texture that have an alpha value != 1.0 will be a mix of whatever was already in the framebuffer and what you just drew.
Remember to enable blending before you draw your transparent objects:
glEnable (GL_BLEND);
and disable it when you do not need it:
glDisable (GL_BLEND);
Lastly, you should be aware that the order you draw translucent objects in OpenGL is pretty important. Opaque objects (such as your background) need to be drawn first. In general, you need to draw things from back-to-front in order for alpha blending to function correctly. This is the opposite order you would ideally want for opaque geometry (since hardware can skip shading for obstructed objects if you draw objects in front of them first).
I currently use LWJGL Textures to draw images on the screen. I would like to read Textures* from a sprite sheet. I am using slick's TextureLoader class to load the textures.
I draw an LWJGL Shape and bind a Texture onto it.
e.g:
Me drawing an image:
Texture texture = ResourceManager.loadTexture("Images/Tests/test.png");
GL11.glBegin(GL11.GL_QUADS);
{
GL11.glTexCoord2f(0, 0);
GL11.glVertex2f(0, 0);
GL11.glTexCoord2f(0, texture.getHeight());
GL11.glVertex2f(0, height);
GL11.glTexCoord2f(texture.getWidth(), texture.getHeight());
GL11.glVertex2f(width,height);
GL11.glTexCoord2f(texture.getWidth(), 0);
GL11.glVertex2f(width,0);
}
GL11.glEnd();
I think there is a way by when calling glTexCoord2f, I could give it a sprite offset and load the sprite sheet in the texture instead,
for example one call would be like this:
GL11.glTexCoord2f(0+spriteXOffset, texture.getHeight()-spriteYOffset);
But I would really like to know if there is a simpler way, maybe extracting Textures from a single texture for example like they do in here:
Reading images from a sprite sheet Java
Just instead of BufferedImage, Texture object.
Thank you for the help!
Texture coordinates for GL_TEXTURE_2D, used internally by the Slick texture loader, require normalized texture coordinates. That is, the coordinates range from 0.0 to 1.0. So (0,0) is the top-left corner of the texture, and (1,1) is the bottom-right corner. Assuming that you have your sprite coordinates in pixel coordinates at hand, you then have to divide the x coordinate by the texture width and the y coordinate by the texture height, resulting in normalized texture coordinates. You would then supply these coordinates to OpenGL using glTexCoord.
glTexCoord2f(spriteX / textureWidth, spriteY / textureHeight);
glVertex2f(coordinateX, coordinateY);
glTexCoord2f(spriteX+spriteWidth / textureWidth, spriteY / textureHeight);
glVertex2f(coordinateX2, coordinateY);
// Et cetera
There is, however, also an easier way of doing this. Take a look at this video (I created it), to see how you can use the pixel coordinates for textures instead of normalized ones.