OpenGL: 2D Overlay is white over of 3D Scene - java

I'm trying to make a copy of MineCraft in Java using OpenGL (LWJGL). The problem I'm facing is that everything of my 2D overlay (aiming cross in the middle, menus, etc...) are all white. The 3D part of the game works great: every cube has a texture on each side.
But when I try to draw the overlay, as I said, every texture is white, but I can see the shape of it (because it has transparent areas). I'll add a picture of it.
(This is supposed to be the inventory)
As you can see, the overlay is completely white. And it should look like this:
I'm already searching the web for hours. Can't seem to find a solution.
This drives my crazy... I already searched for instructions of how to create a 2D overlay on a 3D scene, but they don't help either. So I though, I'll give StackOverflow a try.
Hopefully someone can help me?
Thanks for reading my question and for the (hopefully coming) answers!
Martijn
Here is the code:
Initialising OpenGL
public void initOpenGL() throws IOException
{
// init OpenGL
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 800, 600, 0, 1, 300);
glMatrixMode(GL_MODELVIEW);
float color = 0.9f;
glClearColor(color, color, color, color);
glEnable(GL_TEXTURE_2D);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glShadeModel(GL_FLAT);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glEnable(GL_LINE_SMOOTH);
glEnable(GL_CULL_FACE);
glEnable(GL_FOG);
glFog(GL_FOG_COLOR, MineCraft.wrapDirect(color, color, color, 1.0f));
glFogi(GL_FOG_MODE, GL_LINEAR);
glFogf(GL_FOG_START, _configuration.getViewingDistance() * 0.8f);
glFogf(GL_FOG_END, _configuration.getViewingDistance());
glFogi(NVFogDistance.GL_FOG_DISTANCE_MODE_NV, NVFogDistance.GL_EYE_RADIAL_NV);
glHint(GL_FOG_HINT, GL_NICEST);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
}
Configuring the matrixes for drawing the overlay (Out of inspiration, I literally copied all the OpenGL calls for this method from BlockMania (another open-source MineCraft copy), which works great)
public void renderOverlay()
{
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
GLU.gluOrtho2D(0, conf.getWidth(), conf.getHeight(), 0);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_COLOR_MATERIAL);
glPushMatrix();
glLoadIdentity();
glDisable(GL_CULL_FACE);
glDisable(GL_DEPTH_TEST);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
/** RENDER **/
if (_activatedInventory != null)
{
_activatedInventory.renderInventory();
}
glDisable(GL_BLEND);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glPopMatrix();
glMatrixMode(GL_PROJECTION);
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
}
Drawing the texture itself:
public void renderInventory()
{
Configuration conf = Game.getInstance().getConfiguration();
glTranslatef(conf.getWidth() / 2.0f, conf.getHeight() / 2.0f, 0.0f);
glEnable(GL_TEXTURE_2D);
Texture tex = TextureStorage.getTexture("gui.inventory");
tex.bind(); // newdawn.slick (same library for my whole program, so this works)
float hw = 170; // half width
float hh = 163; // half height
Vector2f _texPosUpLeft = new Vector2f(3, 0);
Vector2f _texPosDownRight = new Vector2f(_texPosUpLeft.x + hw, _texPosUpLeft.y + hh);
_texPosUpLeft.x /= tex.getTextureWidth();
_texPosUpLeft.y /= tex.getTextureHeight();
_texPosDownRight.x /= tex.getTextureWidth();
_texPosDownRight.y /= tex.getTextureHeight();
glColor3f(1, 1, 1); // Changes this doesn't make any effect
glBegin(GL_QUADS);
glTexCoord2f(_texPosUpLeft.x, _texPosUpLeft.y);
glVertex2f(-hw, -hh);
glTexCoord2f(_texPosDownRight.x, _texPosUpLeft.y);
glVertex2f(hw, -hh);
glTexCoord2f(_texPosDownRight.x, _texPosDownRight.y);
glVertex2f(hw, hh);
glTexCoord2f(_texPosUpLeft.x, _texPosDownRight.y);
glVertex2f(-hw, hh);
glEnd();
}
(The texture pack I'm using is CUBISM1.00)

I found it!!
It was the fog. For one or another reason it looks like it thinks the overlay is out of sight and gives it the color of the fog. So, disabling the fog before rendering the overlay solved it.
glDisable(GL_FOG);
/* Render overlay here */
glEnable(GL_FOG);
If there are still people who read this, is this caused by matrix abuse or is this behaviour normal?

Related

OpenGL GlOrtho no primitives appearing

I am drawing on a texture and then rendering this texture to the screen using GL_QUADS. There is no problem drawing the texture on screen, but while drawing to the texture the only operation that has any effect is glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); - this sets the whole texture to the colour I want. It is however impossible to draw any primitives.
The following code should absolutely cover the whole texture with black, right?
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, texRenderer.framebufferID); // switch to the texture framebuffer
glClearColor(1.0f,0.5f,0.0f,1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glDisable(GL_DEPTH_TEST);
glDisable(GL_SCISSOR_TEST);
glDisable(GL_CULL_FACE);
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glOrtho(0, texRenderer.pixelsWidth, 0, texRenderer.pixelsHeight, 1, -1);
// switch to modelview matrix before rendering objects
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glColor4f(0,0,0,1);
glBegin(GL_QUADS);
glVertex2f(-100, -100);
glVertex2f(-100, 100);
glVertex2f(100, 100);
glVertex2f(100, -100);
glEnd();
glMatrixMode(GL_PROJECTION);
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
//RenderScene(sim, stateMachine, viewSettings, commandMgr, font, texRenderer.pixelsWidth, texRenderer.pixelsHeight, isMainView, drawText);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); // switch back to normal framebuffer
Instead it is orange.
pixelsWidth and pixelsHeight are both equal to 64. I've made sure the viewport is set up correctly. Also glGetError() returns 0.
What have I missed?
Took me 6 hours but the problem is solved. I was wrong. The viewport was not properly set. Before rendering to a texture, be sure to add the following line:
glViewport(0, 0, textureWidth, textureHeight);
And also unbind any currently bound textures using
glBindTexture(GL_TEXTURE_2D, 0);

Problems with OpenGL Rotation and Scale

I just started trying out some OpenGL with java. I downloaded the LWJGL and the Slick-Util Library. Now I'm trying to paint some images on the screen which is working quite fine. But I have to big problems:
When i rotate my image and it's about 45° you can see some bits of the same image at the corners like it's a spritesheet with the same image, which get rotated.
How do I scale my image? It's pretty small and the glScale() func scales the image itself, but not the space where it's printed. So if the image has a size of 16*16 pixels and i scale it up i just see a part of the the scaled image in the 16*16pixels
Here's my code for the OpenGL:
public class Widget {
String name;
int angle;
public Texture image_texture;
public String image_path ;
public int image_ID;
public int cord_x = 0;
public int cord_y = 0;
static LinkedList<Widget> llwidget = new LinkedList<Widget>();
public Widget(String path){
llwidget.add(this);
image_path = path;
try {
image_texture = TextureLoader.getTexture("PNG", ResourceLoader.getResourceAsStream(image_path), GL_NEAREST);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
};
image_ID = image_texture.getTextureID();
glEnable(GL_TEXTURE_2D);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glViewport(0,0,Display.getWidth(),Display.getHeight());
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, Display.getWidth(),Display.getHeight(), 0, 1, -1);
glMatrixMode(GL_MODELVIEW);
}
void render(){
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
glTranslatef(0.5f,0.5f,0.0f);
//! Graphics bug
glRotatef(angle,0.0f,0.0f,1.0f);
//
glTranslatef(-0.5f,-0.5f,0.0f);
//! Doesn't work
glScalef(2f, 2f, 2f);
//
glMatrixMode(GL_MODELVIEW);
Color.white.bind();
image_texture.bind();
glBegin(GL_QUADS);
glTexCoord2f(0,0);
glVertex2f(cord_x, Display.getHeight() - cord_y);
glTexCoord2f(1,0);
glVertex2f(cord_x+image_texture.getTextureWidth(),Display.getHeight() - cord_y);
glTexCoord2f(1,1);
glVertex2f(cord_x+image_texture.getTextureWidth(),Display.getHeight() - cord_y+image_texture.getTextureHeight());
glTexCoord2f(0,1);
glVertex2f(cord_x,Display.getHeight() - cord_y+image_texture.getTextureHeight());
glEnd();
}
}
Question 1:
Calling glScalef(2f, 2f, 2f) in GL_TEXTURE mode results in a zoom of the texture inside a quad which has the size determined by your glVertex2f calls. This could lead to unwanted artifacts at the edge of the quad.
Question 2:
The main problem at this is that the program is not in a state which allows to translate your quad coordinates when glScalef(2f, 2f, 2f) got called. After calling glMatrixMode(GL_TEXTURE) all following matrix operations affect the texture matrix. To get a zoom effect of the quad which you draw with glVertex2f you need to get in GL_MODELVIEW Mode before calling glScale2f.
Alternative:
With glVertex2f you determine on which coordinates you draw the quad. The parameters of glVertex2f are coordinates on which to draw vertices. So altering this params would also do the job.

2D overlay over 3D scene not working

I am attempting to create a 2D overlay over a 3D scene! I have tried all the solutions I can find on GameDev and StackOverflow, however they have not seemed to work!
My current code:
static void ready3D()
{
glViewport(0, 0, Display.getWidth(),Display.getHeight());
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
GLU.gluPerspective(45, (float) Display.getWidth()/Display.getHeight(), 0.1f, 5000.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glDepthFunc(GL_LEQUAL);
glEnable(GL_DEPTH_TEST);
}
static void ready2D()
{
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
GLU.gluOrtho2D(0.0f, Display.getWidth(), Display.getHeight(), 0.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.375f, 0.375f, 0.0f);
glDisable(GL_DEPTH_TEST);
}
And,
glPushMatrix();
//Overlay start - This is in my render method BTW.
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
ready2D();
glBegin(GL_QUADS);
glColor3f(1, 1, 1);
glVertex2f(0, 0); // bottom-left
glVertex2f(0, 1); // top-left
glVertex2f(1, 1); // top-right
glVertex2f(1, 0); // bottom-right
glEnd();
glPopMatrix();
ready3D();
However the 2D box thing that I am trying to draw does not draw! Obviously I eventually hope to have complicated objects/icons on an overlay, but first things first.
The 3D world still draws totally fine.
Is anyone able to tell me what I am doing wrong?
The order of the vertices in the 2D quad look a bit suspicious to me, if I'm reading that rightly it's going: bottom-left, top-right, bottom-right, top-left, which results in a sort of cross rather than a quad.
Try this:
glVertex2f(0, 0); // bottom-left
glVertex2f(0, 1); // top-left
glVertex2f(1, 1); // top-right
glVertex2f(1, 0); // bottom-right
Note the clockwise ordering which I believe OpenGL is expecting by default for front-facing quads.

My Engine flickers when I try to render 2D and 3D

I am working with LWJGL to make a game. It's very basic. Before even implementing any sort of gpu rendering, or fancy model loaders, I wanted to make sure I could at least render 2D and 3D at the same time; My game has a gui while you walk around. Or at least, it is supposed to. Here is my initialization code; The flickering does not happen when I only render 3D.
public void clearGL() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glLoadIdentity();
}
public void init3D() {
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective((float) 68, Engine.size[0] / Engine.size[1], 0.3f, 1000);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.5f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
}
public void render3D(Camera c) {
init3D();
clearGL();
//Do translations here
glTranslatef(0f, -5f, 0f);
glColor3f(0, 1, 0);
glBegin(GL_QUADS);
glVertex3f(-50f, 0f, -50f);
glColor3f(0, 0, 1);
glVertex3f(50f, 0f, -50f);
glColor3f(1, 0, 0);
glVertex3f(50f, 0f, 50f);
glColor3f(0, 1, 1);
glVertex3f(-50f, 0f, 50f);
glEnd();
}
public void init2D() {
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, Engine.size[0], 0, Engine.size[1], -1, 1);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glLoadIdentity();
}
public void render2D() {
init2D();
glBegin(GL_QUADS);
glVertex2f(0f, 50f);
glVertex2f(50f, 50f);
glVertex2f(50f, 0f);
glVertex2f(0f, 0f);
glPopMatrix();
}
I can tell its rendering at all because I am drawing a quad to represent the floor in JBullet. For some reason it is above the cameras head, but when I translate the camera up towards it it get's further away, which is why I translated the Camera to -5. That's another problem, for another day.
You should really consider disabling the depth test when you "switch" from 3D to 2D if you are going to draw at Z=0 (middle of your depth range). Half of the visible space in your 3D scene will potentially obstruct your 2D drawing if you do not do this. Alternatively, you could replace your glVertex2f (...) calls with glVertex3f (x,y, -1.0) to bring everything in 2D to the very front of the depth range.
But the really weird thing about all of this is the end of your render2D (...) function: You never call glEnd (...) and you pop a matrix that you appear never to have pushed. That is two sources of mismatched weirdness, either one of them could be causing your problem.

Android - OpenGL - Emulator vs Actual Device

I am writing a game which uses opengles. I have created my renderer class and have a sample of my game working on the emulator, however none of the texures display on an actual device. I have read about the most common cause for this being the need for texture to be a factor of 2 however I have tried drawing a square (128x128) with a texture of the same size mapped to it and this only shows on the emulator. Further to that my actual game will be using rectangles so I'm unsure how I can map textures that are squares to rectangles..
This is my code so far (The game is 2d so I'm using ortho mode):
EDIT: I have updated my code, it is now correctly binding textures and using textures of size 128x128, still only seeing textures on the emulator..
public void onSurfaceCreated(GL10 gl, EGLConfig config)
{
byteBuffer = ByteBuffer.allocateDirect(shape.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
vertexBuffer = byteBuffer.asFloatBuffer();
vertexBuffer.put(cardshape);
vertexBuffer.position(0);
byteBuffer = ByteBuffer.allocateDirect(shape.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
textureBuffer = byteBuffer.asFloatBuffer();
textureBuffer.put(textureshape);
textureBuffer.position(0);
// Set the background color to black ( rgba ).
gl.glClearColor(0.0f, 0.0f, 0.0f, 0.5f);
// Enable Smooth Shading, default not really needed.
gl.glShadeModel(GL10.GL_SMOOTH);
// Depth buffer setup.
gl.glClearDepthf(1.0f);
// Enables depth testing.
gl.glEnable(GL10.GL_DEPTH_TEST);
// The type of depth testing to do.
gl.glDepthFunc(GL10.GL_LEQUAL);
// Really nice perspective calculations.
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
gl.glEnable(GL10.GL_TEXTURE_2D);
loadGLTexture(gl);
}
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glDisable(GL10.GL_DEPTH_TEST);
gl.glMatrixMode(GL10.GL_PROJECTION); // Select Projection
gl.glPushMatrix(); // Push The Matrix
gl.glLoadIdentity(); // Reset The Matrix
gl.glOrthof(0f, 480f, 0f, 800f, -1f, 1f);
gl.glMatrixMode(GL10.GL_MODELVIEW); // Select Modelview Matrix
gl.glPushMatrix(); // Push The Matrix
gl.glLoadIdentity(); // Reset The Matrix
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glLoadIdentity();
gl.glTranslatef(card.x, card.y, 0.0f);
gl.glBindTexture(GL10.GL_TEXTURE_2D, card.texture[0]); //activates texture to be used now
gl.glVertexPointer(2, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, 4);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
// Sets the current view port to the new size.
gl.glViewport(0, 0, width, height);
// Select the projection matrix
gl.glMatrixMode(GL10.GL_PROJECTION);
// Reset the projection matrix
gl.glLoadIdentity();
// Calculate the aspect ratio of the window
GLU.gluPerspective(gl, 45.0f, (float) width / (float) height, 0.1f,
100.0f);
// Select the modelview matrix
gl.glMatrixMode(GL10.GL_MODELVIEW);
// Reset the modelview matrix
gl.glLoadIdentity();
}
public int[] texture = new int[1];
public void loadGLTexture(GL10 gl) {
// loading texture
Bitmap bitmap;
bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.image);
// generate one texture pointer
gl.glGenTextures(0, texture, 0); //adds texture id to texture array
// ...and bind it to our array
gl.glBindTexture(GL10.GL_TEXTURE_2D, texture[0]); //activates texture to be used now
// create nearest filtered texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
// Use Android GLUtils to specify a two-dimensional texture image from our bitmap
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
// Clean up
bitmap.recycle();
}
Is there anything I have done wrong? Or something I haven't done? It works perfectly fine in the emulator so I could only assume it was the power of 2 issue but like I said I tried that using a 128x128 texture on a square but it didn't show.. any help would be appreciated..
EDIT: I have also tried setting the minsdkversion is 3, loading the bitmap via an input stream bitmap = BitmapFactory.decodeStream(is), setting BitmapFactory.Options.inScaled to false, putting the images in the nodpi folder and then trying them in the raw folder.. any other ideas?
I'm actually looking for the solution to a similar problem right now. I think I might have a temporary fix for you, however.
The problem appears to be that on the emulator the orthographic view is flipped. To solve this, in my app we added an option in preferences to manually flip the view if nothing draws. Here's the snippet that handles this:
if (!flipped)
{
glOrthof(0, screenWidth, screenHeight, 0, -1, 1); //--Device
}
else
{
glOrthof(0, screenWidth, 0, -screenHeight, -1, 1); //--Emulator
}
Hope this helps! If anybody has a more general solution, I'd be happy to hear it!
I didn't look at your code but I have been on that road before. Developing in OpenGL is a real pain in the ass. If you are not obligated to use OpenGL, then use a graphics engine. Unity is a great one and it's free. Also your game would work on Android, iOS or other platforms. Study your choices carefully. Good luck..

Categories