OpenGL ES Android - Texture not loaded (rendered black) - java

As in the title, I'm using OpenGL ES (2) on Android (7.0).
Apparently, though, I have some problems when trying to load a texture from the asset folder.
This is the code I'm using
public static int loadTexture(final AssetManager assetManager, final String img)
{
final int[] textureHandle = new int[1];
glGenTextures(1, textureHandle, 0);
if(textureHandle[0] == 0)
throw new RuntimeException("Error loading texture");
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false;
try
{
final Bitmap bitmap = BitmapFactory.decodeStream(assetManager.open(img));
glBindTexture(GL_TEXTURE_2D, textureHandle[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
GLUtils.texImage2D(GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
}
catch (IOException e)
{
e.printStackTrace();
}
textures.put(img, textureHandle[0]);
return textureHandle[0];
}
This is called from onSurfaceCreated as I know I need a running OpenGL context
Most of the time this works and gives me this result
but a considerable number of other times I get this
No Exception is thrown.
I know the problem is not depending on the 3D model nor the texture because I've tried other ones. The only thing I can do when this happens is to restart my App a couple of times.
I've tried googling around but with no results. I know I could try to implement another loading function, but first I'd like to understand why this is not ok and where it's not working.
If you think this could depends on the code I'm using to render the scene here it is (no VAOs because I need to support older devices and OpenGL ES 2):
glBindBuffer(GL_ARRAY_BUFFER, model.getCoordsVBO());
glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, model.getTextureVBO());
glVertexAttribPointer(1, 2, GL_FLOAT, false, 0, 0);
glEnableVertexAttribArray(1);
glBindBuffer(GL_ARRAY_BUFFER, model.getNormalsVBO());
glVertexAttribPointer(2, 3, GL_FLOAT, false, 0, 0);
glEnableVertexAttribArray(2);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, entity.getTexture());
loadFloat(loc_textureSampler, 0);
if(model.getIndicesBuffer() != null)
{
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, model.getIndicesVBO());
glDrawElements(GL_TRIANGLES, model.getVertexCount(), GL_UNSIGNED_INT, 0);
}
else
glDrawArrays(GL_TRIANGLES, 0, model.getVertexCount());
glDisableVertexAttribArray(0);
glDisableVertexAttribArray(1);
glDisableVertexAttribArray(2);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

What I didn't say (because I thought it was irrelevant) is that I had a little more code ad the top of the function...
if(textures.containsKey(img))
return textures.get(img);
and at the bottom
textures.put(img, textureHandle[0]);
I thought it was intelligent to keep track of the textures, so I could have called this load function anytime I needed a texture and if I had already loaded it, it would have returned it without loading again.
This could work on a desktop application, but I haven't considered that on Android the garbage collector could delete the texture from OpenGL when I exit the app so that when the activity restarts it wouldn't be there anymore.
Removing that HashMap solved the problem, demonstrating that the actual reading and loading code had nothing to do with that black thing.

Related

Correct Use of glVertexAttribPointer?

I recently decided to start Learning OpenGL and got myself a book about OpenGL Core 3.3. The book is generally about C++.
So, after looking for a bit, I found a library in a language I was better in which provided almost the same functionality: lwjgl.
I followed the book's steps and translated the C++ syntax into java syntax, which worked until it got to actually drawing something.
There, the JVM just kept crashing, no matter what I changed about the code. After doing some debugging, I found out that the JVM crashed when I called either glVertexAttribPointer or glDrawArrays.
I am very new to OpenGL, and I am assuming this question must sound very stupid to someone more experienced, but: What do I need to change about this code?
float[] vertices = {
-0.5f, -0.5f, -0.0f,
0.5f, 0.5f, 0.0f,
0.0f,0.5f,0.0f
};
FloatBuffer b = BufferUtils.createFloatBuffer(9);
b.put(vertices);
int VBO = glGenBuffers();
int VAO = glGenVertexArrays();
log.info("VBO:" + VBO + "VAO: " + VAO);
// bind the Vertex Array Object first, then bind and set vertex buffer(s), and then configure vertex attributes(s).
glBindVertexArray(VAO);
glVertexAttribPointer(0, 3, GL_FLOAT, false, 12, 0);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, vertices, GL_STATIC_DRAW);
glBindVertexArray(0);
// Run the rendering loop until the user has attempted to close
// the window or has pressed the ESCAPE key.
while (!glfwWindowShouldClose(window))
{
// input
// -----
// render
// ------
glClearColor(0.2f, 0.3f, 0.3f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
// draw our first triangle
glUseProgram(shaderProgram);
glBindVertexArray(VAO); // seeing as we only have a single VAO there's no need to bind it every time, but we'll do so to keep things a bit more organized
glDrawArrays(GL_TRIANGLES, 0, 3);
glBindVertexArray(0); // no need to unbind it every time
// glfw: swap buffers and poll IO events (keys pressed/released, mouse moved etc.)
// -------------------------------------------------------------------------------
glfwSwapBuffers(window);
glfwPollEvents();
}
I would be very thankful for any help i can get, if you need more info/ need to see more of my code please let me know. Thanks in advance
You have to bind the vertex buffer object to the target GL_ARRAY_BUFFER, before specifying the vertex attribute:
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glVertexAttribPointer(0, 3, GL_FLOAT, false, 12, 0);
When glVertexAttribPointer is called, the buffer object currently bound to the ARRAY_BUFFER target is associated to the attribute (index) and a reference to the buffer object is stored in the state vector of the Vertex Array Object.

OpenGL does not render my Mesh

Using LWJGL I tried to render to render a simple Mesh on screen, but OpenGL decided to instead do nothing. :(
So I have a mesh class which creates a VBO. I can add some vertices which then are supposed to be drawn on screen.
public class Mesh {
private int vbo;
private int size = 0;
public Mesh() {
vbo = glGenBuffers();
}
public void addVertices(Vertex[] vertices) {
size = vertices.length;
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, Util.createFlippedBuffer(vertices), GL_STATIC_DRAW);
}
public void draw() {
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glVertexAttribPointer(0, 3, GL_FLOAT, false, Vertex.SIZE * 4, 0);
glDrawArrays(GL_TRIANGLES, 0, size);
glDisableVertexAttribArray(0);
}
}
Here is how I add vertices to my mesh:
mesh = new Mesh();
Vertex[] vertices = new Vertex[] { new Vertex(new Vector3f(-1, -1, 0)),
new Vertex(new Vector3f(-1, 1, 0)),
new Vertex(new Vector3f(0, 1, 0)) };
mesh.addVertices(vertices);
I am pretty sure I added them in the correct (clock-wise) order.
And my OpenGL setup:
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glFrontFace(GL_CW);
glCullFace(GL_BACK);
glEnable(GL_CULL_FACE);
glEnable(GL_DEPTH_TEST);
Calling glGetError() returns no error (0).
EDIT:
Well I found out that macs are little weird when it comes to OpenGL. I needed to use a VAO along with the VBO. Now it works fine. Thanks anyway!
I don't see anywhere you're specifying either a shader for output or a color, or a vertex array. Depending on which profile you're using you need to be doing one or more of these.
I would suggest checking / setting the following
Disable face culling to ensure that regardless of the winding you should see something
If you're requesting a core profile, you'll need a shader and quite possibly a vertex array object
If you're instead using a compatibility profile you should call glColor3f(1, 1, 1) in your draw call to ensure you're not drawing a black triangle
Are you clearing the color and depth framebuffers before your render?
You might not be drawing that object within the viewing frustum, also call glCheckError to make sure you aren't making any mistakes.
It is also important to understand the difference between fixed pipeline and programmable pipeline OpenGL. If you are using a version with a programmable pipeline you will need to write shaders, otherwise you will need to set modelview and projection matrices.

For loop for rendering OpenGL VBO

I have managed to get a cube rendered in OpenGL using a VBO. My next goal is actually creating a for loop to create multiple cubes. I'm stuck on this part though, do I put this code:
GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
ARBVertexBufferObject.glBindBufferARB(ARBVertexBufferObject.GL_ARRAY_BUFFER_ARB, vertexBufferID);
GL11.glVertexPointer(3, GL11.GL_FLOAT, 0, 0);
GL11.glDrawArrays(GL11.GL_QUADS, 0, 24);
GL11.glDisableClientState(GL11.GL_VERTEX_ARRAY);
Into a for loop? Wouldn't I have to use some sort of glPopMatrix command along with a translate function? I barely understand how to create one cube in a VBO, so sorry if its obvious whats wrong.
You can use the following way:
GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
ARBVertexBufferObject.glBindBufferARB(ARBVertexBufferObject.GL_ARRAY_BUFFER_ARB, vertexBufferID);
GL11.glVertexPointer(3, GL11.GL_FLOAT, 0, 0);
for (int i = 0; i < cubeCount; i++) {
GL11.glPushMatrix();
// do translation/rotation for cube no i
GL11.glDrawArrays(GL11.GL_QUADS, 0, 24);
GL11.glPopMatrix();
}
GL11.glDisableClientState(GL11.GL_VERTEX_ARRAY);
Please note that the glPushMatrix/glPopMatrix way is deprecated in newer openGl versions. For you it should work because you are using GL11.

Effective clean at opengl animation in android?

My Renderer looks like this:
public void onDrawFrame(GL10 gl) {
if(draw) {
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[i]);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glFrontFace(GL10.GL_CW);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(),
textures[i]);
gl.glGenTextures(1, textures, 0);
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[i]);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
} else {
gl.glDeleteTextures(textures.length, textures, 0);
gl.glFlush();
}
}
Is this all that I can do to clean after animation is over? I use this renderer all the time and after a few minutes I always get out of memory.
I use this renderer all the time and after a few minutes I always get out of memory.
You're not supposed to create all the textures anew every frame. You create them once and reuse them. If you want to update the picture while keeping format and dimensions use glTexSubImage2D on an existing texture object.

Android - OpenGL - Emulator vs Actual Device

I am writing a game which uses opengles. I have created my renderer class and have a sample of my game working on the emulator, however none of the texures display on an actual device. I have read about the most common cause for this being the need for texture to be a factor of 2 however I have tried drawing a square (128x128) with a texture of the same size mapped to it and this only shows on the emulator. Further to that my actual game will be using rectangles so I'm unsure how I can map textures that are squares to rectangles..
This is my code so far (The game is 2d so I'm using ortho mode):
EDIT: I have updated my code, it is now correctly binding textures and using textures of size 128x128, still only seeing textures on the emulator..
public void onSurfaceCreated(GL10 gl, EGLConfig config)
{
byteBuffer = ByteBuffer.allocateDirect(shape.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
vertexBuffer = byteBuffer.asFloatBuffer();
vertexBuffer.put(cardshape);
vertexBuffer.position(0);
byteBuffer = ByteBuffer.allocateDirect(shape.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
textureBuffer = byteBuffer.asFloatBuffer();
textureBuffer.put(textureshape);
textureBuffer.position(0);
// Set the background color to black ( rgba ).
gl.glClearColor(0.0f, 0.0f, 0.0f, 0.5f);
// Enable Smooth Shading, default not really needed.
gl.glShadeModel(GL10.GL_SMOOTH);
// Depth buffer setup.
gl.glClearDepthf(1.0f);
// Enables depth testing.
gl.glEnable(GL10.GL_DEPTH_TEST);
// The type of depth testing to do.
gl.glDepthFunc(GL10.GL_LEQUAL);
// Really nice perspective calculations.
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
gl.glEnable(GL10.GL_TEXTURE_2D);
loadGLTexture(gl);
}
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glDisable(GL10.GL_DEPTH_TEST);
gl.glMatrixMode(GL10.GL_PROJECTION); // Select Projection
gl.glPushMatrix(); // Push The Matrix
gl.glLoadIdentity(); // Reset The Matrix
gl.glOrthof(0f, 480f, 0f, 800f, -1f, 1f);
gl.glMatrixMode(GL10.GL_MODELVIEW); // Select Modelview Matrix
gl.glPushMatrix(); // Push The Matrix
gl.glLoadIdentity(); // Reset The Matrix
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glLoadIdentity();
gl.glTranslatef(card.x, card.y, 0.0f);
gl.glBindTexture(GL10.GL_TEXTURE_2D, card.texture[0]); //activates texture to be used now
gl.glVertexPointer(2, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, 4);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
// Sets the current view port to the new size.
gl.glViewport(0, 0, width, height);
// Select the projection matrix
gl.glMatrixMode(GL10.GL_PROJECTION);
// Reset the projection matrix
gl.glLoadIdentity();
// Calculate the aspect ratio of the window
GLU.gluPerspective(gl, 45.0f, (float) width / (float) height, 0.1f,
100.0f);
// Select the modelview matrix
gl.glMatrixMode(GL10.GL_MODELVIEW);
// Reset the modelview matrix
gl.glLoadIdentity();
}
public int[] texture = new int[1];
public void loadGLTexture(GL10 gl) {
// loading texture
Bitmap bitmap;
bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.image);
// generate one texture pointer
gl.glGenTextures(0, texture, 0); //adds texture id to texture array
// ...and bind it to our array
gl.glBindTexture(GL10.GL_TEXTURE_2D, texture[0]); //activates texture to be used now
// create nearest filtered texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
// Use Android GLUtils to specify a two-dimensional texture image from our bitmap
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
// Clean up
bitmap.recycle();
}
Is there anything I have done wrong? Or something I haven't done? It works perfectly fine in the emulator so I could only assume it was the power of 2 issue but like I said I tried that using a 128x128 texture on a square but it didn't show.. any help would be appreciated..
EDIT: I have also tried setting the minsdkversion is 3, loading the bitmap via an input stream bitmap = BitmapFactory.decodeStream(is), setting BitmapFactory.Options.inScaled to false, putting the images in the nodpi folder and then trying them in the raw folder.. any other ideas?
I'm actually looking for the solution to a similar problem right now. I think I might have a temporary fix for you, however.
The problem appears to be that on the emulator the orthographic view is flipped. To solve this, in my app we added an option in preferences to manually flip the view if nothing draws. Here's the snippet that handles this:
if (!flipped)
{
glOrthof(0, screenWidth, screenHeight, 0, -1, 1); //--Device
}
else
{
glOrthof(0, screenWidth, 0, -screenHeight, -1, 1); //--Emulator
}
Hope this helps! If anybody has a more general solution, I'd be happy to hear it!
I didn't look at your code but I have been on that road before. Developing in OpenGL is a real pain in the ass. If you are not obligated to use OpenGL, then use a graphics engine. Unity is a great one and it's free. Also your game would work on Android, iOS or other platforms. Study your choices carefully. Good luck..

Categories