I can't figure out how to increase the size of the image, I know how to rotate, but how would I adjust the size? Also, I would like to know how to prevent the bitmap from looking squished. Right now when I load it on screen it looks like the sides are being squished.
#Override
public void onDrawFrame(GL10 gl) {
// clear Screen and Depth Buffer
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
// Reset the Modelview Matrix
gl.glLoadIdentity();
// Drawing
gl.glTranslatef(0.0f, 0.0f, -5.0f); // move 5 units INTO the screen
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glRotatef(mAngle, 0, 1, 0);
gl.glRotatef(mAngle*0.25f, 1, 0, 0);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glRotatef(mAngle*2.0f, 0, 1, 1);
gl.glTranslatef(0.5f, 0.5f, 0.5f);
mAngle += 1.2f;
bitmap_image.draw(gl);
}
In another class I'm loading the bitmap with:
private FloatBuffer vertexBuffer; // buffer holding the vertices
private float vertices[] = {
-1.0f, -1.0f, 0.0f, // V1 - bottom left
-1.0f, 1.0f, 0.0f, // V2 - top left
1.0f, -1.0f, 0.0f, // V3 - bottom right
1.0f, 1.0f, 0.0f // V4 - top right
};
private FloatBuffer textureBuffer; // buffer holding the texture coordinates
private float texture[] = {
// Mapping coordinates for the vertices
0.0f, 1.0f, // top left (V2)
0.0f, 0.0f, // bottom left (V1)
1.0f, 1.0f, // top right (V4)
1.0f, 0.0f // bottom right (V3)
};
/** The texture pointer */
private int[] textures = new int[1];
public ImageLoader() {
// a float has 4 bytes so we allocate for each coordinate 4 bytes
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(vertices.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
// allocates the memory from the byte buffer
vertexBuffer = byteBuffer.asFloatBuffer();
// fill the vertexBuffer with the vertices
vertexBuffer.put(vertices);
// set the cursor position to the beginning of the buffer
vertexBuffer.position(0);
byteBuffer = ByteBuffer.allocateDirect(texture.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
textureBuffer = byteBuffer.asFloatBuffer();
textureBuffer.put(texture);
textureBuffer.position(0);
}
Any ideas why this is not producing a smooth circle?
public void draw(ShapeRenderer sRenderer) {
sRenderer.begin(ShapeType.Filled);
sRenderer.setColor(1.0f, 0.0f, 0.0f, 0.0f);
sRenderer.identity();
sRenderer.translate(1.0f, 1.0f, 0);
sRenderer.rotate(0.0f, 0.0f, 1.0f, (float) Math.toDegrees(getBody().getAngle()));
sRenderer.circle(0.0f, 0.0f, 1.0f);
sRenderer.end();
}
circle() takes another argument for setting the number of segments manually. You have it set to estimate and since it's thinking in pixels and not world units it assumed a 1 pixel radius circle.
public void draw(ShapeRenderer sRenderer) {
sRenderer.begin(ShapeType.Filled);
sRenderer.setColor(1.0f, 0.0f, 0.0f, 0.0f);
sRenderer.identity();
sRenderer.translate(1.0f, 1.0f, 0);
sRenderer.rotate(0.0f, 0.0f, 1.0f, (float) Math.toDegrees(getBody().getAngle()));
sRenderer.circle(0.0f, 0.0f, 1.0f, 100);
sRenderer.end();
}
That should get you somewhere, 100 is just a number I threw in, tune it to your needs.
I cannot seem to render a Texture to my square. I have gotten my program to render the blank square with color. Any help is greatly appreciated.
I've redesigned my code to the following and believe that the problem exists with how I'm setting up my Vertex Coordinates and Texture Coordinates. I also get a libc Fatal signal 11 at my glDrawArrays function.
Here are the Vertex and Texture Coordinates:
private final FloatBuffer vertexBuffer;
private final FloatBuffer textureBuffer;
static final int COORDS_PER_VERTEX = 3;
static float positionCoords[] = { // in counterclockwise order:
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
};
static final int COORDS_PER_TEXTURE = 2;
static float textureCoords[] = {
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f,
};
Here's my draw function in my square class:
public void draw(float[] mvpMatrix)
{
int MVPMatrixHandle = GLES20.glGetUniformLocation(shader.getProgram(), "u_MVPMatrix");
int textureHandler = GLES20.glGetUniformLocation(shader.getProgram(), "u_s_texture");
int positionHandler = GLES20.glGetAttribLocation(shader.getProgram(), "a_position");
int texCoordHandler = GLES20.glGetAttribLocation(shader.getProgram(), "a_texCoord");
Log.d(TAG, "Setting up GLProgram Handlers");
GlRenderer.checkGlError("Setup GLProgram Handlers");
GLES20.glEnableVertexAttribArray(positionHandler);
GLES20.glEnableVertexAttribArray(texCoordHandler);
GlRenderer.checkGlError("EnableVertexAttribArrays");
GLES20.glVertexAttribPointer(positionHandler, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
GLES20.glVertexAttribPointer(textureHandler, COORDS_PER_TEXTURE,
GLES20.GL_FLOAT, false,
textureStride, textureBuffer);
GlRenderer.checkGlError("VertexAttribPointers (Position, Texture)");
GLES20.glUniformMatrix4fv(MVPMatrixHandle, 1, false, mvpMatrix, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureID);
GLES20.glUniform1i(textureHandler, 0);
GlRenderer.checkGlError("Binding Texture");
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
GlRenderer.checkGlError("Draw Arrays");
GLES20.glDisableVertexAttribArray(positionHandler);
GLES20.glDisableVertexAttribArray(texCoordHandler);
GlRenderer.checkGlError("DisableVertexAttribArrays");
}
Your SetupGLPositionHandle function looks wrong to me. Why disable the PositionHandle attribute at the end of the function?
The attribute must be enabled at the time glDrawArrays is called.
My main problem in the above code was the fact that I was using the textureHandleinstead of the texCoordHandler in the glVertexAttribPointer //Texture
The code should look like this:
GLES20.glVertexAttribPointer(texCoordHandler, COORDS_PER_TEXTURE,
GLES20.GL_FLOAT, false,
textureStride, textureBuffer);
Since this problem arose I've rewritten my code yet again. Adding indices (draworder), combining my Vertex and Texture arrays into a Vertices array, and just reference a vertexBuffer //Contains both position coordinates and texture coordinates instead of a textureBuffer and a vertexBuffer //Only contains position coordinates
I've also changed the GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount); to:
GLES20.glDrawElements(GLES20.GL_TRIANGLES, indices.length,
GLES20.GL_UNSIGNED_SHORT, indexBuffer);
This has probably something to do with my transformations, but right now I can't figure this out and this is driving me instane. I have wrapped the draw code so that I can easily define new triangles. However, when I put this into a function, it just shows a grey screen. Te function code is as follows:
public void Draw(float[] mViewMatrix, float[] mModelMatrix, float[] mProjectionMatrix, int mPositionHandle, int mColorHandle, int mMVPMatrixHandle)
{
long time = SystemClock.uptimeMillis() % 10000L;
float angleInDegrees = (360.0f / 10000.0f) * ((int) time);
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
aBuffer = ByteBuffer.allocateDirect(verts.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
//aBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Pass in the color information
aBuffer.position(mColorOffset);
GLES20.glVertexAttribPointer(mColorHandle, mColorDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aBuffer);
GLES20.glEnableVertexAttribArray(mColorHandle);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 3);
}
The code which IS working is:
public void onDrawFrame(GL10 glUnused)
{
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
// Do a complete rotation every 10 seconds.
long time = SystemClock.uptimeMillis() % 10000L;
float angleInDegrees = (360.0f / 10000.0f) * ((int) time);
// Draw the triangle facing straight on.
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
drawTriangle(mTriangle1Vertices);
// Draw one translated a bit down and rotated to be flat on the ground.
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 0.0f, -1.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, 90.0f, 1.0f, 0.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
drawTriangle(mTriangle2Vertices);
// Draw one translated a bit to the right and rotated to be facing to the left.
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 1.0f, 0.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, 90.0f, 0.0f, 1.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
drawTriangle(mTriangle3Vertices);
*/
/*
for (int x = 0; x < staticHolder.objectList.size(); x++)
{
staticHolder.objectList.get(x).Draw(mViewMatrix, mModelMatrix, mProjectionMatrix, mPositionHandle, mColorHandle, mMVPMatrixHandle);
}
*/
}
/**
* Draws a triangle from the given vertex data.
*
* #param aTriangleBuffer The buffer containing the vertex data.
*/
private void drawTriangle(final FloatBuffer aTriangleBuffer)
{
// Pass in the position information
aTriangleBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Pass in the color information
aTriangleBuffer.position(mColorOffset);
GLES20.glVertexAttribPointer(mColorHandle, mColorDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mColorHandle);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 3);
}
I am passing in the same variables and the final variables used here are initialized the same. There is some other work that happens in the function for encapsulation. Any idea why it is refusing to render in the function?
The following code loads the objects in the list:
final float[] triangle1VerticesData = {
// X, Y, Z,
// R, G, B, A
-0.5f, -0.25f, 0.0f,
1.0f, 0.0f, 0.0f, 1.0f,
0.5f, -0.25f, 0.0f,
0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.559016994f, 0.0f,
0.0f, 1.0f, 0.0f, 1.0f};
final float[] triangle2VerticesData = {
// X, Y, Z,
// R, G, B, A
-0.5f, -0.25f, 0.0f,
1.0f, 1.0f, 0.0f, 1.0f,
0.5f, -0.25f, 0.0f,
0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 0.559016994f, 0.0f,
1.0f, 0.0f, 1.0f, 1.0f};
// This triangle is white, gray, and black.
final float[] triangle3VerticesData = {
// X, Y, Z,
// R, G, B, A
-0.5f, -0.25f, 0.0f,
1.0f, 1.0f, 1.0f, 1.0f,
0.5f, -0.25f, 0.0f,
0.5f, 0.5f, 0.5f, 1.0f,
0.0f, 0.559016994f, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f};
staticHolder.objectList.add(new Triangle(triangle1VerticesData));
staticHolder.objectList.add(new Triangle(triangle2VerticesData));
staticHolder.objectList.add(new Triangle(triangle3VerticesData));
The receiving class is:
public class Triangle extends shape
{
public Triangle(float[] data)
{
verts = data;
}
}
After the following bit of code:
aBuffer = ByteBuffer.allocateDirect(verts.length * mBytesPerFloat).order(ByteOrder.nativeOrder()).asFloatBuffer();
You must put the vertices into the buffer (otherwise, it's blank!):
aBuffer.put(verts);
The reason this isn't in the bit of code that works, is because those three sets of vertices' buffers are pre-allocated, and the vertices are put into it then (at initialization). They are simply passed to the method each time, so they don't have to be put() in again.
On that note, you will want to avoid allocations in your Draw method, as it's called many times per frame and could lead to slow rendering. Allocate aBuffer once, and put new vertices into it each time.
I'm creating application to Android using OpenGL ES.
I created rectangle using following vertices.
private float vertices[] = {
-1.0f, 0.5f, 0.0f, // 0, Top Left
-1.0f, -0.5f, 0.0f, // 1, Bottom Left
1.0f, -0.5f, 0.0f, // 2, Bottom Right
1.0f, 0.5f, 0.0f, // 3, Top Right
};
private short[] indices = { 0, 1, 2, 0, 2, 3 };
How do find location in pixels for this rectangle.
It depends on your viewport, projection and model view matrices. position of a vertex on the screen is calculated with the formula like: projectionMatrix * modelviewMatrix * vertex
find some useful explanations here:
http://robertokoci.com/world-view-projection-matrix-unveiled/
http://db-in.com/blog/2011/04/cameras-on-opengl-es-2-x/