I'm using OpenGL with LWJGL and my own very small framework for trivial tasks. I'm following the book OpenGL SuperBible: Comprehensive Tutorial and Reference (6th Edition).
I will list the most important parts of my program:
public class GameController extends Controller {
private Program test1Program;
private int vaoId;
#Override
protected void init() {
glViewport(0, 0, 800, 600);
test1Program = new Program(
new VertexShader("src/shaders/test.vert.glsl").create(),
new ControlShader("src/shaders/test.cont.glsl").create(),
new EvaluationShader("src/shaders/test.eval.glsl").create(),
new FragmentShader("src/shaders/test.frag.glsl").create()
).create();
vaoId = glGenVertexArrays();
glBindVertexArray(vaoId);
}
#Override
protected void draw(double msDelta) {
glClearColor((float)Math.sin(currentTime / 1000f) * 0.5f + 0.5f, (float)Math.cos(currentTime / 1000f) * 0.5f + 0.5f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
test1Program.use();
glVertexAttrib4f(0, (float)Math.sin(currentTime / 1000f) * 0.5f, (float)Math.cos(currentTime / 1000f) * 0.5f, 0.0f, 0.0f);
glVertexAttrib4f(1, (float)Math.sin(currentTime / 1000f * 2f) * 0.5f + 0.5f, (float)Math.cos(currentTime / 1000f * 2f) * 0.5f + 0.5f, 0.0f, 1.0f);
glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
glDrawArrays(GL_TRIANGLES, 0, 3);
}
#Override
protected void shutdown() {
test1Program.delete();
glDeleteVertexArrays(vaoId);
}
public static void main(String[] args) {
Controller controller = new GameController();
controller.start();
}
}
My custom VertexShader, ControlShader, EvaluationShader and FragmentShader are working and if shader code does not compile properly or does not get linked correctly, then an exception will be thrown and I would notice it. So I have verified that those and Program are working correctly.
The error (Exception in thread "main" org.lwjgl.opengl.OpenGLException: Invalid operation (1282)) gets thrown at the glDrawArrays call.
test.vert.glsl:
#version 440 core
layout(location = 0) in vec4 offset;
layout(location = 1) in vec4 color;
out VS_OUT {
vec4 color;
} vs_out;
void main() {
const vec4 vertices[3] = vec4[3](
vec4(0.25, -0.25, 0.5, 1.0),
vec4(-0.25, -0.25, 0.5, 1.0),
vec4(0.25, 0.25, 0.5, 1.0)
);
gl_Position = vertices[gl_VertexID] + offset;
vs_out.color = color;
}
test.cont.glsl:
#version 440 core
layout(vertices = 3) out;
void main() {
if (gl_InvocationID == 0) {
gl_TessLevelInner[0] = 5.0;
gl_TessLevelOuter[0] = 5.0;
gl_TessLevelOuter[1] = 5.0;
gl_TessLevelOuter[2] = 5.0;
}
gl_out[gl_InvocationID].gl_Position = gl_in[gl_InvocationID].gl_Position;
}
test.eval.glsl:
#version 440 core
layout(triangles, equal_spacing, cw) in;
void main() {
gl_Position = (gl_TessCoord.x * gl_in[0].gl_Position + gl_TessCoord.y * gl_in[1].gl_Position + gl_TessCoord.z * gl_in[2].gl_Position);
}
#version 440 core
in VS_OUT {
vec4 color;
} fs_in;
out vec4 color;
void main() {
color = fs_in.color;
}
I have triple checked all my code and crosschecked with the book, but have no clue why it is not working. I would appreciate any help and am around to provide additional information if neccessary.
I found the answer just now:
glDrawArrays(GL_TRIANGLES, 0, 3);
needs to be:
glDrawArrays(GL_PATCHES, 0, 3);
Some more effort from OpenGL to show what was wrong would have been appreciated.
Also nowhere in the book it has been (explicitely) mentioned that I need to use GL_PATCHES, I just figured it out by looking at the source code of the compilable examples from the book.
After a more thorough investigation of your shaders, I suspect part of the problem is that there is no vertex attribute 1 (layout(location = 1) in vec4 color) in your program after all of the stages are linked. It is probably not your entire problem, but as it stands right now your GLSL program will not behave the way you want.
You have to pass the data fed into the vertex shader through the tessellation shader stages to get it into the fragment shader stage - if you do not do this, then the GLSL compiler/linker determines that that vertex attribute is not used when the program executes and does not assign it a location. This behavior applies to uniforms as well.
Have a look at another answer I wrote on SO for an explanation on how to do this.
Related
I've a trouble with moving my entities in a OpenGL context:
when I try to place an entity, the position seems correct, but when the entity starts to move, everything is going wrong, and collisions don't work. I'm new to OpenGL, and I suspect my world matrix or model matrix to be wrong.
Here's the code of the vertex shader:
#version 330 core
layout (location=0) in vec3 position;
out vec3 extColor;
uniform mat4 projectionMatrix;
uniform mat4 modelMatrix;
uniform vec3 inColor;
void main()
{
gl_Position = projectionMatrix * modelMatrix * vec4(position, 1.0);
extColor = inColor;
}
Here is the class that computes most of the Matrix:
public class Transformations {
private Matrix4f projectionMatrix;
private Matrix4f modelMatrix;
public Transformations() {
projectionMatrix = new Matrix4f();
modelMatrix = new Matrix4f();
}
public final Matrix4f getOrthoMatrix(float width, float height, float zNear, float zFar) {
projectionMatrix.identity();
projectionMatrix.ortho(0.0f, width, 0.0f, height, zNear, zFar);
return projectionMatrix;
}
public Matrix4f getModelMatrix(Vector3f offset, float angleZ, float scale) {
modelMatrix.identity().translation(offset).rotate(angleZ, 0, 0, 0).scale(scale);
return modelMatrix;
}
}
Here's the test for collisions:
public boolean isIn(Pos p) {
return (p.getX() >= this.pos.getX() &&
p.getX() <= this.pos.getX() + DIMENSION)
&& (p.getY() >= this.pos.getY() &&
p.getY() <= this.pos.getY() + DIMENSION);
}
Also, there's a link to the github project: https://github.com/ShiroUsagi-san/opengl-engine.
I'm really new to OpenGL 3 so I could have done some really big mistakes.
I'm also running i3 as WM, I don't really know if this could lead to this kind of issues.
I fixes the issues after thinking about how openGL and VBO work: Indeed, I was setting a new reference for each entity, so I had to change the line
Mesh fourmiMesh = MeshBuilder.buildRect(this.position.getX(), this.position.getY(), 10, 10);
to
Mesh fourmiMesh = MeshBuilder.buildRect(0, 0, 10, 10);
It was a confusion that I made between the positions of the vertex in a VBO and the positions in my world.
Hope that misunderstood helps people to understand.
Introduction to the problem:
I'm working on a game engine using the LWJGL library following this tutorial. However, I'm trying to make it so that there is a real division between the main engine and the game itself. I've therefore complicated the project a whole lot and I think this is causing some problems as the ProjectionMatrix doesn't work as explained in the video.
What am I doing:
Creating the ProjectionMatrix:
In order to create a ProjectionMatrix I created the a method which creates it for me:
public static Matrix4f createProjectionMatrix(float aspectRatio, float fov, float nearPlane, float farPlane) {
float y_scale = (float) ((1f / Math.tan(Math.toRadians(fov / 2f))) * aspectRatio);
float x_scale = y_scale / aspectRatio;
float frustum_length = nearPlane - farPlane;
Matrix4f projectionMatrix = new Matrix4f();
projectionMatrix.m00 = x_scale;
projectionMatrix.m11 = y_scale;
projectionMatrix.m22 = -((farPlane + nearPlane) / frustum_length);
projectionMatrix.m23 = -1;
projectionMatrix.m32 = -((2 * nearPlane * farPlane) / frustum_length);
projectionMatrix.m33 = 0;
return projectionMatrix;
}
I create the ProjectionMatrix with the following values:
aspectRatio = width/height = 640/480 = 1.33333
fov = 100
nearPlane = -0.5
farPlane = 100
This results in the following values for my ProjectionMatrix:
0.83909965 0.0 0.0 0.0
0.0 0.83909965 0.0 0.0
0.0 0.0 0.9990005 -0.9995003
0.0 0.0 -1.0 0.0
Using the ProjectionMatrix:
In order to use the ProjectionMatrix I created the following shaders:
vertex.vs:
#version 150
in vec3 position;
in vec2 textureCoordinates;
out vec2 passTextureCoordinates;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform int useProjectionMatrix;
void main(void){
if (useProjectionMatrix == 1) {
gl_Position = projectionMatrix * transformationMatrix * vec4(position,1.0);
} else {
gl_Position = transformationMatrix * vec4(position,1.0);
}
passTextureCoordinates = textureCoordinates;
}
fragment.fs:
#version 150
in vec2 passTextureCoordinates;
out vec4 out_Color;
uniform sampler2D textureSampler;
void main(void){
out_Color = texture(textureSampler,passTextureCoordinates);
}
Finally in order to render the entity I've created the following renderer class:
public class TexturedEntityRenderer extends AbstractEntityRenderer{
private float aspectRatio;
private float fov;
private float nearPlane;
private float farPlane;
public void prepare() {
GL11.glClearColor(0,0,0,1);
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);
}
public void render (AbstractEntity entity, AbstractShader shader) {
if(shader instanceof TexturedEntityShader) {
if(entity.getModel() instanceof TexturedModel) {
TexturedModel model = (TexturedModel)entity.getModel();
GL30.glBindVertexArray(model.getVaoID());
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
Matrix4f transformationMatrix = MatrixMaths.createTransformationMatrix(entity.getPosition(), entity.getRx(), entity.getRy(), entity.getRz(), entity.getScale());
((TexturedEntityShader)shader).loadTransformationMatrix(transformationMatrix);
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, ((TexturedModel)entity.getModel()).getTexture().getTextureID());
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, model.getVaoID());
GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
} else {
ExceptionThrower.throwException(new ModelInvalidException());
}
} else {
ExceptionThrower.throwException(new ShaderIncompatableException(shader.toString()));
}
}
public void setup(AbstractShader shader) {
nearPlane = Float.parseFloat(OptionHandler.getProperty(GraphicOptions.WINDOWNEARPLANE_KEY, OptionHandler.GRAPHIC_OPTION_ID));
farPlane = Float.parseFloat(OptionHandler.getProperty(GraphicOptions.WINDOWFARPLANE_KEY, OptionHandler.GRAPHIC_OPTION_ID));
aspectRatio = DisplayManager.getWidth() / DisplayManager.getHeight();
fov = Float.parseFloat(OptionHandler.getProperty(GraphicOptions.WINDOWFOV_KEY, OptionHandler.GRAPHIC_OPTION_ID));
((TexturedEntityShader)shader).loadProjectionMatrix(MatrixMaths.createProjectionMatrix(aspectRatio, fov, nearPlane, farPlane));
((TexturedEntityShader)shader).loadUseProjectionMatrix();
}
}
The Optionhandler.getProperty() function in the setup() returns the property for a given key(like fov or nearPlane value) from a text file. (I've checked that this works by printing all loaded options.) Also, the DisplayManager.getWidth() and DisplayManager.getHeight() functions, obviously, obtain the width and height for calculating the aspectRatio variable.
Updating the entity:
Last but not least, I'm updating the entity using a class called EntityModifier which looks like this:
public class EntityModifier {
private Vector3f dposition;
private float drx;
private float dry;
private float drz;
private float dscale;
public BasicEntityModifier(Vector3f dposition, float drx, float dry, float drz, float dscale) {
this.dposition = dposition;
this.drx = drx;
this.dry = dry;
this.drz = drx;
this.dscale = dscale;
}
public Vector3f getDposition() {
return dposition;
}
public float getDrx() {
return drx;
}
public float getDry() {
return dry;
}
public float getDrz() {
return drz;
}
public float getDscale() {
return dscale;
}
#Override
public String toString() {
return "BasicEntityModifier [dposition=" + dposition + ", drx=" + drx + ", dry=" + dry + ", drz=" + drz + ", dscale=" + dscale + "]";
}
}
Each entity I create has one of these classes and I cal call an update method which adds the values to the entity's transformation:
public void update() {
increasePosition(modifier.getDposition().getX(),modifier.getDposition().getY(),modifier.getDposition().getZ());
increaseRotation(modifier.getDrx(), modifier.getDry(), modifier.getDrz());
increaseScale(modifier.getDscale());
}
private void increasePosition(float dx, float dy, float dz) {
position.x += dx;
position.y += dy;
position.z += dz;
}
private void increaseRotation(float drx, float dry, float drz) {
rx += drx;
ry += dry;
rz += drz;
}
private void increaseScale(float dscale) {
scale += dscale;
}
The problem:
I'm able to change the position of the x and y values of the entity normally but whenever I change the z position, using an EntityModifier, the entity loads but then dissapears from the screen. It's loaded for about 60 frames before dissapearing and changing dz's value doesn't seem to affect the speed at which it dissapears in any way(It does, see EDIT 2). Also there the entity doesn't have the scale effect as shown in the tutorial here (same link but with timestamp).
Changing the dz value to 0 stops the dissapearing of the entity.
What is going on here? How can I fix this?
EDIT:
I've been pointed out in the comments that the nearPlane value should be positive so I changed it to 0.5 but I still get the same results. I also changed: float frustum_length = nearPlane - farPlane; to float frustum_length = farPlane - nearPlane; which was also suggested there (this also did not solve the problem).
EDIT 2:
After some more investigation I found a few intresting things:
1. Changing the speed at which the z value changes does affect how long it takes for the entity to dissapear. After finding this out I tried timing a few different dz(with dz being the change per frame of z) values and I got this:
`for dz = -0.002 -> frames before dissapear: 515 frames.`
`for dz = -0.001 -> frames before dissapear: 1024 frames.`
`for dz = 0.02 -> frames before dissapear: 63 frames.`
If we take into account reaction times (I made the program output the total ammount of rendered frames on closure and just closed it as fast as possible when the entity dissapeared) we can calculate the values for z at which the entity dissapears.
-0.002 * 515 ≈ -1
-0.001 * 1024 ≈ -1
0.02 * 63 ≈ 1
This probably has to do with the way the coordinate system works in OpenGL but it still doesn't explain why the entity isn't becoming smaller as it does in the tutorial mentioned above.
2. Removing the code which adds the ProjectionMatrix to the renderer class does not change the behavior. This means the error is elsewere.
New Problem:
I think there is no problem with the ProjectionMatrix (or at least not a problem that is causing this behavior) but the problem is with the entity's position surpassing 1 or -1 on the z axes. However this still doesn't explain why there is no "zoom effect". Therefor I don't think that restricting the z movement between -1 and 1 will solve this problem, infact, I think it will work against us as the entity should not be rendered anyway if it's totaly "zoomed" out or in.
What can cause this problem if it isn't the ProjectionManager?
EDIT 3:
Someone on reddit pointed out that the following classes might also be of intrest for solving the problem:
AbstractShader: contains basic shader functionality common for all shader classes.
TexturedEntityShader: used to render a texturedEntity (shown above)
DisplayManager: class which handles rendering.
EDIT 4:
After some more discussion on reddit about this problem we've come across a problem and were able to fix it: The value for useProjectionMatrix was not loaded because the shader was stopped when I tried to load it. Changing the loadUseProjectionMatrix() method to:
public void loadUseProjectionMatrix() {
super.start();
super.loadBoolean(useProjectionMatrixLocation, useProjectionMatrix);
System.out.println("loaded useProjectionMatrix: " + useProjectionMatrix + "\n\n");
super.stop();
}
seems to partially solve the problem as the projectionMatrix now can be used inside the shader (before it would not be used due to the fact that the useProjectionMatrix value would always be 0 as we did not load a value for it.).
However, this did not fix the entire problem as there is still an issue with the projectionMatrix I think. The entity does not want to render at all when using the projectionMatrix but it renders fine when not using it. I've tried hardcoding the values of the projectionMatrix by using the following shader:
#version 150
in vec3 position;
in vec2 textureCoordinates;
out vec2 passTextureCoordinates;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform int useProjectionMatrix;
mat4 testMat;
void main(void){
testMat[0] = vec4(0.83909965, 0, 0, 0);
testMat[1] = vec4(0, 0.83909965, 0, 0);
testMat[2] = vec4(0, 0, 0.9990005, -0.9995003);
testMat[3] = vec4(0, 0, -1, 0);
if (true) {
gl_Position = testMat * transformationMatrix * vec4(position,1.0);
} else {
gl_Position = transformationMatrix * vec4(position,1.0);
}
passTextureCoordinates = textureCoordinates;
}
However that does not seem to work. Are these values OK?
Fow who wants to see it here are the 2 posts I created on reddit about this problem: post 1, post 2.
before showing my code I want to explain the Situation a littlze bit. I am trying to make a FlappyBird clone just for practice using Lwjgl2. Right now I am able to create a textured Quad which can move ONLY in x and y direction and rotate around all the axis. I was about to set up the projectionMatrix so I can also have 3D movement and the Z axis work.
I followed a tutorial on youtube, doing the exact same things but it somehow does not work for me at all.
When trying to move the Object without using the projectionMatrix, it vanishes at soon as Z > 1 or Z < -1 for some reason, although nothing should happen. As soon as I add the projectionMatrix inside the vertexShader it vanishes for every coordinate I give to be rendered at... it just disappears into the void.
Here is all the relevant code:
Model model = loader.loadToVAO(vertices, textureCoords, indices);
ModelTexture texture = new ModelTexture(loader.loadTexture("Flappy2"));
TexturedModel texturedModel = new TexturedModel(model, texture);
Entity entity = new Entity(texturedModel, new Vector3f(0,0,0),0,0,0,1 );
float y = 0;
float x = 0;
//Main loop which renders stuff
while(!Display.isCloseRequested()) {
while(Keyboard.next()) {
if (Keyboard.getEventKey() == Keyboard.KEY_SPACE) {
if (Keyboard.getEventKeyState()) {
y = 0.1f;
}
}
}
y -= 0.005f;
entity.increasePosition(x, y, 0);
entity.increaseRotation(0, 0,0 );
renderer.prepare();
shader.start();
renderer.render(entity, shader);
shader.stop();
DisplayManager.updateDisplay();
}
shader.cleanUp();
loader.cleanUp();
DisplayManager.closeDisplay();
}
This is the main loop, nothing special.
#version 400 core
in vec3 position;
in vec2 textureCoords;
out vec2 pass_textureCoords;
out vec3 color;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
void main(void){
gl_Position = projectionMatrix * transformationMatrix * vec4(position,1.0);
pass_textureCoords = textureCoords;
color = vec3(position.x +0.5,0.0,position.y+0.5);
}
That was the vertex Shader.
private void createProjectionMatrix() {
float aspectRatio = (float) Display.getDisplayMode().getWidth() / (float) Display.getDisplayMode().getHeight();
float y_scale = (float)(1f / Math.tan(Math.toRadians(FOV / 2f))) * aspectRatio;
float x_scale = y_scale / aspectRatio;
float frustum_length = FAR_PLANE - NEAR_PLANE;
projectionMatrix = new Matrix4f();
projectionMatrix.m00 = x_scale;
projectionMatrix.m11 = y_scale;
projectionMatrix.m22 = -((FAR_PLANE + NEAR_PLANE) / frustum_length);
projectionMatrix.m23 = -1;
projectionMatrix.m32 = -((2 * NEAR_PLANE * FAR_PLANE) / frustum_length);
projectionMatrix.m33 = 0;
}
Here I set up the projectionMatrix in the Rendering class.
As I said, most of it is copied from a youtube tutorial, as I am new to LWJGL2. So if it works for him why does it not for me ?
I tried copying the entire tutorial code, instead of just typing it myself and it did somehow fix my problem.
I probably had switched variable names somewhere or small errors like that which prevented the projection Matrix from working.
No need to comment anymore :) Ignore this post
I recently started coding with OpenGL ES 2.0 and ran into a (for me) quite challenging problem. This is my first try, streaming VBO Buffer Objects dynamically ( at least that's what I think I am doing). My application should draw 2 triangles in a specific color, but instead they are just black. I think it's possible that I mixed up some GL commands, but I can't find the problem.
Here's a snippet from my GLRenderer class:
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
/* Draw black background */
GLES20.glClearColor(0.8f, 0.6f, 0.4f, 1.0f);
{...}
viewMatrix = camera.getMatrix();
/* Create and compile shaders */
int vertexShaderHandle = loadShader(vertexShader, GLES20.GL_VERTEX_SHADER);
int fragmentShaderHandle = loadShader(fragmentShader, GLES20.GL_FRAGMENT_SHADER);
/* Create and link program */
programHandle = loadProgram(vertexShaderHandle, fragmentShaderHandle);
/* Set references for drawing input */
mVPMatrixHandle = GLES20.glGetUniformLocation(programHandle, "uMVPMatrix");
positionHandle = GLES20.glGetAttribLocation(programHandle, "vPosition");
colorHandle = GLES20.glGetUniformLocation(programHandle, "vColor");
checkGlError("glGetUniformLocation");
/* Create 2 Triangles for testing purposes. */
final float[] triangle1Vertex = { 0.0f, 0.5f, 0.0f,
-0.5f, -0.5f, 0.0f,
0.5f, -0.5f, 0.0f,};
final float[] triangle2Vertex = { -1.0f, 1.0f, 0.0f,
-1.0f, -0.5f, 0.0f,
-0.5f, 1.0f, 0.0f};
/* Color */
final float[] color = { 0.63671875f, 0.76953125f, 0.22265625f, 0.0f};
/* Init triangles */
Triangle triangle1 = new Triangle(triangle1Vertex, color);
Triangle triangle2 = new Triangle(triangle2Vertex, color);
/* Add triangles to be drawn */
TriangleCollection.add(triangle1);
TriangleCollection.add(triangle2);
/* Create buffer objects in GPU, 2 buffers are needed */
final int buffers[] = new int[2];
GLES20.glGenBuffers(2, buffers, 0); //Generate GPUSide Buffers
/* Allocate GPU memory space for vertices */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[0]);
GLES20.glBufferData(
GLES20.GL_ARRAY_BUFFER,
TriangleCollection.MAX_NUMBER_OF_VERTICES * TriangleCollection.BYTES_PER_FLOAT,
TriangleCollection.publishVerticesBuffer(),
GLES20.GL_STREAM_DRAW);
checkGlError("glBufferData");
/* Allocate GPU memory space for color data */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[1]);
GLES20.glBufferData(
GLES20.GL_ARRAY_BUFFER,
TriangleCollection.NUMBER_OF_COLOR_ELEMENTS * TriangleCollection.BYTES_PER_FLOAT,
TriangleCollection.publishColorBuffer(),
GLES20.GL_STREAM_DRAW);
checkGlError("glBufferData");
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);
/* Reference the GPU Buffers */
triangleVerticesIdx = buffers[0];
triangleColorsIdx = buffers[1];
GLES20.glFlush();
startTime = System.nanoTime();
}
#Override
public void onDrawFrame(GL10 unused) {
FloatBuffer vertices = TriangleCollection.publishVerticesBuffer();
FloatBuffer colors = TriangleCollection.publishColorBuffer();
/* Upload triangle data */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleVerticesIdx);
GLES20.glBufferSubData(GLES20.GL_ARRAY_BUFFER, 0, vertices.capacity() * Triangle.BYTES_PER_FLOAT, vertices);
checkGlError("glBufferSubData");
/* Upload color data */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleColorsIdx);
GLES20.glBufferSubData(GLES20.GL_ARRAY_BUFFER, 0, colors.capacity() * Triangle.BYTES_PER_FLOAT, colors);
checkGlError("glBufferSubData");
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);
/* Clear Screen */
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glUseProgram(programHandle);
/*Matrix calculations */
Matrix.setIdentityM(modelMatrix, 0);
Matrix.multiplyMM(mvpMatrix, 0, projectionMatrix, 0, viewMatrix, 0);
GLES20.glUniformMatrix4fv(mVPMatrixHandle, 1, false, mvpMatrix, 0);
checkGlError("glUniformMatrix4fv");
/* Pass the position information */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleVerticesIdx);
checkGlError("glBindBuffer");
GLES20.glEnableVertexAttribArray(positionHandle);
checkGlError("glEnableVertexAttribArray");
GLES20.glVertexAttribPointer(positionHandle, Triangle.COORDINATES_PER_VERTEX, GLES20.GL_FLOAT, false, 0, 0);
checkGlError("glVertexAttribPointer");
/* Pass the color information */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleColorsIdx);
checkGlError("glBindBuffer");
GLES20.glEnableVertexAttribArray(colorHandle);
checkGlError("glEnableVertexAttribArray");
GLES20.glVertexAttribPointer(colorHandle, TriangleCollection.COLOR_SIZE_FLOAT, GLES20.GL_FLOAT, false, 0, 0);
checkGlError("glVertexAttribPointer");
/* Clear currently bound buffer */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);
checkGlError("glBindBuffer");
//Draw
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, TriangleCollection.MAX_NUMBER_OF_VERTICES);
checkGlError("glDrawArrays");
}
This code runs without errors and I already checked the FloatBuffers in Debug Mode. They contain the required information.
I would also appeciate any feedback on the general concept of my drawing / rendering pipeline. I'm not sure weather this is a good solution but at least I get 30 FPS #8000 Triangles on my Nexus 5.
Edit 1
After some testing I got the following results:
According to the log I'm using EGL 1.4. I do not intend to use OpenGL ES 3.0 for now ( Provided that this is possible).
2.Replacing vColor element of the fragment shader with a constant value works. The triangles are red:
final String fragmentShader =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vec4(1.0,0.0,0.0,1.0);" +
"}";
When using the normal non-static fragment shader, removing this part of the code changes absolutely nothing:
/* Pass the color information */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleColorsIdx);
checkGlError("glBindBuffer");
GLES20.glEnableVertexAttribArray(colorHandle);
checkGlError("glEnableVertexAttribArray");
GLES20.glVertexAttribPointer(colorHandle, TriangleCollection.COLOR_SIZE_FLOAT, GLES20.GL_FLOAT, false, 0, 0);
checkGlError("glVertexAttribPointer");
Removing colorHandle = GLES20.glGetUniformLocation(programHandle, "vColor"); from surfaceCreated() works as usual, no triangle is drawn.
Edit 2
I still can't find my mistake. While using glGetUniformLocation worked for one triangle, it doesn't work for many. I stripped down my project to a simple test application so I can show the complete code:
public class MainActivity extends Activity {
private MySurfaceView mySurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
/* Create SurfaceView and add it to Activity */
MySurfaceView = new MySurfaceView(this);
setContentView(mySurfaceView);
}
#Override
protected void onPause() {
super.onPause();
MySurfaceView.onPause();
}
#Override
protected void onResume() {
super.onResume();
MySurfaceView.onResume();
}
}
public class MySurfaceView extends GLSurfaceView {
private final GLRenderer renderer;
/**
* Creates the SurfaceView
* #param context Application context
*/
public MySurfaceView(Context context) {
super(context);
setEGLConfigChooser(8, 8, 8, 8, 16, 0);
/* OpenGl Version GLES 2.0 min */
setEGLContextClientVersion(2);
/* Add Renderer for drawing */
renderer = new GLRenderer();
setRenderer(renderer);
}
}
public class GLRenderer implements GLSurfaceView.Renderer {
/* Frame Counter */
private int nbFrame = 0;
private long startTime;
/* Vertex Shader */
final String vertexShader =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
" gl_Position = uMVPMatrix * vPosition;" +
"}";
/* Fragment Shader*/
final String fragmentShader =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
/* Reference for the program */
private int programHandle;
/* References to pass data into shader */
private int mVPMatrixHandle, positionHandle, colorHandle;
/* Projection matrix, used for projection 3D scene to 2D viewport. */
private float[] projectionMatrix = new float[16];
/* Model matrix used for moving Models around */
private float[] modelMatrix = new float[16];
/* Combined Matrix */
private float[] mvpMatrix = new float[16];
/* Matrix of the camera position and perspective */
private float[] viewMatrix;
/* Reference to the buffer of the triangle vertices in the GPU DDR */
private int triangleVerticesIdx;
/* Reference to the buffer of the triangle colors in the GPU DDR */
private int triangleColorsIdx;
/**
* Load shader
*/
static int loadShader(final String shader, int type) {
int shaderHandle = GLES20.glCreateShader(type);
if (shaderHandle != 0) {
GLES20.glShaderSource(shaderHandle, shader);
checkGlError("glShaderSource");
GLES20.glCompileShader(shaderHandle);
checkGlError("glCompileShader");
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(shaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
if (compileStatus[0] == 0) {
GLES20.glDeleteShader(shaderHandle);
shaderHandle = 0;
}
}
if (shaderHandle == 0) {
throw new RuntimeException("Error while creating shader");
}
return shaderHandle;
}
/**
* Loads a OpenGL ES 2.0 program with a vertex and a fragment shader.
* #param vertexShader
* #param fragmentShader
* #return
*/
public static int loadProgram(int vertexShader, int fragmentShader) {
int programHandle;
/* Load program */
programHandle = GLES20.glCreateProgram();
if (programHandle != 0) {
/* Bind shaders to program */
GLES20.glAttachShader(programHandle, vertexShader);
checkGlError("glAttachShader");
GLES20.glAttachShader(programHandle, fragmentShader);
checkGlError("glAttachShader");
/* Bind Attributes */
GLES20.glBindAttribLocation(programHandle, 0, "vPosition");
checkGlError("glBindAttribLocation");
GLES20.glBindAttribLocation(programHandle, 1, "vColor");
checkGlError("glBindAttribLocation");
/* Link shaders */
GLES20.glLinkProgram(programHandle);
/* Get link status... */
final int[] linkStatus = new int[1];
GLES20.glGetProgramiv(programHandle, GLES20.GL_LINK_STATUS, linkStatus, 0);
if (linkStatus[0] == 0) {
GLES20.glDeleteProgram(programHandle);
programHandle = 0;
}
}
if (programHandle == 0) {
throw new RuntimeException("Error creating program.");
}
return programHandle;
}
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
/* Draw black background */
GLES20.glClearColor(0.8f, 0.6f, 0.4f, 1.0f);
/* Create Camera and define values -> calculate Matrix */
Camera camera = new Camera();
camera.setPosition(0.0f, 0.0f, 1.5f);
camera.setPerspective(0.0f, 0.0f, -5.0f);
camera.setUpVector(0.0f, 1.0f, 0.0f);
camera.setMatrix();
viewMatrix = camera.getMatrix();
/* Create and compile shaders */
int vertexShaderHandle = loadShader(vertexShader, GLES20.GL_VERTEX_SHADER);
int fragmentShaderHandle = loadShader(fragmentShader, GLES20.GL_FRAGMENT_SHADER);
/* Create and link program */
programHandle = loadProgram(vertexShaderHandle, fragmentShaderHandle);
/* Set references for drawing input */
mVPMatrixHandle = GLES20.glGetUniformLocation(programHandle, "uMVPMatrix");
positionHandle = GLES20.glGetAttribLocation(programHandle, "vPosition");
colorHandle = GLES20.glGetUniformLocation(programHandle, "vColor");
checkGlError("glGetUniformLocation");
/* Create 2 Triangles for testing purposes. */
final float[] triangle1Vertex = { 0.0f, 0.5f, 0.0f,
-0.5f, -0.5f, 0.0f,
0.5f, -0.5f, 0.0f,};
/* Color */
final float[] color = { 0.0f, 0.76953125f, 0.22265625f, 1.0f,
0.0f, 0.76953125f, 0.22265625f, 1.0f,
0.0f, 0.76953125f, 0.22265625f, 1.0f};
/* Create Vertex Buffer */
ByteBuffer bb = ByteBuffer.allocateDirect(triangle1Vertex.length*4);
bb.order(ByteOrder.nativeOrder());
FloatBuffer vert1 = bb.asFloatBuffer();
vert1.put(triangle1Vertex);
vert1.position(0);
/* Create Color Buffer */
ByteBuffer bb1 = ByteBuffer.allocateDirect(color.length*4);
bb1.order(ByteOrder.nativeOrder());
FloatBuffer color1 = bb1.asFloatBuffer();
color1.put(color);
color1.position(0);
/* Create buffer objects in GPU, 2 buffers are needed */
final int buffers[] = new int[2];
GLES20.glGenBuffers(2, buffers, 0); //Generate GPUSide Buffers
/* Allocate GPU memory space for vertices */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[0]);
GLES20.glBufferData(
GLES20.GL_ARRAY_BUFFER,
1*9*4,// 9 floats for triangle and 4 bytes per float
vert1,
GLES20.GL_STATIC_DRAW);
checkGlError("glBufferData");
/* Upload FPU memory space for color data */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, buffers[1]);
GLES20.glBufferData(
GLES20.GL_ARRAY_BUFFER,
1*3*4*4,
color1,
GLES20.GL_STATIC_DRAW);
checkGlError("glBufferData");
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);
/* Reference the GPU Buffers */
triangleVerticesIdx = buffers[0];
triangleColorsIdx = buffers[1];
GLES20.glFlush();
startTime = System.nanoTime();
}
/**
* Not needed. Device must be in landscape mode all the time.
*
* #param unused -
*/
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
/* Define Viewport */
GLES20.glViewport(0, 0, width, height);
/* Create perspective projection */
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.0f;
Matrix.frustumM(projectionMatrix, 0, left, right, bottom, top, near, far);
}
#Override
public void onDrawFrame(GL10 unused) {
/* Measure FPS */
nbFrame++;
if(System.nanoTime()-startTime >= 1000000000) {
Log.d("FPS", Integer.toString(nbFrame));
nbFrame = 0;
startTime = System.nanoTime();
}
/* Clear Screen */
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glUseProgram(programHandle);
/*Matrix calculations */
Matrix.setIdentityM(modelMatrix, 0);
Matrix.multiplyMM(mvpMatrix, 0, projectionMatrix, 0, viewMatrix, 0);
GLES20.glUniformMatrix4fv(mVPMatrixHandle, 1, false, mvpMatrix, 0);
checkGlError("glUniformMatrix4fv");
/* Pass the position information */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleVerticesIdx);
checkGlError("glBindBuffer");
GLES20.glEnableVertexAttribArray(positionHandle);
checkGlError("glEnableVertexAttribArray");
GLES20.glVertexAttribPointer(positionHandle, 3, GLES20.GL_FLOAT, false, 0, 0);
checkGlError("glVertexAttribPointer");
/* Pass the color information */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, triangleColorsIdx);
checkGlError("glBindBuffer");
GLES20.glEnableVertexAttribArray(colorHandle);
checkGlError("glEnableVertexAttribArray");
GLES20.glVertexAttribPointer(colorHandle, 4, GLES20.GL_FLOAT, false, 0, 0);
checkGlError("glVertexAttribPointer");
/* Clear currently bound buffer */
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);
checkGlError("glBindBuffer");
//Draw
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 1*9);
checkGlError("glDrawArrays");
}
/**
* Utility method for debugging OpenGL calls. Provide the name of the call
* just after making it:
*
* <pre>
* mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
* MyGLRenderer.checkGlError("glGetUniformLocation");</pre>
*
* If the operation is not successful, the check throws an error.
*
* #param glOperation - Name of the OpenGL call to check.
*/
public static void checkGlError(String glOperation) {
int error;
while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
Log.e("OPEN_GL", glOperation + ": glError " + error);
throw new RuntimeException(glOperation + ": glError " + error);
}
}
}
Solution
Finally I was able to solve the problem (with help of your clues). For all other people who have similar problems, check your shaders and don't just copy & paste them like me. Also this helped me alot. Also here are my now working shaders:
final String vertexShader =
"uniform mat4 uMVPMatrix; \n"
+ "attribute vec4 aPosition; \n"
+ "attribute vec4 aColor; \n"
+ "varying vec4 vColor; \n"
+ "void main() \n"
+ "{ \n"
+ " vColor = aColor; \n"
+ " gl_Position = uMVPMatrix * aPosition; \n"
+ "} \n";
/* Fragment Shader*/
final String fragmentShader =
"precision mediump float;" +
"varying vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
I'd have to see your Fragment shader to know for certain, but from here, it looks like you're setting the alpha component to 0 in your color array, which means that your colors won't show up. Set the alpha component to 1.
You need to check and ensure that your fragment shader is compiling correctly. According to the GLSL_ES specification, shaders need to contain a line at the beginning indicating which version you're using. (Section 3.3, page 9). Unless you're building for ESGL1.0 (which seems unlikely given your liberal use of Vertex Buffer Objects) that directive has to be present in any valid shader code.
I am trying to make a game/demo with Java/LWJGL and having troubles with the first person camera:
I use the WASD to move the eye vector around, however strange things are happening:
When I move the eye away or to me, the right resp. left parts of the view get blocked for my view.
When I move the eye to the right or left of me, then the top resp. bottom parts of the view get blocked for my view.
I have implemented mouse movement (to look around) with success, the relevant pieces of code now:
(Note: The Matrix4f class works in column-major order)
Matrix4f viewMatrix = new Matrix4f();
private void checkKeys() {
if (isKeyCurrentlyDown(Keyboard.KEY_W)) {
eye.updateTranslate(1.0f, 0.0f, 0.0f);
updateView();
}
else if (isKeyCurrentlyDown(Keyboard.KEY_S)) {
eye.updateTranslate(-1.0f, 0.0f, 0.0f);
updateView();
}
else if (isKeyCurrentlyDown(Keyboard.KEY_A)) {
eye.updateTranslate(0.0f, -1.0f, 0.0f);
updateView();
}
else if (isKeyCurrentlyDown(Keyboard.KEY_D)) {
eye.updateTranslate(0.0f, 1.0f, 0.0f);
updateView();
}
}
private void updateView() {
System.out.println("eye = " + eye);
viewMatrix.identity().viewFPS(eye, roll, yaw, pitch);
System.out.println("viewMatrix = " + viewMatrix);
Uniforms.setUniformMatrix4(UNIFORM_VIEW_MATRIX, false, viewMatrix);
}
#Override
protected void render(final double msDelta) {
super.render(msDelta);
glClearColor(0.0f, 0.25f, 0.0f, 1.0f);
glClearDepthf(1f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
program.use();
for (int i = 0; i < 24; i++) {
float fVar = i + currentTime / 1000f * 0.3f;
modelviewMatrix.identity()
.translate(0.0f, 0.0f, -8.0f)
.rotate(currentTime / 1000f * 45.0f, 0.0f, 1.0f, 0.0f)
.rotate(currentTime / 1000f * 21.0f, 1.0f, 0.0f, 0.0f)
.translate(
(float)Math.sin(2.1f * fVar) * 2.0f,
(float)Math.cos(1.7f * fVar) * 2.0f,
(float)Math.sin(1.3f * fVar) * (float)Math.cos(1.5f * fVar) * 2.0f
);
Uniforms.setUniformMatrix4(UNIFORM_MODEL_MATRIX, false, modelviewMatrix.writeToFloatBuffer(modelViewMatrixBuffer));
program.drawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);
}
}
#version 440 core
layout(location = 0) in vec4 position;
out VS_OUT {
vec4 color;
} vs_out;
layout(location = 0) uniform mat4 model_matrix;
layout(location = 1) uniform mat4 view_matrix;
layout(location = 2) uniform mat4 proj_matrix;
void main() {
gl_Position = proj_matrix * view_matrix * model_matrix * position;
vs_out.color = position * 2.0 + vec4(0.5, 0.5, 0.5, 0.0);
}
public enum UniformLocation {
UNIFORM_MODEL_MATRIX(0),
UNIFORM_VIEW_MATRIX(1),
UNIFORM_PROJECTION_MATRIX(2)
;
private final int location;
private UniformLocation(final int location) {
this.location = location;
}
public int getLocation() {
return this.location;
}
}
//Column-major order
public Matrix4f viewFPS(final Vector3f eye, final float rollAngle, final float yawAngle, final float pitchAngle) {
//roll = rolling your head, Q&E
//yaw = looking left/right, mouseY
//pitch = looking up/down, mouseX
float sinRoll = (float)Math.sin(Math.toRadians(rollAngle));
float cosRoll = (float)Math.cos(Math.toRadians(rollAngle));
float sinYaw = (float)Math.sin(Math.toRadians(yawAngle));
float cosYaw = (float)Math.cos(Math.toRadians(yawAngle));
float sinPitch = (float)Math.sin(Math.toRadians(pitchAngle));
float cosPitch = (float)Math.cos(Math.toRadians(pitchAngle));
Vector3f xAxis = new Vector3f(
cosYaw * cosPitch,
sinYaw * cosPitch,
-sinPitch
);
Vector3f yAxis = new Vector3f(
cosYaw * sinPitch * sinRoll - sinYaw * cosRoll,
sinYaw * sinPitch * sinRoll + cosYaw * cosRoll,
cosPitch * sinRoll
);
Vector3f zAxis = new Vector3f(
cosYaw * sinPitch * cosRoll + sinYaw * sinRoll,
sinYaw * sinPitch * cosRoll - cosYaw * sinRoll,
cosPitch * cosRoll
);
return multiply(
xAxis.getX(), xAxis.getY(), xAxis.getZ(), -xAxis.dot(eye), //X column
yAxis.getX(), yAxis.getY(), yAxis.getZ(), -yAxis.dot(eye), //Y column
zAxis.getX(), zAxis.getY(), zAxis.getZ(), -zAxis.dot(eye), //Z column
0.0f, 0.0f, 0.0f, 1.0f //W column
);
}
I really have no clue why the camera is behaving so weirdly, and what it even means?
How can parts of the picture suddenly become not visible anymore?
Update: It might have to do with the viewFPS() method, as the translations look slightly odd there, could someone confirm?
Noticed that in viewFPS you combine rotation and translation in one matrix multiplication.
I am not familiar with the way you build the matrix, but I can tell you what works for me.
Create 3 matrices, one for X rotation, Y rotation and Z rotation values.
Multiply those matrices in the order you want. (I use Z, Y then X)
Multiply the result with your translation matrix. (i.e Position of your eye)
The resulting matrix is your view matrix.
This might be less efficient than the method you are using, but it's great to get it working then optimize from there? Splitting up the rotation axies also gives the benefit of being able to debug 1 axis at a time.
Rotation matrices I use:
Where , (phi), and (psi) are the rotations around the X, Y and Z axes.(http://www.fastgraph.com/makegames/3drotation/)