I just started using libgdx and want to render some 2D shapes using a Mesh and a custom ShaderProgram.
I'm experienced in OpenGL, but I don't see my mistake here, maybe someone can help me.
The shader is very basic, vertex:
attribute vec2 v;
uniform mat4 o;
void main(){
gl_Position = vec4(o*vec3(v, 1.0), 1.0);
}
fragment:
#ifdef GL_ES
precision mediumhp float;
#endif
void main(){
gl_FragColor = vec4(1, 1, 1, 1);
}
The mesh (quad 100x100px):
Mesh mesh = new Mesh(true, 4, 6, new VertexAttribute(Usage.Position, 2, "v"));
mesh.setVertices(new float[]{0, 0,
100, 0,
0, 100,
100, 100});
mesh.setIndices(new short[]{0, 1, 3, 0, 3, 2});
The render stage:
Matrix4 o = new Matrix4(); // also tried OrthographicCamera and SpriteBatch.getProjectionMatrix() here...
o.setToOrtho2D(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
shader.begin();
shader.setUniformMatrix(shader.getUniformLocation("o"), o);
mesh.render(shader, GL20.GL_TRIANGLES);
shader.end();
And thats it. I get no output at all, black screen.
Of course I clear the screen and everything, SpriteBatch (which I also use for different purposes) just works fine. But I don't get how this is done in libgdx or whats wrong here...
Related
I'm currently working on a project with the Cardboard SDK, and I'm relatively struck right now.
I want to display a cross in the center of the sight, like in a FPS, and keep it in the center of the sight when the user moves his head.
I know that in this code :
public void onNewFrame(HeadTransform headTransform) {
float[] headView = new float[16];
headTransform.getHeadView(headView, 0);
}
the headView param will contains the transformation matrix (rotation + translation) of the head (thanks to this SO : Android VR Toolkit - HeadTransform getHeadView matrix representation ).
I tried to do this :
private float[] mHeadView = new float[16];
public void onNewFrame(HeadTransform headTransform) {
headTransform.getHeadView(mHeadView, 0);
}
public void onDrawEye(Eye eye) {
float[] mvpMatrix = new float[16];
float[] modelMatrix = new float[16];
float[] mvMatrix = new float[16];
float[] camera = new float[16];
Matrix.setLookAt(camera, 0, 0, 0, -2, 0, 0, -1, 0, 1, 0);
Matrix.multiplyMM(modelMatrix, 0, mHeadView, 0, camera, 0);
Matrix.multiplyMM(mvMatrix, 0, eye.getEyeView(), 0, modelMatrix, 0);
Matrix.multiplyMM(mvpMatrix, 0, eye.getEyePerspective(0.1, 100), 0, mvMatrix, 0);
// Pass the mvpMatrix and vertices buffer to the vertex shader.
}
And here is my vertex shader :
uniform mat4 uMatrix;
attribute vec4 vPosition;
attribute vec4 vColors;
varying vec4 color;
void main() {
color = vColors;
gl_Position = uMatrix * vPosition;
}
But the cross is still anchored to its initial position and doesn't follow the head.
Am I missing something ?
How can I make my cross follow my head and stay in the center of the sight ?
Thanks in advance for your answers :)
(PS : I don't want to use Unity because this project must only use the Java SDK).
tl;dr: For a head-locked crosshair, skip the multiplication by the mHeadView matrix.
If you want a head locked object, you need to define it in head space, not in world space. Your current code defines the crosshair in world space. The mHeadView transform maps from world space to current head space, accounting for current head rotation. You don't need to multiply by this, it's only required for world-locked objects.
im tryign to write a script to display basic 3D objects/polygon triangles using JOGL 2 with OpenGL 3.3 however when the item compiles i receive no error and get an blank window of where the object appears. So my question is, is there anything in specific im missing in adding to make the object to appear.. my code is as follows...
public void init(GL3 gl)
{
gl.glGenVertexArrays(1, IntBuffer.wrap(temp));
//create vertice buffers
int vao = temp[0];
gl.glBindVertexArray(vao);
gl.glGenBuffers(1, IntBuffer.wrap(temp));
int[] temp2 = new int[]{1,1};
gl.glGenBuffers(2, IntBuffer.wrap(temp2));
vbo = temp2[0];
ebo = temp2[1];
//creates vertex array
float vertices[] = {
-0.5f, 0.5f, 0.0f,//1,0,0, // Top-left
0.5f, 0.5f, 0.0f,//0,1,0, // Top-right
0.5f, -0.5f, 0.0f,//0,0,1, // Bottom-right
-0.5f, -0.5f, 0.0f//1,1,0 // Bottom-left
};
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, vbo);
gl.glBufferData(GL.GL_ARRAY_BUFFER, vertices.length * 4,
FloatBuffer.wrap(vertices), GL.GL_STATIC_DRAW);
//creates element array
int elements[] = {
0,1,2,
2,3,0
};
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, ebo);
gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, elements.length * 4,
IntBuffer.wrap(elements), GL.GL_STATIC_DRAW);
gl.glVertexAttribPointer(0, 3, GL.GL_FLOAT, false, 3*4, 0* 4);
gl.glEnableVertexAttribArray(0);
}
public void draw(GL3 gl)
{
gl.glBindVertexArray(vao);
gl.glDrawElements(GL.GL_TRIANGLES, 2, GL.GL_UNSIGNED_INT, 0);
}
As for where my shaders being initiated, its in a different class, which is as follows..
//Matrix4 view = new Matrix4(MatrixFactory.perspective(scene.camera.getHeightAngle(),scene.camera.getAspectRatio(),scene.camera.getPosition());
projection = MatrixFactory.perspective(scene.camera.getHeightAngle(), scene.camera.getAspectRatio(), 0.01f, 100f);
view = MatrixFactory.lookInDirection(scene.camera.getPosition(), scene.camera.getDirection(), scene.camera.getUp());
try {
shader = new Shader(new File("shaders/Transform.vert"), new File("shaders/Transform.frag"));
shader.compile(gl);
shader.enable(gl);
shader.setUniform("projection", projection, gl);
shader.setUniform("view", view, gl);
}
catch (Exception e) {
System.out.println("message " + e.getMessage());
}
for (Shape s : scene.shapes) {
s.init(gl);
}
And finally, my shader files
#version 330
out vec4 fragColour;
//in vec3 outColour;
void main() {
fragColour = vec4(1,0,0,1);
}
#version 330
uniform mat4 projection;
uniform mat4 view;
layout(location=0) in vec3 pos;
//layout(location=2) in vec2 texCoord;
//layout(location=1) in vec3 colours;
out vec2 fragTex;
out vec3 outColour;
vec4 newPos;
void main() {
newPos = vec4(pos,1.0);
gl_Position = projection * view * newPos;
//fragTex = texCoord;
//outColour = colours;
}
i am unsure on where i am going wrong, whether it is the shader files, or the actualy code itself..
I am not experienced in JOGL, I am used to c++ GL. However there are several problems: First, as Reto Koradi stated you are using the same value to ebo, vbo and vao. It should be like,
gl.glGenVertexArrays(1, IntBuffer.wrap(tempV));
int vao = tempV[0];
gl.glGenBuffers(2, IntBuffer.wrap(tempB));
int vbo = tempB[0];
int ebo = tempB[1];
Lastly, your draw seems a bit problematic, you seem to skip a step."bind the array to want to draw." Then draw.
gl.glBindVertexArray (vao);
gl.glDrawElements(GL.GL_TRIANGLES, 2, GL.GL_UNSIGNED_INT, 0);
I hope these help.
Ok, after many frustrating hours. someone helped me with the solution. The issue wasnt making seperate buffers, but rather not clearing them each time, meaning i needed to do
gl.glGenVertexArrays(1, IntBuffer.wrap(temp));
//create vertice buffers
vao = temp[0];
gl.glGenBuffers(1, IntBuffer.wrap(temp));
vbo = temp[0];
gl.glGenBuffers(1, IntBuffer.wrap(temp));
ebo = temp[0];
which is similar to how Hakes however i didnt need a seperate temp, i just needed to clear the buffer each time. one other thing i needed to do was to also put
gl.glBindVertexArray(vao);
in the init as well as the draw.
(edit)
im actually not too sure gl.glBindVertexArray(vao); needed to be in the draw method
Using LWJGL I tried to render to render a simple Mesh on screen, but OpenGL decided to instead do nothing. :(
So I have a mesh class which creates a VBO. I can add some vertices which then are supposed to be drawn on screen.
public class Mesh {
private int vbo;
private int size = 0;
public Mesh() {
vbo = glGenBuffers();
}
public void addVertices(Vertex[] vertices) {
size = vertices.length;
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, Util.createFlippedBuffer(vertices), GL_STATIC_DRAW);
}
public void draw() {
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glVertexAttribPointer(0, 3, GL_FLOAT, false, Vertex.SIZE * 4, 0);
glDrawArrays(GL_TRIANGLES, 0, size);
glDisableVertexAttribArray(0);
}
}
Here is how I add vertices to my mesh:
mesh = new Mesh();
Vertex[] vertices = new Vertex[] { new Vertex(new Vector3f(-1, -1, 0)),
new Vertex(new Vector3f(-1, 1, 0)),
new Vertex(new Vector3f(0, 1, 0)) };
mesh.addVertices(vertices);
I am pretty sure I added them in the correct (clock-wise) order.
And my OpenGL setup:
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glFrontFace(GL_CW);
glCullFace(GL_BACK);
glEnable(GL_CULL_FACE);
glEnable(GL_DEPTH_TEST);
Calling glGetError() returns no error (0).
EDIT:
Well I found out that macs are little weird when it comes to OpenGL. I needed to use a VAO along with the VBO. Now it works fine. Thanks anyway!
I don't see anywhere you're specifying either a shader for output or a color, or a vertex array. Depending on which profile you're using you need to be doing one or more of these.
I would suggest checking / setting the following
Disable face culling to ensure that regardless of the winding you should see something
If you're requesting a core profile, you'll need a shader and quite possibly a vertex array object
If you're instead using a compatibility profile you should call glColor3f(1, 1, 1) in your draw call to ensure you're not drawing a black triangle
Are you clearing the color and depth framebuffers before your render?
You might not be drawing that object within the viewing frustum, also call glCheckError to make sure you aren't making any mistakes.
It is also important to understand the difference between fixed pipeline and programmable pipeline OpenGL. If you are using a version with a programmable pipeline you will need to write shaders, otherwise you will need to set modelview and projection matrices.
I'm trying to learn GLSL and I'm wanting to send a float attribute to my vertex shader. For now, this float is to be used simply to set the brightness (I mulitply in_Color by in_Light and assign it to out_Color). The brightness always seems to equal 1.0 in the vertex shader regardless of what I try to pass (0.1 in the code below). I've tried googling and I've tried experimenting quite a bit but I just don't get what's wrong. Position, texture coordinates and color seem to work fine.
Here is how I'm binding the attribute locations, and a bit more of the code.
this.vsId = Shader.loadShader(pVertexFilePath, GL20.GL_VERTEX_SHADER);
this.fsId = Shader.loadShader(pFragmentFilePath, GL20.GL_FRAGMENT_SHADER);
this.pId = GL20.glCreateProgram();
GL20.glAttachShader(this.pId, this.vsId);
GL20.glAttachShader(this.pId, this.fsId);
GL20.glBindAttribLocation(this.pId, 0, "in_Position");
GL20.glBindAttribLocation(this.pId, 1, "in_TextureCoord");
GL20.glBindAttribLocation(this.pId, 2, "in_Color");
GL20.glBindAttribLocation(this.pId, 3, "in_Light");
GL20.glLinkProgram(this.pId);
this.normalMatrixId = GL20.glGetUniformLocation(this.pId, "normalMatrix");
this.projectionModelViewMatrixId = GL20.glGetUniformLocation(this.pId, "projModelViewMatrix");
This is how I'm setting the position of each attribute
FloatBuffer verticesFloatBuffer = BufferUtils.createFloatBuffer(verticesCount * 36);
for (VertexData vert : vertices) {verticesFloatBuffer.put(vert.getElements());}
vertices.clear();
verticesFloatBuffer.flip();
GL30.glBindVertexArray(vaoId);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboId);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, verticesFloatBuffer, GL15.GL_STATIC_DRAW);
GL20.glVertexAttribPointer(0, 3, GL11.GL_FLOAT, false, 36, 0);
GL20.glVertexAttribPointer(1, 2, GL11.GL_FLOAT, false, 36, 12);
GL20.glVertexAttribPointer(2, 3, GL11.GL_FLOAT, false, 36, 20);
GL20.glVertexAttribPointer(3, 1, GL11.GL_FLOAT, false, 36, 32);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
This is the getElements method used above (I have manually input color and brightness for testing purposes). Note that the last value which should be in_Light is 0.1f and always turns out to be 1.f in the vertex shader.
public float[] getElements() {
return new float[]{this.x, this.y, this.z, this.s, this.t, 1.f, 0.f, 1.f, 0.1f};
}
Vertex Shader:
#version 430 core
uniform mat4 projModelViewMatrix;
uniform mat3 normalMatrix;
in vec3 in_Position;
in vec2 in_TextureCoord;
in vec3 in_Color;
in float in_Light;
out vec4 pass_Color;
out vec2 pass_TextureCoord;
void main(void) {
gl_Position = projModelViewMatrix * vec4(in_Position, 1.0);
pass_TextureCoord = in_TextureCoord;
pass_Color = vec4(in_Color * in_Light, 1.0);
}
Fragment Shader (just in case someone wants to see it):
#version 430 core
uniform sampler2D texture_diffuse;
in vec4 pass_Color;
in vec2 pass_TextureCoord;
out vec4 out_Color;
void main(void) {
out_Color = texture2D(texture_diffuse, pass_TextureCoord) * pass_Color;
}
All other attributes that I pass to the vertex shader work fine.
EDIT:
Just as a note, I've tried changing the shader to specify locations, eg:
layout (location = 0) in vec3 in_Position;
I've also used glGetAttributeLocation which gives me the same attribute locations (0, 1, 2, 3), eg:
GL20.glGetAttribLocation(this.pId, "in_Position");
I've also added an if statement to the shader to check the value of in_Light and it always equals one.
EDIT2:
Now I've changed the color attribute to a vec4 and passed the light value in place of the alpha which works fine. Based on other trials, as well as this, it's almost as if I can't have more than 3 attributes for some reason.
GL20.glLinkProgram(this.pId);
GL20.glBindAttribLocation(this.pId, 0, "in_Position");
GL20.glBindAttribLocation(this.pId, 1, "in_TextureCoord");
GL20.glBindAttribLocation(this.pId, 2, "in_Color");
GL20.glBindAttribLocation(this.pId, 3, "in_Light");
The glBindAttribLocation call needs to come before you link the program. Attribute locations are fixed at link time, so if you didn't bind any, OpenGL will arbitrarily assign them locations.
Yup, simple noob mistake in addition to the one Nicol Bolas mentioned. (wonder how many more I have ;)
I needed to add glEnableVertexAttribArray before setting the position of each attribute with glVertexAttribPointer. Seems like the first three attributes are automatically enabled, though I imagine this could change on a per GPU basis?
FloatBuffer verticesFloatBuffer = BufferUtils.createFloatBuffer(verticesCount * 36);
for (VertexData vert : vertices) {verticesFloatBuffer.put(vert.getElements());}
vertices.clear();
verticesFloatBuffer.flip();
GL30.glBindVertexArray(vaoId);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboId);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, verticesFloatBuffer, GL15.GL_STATIC_DRAW);
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
GL20.glEnableVertexAttribArray(2);
GL20.glEnableVertexAttribArray(3);
GL20.glVertexAttribPointer(0, 3, GL11.GL_FLOAT, false, 36, 0);
GL20.glVertexAttribPointer(1, 2, GL11.GL_FLOAT, false, 36, 12);
GL20.glVertexAttribPointer(2, 3, GL11.GL_FLOAT, false, 36, 20);
GL20.glVertexAttribPointer(3, 1, GL11.GL_FLOAT, false, 36, 32);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
Thanks to everyone that replied!
I'm following along with the OpenGL tutorial found here. I'm on chapter 2 right now and it's going over the advantages of using glArrayElement to render objects. Currently, my code is as follows:
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
double vertices[] = {100, 200, 0, 200, 100, 0, 100, 100, 0};
double colors[] = {1, .5, .8, .3, .5, .8, .3, .5, .8};
DoubleBuffer vertexBuffer = BufferUtils.createDoubleBuffer(9).put(vertices);
DoubleBuffer colorBuffer = BufferUtils.createDoubleBuffer(9).put(colors);
glVertexPointer(3, 0, vertexBuffer);
glColorPointer(3, 0, colorBuffer);
while(!Display.isCloseRequested()) {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glArrayElement(0);
glArrayElement(1);
glArrayElement(2);
glVertex3d(300, 200, 0);
glVertex3d(400, 100, 0);
glVertex3d(300, 100, 0);
glEnd();
//Display.sync(60);
Display.update();
}
The second triangle, defined explicitly by calls to glVertex3d is rendered fine. Bu the first triangle does not render at all. Am I making a simple mistake?
While scouring for more sample code, I came across a snippet that said you had to "flip each buffer." Adding
vertexBuffer.flip();
colorBuffer.flip();
Right before the while loop solved my problem!