Why isn't this a square? LWJGL - java

I have a basic LWJGL window set up and I am trying to draw a square using the glBegin(GL_QUADS) method. Square square = new Square(25, 25, 25), is the way I am calling my Square class to draw the square... but it is a rectangle. When I call it I pass in all 25's as the parameters. the first two are the starting coordinates and the last 25 is the side length, as seen below. What am I doing wrong to produce a rectangle?
public Square(float x,float y,float sl) {
GL11.glColor3f(0.5F, 0.0F, 0.7F);
glBegin(GL11.GL_QUADS);
glVertex2f(x, y);
glVertex2f(x, y+sl);
glVertex2f(x+sl, y+sl);
glVertex2f(x+sl, y);
glEnd();
}
My Viewport code
glMatrixMode(GL_PROJECTION);
glLoadIdentity(); // Resets any previous projection matrices
glOrtho(0, 640, 0, 480, 1, -1);
glMatrixMode(GL_MODELVIEW);

Using glOrtho(0, 640, 0, 480, 1, -1); constructs a non-square viewport. That means that the rendered output is more than likely going to be skewed if your window is not the same size as your viewport (or at least the same aspect ratio).
Consider the following comparison:
If your viewport is the same size as your window, then it should remain square. I'm using JOGL, but in my resize function, I reshape my viewport to be the new size of my window.
glcanvas.addGLEventListener(new GLEventListener() {
#Override
public void reshape(GLAutoDrawable glautodrawable, int x, int y, int width, int height) {
GL2 gl = glautodrawable.getGL().getGL2();
gl.glMatrixMode(GL2.GL_PROJECTION);
gl.glLoadIdentity(); // Resets any previous projection matrices
gl.glOrtho(0, width, 0, height, 1, -1);
gl.glMatrixMode(GL2.GL_MODELVIEW);
}
... Other methods
}

To draw a square around the point (x | y) you can calculate the four points that represent the corners of your square.
First you'll need your width to height ratio
float ratio = width / height
I will use a defaultSize for the length of the shortest path from the midpoint to any of the sides.
Then you can calculate four values like so:
float a = x + defaultSize
float b = ratio * (y + defaultSize)
float c = x - defaultSize
float d = ratio * (y - defaultSize)
with which you can represent all four corners to draw your square with. Since GL_SQUAD is deprecated I'll use GL_TRIANGLE.
glBegin(GL_TRIANGLES);
glColor3f(red, green, blue);
// upper left triangle
glVertex2f(a, b);
glVertex2f(c, b);
glVertex2f(c, d);
// lower right triangle
glVertex2f(a, b);
glVertex2f(c, d);
glVertex2f(a, d);
glEnd();
I don't know if this is the most performant or idiomatic way to do this since I just started exploring LWJGL.

Related

Center rectangle-object in the screen by using glOrthof in Android with opengles 2, with java

I'm working on the examples of the book OpenGlEs 2 for Android.
I did the first example, for drawing a rectangle of base 9, and height 14, by using the below array for defining the coordinates
private float[] tableVerticesWithTriangles = {
//Triangle
0f, 0f,
9f, 14f,
0f, 14f,
//Triangle 2
0f, 0f,
9f, 0f,
9f, 14f
};
The rectangle is appearing as in the example, the white rectangle in the top right corner:
The code I'm working on is in the repository https://github.com/quimperval/opengles-android-rectangle
Now in the book the author centers the rectangle by modifying the coordinates of the rectangle, however as far as I know, openGl can take care of that by using a projection matrix. So, I modified the vertex shader for using a projection Matrix
attribute vec4 a_Position;
attribute mat4 u_Projection;
void main(){
gl_Position = u_Projection * a_Position;
}
And in the CRenderer class I added the below variables
private static final String U_PROJECTION = "u_Projection";
int projectionMatrixLocation;
and the
float[] projectionMatrix = new float[16];
And in the onSurfaceChanged method I added the logic for considering the aspectRatio
#Override
public void onSurfaceChanged(GL10 gl10, int width, int height) {
glViewport(0, 0, width, height);
// Calculate the projection matrix
float aspectRatio = width > height ?
(float) width / (float) height :
(float) height / (float) width;
if (width > height) {
// Landscape
glOrthof(-aspectRatio, aspectRatio, -1f, 1f, -1f, 1f);
} else {
// Portrait or square
glOrthof(-1f, 1f, -aspectRatio, aspectRatio, -1f, 1f);
}
projectionMatrixLocation = glGetUniformLocation(program, "u_Projection");
glUniformMatrix4fv(projectionMatrixLocation, 1, false, projectionMatrix, 0);
}
In the onDrawFrame I didn't do changes.
When I compile and install the application in the emulator, It crashes, with the error:
2022-12-31 14:45:23.971 10499-10521/com.perval.myapplication A/libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 10521 (GLThread 411)
Am I missing some operation?
I expect to center the rectangle (of any dimensions) in the screen of the device.
I believe the answer is that you are:
Not declaring the matrix as a uniform
Not checking your response from glGetUniformLocation()
As you are accessing a uniform with glGetUniformLocation, you should declare it in your shader like so:
uniform mat4 u_Projection;
I manage to find another way to achive the result I need, by using the below code in the onSurfaceChangedMethod
#Override
public void onSurfaceChanged(GL10 gl10, int width, int height) {
glViewport(0,0, width, height);
/*orthoM
*projectionMarix - float[] m - the destination array
* mOffset - the offset in to m which the result is written
* float left - the minimum range of the x-axis
* float right - the maximum range of the x-axis
* float bottom - the minimum range of the y-axis
* float top - the maximum range ot the y-axis
* float near - the minimum range of the z-axis
* float far - the maximum range of the z-axis
*
*/
float boundingBoxWidth = 300;
float boundingBoxHeight = 300;
float aspectRatio;
if(width>height){
//Landscape
aspectRatio = (float) width / (float) height;
orthoM(projectionMatrix, 0, -aspectRatio*boundingBoxHeight, aspectRatio*boundingBoxHeight, -boundingBoxHeight, boundingBoxHeight, -1f, 1f);
} else {
//Portrait or square
aspectRatio = (float) height / (float) width;
orthoM(projectionMatrix, 0, -boundingBoxWidth, boundingBoxWidth, -boundingBoxWidth*aspectRatio, boundingBoxWidth*aspectRatio, -1f, 1f);
}
}
In that way I got the behaviour I wanted, place an object in the center of the screen, and the object has vertex coordinates outside of the surface extents (-1,-1) to (1,1).
The key is to know the width and the height of the collision box of the object I want to draw, then it is just a matter of scaling the left, right or bottom/top variables based on the orientation of the screen, with the aspectRatio variable.
I placed the code in the repository:
https://github.com/quimperval/opengl-es-android-draw-object

Java Game: Changing the brightness or transparency of a image

Hello Everyone,
I am working on a 2D tile game and I want to work on light so at night it'll be darker and at night brighter. If I can't change the brightness, I could have it fade from a brighter texture to a darker. Here is the image drawing function I have:
public static void drawRectTexRot(Texture tex, float x, float y, float width, float height, float angle) {
tex.bind();
glTranslatef(x + width / 2, y + height / 2, 0);
glRotatef(angle, 0, 0, 1);
glTranslatef(- width / 2, - height / 2, 0);
glBegin(GL_QUADS);
glTexCoord2f(0,0);
glVertex2f(0,0);
glTexCoord2f(1,0);
glVertex2f(width,0);
glTexCoord2f(1,1);
glVertex2f(width, height);
glTexCoord2f(0, 1);
glVertex2f(0, height);
glEnd();
glLoadIdentity();
}
Any help would be great. :)
-Ekrcoaster
In java you cange transparency of Graphics2D object by method :
float opacity = 0.5f;
g.setComposite(AlphaComposite.getInstance(AlphaComposite.SRC_OVER, opacity));
Value 0 is for transparent and 1 is opaque. In this case whatever you draw next
will be half transparent.
To change brightness of Image use RescaleOp. Example :
RescaleOp rescaleOp = new RescaleOp(1.3f, 0, null);
rescaleOp.filter(image, image);
The first argument of RescaleOp constructor determines brightness scale. Value of 1f means the same brightness, value 0.75f means image is darker by 25%. Value of 1.3f makes image brighter by 30 %.

Pixels rendering white (OpenGL)

I'm trying to get non-shader per-pixel lighting in the background of my window. It's supposed to render completely white in the top-left corner and completely black in the bottom left corner, but instead it's white with black edges. For some reason, all pixels except the far right and far bottom edge are completely white (the edges are black)). Why isn't it rendering properly?
glViewport(0, 0, Display.getWidth(), Display.getHeight());
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 2000, 2000, 0, 1, -1);
glMatrixMode(GL_MODELVIEW);
if (lighting != 0)
glDeleteLists(lighting, 1);
lighting = glGenLists(1);
glNewList(lighting, GL_COMPILE);
glDisable(GL_TEXTURE_2D);
glBegin(GL_POINTS);
for (int x = 1; x <= 2000; x++)
{
for (int y = 1; y <= 2000; y++)
{
double dist = new Point(x, y).distance(new Point(0, 0));
double brightness = 1 - (1 / 2000 * dist); //I tried just "1 / 2000 & dist" instead, but that just renders black everywhere
glColor3d(brightness, brightness, brightness);
glVertex2f(x, y);
}
}
glEnd();
glEndList();

Java - GluProject How to deal with rotation

I'm trying to use gluProject with my game. I've found a way to use it but I don't know how to deal with rotation.
Here is the code I made:
public static Vector2 getScreenCoordinates(float x, float y, float z, int height, int width)
{
FloatBuffer screen_coords = GLAllocation.createDirectFloatBuffer(4);
IntBuffer viewport = GLAllocation.createDirectIntBuffer(16);
FloatBuffer modelview = GLAllocation.createDirectFloatBuffer(16);
FloatBuffer projection = GLAllocation.createDirectFloatBuffer(16);
GL11.glGetFloat(2982, modelview);
GL11.glGetFloat(2983, projection);
GL11.glGetInteger(2978, viewport);
boolean result = GLU.gluProject(x, y, z, modelview, projection, viewport, screen_coords);
if (result)
{
float screen_x = screen_coords.get(0);
float screen_y = screen_coords.get(1);
float scrren_z = screen_coords.get(2);
screen_y -= height / 2;
screen_y = -screen_y;
screen_y += height / 2;
return new Vector2(screen_x, screen_y);
}
else
{
System.out.printf("Failed to convert 3D coords to 2D screen coords");
return null;
}
}
So x, y and z are coords in my 3D map. height and width are my window size.
How can I modify it to deal with rotation (yaw and pitch)?
Thanks.
Here is my new code:
public Vector2 worldToScreen(double x, double y, double z, int yaw, int pitch)
{
// Initialize the result buffer
FloatBuffer screen_coords = GLAllocation.createDirectFloatBuffer(4);
// Init the OpenGL buffers
IntBuffer viewport = GLAllocation.createDirectIntBuffer(16);
FloatBuffer modelview = GLAllocation.createDirectFloatBuffer(16);
FloatBuffer projection = GLAllocation.createDirectFloatBuffer(16);
// Add the rotation
GL11.glRotatef(yaw, 0, 0, 1);
GL11.glRotatef(pitch, 1, 0, 0);
// Get the OpenGL data
GL11.glGetFloat(2982, modelview);
GL11.glGetFloat(2983, projection);
GL11.glGetInteger(2978, viewport);
// Remove the rotation
GL11.glRotatef(-yaw, 0, 0, 1);
GL11.glRotatef(-pitch, 1, 0, 0);
// Calculate the screen position
boolean result = GLU.gluProject(x, y, z, modelview, projection, viewport, screen_coords);
if (result)
{
if ( (screen_coords.get(0) < -1.0f) || (screen_coords.get(0) > 1.0f) || (screen_coords.get(1) < -1.0f) || (screen_coords.get(1) > 1.0f) )
return null;
int window_half_width = getWidth() / 2;
int window_half_height = getHeight() / 2;
int screen_x = window_half_width + (window_half_width * -screen_coords.get(0));
int screen_y = window_half_height + (window_half_height * screen_coords.get(1));
System.out.printf("(Full Screen / No bounds) [" +x+ ", " +y+ ", " +z+ "] gives [" +screen_coords.get(0)+ ", " +screen_coords.get(1)+ "]");
System.out.printf("(Every Time) [" +x+ ", " +y+ ", " +z+ "] gives [" +screen_x+ ", " +screen_y+ "]");
return new Vector2(screen_x, screen_y);
}
else
{
System.out.printf("Failed to convert 3D coords to 2D screen coords");
return null;
}
}
Is that correct?
Yaw is a rotation about the y axis (pointing upwards), and pitch is a rotation about the x axis.
GL11.glRotatef(pitch, 1, 0, 0);
GL11.glRotatef(yaw, 0, 1, 0);
It may also be faster to push/pop the the projection matrix, and you should be in the correct matrix mode. (Both your model-view matrix and your projection matrix have to be correct).
GL11.glPushMatrix();
// Camera rotations
GL11.glPopMatrix();
But it's questionable as to why you need to undo the camera tranformations. You should probably just apply them once per frame or viewport. Render your scene with that view, then get the projected coordinate and then change to your 2D rendering mode.
Update
I'd prefer it if you used roll, pitch and yaw of the camera to mean what they actually mean in OpenGL's coordinate system - with respect to the camera. The point is, as long as you have the camera model-view and projection matrices setup correctly, you can project the point. You must already be setting up the camera somewhere. You will be able to see if this is correct, because you can SEE the results. Apply that transformation and you will know it's correct.
Is this correct? It might be? You have to apply all of the transformations. The exact same transformations that give you the camera view you see in game. This includes all translations and rotations. You will know if you are right, because you should already be applying these transformations in-game.
However, What you need to think about is what is the camera projection matrix before hand? If you don't know, call glLoadIdentity() to clear it, and then call your camera transformation logic. You need an accurate model-view and projection matrix - the exact same camera setup you are using in-game. If your model-view or projection matrices are in the wrong state, it just won't work.

How to approximate an ellipse to fill a given rectangle, using bezier curves?

I tried the code below, which draws a good approximation of a circle if the rectangle's width is the same as its height; but it doesn't draw a great oval, the "corners" are very pointed. Any suggestions?
float width = rect.width();
float height = rect.height();
float centerX = rect.width() / 2;
float centerY = rect.height() / 2;
float diameter = Math.min(width, height);
float length = (float) (0.5522847498 * diameter/2);
path.moveTo(0, centerY);
path.cubicTo(0, centerY - length, 0, centerX - length, 0, centerX, 0);
path.cubicTo(centerX + length, 0, width, centerY - length, height, centerY);
path.cubicTo(width, centerY + length, centerX + length, height, centerX, height);
path.cubicTo(centerX - length, height, 0, centerY + length, 0, centerY);
You should scale length according to which axis it's on, so that the distance from each arc endpoint to the adjacent control points is (not fixed but) a fixed fraction of the axis you're moving parallel to at that point.
If it's a true rectangle, with right angles, then it should be easy. The major and minor axes of the ellipse equal the lengths of the sides of the rectangle, and the center of the ellipse is located at the intersection of the rectangle's diagonals.
I don't know how to express it as Bezier splines off the top of my head, but the classic equation for an ellipse should be easy enough to write, as long as you transform to the appropriate coordinate system first (e.g. x-axis along the major axis of the rectangle/ellipse, y-axis along the minor axis of the rectangle/ellipse).

Categories