Ok so I am currently working on a 2d game in java with LWJGL. I have a fairly solid understanding of java and how it works and know the basics of how games work and LWJGL/openGL, but I am having this really weird issue with rendering textures. I determined that one of my draw methods here is the culprit ..
public static void texturedTriangleInverted(float x, float y, float width, float height, Texture texture) {
GL11.glPushMatrix();
{
GL11.glTranslatef(x, y, 0);
texture.bind();
GL11.glBegin(GL11.GL_TRIANGLES);
{
GL11.glVertex2f(width / 2, 0);
GL11.glTexCoord2f(width / 2, 0);
GL11.glVertex2f(0, height);
GL11.glTexCoord2f(0, 1);
GL11.glVertex2f(width, height);
GL11.glTexCoord2f(1, 1);
}
GL11.glEnd();
}
GL11.glPopMatrix();
}
so what happens is if I render anything with this method the next thing rendered looks like it literally was compressed on one end and stretched on the other, even if the next thing rendered is not rendered with this method. I am almost positive that it has to do with the argument I passed into the Gl11.glTexCoord2f(float x, float y) methods, but I cant figure out how to fix it.
here is my openGL initialization code
private void initGL() {
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glOrtho(0, Strings.DISPLAY_WIDTH, 0, Strings.DISPLAY_HEIGHT, -1, 1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glEnable(GL11.GL_TEXTURE_2D);
GL11.glEnable(GL11.GL_TEXTURE_BINDING_2D);
GL11.glViewport(0, 0, Strings.DISPLAY_WIDTH, Strings.DISPLAY_HEIGHT);
GL11.glClearColor(0, 0, 1, 0);
GL11.glDisable(GL11.GL_DEPTH_TEST);
}
my game loop code
private void gameLoop() {
while (!Display.isCloseRequested()) {
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);
GL11.glLoadIdentity();
GL11.glTranslatef(Strings.transX, Strings.transY, 0);
this.level.update();
this.level.render();
Display.update();
Display.sync(Strings.FPS_CAP);
}
Keyboard.destroy();
Display.destroy();
}
my texture loading code (note I used slick to load my textures)
public static final Texture loadTexture(String location) {
try {
if (textureExists(location)) {
return TextureLoader.getTexture("png", new BufferedInputStream(new FileInputStream(new File(location))), false);
} else {
System.err.println("txture does not exist");
}
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
You need to specify your glTexCoord before the glVertex it refers to, not after. This is the same as with glColor and glNormal, glVertex uses the last attributes that you set.
Related
I've looked for hours but I couldn't find anything but two StackOverflow questions, one of which had one answer, by the author, not really giving proper explanation.
I was working on a game with LWJGL2.9 and now decided to switch to LibGDX for its simplicity. I had working code that allowed me to move through space - but that code can't be converted 1:1 to LibGDX and I can't find any example, explanation, tutorial or anything of the like, not even in LibGDX's wiki.
This is the code that I used for LWJGL:
public Vectorf calculateMovement(MovementDirection direction, float newYaw, float walkSpeed, double deltaTime) {
Vectorf position = new Vectorf(0f, 0f, 0f); // Just a Vector3f
float amount = (walkSpeed) * (float) deltaTime;
switch (direction) {
case FORWARD:
case BACKWARD: {
float dx = (float) (direction.getXAdd() * Math.sin(Math.toRadians(newYaw))) * amount;
float dz = (float) -(direction.getZAdd() * Math.cos(Math.toRadians(newYaw))) * amount;
position.add(dx, direction.getYAdd() * amount, dz);
break;
}
case LEFT:
case RIGHT: {
float dx = (amount * (float) (Math.sin(Math.toRadians(newYaw + 90))) * direction.getXAdd());
float dz = (amount * (float) (Math.cos(Math.toRadians(newYaw + 90))) * direction.getZAdd());
position.add(dx, direction.getYAdd() * amount, dz);
break;
}
default: {
position.add(direction.getXAdd() * amount, direction.getYAdd() * amount, direction.getZAdd() * amount);
}
}
return position;
}
public enum MovementDirection {
FORWARD(1, 1),
BACKWARD(-1, -1),
LEFT(-1, 1),
RIGHT(1, -1),
UP(1),
DOWN(-1);
private final int xAdd;
private final int yAdd;
private final int zAdd;
MovementDirection(int yAdd) {
this(0, yAdd, 0);
}
MovementDirection(int xAdd, int zAdd) {
this(xAdd, 0, zAdd);
}
MovementDirection(int xAdd, int yAdd, int zAdd) {
this.xAdd = xAdd;
this.yAdd = yAdd;
this.zAdd = zAdd;
}
public float getXAdd() {
return xAdd;
}
public float getYAdd() {
return yAdd;
}
public float getZAdd() {
return zAdd;
}
}
I then updated the view matrix before rendering every frame calling the following method:
public static Matrix4f createViewMatrix(Camera camera) {
Matrix4f matrix = new Matrix4f();
matrix.setIdentity();
Matrix4f.rotate((float) Math.toRadians(camera.getYaw()), new Vector3f(1, 0, 0), matrix, matrix);
Matrix4f.rotate((float) Math.toRadians(camera.getPitch()), new Vector3f(0, 1, 0), matrix, matrix);
Matrix4f.rotate((float) Math.toRadians(camera.getDelta()), new Vector3f(0, 0, 1), matrix, matrix);
Matrix4f.translate(camera.getNegativePosition(), matrix, matrix);
return matrix;
}
LibGDX, though, uses matrixes in a totally different way and I have no idea how to configure it (or how to properly update the view matrix, for that matter). Here's what I have:
public static Matrix4 createViewMatrix(Camera camera) {
Matrix4 matrix = new Matrix4();
matrix.setToLookAt(camera.direction, camera.up);
return matrix;
}
I'm not really even sure how I would be using that, I'm completely lost. I can draw a cube on the screen and move my mouse around to see it but as soon as I try to move (W,A,S,D) the cube disappears (I'm guessing it has to do with the view matrix practically not being applied).
I know that it's a long read and the code may even be bad but I already could find little to nothing for LWJGL let alone LibGDX.
I finally got it. I was drawing before moving my camera and that, apparently, was the issue.
My render loop previously looked like this:
#Override
public void render() {
// Other code
Gdx.gl.glEnable(GL20.GL_DEPTH_TEST);
Gdx.gl.glViewport(0, 0, Gdx.graphics.getBackBufferWidth(), Gdx.graphics.getBackBufferHeight());
Gdx.gl.glClearColor(0, 0, 1, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
// Handle camera input
camera.update();
modelBatch.render(camera);
Gdx.gl.glDisable(GL20.GL_DEPTH_TEST);
}
I simply moved the camera input handling and camera update above the "render" code:
#Override
public void render() {
// Other code
// Handle camera input
camera.update();
Gdx.gl.glEnable(GL20.GL_DEPTH_TEST);
Gdx.gl.glViewport(0, 0, Gdx.graphics.getBackBufferWidth(), Gdx.graphics.getBackBufferHeight());
Gdx.gl.glClearColor(0, 0, 1, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
modelBatch.render(camera);
Gdx.gl.glDisable(GL20.GL_DEPTH_TEST);
}
Okay so this thing I've been working on has this class, Construct.
package od.methods;
import org.lwjgl.opengl.GL11;
public class Construct {
public static void box(int x, int y, int width, int height) {
GL11.glBegin(GL11.GL_QUADS);
GL11.glVertex2i(x,y); //Top Left
GL11.glVertex2i(x+width,y); //Top Right
GL11.glVertex2i(x+width,y+height); //Bottom Right
GL11.glVertex2i(x,y-height); //Bottom Left
GL11.glEnd();
}
}
So you'd think that would construct a box. It doesn't the only time it shows something is if you have the values set to
box(0, 0, width, height);
And when yo do that, no matter the width or height it makes the block a quarter of the screen. If for X and Y you do any value other than 0, nothing comes up.
I don't understand what I'm doing wrong
Okay I feel really dumb right now. I should've known to put this little bit of init code
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glOrtho(0, 640, 480, 0, -1, 1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glEnable(GL11.GL_TEXTURE_2D);
This probably sounds weird, but somehow my program just ignores green color
First, I init GL like this: (I don't know, maybe it does matter)
private static void initgl() {
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1280, 720, 0, 1, -1);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
}
I'm drawing a square like this:
public static void gameLoop(){
while (!Display.isCloseRequested()){
glClear(GL_COLOR_BUFFER_BIT);
DrawStuff.square(10, 10, 100, 100, new byte[]{127,127,127});
Display.update();
Display.sync(60);
}
}
and in square method I have
public static void square(int x, int y, int w, int h, byte[] rgb) {
glColor3b(rgb[0], rgb[1], rgb[2]);
glBegin(GL_QUADS);
glVertex2i(x, y);
glVertex2i(x + w, y);
glVertex2i(x + w, y + h);
glVertex2i(x,y+h);
glEnd();
}
When I run the program, I see this:
And by the way, color picker says it's #C808C7 . Not even #FF00FF like I expected
what happened to the green color?
why colors are off?
Ok, this answer helped me: How do color and textures work together?
That's not your problem, but it still needs to be done.
What was messing with my colors is glEnable(GL_TEXTURE_2D); To achieve both color and texture filling I have to disable it each time I want to draw something with solid color and then enable again:
public static void square(int x, int y, int w, int h, byte[] rgb) {
glDisable(GL_TEXTURE_2D);
glColor3b(rgb[0], rgb[1], rgb[2]);
glBegin(GL_QUADS);
glVertex2i(x, y);
glVertex2i(x + w, y);
glVertex2i(x + w, y + h);
glVertex2i(x,y+h);
glEnd();
glEnable(GL_TEXTURE_2D);
}
Depending on your case, future reader, you may want to keep it disabled in the first place, only enabling it if you need to add textures.
Volia:
I'm trying to make a game in Java OpenGL (JOGL), but I have a problem with textures.
When I draw a quad with a texture, the only thing I see is the image in blue scale.
The following is my code:
Texture grass;
// public void init() in Base class that implements GLEventListener
try {
grass = TextureIO.newTexture(new File("src/com/jeroendonners/main/grass.png"), false);
} catch(Exception e) {
e.printStackTrace();
}
To render a quad with this texture I use the following code:
int x = 0;
int y = 0;
gl.glColor3f(1f, 1f, 1f);
this.grass.bind(gl);
gl.glBegin(gl.GL_QUADS);
gl.glTexCoord2d(0, 0);
gl.glVertex3f(0, 0, 0);
gl.glTexCoord2d(1, 0);
gl.glVertex3f(1, 0, 0);
gl.glTexCoord2d(1, 1);
gl.glVertex3f(1, 1, 0);
gl.glTexCoord2d(0, 1);
gl.glVertex3f(0, 1, 0);
gl.glEnd();
I have read here that I have to use GL_BGR instead of the default GL_RGB, but since that question initializes textures in a different way, I don't know what to do with it.
Maybe a note: I am using an old version of JOGL, 1.0 I think. That's because I had a course on school with this version.
I have been recently learning LWJGL since I've been with Java for some time now, and I've learned that since there aren't very many LWJGL tutorials/reference material, I just use search OpenGL tutorials and since I know that LWJGL is like a Java port of OpenGL (I think that's how you'd describe) they'd be basically the exact same, except I always have to tweak it a bit, and I made this code (basically all by myself) and when I run it, it only displays one tile map, but it should display 16 tiles in all! Why is this?
package testandothertutorials;
import static org.lwjgl.opengl.GL11.*;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import org.lwjgl.LWJGLException;
import org.lwjgl.input.Mouse;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.newdawn.slick.opengl.Texture;
import org.newdawn.slick.opengl.TextureLoader;
public class TileMapTest {
int tilemap[][] = {
{ 0, 1, 1, 0 },
{ 0, 1, 1, 0 },
{ 0, 1, 1, 0 },
{ 1, 0, 0, 1 }
};
int TILE_SIZE = 32;
int WORLD_SIZE = 4;
Texture stone_texture, dirt_texture;
public TileMapTest() {
try {
Display.setDisplayMode(new DisplayMode(640, 480));
Display.setTitle("Game");
Display.create();
} catch(LWJGLException e) {
}
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 640, 480, 0, 1, -1);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
//Load the stone and dirt textures before the render loop
try {
stone_texture = TextureLoader.getTexture("PNG", new FileInputStream(new File("C://Users//Gannon//Desktop//Java//workspace//Test Game//res//stone.png")));
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
try {
dirt_texture = TextureLoader.getTexture("PNG", new FileInputStream(new File("C://Users//Gannon//Desktop//Java//workspace//Test Game//res//dirt.png")));
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
while(!Display.isCloseRequested()) {
glClear(GL_COLOR_BUFFER_BIT);
drawTiles();
Display.update();
Display.sync(60);
}
Display.destroy();
}
public void drawTiles() {
for(int x = 0; x < WORLD_SIZE; x++) {
for(int y = 0; y < WORLD_SIZE; y++) {
if(tilemap[x][y] == 0) { //If the selected tile in the tilemap equals 0, set it to the stone texture to draw
stone_texture.bind();
} else if(tilemap[x][y] == 1) { //If the selected tile equals 1, set it to the dirt texture to draw
dirt_texture.bind();
}
glPushMatrix();
glTranslatef(x, y, 0);
glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2f(0, 0);
glTexCoord2f(1, 0);
glVertex2f(32, 0);
glTexCoord2f(1, 1);
glVertex2f(32, 32);
glTexCoord2f(0, 1);
glVertex2f(0, 32);
glEnd();
glPopMatrix();
}
}
}
public static void main(String args[]) {
new TileMapTest();
}
}
Try using glpushmatrix() and glpopmatrix(), currently your GL_QUADS are reletive to the last one drawn and so the positioning get far apart the higher x and y do:
glPushMatrix();
glTranslatef(x, y, 0);
glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2f(0, 0);
glTexCoord2f(1, 0);
glVertex2f(32, 0);
glTexCoord2f(1, 1);
glVertex2f(32, 32);
glTexCoord2f(0, 1);
glVertex2f(0, 32);
glEnd();
glPopMatrix();
This will allow each quad to be back in world space coordinates instead of the last drawn quad - I had the same problem once.
Also loading identity should only be done at the start of each new frame and try loading both textures outside the loop and choosing inside, loading for each tile is a real waste on the hard-drive, make use of RAM.
Your problem occurs, because your not translating the tiles enough, so you end up rendering all the tiles on top of each other!
Currently your doing glTranslatef(x, y, 0); and remember that your tiles have a 32 pixels width and height, but the range that you translate in is only 0 to 4 (since you only render 16 tiles) you need to change your translation.
This is how you should translate.
glTranslatef(x * TILE_SIZE, y * TILE_SIZE, 0);
So at the rendering part it ends up looking like this.
glPushMatrix();
glTranslatef(x * TILE_SIZE, y * TILE_SIZE, 0);
glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2f(0, 0);
glTexCoord2f(1, 0);
glVertex2f(32, 0);
glTexCoord2f(1, 1);
glVertex2f(32, 32);
glTexCoord2f(0, 1);
glVertex2f(0, 32);
glEnd();
glPopMatrix();