Java Game: Changing the brightness or transparency of a image - java

Hello Everyone,
I am working on a 2D tile game and I want to work on light so at night it'll be darker and at night brighter. If I can't change the brightness, I could have it fade from a brighter texture to a darker. Here is the image drawing function I have:
public static void drawRectTexRot(Texture tex, float x, float y, float width, float height, float angle) {
tex.bind();
glTranslatef(x + width / 2, y + height / 2, 0);
glRotatef(angle, 0, 0, 1);
glTranslatef(- width / 2, - height / 2, 0);
glBegin(GL_QUADS);
glTexCoord2f(0,0);
glVertex2f(0,0);
glTexCoord2f(1,0);
glVertex2f(width,0);
glTexCoord2f(1,1);
glVertex2f(width, height);
glTexCoord2f(0, 1);
glVertex2f(0, height);
glEnd();
glLoadIdentity();
}
Any help would be great. :)
-Ekrcoaster

In java you cange transparency of Graphics2D object by method :
float opacity = 0.5f;
g.setComposite(AlphaComposite.getInstance(AlphaComposite.SRC_OVER, opacity));
Value 0 is for transparent and 1 is opaque. In this case whatever you draw next
will be half transparent.
To change brightness of Image use RescaleOp. Example :
RescaleOp rescaleOp = new RescaleOp(1.3f, 0, null);
rescaleOp.filter(image, image);
The first argument of RescaleOp constructor determines brightness scale. Value of 1f means the same brightness, value 0.75f means image is darker by 25%. Value of 1.3f makes image brighter by 30 %.

Related

Is there a way to clear the colour from the memory of LWJGL?

I am creating a 2D game where I am currently drawing on the screen, a quad filling the entire screen to have a blue background. Then I am drawing a stone texture on top of it. The issue is when the program draws the stone, it gives it a blue tint as it is using the colour from the sky quad. I was wondering if there is a way to clear the colour from the memory of OpenGL / GLFW / LWJGL. I am using Java 1.8 and LWJGL 3.2.3
Here is the code:
private static void renderAir() {
glfwMakeContextCurrent(InnocentDream.win);
glBegin(GL_QUADS);
glColor4f(0, 204 / 255f, 1, 1);
glVertex2f(-256, 128);
glColor4f(0, 204 / 255f, 1, 1);
glVertex2f(256, 128);
glColor4f(0, 108 / 255f, 250 / 255f, 1);
glVertex2f(256, -128);
glColor4f(0, 108 / 255f, 250 / 255f, 1);
glVertex2f(-256, -128);
glEnd();
}
I have the GLFW and GL11 classes as static imports.
public void renderTileAtPos(float x, float y) {
float TILE_WIDTH = Tile.TILE_WIDTH / 2f;
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture.getTextureID());
glBegin(GL_QUADS);
glTexCoord2f(1, 0);
glVertex2f(-TILE_WIDTH + x, TILE_WIDTH + y);
glTexCoord2f(0, 0);
glVertex2f(TILE_WIDTH + x, TILE_WIDTH + y);
glTexCoord2f(0, 1);
glVertex2f(TILE_WIDTH + x, -TILE_WIDTH + y);
glTexCoord2f(1, 1);
glVertex2f(-TILE_WIDTH + x, -TILE_WIDTH + y);
glEnd();
glDisable(GL_TEXTURE_2D);
}
And here is the tile rendering.
glColor4f(1,1,1,1);
Throw that call either after your original sky code, or just before you render the stone.
It's a good practice if using glColor() calls, to get in the habit of always calling glColor() before drawing anything, or to only call it once on initialisation and never call it again.
There is no "remove the colour I just set" call. The value of "glColor" is just one of those pieces of state that OpenGL hangs on to.

Horizontal tiling background

I'm coding for a game and want the background to repeat itself.
xOffset = (int) (camera.getX() % WIDTH);
g.drawImage(bgInv, xOffset - WIDTH, 0, WIDTH, HEIGHT, null);
g.translate(xOffset, 0);
g.drawImage(bg, 0, 0, WIDTH, HEIGHT, null);
g.translate(-xOffset, 0);
g.drawImage(bgInv, xOffset + WIDTH, 0, WIDTH, HEIGHT, null);
The first drawImage draws when the camera's X is negative
and the third when the camera's X is positive.
bg is the normal background
bgInv is the background inverted
The problem is that when I'm moving and the xOffset goes from WIDTH to 0, it seems like there's a "wrap".
Click here to see
The console is outputting xOffset
I know it is because I'm using modulo to get xOffset but I didn't figure out a better way...
Thanks in advance
If I understood correctly, what you want to repeat is a 2 * WIDTH by HEIGHT image, where the left half is the background image and the right half is the same image horizontally inverted.
So what you can do is the following:
xOffset = (int) (camera.getX() % (2 * WIDTH));
// draw the background image at x = xOffset - 2 * WIDTH
g.drawImage(bg, xOffset - 2 * WIDTH, 0, WIDTH, HEIGHT, null);
g.drawImage(bgInv, xOffset - WIDTH, 0, WIDTH, HEIGHT, null);
// draw the background image at x = xOffset
g.drawImage(bg, xOffset, 0, WIDTH, HEIGHT, null);
g.drawImage(bgInv, xOffset + WIDTH, 0, WIDTH, HEIGHT, null);

Why isn't this a square? LWJGL

I have a basic LWJGL window set up and I am trying to draw a square using the glBegin(GL_QUADS) method. Square square = new Square(25, 25, 25), is the way I am calling my Square class to draw the square... but it is a rectangle. When I call it I pass in all 25's as the parameters. the first two are the starting coordinates and the last 25 is the side length, as seen below. What am I doing wrong to produce a rectangle?
public Square(float x,float y,float sl) {
GL11.glColor3f(0.5F, 0.0F, 0.7F);
glBegin(GL11.GL_QUADS);
glVertex2f(x, y);
glVertex2f(x, y+sl);
glVertex2f(x+sl, y+sl);
glVertex2f(x+sl, y);
glEnd();
}
My Viewport code
glMatrixMode(GL_PROJECTION);
glLoadIdentity(); // Resets any previous projection matrices
glOrtho(0, 640, 0, 480, 1, -1);
glMatrixMode(GL_MODELVIEW);
Using glOrtho(0, 640, 0, 480, 1, -1); constructs a non-square viewport. That means that the rendered output is more than likely going to be skewed if your window is not the same size as your viewport (or at least the same aspect ratio).
Consider the following comparison:
If your viewport is the same size as your window, then it should remain square. I'm using JOGL, but in my resize function, I reshape my viewport to be the new size of my window.
glcanvas.addGLEventListener(new GLEventListener() {
#Override
public void reshape(GLAutoDrawable glautodrawable, int x, int y, int width, int height) {
GL2 gl = glautodrawable.getGL().getGL2();
gl.glMatrixMode(GL2.GL_PROJECTION);
gl.glLoadIdentity(); // Resets any previous projection matrices
gl.glOrtho(0, width, 0, height, 1, -1);
gl.glMatrixMode(GL2.GL_MODELVIEW);
}
... Other methods
}
To draw a square around the point (x | y) you can calculate the four points that represent the corners of your square.
First you'll need your width to height ratio
float ratio = width / height
I will use a defaultSize for the length of the shortest path from the midpoint to any of the sides.
Then you can calculate four values like so:
float a = x + defaultSize
float b = ratio * (y + defaultSize)
float c = x - defaultSize
float d = ratio * (y - defaultSize)
with which you can represent all four corners to draw your square with. Since GL_SQUAD is deprecated I'll use GL_TRIANGLE.
glBegin(GL_TRIANGLES);
glColor3f(red, green, blue);
// upper left triangle
glVertex2f(a, b);
glVertex2f(c, b);
glVertex2f(c, d);
// lower right triangle
glVertex2f(a, b);
glVertex2f(c, d);
glVertex2f(a, d);
glEnd();
I don't know if this is the most performant or idiomatic way to do this since I just started exploring LWJGL.

OpenGL 1.1 Changing Color changes previous color?

I am trying to render 2 2d rectangles after each other, I have the height and the width of the 2 rectangles together. Now when I set the color for the 2nd quad the first quad inherits my first color?
I have tried to use popmatrix along with pushmatrix but that makes no difference. I have also tried resetting the colors with glColor4f(1,1,1,1).
Here is my code:
protected void renderComponent(Frame component) {
Rectangle area = new Rectangle(component.getArea());
int fontHeight = theme.getFontRenderer().FONT_HEIGHT;
int titleHeight = 25;
translateComponent(component, false);
glEnable(GL_BLEND);
glDisable(GL_CULL_FACE);
glDisable(GL_TEXTURE_2D);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
if(component.isMinimized()){
}
glBegin(GL_QUADS);
{
RenderUtil.setColor(titleColor);
glVertex2d(0, 0);
glVertex2d(area.width, 0);
glVertex2d(area.width, titleHeight);
glVertex2d(0, titleHeight);
}
glEnd();
glBegin(GL_QUADS);
{
RenderUtil.setColor(component.getBackgroundColor());
glVertex2d(0, 0);
glVertex2d(area.width, 0);
glVertex2d(area.width, area.height + titleHeight);
glVertex2d(0, area.height + titleHeight);
}
glEnd();
glEnable(GL_TEXTURE_2D);
theme.getFontRenderer().func_175063_a(component.getTitle(), getCenteredX(area.width, component.getTitle()), 6, RenderUtil.toRGBA(component.getForegroundColor()));
glEnable(GL_CULL_FACE);
glDisable(GL_BLEND);
}
And my util setcolor method:
public static void setColor(Color c) {
glColor4f(c.getRed() / 255f, c.getGreen() / 255f, c.getBlue() / 255f, c.getAlpha() / 255f);
}
You appear to be drawing the second rectangle over top of the first, thus making it appear you've changed the colour of the first.
use the coordinates below for the second cube instead
glVertex2d(0, titleHeight);
glVertex2d(area.width, titleHeight);
glVertex2d(area.width, area.height + titleHeight);
glVertex2d(0, area.height + titleHeight);
This will place the second rectangle below the first, and give it a height of area.height.

Java - OpenGL set offset for center (0,0) coordinates

I am doing something with JOGL libraries (forced to) and I can't figure out how to offset the center zero coordinates. I would like to offset them to the bottom of my viewport, in the method
public void reshape(GLAutoDrawable drawable, int x, int y, int width, int height)
but I can't google any way to translate int height into any meaningfull offset float coordinates.
edit:
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL_PROJECTION);
gl.glLoadIdentity();
glu.gluPerspective(45.0, width / (float) height, 0.1, 100.0);
gl.glMatrixMode(GL_MODELVIEW);
gl.glLoadIdentity();
Solved using
glu.gluLookAt(0, 0, 1, 0, 0.42, 0, 0, 1, 0);
Didn't realize the offset wasn't relative.

Categories