Rendering polygons with a VBO seems to lose some vertices - java

I'm trying to render a set of polygons, i have a set of points and im not doing any triangularization.
If i render my VBO with GL_LINE_LOOP mode, the lines went whit the right vertices, but when i try to render filled polygons with the same buffer but using GL_POLYGON i get wrong vertices, its like some points just go away.
I tried to disable the OpenGl polygon smoothing but still the same.
Any tips?
This image shows the lines and the polygon expected to be the same.

GL_POLYGON is only for convex, coplanar polygons.
Make sure the points in your VBO form one.

Related

Rendering a triangular face in 3d space

Let's say I have a triangular face in 3d space, and I have the 3d coordinates of each vertex of this triangle, and would also have other information about the triangle(angles, lengths of sides, etc.). In Java, if I have the viewing screen and its information, how can I draw that plane, without using libraries like LWJGL, to that image, assuming I can properly project, accounting for perspective, any 3d point to that 2d image.
Would the best course of action just be to run a loop that draws each point on the plain to a point on the image(i.e. setting the corresponding pixel), which will most likely set the same pixel multiple times? If I'd do this, what would be the best way to identify each point in an oblique triangle, or a triangle that doesn't line up nicely with the axes?
tl;dr: I have a triangular face in 3d space, a "camera" looking at the face, and an image in which I can set each pixel. Using no GL libraries, what's the best way to project and draw that face onto the image?
Projection :
won't detail as you seems to know it
Drawing a line
you can look at Bresenham algorithm if you wanna start with the basics
(hardwared in recent graphics card)
Filling
you can fill between left and right borders of the triangle while you use Bresenham on both (you could use a floodfill algorithm starting ... i don't know, maybe at the projection of the center of the triangle)
Your best bet is to check out the g.fillPolygon() function for Java. It allows you to draw polygons with as many sides as possible and theres also g.drawPolygon() if you don't want it solid. Then you can just do some simple maths for the points. Such as each point is basically it's x and y except if the polygon is further away the points move closer to the center of the polygon and if the polygon is closer they move further away from the center of the polygon.
A second idea could be using some sort of array to store pixels and then researching line drawing algorithms and drawing lines then putting all the line data in another array and using some sort of flood-fill. Then whilst it's in that array you could try and do some weird stuff to the pixels if you wanted textures or something.

Thickness of OpenGL 2D Textures

What is the easiest way to give 2D texture in OpenGL (lwjgl) some kind of "Thickness". Of course i could get the border of the texture somehow and add Quads, orriented by the normal of the quad that the texture is drawn on, in the color of the adjacent texture pixel. But there has to be an easier way to do it.
Minecraft is using lwigl as well and there are the (new) 3D Items, that spin down on the ground and don't cause as much of a performance issue, as is if they were drawn of dozends of polygons. As well, when you hold an item in your hand, there is that kind of "stretched" Texture in depth, which also works with high resolution textures.
Does anyone know how that is done?
A 2D texture is always infinitely thin. If you want actual thickness (when you look edge onto it) you need geometry. In Minecraft things look blocky, because they've been modeled that way.
If you look at some angle and ignore the edges you can use parallax mapping to "fake" some depth in the texture. Or you can use a depth map and use a combination of tesselation shaders and vertex shaders to implement a displacement map, that generates geometry from the texture.

Rotating a Sprite in Java

While working on Projectiles I thought that it would be a good idea to rotate the sprite as well, to make it look nicer.
I am currently using a 1-Dimensional Array, and the sprite's width and height can and will vary, so it makes it a bit more difficult for me to figure out on how to do this correctly.
I will be honest and straight out say it: I have absolutely no idea on how to do this. There have been a few searches that I have done to try to find some stuff, and there were some things out there, but the best I found was this:
DreamInCode ~ Rotating a 1-dimensional Array of Pixels
This method works fine, but only for square Sprites. I would also like to apply this for non-square (rectangular) Sprites. How could I set it up so that rectangular sprites can be rotated?
Currently, I'm attempting to make a laser, and it would look much better if it didn't only go along a vertical or horizontal axis.
You need to recalculate the coordinate points of your image (take a look here). You've to do a matrix product of every point of your sprite (x, y) for the rotation matrix, to get the new point in the space x' and y'.
You can assume that the bottom left (or the bottom up, depends on your system coordinate orientation) of your sprite is at (x,y) = (0,0)
And you should recalculate the color too (because if you have a pure red pixel surrounded by blue pixel at (x,y)=(10,5) when you rotate it can move for example to (x, y)=(8.33, 7.1) that it's not a real pixel position because pixel haven't float coordinate. So the pixel at real position (x, y)=(8, 7) will be not anymore pure red, but a red with a small percentage of blue)... but one thing for time.
It's easier than you think: you only have to copy the original rectangular sprites centered into bigger square ones with transparent background. .png files have that option and I think you may use them.

How to get vertices of rotated Mesh in Libgdx without render?

First I would like to know if it's possible to rotate a Mesh without drawing it.
If it is possible then how could I get the new vertices of the rotated mesh?
I need this to verify if a certain Mesh is still inside a rectangle after rotation and, I only want to draw it if it still's inside.

Strange issues with texture mapping

I am attempting to use texture coordinates from a pre-generated PNG file on a 3d world of quads loaded into Java with LWJGL's slick-util extension.
The texture file is 192x96pixels, and properly formatted. It's composed of 6x3 32x32 tiles.
The 3d quads are 1.5f wide and long. They are spaced apart properly.
I am having issues getting the right texture coordinates. When I put 0.0f to 0.333333f as the y coordinates, I get slightly more than the top tile's height displayed. However, if I put 0.0f-0.25f, I get exactly 1/3rd, which is my tile's height. I have yet to find a magic number for the X coordinates, but maybe someone could explain to me why 1/4 of 96 is 24 according to textures coordinates, or what I'm doing wrong? I'm suspecting it could be a clash between my quad size and textures.
The tops of the cubes are using the texture coordinates (0.0, 0.0f), (0.0, 0.333333f), (0.166666f, 0.333333f), (0.166666f, 0.0f), which is applied moving anticlockwise from the top left to the top right. Again, the main texture file is 32x32 tiles arranged to make 192x96(96 is the height).
Notice I placed a white line at the top of one of the tiles to see its border, and black line at it's bottom, then a white line for the top of the next one below it. The texture 'bleeds' too far down. The other textures have their own even stranger coordinates, as you can see.
Arranging texture coordinates with the assumption the top of the image is 1.0 rather than the bottom produces odd squares with a rectangular hole in the center where quads should be.
I am using TEX_ENV GL_MODULATE.
Texture sizes are usually a power of 2. I suspect something resized your 192x96 texture as a 256x128 or 256x256 texture. This doesn't really explain the values you found however... But, I think, if you resize your texture to 256x256(increase the size, don't scale!) and calculate your texture coordinates based on that, your problem will go away.
I don't know about java but with my image atlases in objective C and openGL ES you need to make the textures smaller than what you are referring to when selecting them from the atlas.
Have you left a sufficient gap between the texture images to prevent 'bleeding?

Categories