Checking depth/z when rendering triangular faces in 3d space - java

My question can be simplified to the following: If a 3d triangle is being projected and rendered to a 2d viewing plane, how can the z value of each pixel being rendered be calculated in order to be stored to a buffer?
I currently have a working Java program that is capable of rendering 3d triangles to the 2d view as a solid color, and the camera can be moved, rotated, etc. with no problem, working exactly how one would expect it to, but if I try to render two triangles over each other, the one closer to the camera being expected to obscure the farther one, this isn't always the case. A Z buffer seems like the best idea as to how to remedy this issue, storing the z value of each pixel I render to the screen, and then if there's another pixel trying to be rendered to the same coordinate, I compare it to the z value of the current pixel when deciding which one to render. The issue I'm now facing is as follows:
How do I determine the z value of each pixel I render? I've thought about it, and there seem to be a few possibilities. One option involves finding the equation of the plane(ax + by + cz + d = 0) on which the face lies, then some sort of interpolation of each pixel in the triangle being rendered(e.g. halfway x-wise on the 2d rendered triangle -> halfway x-wise through the 3d triangle, same for the y, then solve for z using the plane's equation), though I'm not certain this would work. The other option I thought of is iterating through each point, with a given quantum, of the 3d triangle, then render each point individually, using the z of that point(which I'd also probably have to find through the plane's equation).
Again, I'm currently mainly considering using interpolation, so the pseudo-code would look like(if I have the plane's equation as "ax + by + cz + d = 0"):
xrange = (pixel.x - 2dtriangle.minX)/(2dtriangle.maxX - 2dtriangle.minX)
yrange = (pixel.y - 2dtriangle.minY)/(2dtriangle.maxY - 2dtriangle.minY)
x3d = (3dtriangle.maxX - 3dtriangle.minX) * xrange + 3dtriangle.minX
y3d = (3dtriangle.maxY - 3dtriangle.minY) * yrange + 3dtriangel.minY
z = (-d - a*x3d - b*y3d)/c
Where pixel.x is the x value of the pixel being rendered, 2dtraingle.minX and 2dtriangle.maxX are the minimum and maximum x values of the triangle being rendered(i.e. of its bounding box) after having been projected onto the 2d view, and it's min/max Y variables are the same, but for its Y. 3dtriangle.minX and 3dtriangle.maxX are the minimum and maximum x values of the 3d triangle before having been projected onto the 2d view, a, b, c, and d are the coefficients of the equation of the plane on which the 3d triangle lies, and z is the corresponding z value of the pixel being rendered.
Will that method work? If there's any ambiguity please let me know in the comments before closing the question! Thank you.

The best solution would be calculating the depth for each vertex of the triangle. Then we are able to get the depth of each pixel the same way we do for the colors when rendering a triangle with Gouraud shading. Doing that simultaneously with rendering allows to check the depth easily.
If we have a situation like this:
And we start to draw lines from the top to the bottom. We calculate the slopes from the point one to the others, and add the correct amount of depth every time we move to the next line... And so on.
You did't provide your rendering method, so can't say anything specific to it, but you should take a look at some tutorials related to Gouraud shading. Do some simple modifications to them and you should be able to use it with depth values.
Well, hopefully this helps!

Related

Determining IF & WHERE a line intersects with a 2D plane (in 3D space)

The following problem im working on is for one of my favorite past-times: game development.
Problem:
We're in 3D space. I'm trying to determine if a line between two vectors in said space is passing through a circle; the latter of which consists of: center vector, radius, yaw & pitch.
In order to determine that, my aim is to convert the circle to a plane which can either be infinite or just have the diameter of the circle for all it's sides.
Should the line between the two vectors in fact pass through that plane, i am left with the simple task of determining wether that intersection point is within the radius of the circle, in which case i can return either true or false.
What's already working:
I have my circles set up and the general framework is there. The circles are appearing/rendered in the 3D space exactly as specified, great!
What was already tried:
Copied some github gist codes and tried to make them work for my purposes. I kinda worked, sometimes at least. Unfortunately due to the nature of how the code was written, i had no idea what it was doing and just scrapped all of that.
Researched the topic a lot, too. But due to me not really understanding the language people speak when talking about line/plane intersections, i could have read the answer without recognizing it as such.
Question:
I'm stuck at line intersections. No idea where to go and how it works logically! So, where do i go from here and how can one comprehend all of this?
Note:
I did tag this issue as "java", but i'm not looking for spoon-fed code. It's a logical issue i'm trying to get past. If explained well enough, i will make the code work with trial and error!
Say if your circle is a circle in the XY plane with its centre on (0,0,0) and radius 1. How would you solve that?
You would check the values of X and Y when Z is equal to zero. And X squared plus Y squared would be less than 1 (radius squared) if the line passes through the circle.
In other words, you could transform the 3D coordinates to a simpler reference frame. So I think you need to learn transformation of 3D coordinates, which is really not too hard to do. You need to rotate the 3D space around until the centre vector only has a Z component, and yaw and pitch are zero. And then offset the coordinates so the circle centre is in (0, 0, 0). Then apply the same transformation to the line. You could lastly scale by radius, but to be honest that is not so important since the circle math is easy.

Calculate meter offset based on decimal degree position

I have a position given in decimal degrees (x.xxxxxxxx and y.yyyyyyyy). I need to draw a rectangle around it. The center of the rectangle matches the position. The dimensions of the rectangle is given in meters and it has a rotation ranging from 0-360 degrees.
Question
How can I calculate the four corners of the rectangle and return the result as four decimal degree values? Like arrayOf<LatLon> getRectangle(LatLon position, int rectWidthCm, int rectLengthCm, double rectRotation).
Example
I have a position given in LatLon format with two two values: latitude and longitude. We will assume this location is precise.
The main task is to draw a rectangle based on this position in a Google Maps chart. The rectangle can have any dimentions but let's use these in this example: Width = 0.9 meter and Length = 1.2 meters. Any heading may also be given so lets use this heading: 45. 0 Is north and going clockwise round (east = 90, south = 180 and west = 270). When the rectangle is pointing north it has the length in the north/south direction. Finally, the rectangle center should be equal to the given position.
Note: The project setup is an Android application with Kotlin support and a google maps chart. I am interested in a modern approach to this problem. Regarding precision loss it should at most be within centimeters.
I understand that you are looking for a function geo_rect(x,y,w,h,a) with the following parameters
x is the longitude according to WGS84
y is the latitude
w is the width of the rectangle in meters
h is the height of the rectangle in meters
a is the angle to which the rectangle is turned from w being horizontal (meaning pointing exactly West to East). I suggest to allow values ranging within the open interval (-90°,90°) as this makes the math either to understand.
Your function getRectangle(LatLon position, int rectWidthCm, int rectLengthCm, double rectRotation) deliver all the required information, you need a small wrapper function which determines w, h, and a from rectWidthCm, rectLengthCm and rectRotation, with the latter being within [0°,360°).
The function geo_rect() will return an arrayOf<LatLon> of length four, namely the coordinates of all four corners, starting on the top left and then going clockwise. We will refer to the points as P_NE,P_NW,P_SE, and P_SW respectively.
Assumptions
In order to keep things mathematically feasible, we make some assumptions
We assume that we can use as approximation that the rectangle is a plane, which is okay if w ~ h << r with r = 6378 km being the radius of the Earth.
We further assume that the Earth is a ideal sphere rather than an ellipsoid or even more bumpy. For an accessible article on that issue, see e.g. Zachary C. Eilon's blog
Basic structure of the algorithm
The algorithm could be structured as follows:
Determine the distance d from (x,y) to all four end points. Because of our first assumption we can use simple Euclidian geometry rather than intricate Spherical geometry. Pythagoras holds: d^2 = (w/2)^2 + (h/2)^2.
We also need the four bearings, e.g. b_NW for the angle between the vector pointing to the North Pole and the vector pointing from (x,y) to point P_NW.
Given the information (x,y,d,b_NW, b_NE, b_SW, b_SE) from the previous steps, we can now follow Get lat/long given current point, distance and bearing to calculate the position of all four points. This is the mathematically hard part where I suggest to use a well-established and tested library for.
Last but not least, let us double-check whether the calculation went well by evaluating Great circle distances between some or all pairs of points. For instance d(P_NE,P_NW) should approximately be w, d(P_NW,P_SW) should approximately be h. Don't be surprised if there is actually a difference - this errors are due the assumptions we made. Normal GPS under usual conditions will anyhow not allow you to determine your position up to the centimeter, you will need DPGS for that.
Further reading
At https://www.movable-type.co.uk/scripts/latlong-vectors.html you can experiment online to determine a destination point along a great-circle given the distance and bearing from a start point (in our case: the center of the rectangle).
Old, but amazingly documented and well tested tool kit for geo-applications in general are the https://www.generic-mapping-tools.org/ - you might want to look at the command gmtvector.
If you are looking for java implementations, I found e.g.
https://introcs.cs.princeton.edu/java/12types/GreatCircle.java.html on of many implementations for calculating great circle distances
Need a standalone Java library for performing spatial calculations on lat/lon data
Calculate point based on distance and direction

can't implement the movement of 2d mechanism

I implement Chebyshev walking mechanism, like this
And I've got a problem so the edges of the mech don't move like they're meant to
For now I have a GUI with some controls using Java8 Swing, it draws a mech, but movement is a problem as i said
here is my GitHub and the class with a problem method DFS_movement()
So I want this mech to move like the actual one with the constant lengths of the edges an all this stuff
Maybe you need the formulas, i.e. the equations of the position (x,y) of the end that moves (almost) along a straight line with respect to the rotation angle a (that describes the circular motion of the "first" bar)? Here the origin of the coordinate system is at the point of rotation of the first bar and the rotation angle a is the angle between the first bar and the horizontal x-axis. If that is the case, the equations are:
x = 2*A - 2*A*sqrt( (5 + cos(a))/(5 - 4*cos(a)) )*sin(a)
y = 2*A*sqrt( (5 + cos(a))/(5 - 4*cos(a)) )*(2 - cos(a))
A is the length of the first barm the one that rotates around its fixed end, attached to the origin of the coordinate system. The distance between the origin and the other fixed point of the linkage is 2A.

How to determine the distance from an obstacle without knowing its location

I am writing a code where I have a world filled with various obstacles (of rectangular shapes). My robot which is a circle, originates randomly at any place inside the world. I assume that it has a range sensor on its head and want to get the distance between the nearest obstacle/boundary wall which is in its straight line of view.
I am using a random orientation between 0 and 360 degrees to orient the robot and use sin and cos of orientation to move the robot in the same orientation. But how can I get the distance between any obstacle or the boundary wall along this orientation? It should be able to tell me the distance of the first object it encounters in its vision which would be an angle from 0 to 360.
Please provide me a hint of logic how to encounter this issue?
Thanks
Assuming you know the angle, the robot's position and the position of all the obstacles, you could have a function like this:
if the angle if less than 90 or greater than 270 you increment the x coordinate by 1, otherwise you decrement by 1
you make a for loop from the current x coordinate until the edge of the world (I don't know how you have the world implemented), scanning for any obstacles at position (x, x*tan(angle)), incrementing or decrementing in accordance with the step above
the first obstacle you run across, return sqrt(x^2 + (x*tan(angle))^2) - that's just the pythagorean theorem
Here's what i think you could do.
In real game development, they uses a lot of optimization tricks, often giving approximates for better performances.
Also note that there's a lot of libraries out there for game development, that probably could get you what you want a lot simplified.
But anyway, here's what i'ld do.
identify object you'd pass through if you go straight forward.
identify the nearest one, in the list of objects you just made.
1:
A)
make a formula for your position/angle in the form y = mx + b
[y = tan(angle)x + (positionY - (tan(angle)*x))]
B)
for each object, divide the object in multiple lines segments (2 points).
check if the segment crosses the line made by the formula in point A
(if a point is smaller and the other is greater than the same X value in the formula, it's crossing)
do the same thing for your world boundaries
2: This part is more tricky (to do in programmation).
Next, you have to find the position where your robot orientation formula intersect
with all the lines you previously identified.
For each line, you must again turn the line into a y=mx+b
Let say we have:
y=3x+5 and
y=5x+1
3x+5 = 5x+1
3x-5x = 1-5
-2x = -4
x = 2
Then you replace x with 2 in either formula, you'll get the intersection point:
y = 3(2)+5 = 11
y = 5(2)+1 = 11
So these two lines intersect on point (2, 11)
Next you have to see if that point is in the domain of you're robot path formula.
Since your robot is looking at a single direction, and the formula we made in point 1.A is infinite in both directions, you must ensure the line intersection you found is not in the back of your robot (unless he moves backward...)
I guess you can make it simple, look at cos(angle) signs, then look at the position of the intersection point, if it's to the left of your robot, and cos(angle) is negative it's fine.
Finally,
Once you found ALL the intersect point, you can find the nearest one by using the Pythagorean theorem sqrt((x1-x2)^2 + (y1-y2)^2)
Also, note that it won't work for 90 and 270 angles, since tan(90) doesn't exists.
In that case, just look if both points of the segments are at both side of your robot, and the intersect point is in the right direction, it means you pass through it.
Again, there's a lot of place for optimization.

algorithm for visualizing gravity distortion (2D)

I'm working on an Android game and would like to implement a 2D grid to visualize the effects of gravity on the playing field. I'd like to distort the grid based on various objects on my playing field. The effect I'm looking for is similar to the following from the Processing library:
Except that my grid will be simpler- 2D, and viewed strictly from the top, as if looking down at the playfield.
Can someone point me to an algorithm for drawing such a grid?
The one idea that I came up with was to draw the lines as if they were "particles"- start at one end of the screen and draw the line in multiple segments, treating each segment as a particle, calculating the effect of gravity at each segment's location.
The application is intended to run on Android.
Thanks
I would draw each line as a separate segment, as you mentioned. If the grid is sparse, it might be fastest.
If you are viewing the grid from above, you would need to calculate x and y coordinate displacements. The easiest way would be to actually do displacement along the z axis and then fake perspective with x_result = x/z and y_result = y/z . You set z=1 and make sure to vary it only relatively slightly (+- 0.1 for instance).
Your z should be proportional to the sum of 1/(distance to the sphere)^2. This simulates how gravity works - it tapers off with square of the distance. Great news - square of the distance means to calculate delta_x^2 + delta_y^2 - so you save yourself that square root calculation == faster.

Categories