Java method Point rot90() - java

I am trying to figure out how to implement the following method in java;
** Point rot90()** Query for a new Cartesian Point equivalent to this Cartesian point rotated by 90 degrees
I have no idea how to go about creating this method. However, I do believe that pulling the point (x,y) and outputting new point (y,x*-1) is equivalent to rotating 90 degrees. Basically the old y coordinate becomes the nee x coordinate and the new y coordinate is the old x coordinate multiplied by negative 1. Any thoughts on how to set up this method would be greatly appreciated. Thanks.
this is what I have so far
public Point rot90(){
Point rotated = new Point();
rotated.yCoord = xCoord*-1;
rotated.xCoord = yCoord;
return rotated;
}
I know this doesn't work as pointed out when I try to compile it. Any suggestions?

Your method needs an argument.
public Point rot90(Point p){
Point rotated = new Point();
rotated.yCoord = -p.xCoord;
rotated.xCoord = p.yCoord;
return rotated;
}
If your Point class has a constructor that can take coordinates then you can make it shorter.
public Point rot90(Point p){
return new Point(p.yCoord, -p.xCoord);
}

What exactly is the problem? Code not working or the result not as expected?
In the complex plane interpretation of Euclidean geometry, rotation (x,y) by 90° is, by definition of the complex unit, the multiplication of x+i·y by i. Since
i·(x+i·y)=-y+i·x,
the rotated point has the coordinates (-y,x), and your code implements the rotation by 270° =^= -90°.
In general, the rotation by an angle a amounts to the multiplication
(cos(a)+i*sin(a)) * (x+i*y)
Note that the screen coordinate system is the mirror image of the Cartesian plane, the y axis points down. Thus angle orientation is reversed.

Related

How to draw a circle on a map according to its real radius in kilometers (Using UnfoldingMaps and Processing in Java)

Earthquake threat circle on the map
I am using UnfoldingMaps to display earthquake information on the map.
I plan to show the threat circle on the map.
A circle is drawn given its radius and center position in pixels. How to get the radius is the problem I met.
Suppose I have the threat circle radius R in kilometers and the center marker A.
I want to create a marker B on the circle so that I can use the screen distance as the screen radius.
I decided to create B with the same longitude but a different latitude from A. I change R to delta latitude.
But after drawing the circle I found it is not the right one since the red triangular should be in the circle according to their distance.
The main difficulty is exactly how to calculate screen radius according to kilometers.
public void calcThreatCircleOnScreen(UnfoldingMap map) {
float radius = 0;
float deltaLat=(float) (threatCircle()/6371/2/3.1415927*360);
Location centerLocation = this.getLocation();
Location upperLocation = new Location(centerLocation);
upperLocation.setLat(centerLocation.getLat() + deltaLat);
SimplePointMarker upperMarker = new SimplePointMarker(upperLocation);
ScreenPosition center = this.getScreenPosition(map);
ScreenPosition upper = upperMarker.getScreenPosition(map);
radius = Math.abs(upper.y - center.y);
setThreatCircleOnScreen(radius);
}
This is going to depend on two things: the zoom level of the map, and the projection you're using.
You need to unproject kilometers to pixels, and you can probably figure out how to do that using google and the Unfolding API.
For example, I found a MercatorProjection class that contains a constructor that takes a zoom level, and methods for projecting and unprojecting points between world coordinates and pixel coordinates.
That's just a starting point, since I'm not sure what units those methods are taking, but hopefully this is a direction for you to take your googling and experimenting.
I'd recommend trying to get something working and posting an MCVE if you get stuck. Good luck.
Now I have the answer for this question. Hope it will be helpful for others.
Earthquake threat circle on the map
My early solution to calculate radius in pixels from km is correct. I think it a simple and powerful idea (independent of projecting API)
The only problem is I should use diameter rather than radius in drawing the circle. I should draw with d=2r like this
float d = 2 * threatCircleRadius();
pg.noFill();
pg.ellipse(x,y,d,d);
I found another cleaner solution like below by consulting the author of UnfoldingMaps. (https://github.com/tillnagel/unfolding/issues/124)
My early solution first changes distance to delta latitude, then create new location by changing latitude.
The new solution use the API GeoUtils.getDestinationLocation(sourceLocation, compassBearingDegree, distanceKm) to directly get the new location!
In addition, I needn't create a new marker to find its screen position.
public void calcThreatCircleOnScreen(UnfoldingMap map) {
float radius = 0;
Location centerLocation = this.getLocation();
Location upperLocation = GeoUtils.getDestinationLocation(centerLocation, 0, threatCircle());
//SimplePointMarker upperMarker = new SimplePointMarker(upperLocation);
ScreenPosition center = map.getScreenPosition(centerLocation);
ScreenPosition upper = map.getScreenPosition(upperLocation);
radius = PApplet.dist(center.x, center.y, upper.x, upper.y);
setThreatCircleOnScreen(radius);
}

Raytracer warps cube based on camera angle?

Essentially, what is happening is there is some strange warping of the 3D cube being rendered by my raytracer, which continues to worsen as the camera moves up, even if the cube is in the same location on the screen.
The code is at http://pastebin.com/HucgjRtx
Here is a picture of the output:
http://postimg.org/image/5rnfrlkej/
EDIT: Problem resolved as being that I was just calculating the angles for vectors wrong. The best method I have found is creating a vector based on your FOV (Z) current pixel X, and current pixel Y, then normalizing that vector.
It looks like you're calculating rays to cast based on Euler angles instead of the usual projection.
Typically a "3D" camera is modeled such that the camera is at a point with rays projecting through a grid spaced some distance from it... which is, incidentally, exactly like looking at a monitor placed some distance from your face and projecting a ray through each pixel of the monitor.
The calculations are conceptually simple in fixed cases.. e.g.
double pixelSpacing = 0.005;
double screenDistance = 0.7;
for (int yIndex= -100; yIndex<= 100; yIndex++)
for (int xIndex= -100; xIndex<= 100; xIndex++) {
Vector3 ray = new Vector3(
xIndex * pixelSpacing,
yIndex * pixelSpacing,
screenDistance
);
ray = vec.normalize();
// And 'ray' is now a vector with our ray direction
}
You can use one of the usual techniques (e.g. 4x4 matrix multiplication) if you want to rotate this field of view.

Writing a Raytracer, and perspective viewing system not displaying properly?

In a ray tracer I'm writing I've just tried to implement a proper perspective viewing system, however something seems to be wrong with it and I can't seem to figure out what is happening.
For example with a scene with three spheres, green, red and blue at (-150.0,0.0,0.0), (0.0,0.0,0.0) and (150.0,0.0,0.0) respectively and a camera at (0.0,0.0,-600.0), pointed at the centre sphere I get this image:
Which seems about right.
If I move the camera to (0.0, 600.0,-600.0), still pointed at the centre sphere, I would expect to get a similar image since I haven't moved left or right. However this is what is rendered:
Which doesn't make any sense to me.
This is my perspective porjector class:
public class PerspectiveProjector extends Projector{
public Point3D camera;
public Point3D sceneLocation;
public double distance;
public Vector3D u, v, w;
public PerspectiveProjector(Point3D camera, Point3D sceneLocation, double FOV){
this.camera = new Point3D(camera);
this.sceneLocation = new Point3D(sceneLocation);
this.distance = RayTracer.world.viewPlane.height/2/Math.tan(Math.toRadians(FOV));
uvw();
}
private void uvw() {
w = camera.subtractVector(sceneLocation);
w.normalise();
//prob
u = new Vector3D(0.00424,1.0,0.00764).cross(w);
u.normalise();
v = w.cross(u);
v.normalise();
}
public Ray createRay(Point2D point) {
Ray ray = new Ray(new Point3D(camera), u.multiply(point.x).add(v.multiply(point.y).subtract(w.multiply(distance))));
ray.direction.normalise();
return ray;
}
}
If you would like to see any more of my code please let me know.
I don't really understand what you're doing to get your u value, so here's how I would solve that problem.
In order to make the X coordinate of the image be horizontal, the u vector needs to lie in the XY plane, so the z coordinate must be zero. To keep the X coordinate perpendicular to the direction the camera direction, which, in your code, is w, their dot product must be zero. A vector like this must have the coordinates:
u = Vector3D(w.y,-w.x,0) or
u = Vector3D(-w.y,w.x,0)
This works because the dot product of the two vectors must algebraically be zero.
(x,y,z)*(y,-x,0) = xy + -yx + 0z = xy - xy + 0 = 0; therefore, (y,-x,0) must be a point on the xy plane perpendicular to (x,y,z).
Replace that one line and tell me if it works

How to rotate points without moving them?

I need to rotate a triangle so that it lies on a plane given by a normal n and a constant d.
I have the normal n1 of the plane that the two triangles lie in. Now i need to rotate the right red triangle so that it results in the orange one.
The points of the triangles and the normals are stored as 3-dimensional vectors.
Until now, I did the following:
Get the normalized rotation quaternion (rotQuat) between n1 and n2.
Multiply every point of the triangle by the quaternion. Therefore I convert the point to a quaternion(point.x, point.y, point.z, 0) and do the multiplcation as follows: resultQuat = rotQuat * point * conjugate(rotQuat). I then apply x, y and z of the result to the point.
This is how i get the rotation between two vectors:
public static Quaternion getRotationBetweenTwoVector3f(Vector3f vec1, Vector3f vec2)
{
Vector3f cross = Vector3f.cross(vec1, vec2);
float w = (float) (java.lang.Math.sqrt(java.lang.Math.pow(vec1.getLength(), 2) * java.lang.Math.pow(vec2.getLength(), 2)) + Vector3f.dot(vec1, vec2));
Quaternion returnQuat = new Quaternion(cross.x, cross.y, cross.z, w);
returnQuat.normalize();
return returnQuat;
}
The problem is that the triangle has the correct orientation after the rotation, but the triangle also moves it's position. I need a rotation that rotates the triangle so that it's still connected to the two points of the left red triangle (like the orange one).
How is this possible?
Your problem is that rotation matrix/quaternions rotate points around an axis that passes through the origin. To rotate around different point than the origin, you need to translate the triangle points to the origin (just Substract the rotation point value from the triangle points), then multiply by the quaternion and then translate back.
So the algorithm becomes:
translatedPoints[i] = triPoints[i] - rotationPoint;
translatedPoints rotate using quaternion
translate translatedPoints back by adding the rotation point value.

Getting bullet X to Y movement ratio from 2 points

I'm making pretty simple game. You have a sprite onscreen with a gun, and he shoots a bullet in the direction the mouse is pointing. The method I'm using to do this is to find the X to Y ratio based on 2 points (the center of the sprite, and the mouse position). The X to Y ratio is essentially "for every time the X changes by 1, the Y changes by __".
This is my method so far:
public static Vector2f getSimplifiedSlope(Vector2f v1, Vector2f v2) {
float x = v2.x - v1.x;
float y = v2.y - v1.y;
// find the reciprocal of X
float invert = 1.0f / x;
x *= invert;
y *= invert;
return new Vector2f(x, y);
}
This Vector2f is then passed to the bullet, which moves that amount each frame.
But it isn't working. When my mouse is directly above or below the sprite, the bullets move very fast. When the mouse is to the right of the sprite, they move very slow. And if the mouse is on the left side, the bullets shoot out the right side all the same.
When I remove the invert variable from the mix, it seems to work fine. So here are my 2 questions:
Am I way off-track, and there's a simpler, cleaner, more widely used, etc. way to do this?
If I'm on the right track, how do I "normalize" the vector so that it stays the same regardless of how far away the mouse is from the sprite?
Thanks in advance.
Use vectors to your advantage. I don't know if Java's Vector2f class has this method, but here's how I'd do it:
return (v2 - v1).normalize(); // `v2` is obj pos and `v1` is the mouse pos
To normalize a vector, just divide it (i.e. each component) by the magnitude of the entire vector:
Vector2f result = new Vector2f(v2.x - v1.x, v2.y - v1.y);
float length = sqrt(result.x^2 + result.y^2);
return new Vector2f(result.x / length, result.y / length);
The result is unit vector (its magnitude is 1). So to adjust the speed, just scale the vector.
Yes for both questions:
to find what you call ratio you can use the arctan function which will provide the angle of of the vector which goes from first object to second object
to normalize it, since now you are starting from an angle you don't need to do anything: you can directly use polar coordinates
Code is rather simple:
float magnitude = 3.0; // your max speed
float angle = Math.atan2(y,x);
Vector2D vector = new Vector(magnitude*sin(angle), magnitude*cos(angle));

Categories