3D rotation ratios via mouse placement on screen - Java - java

I am making a 3D game in which the player can rotate their view point via the mouse to look around the environment. I firstly just did x and y rotation via vertical and horizontal movement of the mouse and z via another control. But after playing the game I realised it did not rotate correctly. NOTE: I have a global variable matrix which represents the player's angle (3x1), at 0,0,0 it seems to work correctly as up or down is a direct x axis rotation and right or left is a direct y axis rotation, but if I move my camera diagonally for example then left doesn't directly correlate to a y axis rotation anymore.
Visually on a unit circle the players viewpoint wouldn't travel the full circumference anymore and would travel in a circle that is smaller that the circumference. This is the current code (x and yRateOfRot is the ratio of how far away from the centre the cursor is in each direction between -1 and 1):
private static void changeRotation(){
angle.set(Matrix.add(angle.matrix,new double[][]{
{ROTATION_SPEED * camera.xRateOfRot()},
{ROTATION_SPEED * camera.yRateOfRot()},
{ROTATION_SPEED * camera.zRateOfRot()}}));
}
I have looked at this source http://paulbourke.net/geometry/rotate/ and understand how to rotate via an arbitrary axis which I could do but I am not sure how to correlate this into getting a ratio to find out what the x,y and z change would be for looking in a specific direction i.e. at 0,0,0 the ratio of looking up would be x:1, y:0, z:0 but then at another angle the ratios would be different as looking up no longer means only an x rotation. Any information would be appreciated, thanks!

Related

Problem with rotation around the center OpenGL

I have a problem with rotation around the center of an object.
I have a plane and I found out its center x,y the z for translation is used for moving.
I want to rotate that around it is center, but not around 0,0,0
But it keeps that way.
Can you give me some advice how to get it work?
glRotatef(moveLeftRight, 1,0,0);
glTranslatef(xT,yT, thrustP);
xT and yT are the points, thrustP is Z like moving. MoveLeftRight - rotation.

How to map 3D coordinates to 2D plane

I'm writing a OpenGL application in which there is a rectangle placed on a 3D object. The 3D object can move around and rotate and the rectangle follows these movements.
What I would like to do is to point with the mouse towards the rectangle so that a dot would appear in that point on the rectangle, and I want the point to follow it as the 3D object that "holds" the rectangle moves around.
I know how to find the intersection on a plane, and I know how to find the world coordinates of the contact point. What I need is a way to convert the world coordinates to the 2D local coordinate system of the rectangle.
For example, suppose I have the plane positioned like this in the 3D world, with a given orientation (sorry, I can't really draw properly):
The black point in the center is the origin of the plane, while the blue point is the one I would like to find. The numbers near the points are their world coordinates; in this example the Z axis "comes out" of the screen.
I would like to map the coordinates of the blue point in the local coordinate system of the plane, like this:
I know that somehow this shouldn't be hard, but I can't find a way at all. Any hints would be appreciated!
You probably already use a transformation matrix to render the rectangle. If you have the 3D position of the point, just transform it with the inverse transformation matrix and you get a 3D position in the rectangle's space. If the local system is aligned with the rectangle, one of the coordinates should always be zero and you can drop it.
I found what I needed in this post here.
Basically the solution for me is to project the point onto the x and y axis of the local coordinate system of the rectangle.
In pseudo code it's like this:
// I compute the direction that go from the origin of the rectangle to its x axis, in world coordinates, and take the norm; same thing for the y axis
var xDir = Norm(mRectangle.LocalToWorld([1, 0, 0]) - mRectangle.LocalToWorld([0, 0, 0]));
var yDir = Norm(mRectangle.LocalToWorld([0, 1, 0]) - mRectangle.LocalToWorld([0, 0, 0]));
// I compute the dot product
mLocalPosition.x = Dot(xDir, (contactPoint - mRectangle.LocalToWorld([0, 0, 0])));
mLocalPosition.y = Dot(yDir, (contactPoint - mRectangle.LocalToWorld([0, 0, 0])));
// I can now set the position of the point converting the local coordinates found to world coordinates
mPoint.SetPosition(mRectangle.LocalToWorld([mLocalPosition.x, mLocalPosition.y, 0]));

Calculating visible radius in Google Maps on Android

I'm trying to figure out how to calculate the largest radius of the screen's visible region.
(Before you say: Hey! Rectangulars don't have a radius! I understand that screen is rectangular (if camera is positioned in 90 degrees and not tilted), I'm talking about the radius of an imaginary circle that wraps the screen, or, the distance of the edge between the center of the screen to one of the corners of the rectangular, which is the same).
So, I understand that in a normal situation when screen is not tilted, I could take the distance between VisibleRegion.latLngbounds.southwest and the center of the screen to find the radius, but when screen is tilted, it becomes trapezoid.
Now, if the screen is an isosceled trapezoid, then I could take VisibleRegion.farLeft for example (which would be equal to farEast) and calculate the distance this way:
VisibleRegion vr = map.getProjection().getVisibleRegion();
Location center = new Location("center");
center.setLatitude(screenCenter.latitude);
center.setLongitude(screenCenter.longitude);
Location farVisiblePoint = new Location("farPoint");
farVisiblePoint.setLatitude(vr.farLeft.latitude);
farVisiblePoint.setLongitude(vr.farLeft.longitude);
float radius = center.distanceTo(farVisiblePoint);
My question is:
Is there any situation with the map that the calculation above will be wrong? Can the trapezoid not be an isosceled one?

Position Vector relative to origin

I have a little tech game I am messing around with and I can't figure out the formula to position 1 object given another objects origin.
So I have a Spaceship and a Cannon. I have the game setup to use units, so 1 unit = 16 pixels (pixel art).
Basically my cannon should be placed 0.5625 units on the X and 0 on the Y relative to the origin of the Spaceship, which is located at 0, 0 (bottom left corner).
The cannon should is independent on the angle of the spaceship, it can aim in different directions rather than being fixed to aim the way of the spaceship.
I have it constantly following the cursor, which works fine. Now when I rotate the Spaceship, obviously the origin of the Spaceship is changing in world coordinates, so my formula to place the cannon is all messed up, like so:
protected Vector2 weaponMount = new Vector2();
weaponMount.set(getBody().getPosition().x + 0.5625f, getBody()
.getPosition().y);
Obviously if I position the ship at a 90° angle, X is going to be different and the cannon would be waaaayyy off the ship. Here is a screenshot example of what I mean:
What would be the formula for this? I have tried using cos/sin but that does not work.
Any ideas?
weaponMount.set(0.5625f,0).setAngle(SpaceshipAngle).add(getBody().getPosition());
Where SpaceshipAngle is the angle of your Spaceship.
The origin of the spaceship is the point, arround which the spaceship will rotate and scale (the Texture of it). The position instead is always the lower left corner of the Texture and does not depend on the rotation.
Your problem is, that your offset does not depent on the rotation of your spaceship.
To take care about this rotation you should store a Vector2 offset, which describes your weapons offset (in your case it is a Vector2(0.5625f, 0)).
Next store a float angle describing your spaceships rotation.
Then you can rotate the offset by using: offset.setAngle(rotation).
The last thing is to set the weapons position. The code for this did not change so much:
weaponMount.set(getBody().getPosition().x + offset.x, getBody()
.getPosition().y + offset.y);

'adding' two angles

Alright, so I got two angles. One is the joystick's angle, and the other is the camera to player angle. The camera's angle. Now I want it so when I press up on the joystick it moves the player away from the camera. How would I do this? And is there a easy way to do it in Java or Ardor3d?
edit: Here is the code of how I get my angles.
float camDegree = (float) Math.toDegrees(Math.atan2(
_canvas.getCanvasRenderer().getCamera().getLocation().getXf() - colladaNode.getTranslation().getXf(),
_canvas.getCanvasRenderer().getCamera().getLocation().getYf()) - colladaNode.getTranslation().getYf());
player.angle = (float) Math.toDegrees(Math.atan2(padX, padY));
Quaternion camQ = new Quaternion().fromAngleAxis(camDegree, Vector3.UNIT_Y);
I have to say that I don't really understand your question, but it seems to be about how to implement camera-relative control using a joystick.
The most important piece of advice I can give you is that it's better not to compute angles, but to work directly with vectors.
Suppose that the camera is looking in the direction v (in some types of game this vector will be pointing directly at the player, but not all types of game, and not always):
Typically you don't care about the vertical component of this vector, so remove it to get the horizontal component, which I'll call y for reasons that will become apparent later:
y = v − (v · up) up
where up is a unit vector pointing vertically upwards.
We can find the horizontal vector that's perpendicular to y using the cross product (and remembering the right hand rule):
x = v × up
Now you can see that y is a vector in the plane pointing forwards (away from the camera), and x a vector in the plane pointing right (sideway with respect to the camera). If you normalise these vectors:
x̂ = x / |x|
ŷ = y / |y|
then you can use x̂ and ŷ as the coordinate basis for camera-relative motion of the player. If your joystick readings are Jx and Jy, then move the player by
s (Jx x̂ + Jy ŷ)
where s is an appropriate scalar value proportional to the player's speed.
(Notice that no angles were computed at any point in this answer!)

Categories