Java 2d exploded pie chart segments - java

I'm trying to add some distance (e.g. 10px) between a segment (arc) of the pie chart and it's center without success, here's what i've tried so far:
int value = 20; // example
double arcAngle = (value * 360 / 100);
double angle = 360 - (arcAngle / 2); // direction to add the distance to (center of arc)
double newX = pieCenterX + Math.cos(angle * Math.PI / 180.0) * 10;
double newY = pieCenterY + Math.sin(angle * Math.PI / 180.0) * 10;
// then drawing the arc with new x and y
g.fill(new Arc2D.Double(newX, newY, bounds.getWidth(), bounds.getHeight(), startAngle, arcAngle, Arc2D.PIE));
Ideally i should end up with something like that:
I don't know much on how to approach this, so my code was taken from examples i found elsewhere.

Usually zero angle is OX direction (right). So you have to make correction by 90 degrees (if your coordinate system is counterclockwise)
double angle = 90 + 360 - (arcAngle / 2);

Related

How i can draw projection line from camera to 3D area?

I need a line that would be part of the user interface, but it always pointed to a specific place in 3D space.
To do this, I try:
double camY, camX;
camX = cameraX * -1;
if(camX > 90)
camX = 180 - camX;
if(camX < -90)
camX = -180 - camX;
camY = cameraY;
double camY2 = camY;
if(camY > 90)
camY2 = 180 - camY;
if(camY < -90)
camY2 = -180 - camY;
double x1;
double y1;
double x2 = x * (90.0 - Math.abs(camY)) / 90.0,
y2 = (y * (90.0 - Math.abs(camY)) / 90.0);
if(vertical) {
x1 = x2 + (y * (camY2 / 90.0) * ((90 - camX) / 90));
y1 = (y2 * ((90 - camX) / 90)) - (x * (camY2 / 90.0));
}else{
x1 = x2 + (y * (camY2 / 90.0));
y1 = y2 - (x * (camY2 / 90.0));
y1 = y1 * (camX / 90.0);
}
GL11.glVertex2d(x1, y1);
GL11.glVertex2d(toX, toY);
GL11.glVertex2d(toX, toY);
GL11.glVertex2d(max, toY);
Where x and y - coordinate of point on 3D space. cameraX and cameraY - angle of camera rotate. toX`` andtoY``` destination point on the camera plane (user interface).
All this code runs before the camera (
GL11.glOrtho(-max, max, -1, 1, 10, -10);
glRotatef(cameraX, 1f, 0f, 0);
glRotatef(cameraY, 0f, 1f, 0);
) and before GL11.glMatrixMode(GL11.GL_MODELVIEW);. Therefore, it ignores the Z coordinate.
Initially, I had only the last 4 lines of code, but then when the camera was rotated, the entire line moved behind it. So I added the rest of the calculations.
This partially solves my problem.
Initial position:
Camera rotation on the y axis:
Small line deviations are already visible.
Camera rotation on the x and y axis:
As you can see in the screenshots, the red line still shifts when the camera rotates.
How can I make it always be at the point I need in spaces? (In the center of the red circle in the screenshots)
You should:
Project the 3D point you want the line to end to the screen manually. For that
Get the model-view and projection matrices with glGetFloatv, GL_MODELVIEW_MATRIX and GL_PROJECTION_MATRIX.
Multiply your 3d point by the matrices, perform a perspective division, and convert to your viewport coordinates.
Draw a 2d line from the UI location you want it to begin to the projected 2d location you want it to end.

Unable to properly calculate a normal/direction vector from pitch and yaw

My pitches and yaws are messed up.
I have the pitch and yaw of the pad but the beam's pitch and yaw are messed up.
How do i calculate the normal vector for the pad's pitch and yaw? I tried a crap load of math off of stackoverflow but they all failed so far.
What I first tried was adding 90 to the pitch of the pad but the yaw stayed messed up:
And this is what happens when I used to the pad's pitch and yaw as is to calculate the direction vector:
What I tried doing next was split the beam's pitch and yaw from the pad's pitch and yaw and have them both separately calculated. That mostly worked but the yaw was still completely messed up.
What I used to calculate the direction vector from yaw and pitch of the beams was a util minecraft uses to do so for mobs:
public static Vec3d getVectorForRotation3d(float pitch, float yaw) {
float f = MathHelper.cos(-yaw * 0.017453292F - (float) Math.PI);
float f1 = MathHelper.sin(-yaw * 0.017453292F - (float) Math.PI);
float f2 = -MathHelper.cos(-pitch * 0.017453292F);
float f3 = MathHelper.sin(-pitch * 0.017453292F);
return new Vec3d((double) (f1 * f2), (double) f3, (double) (f * f2));
}
But that failed obviously, so lastly, i tried the following using the pad's pitch:
double pitch = ((te.getPadPitch() + 90) * Math.PI) / 180;
double yaw = ((te.getPadYaw() + 90) * Math.PI) / 180;
double x = Math.sin(pitch) * Math.cos(yaw);
double y = Math.sin(pitch) * Math.sin(yaw);
double z = Math.cos(pitch);
Vec3d lookvec = new Vec3d(x, y, z);
And this worked perfectly for the yaw but failed for the pitch
the pitch and yaw are both calculated in the way the player head rotates.
The pad's pitch and yaw are 100% correct when I use them on the pad itself but mess up when I use them on the beam. These are both using GL functions
Although the pitch and yaw don't respect the player's head's orientation system, it works with the pad.
For example, this is the yaw of the mirror in this pic and it's perfect for it's current value
And the pad is rotated like this:
GlStateManager.rotate(te.getPadYaw(), 0, 0, 1);
GlStateManager.rotate(te.getPadPitch(), 1, 0, 0);
And the line is drawn as such:
public static void drawConnection(BlockPos pos1, BlockPos pos2, Color color) {
GlStateManager.pushMatrix();
GL11.glLineWidth(1);
GlStateManager.disableTexture2D();
GlStateManager.color(color.getRed(), color.getGreen(), color.getBlue(), 0.7f);
GlStateManager.translate(0.5, 0.7, 0.5);
VertexBuffer vb = Tessellator.getInstance().getBuffer();
vb.begin(GL11.GL_LINES, DefaultVertexFormats.POSITION);
vb.pos(pos2.getX() - pos1.getX(), pos2.getY() - pos1.getY(), pos2.getZ() - pos1.getZ()).endVertex();
vb.pos(0, 0, 0).endVertex();
Tessellator.getInstance().draw();
GlStateManager.enableTexture2D();
GlStateManager.popMatrix();
}
I'm getting the pos1 and pos2 like so [CURRENTLY, MOST RECENTLY]:
double pitch = ((te.getPadPitch() + 90) * Math.PI) / 180;
double yaw = ((te.getPadYaw() + 90) * Math.PI) / 180;
double x = Math.sin(pitch) * Math.cos(yaw);
double y = Math.sin(pitch) * Math.sin(yaw);
double z = Math.cos(pitch);
Vec3d lookvec = new Vec3d(x, y, z);
Vec3d centervec = new Vec3d(te.getPos().getX() + 0.5, te.getPos().getY() + 0.8, te.getPos().getZ() + 0.5);
Vec3d startvec = centervec.add(lookvec);
Vec3d end = startvec.add(new Vec3d(lookvec.xCoord * 30, lookvec.yCoord * 30, lookvec.zCoord * 30));
RayTraceResult result = te.getWorld().rayTraceBlocks(startvec, end, true, false, true);
Utils.drawConnection(te.getPos(), result.getBlockPos(), Color.RED);
How do i calculate the normal vector or a vector that's perpendicular to the pad properly from the pitch and yaw of the pad?
I'm at a loss at this point because I tried nearly everything I found on google for the most part with no luck.
EDIT: I've been told that splitting the beam pitch and yaw from the pad's pitch and yaw shouldn't be necessary, and I agreed but I just can't get it to work otherwise. Why is the beam drawing math different from the pad math?
Im not sure if this is your problem but for me mc returned some messed up yaw positions when i was dealing with them this is how i got that fixed
if (Yaw < -180.0) Yaw += 360;
else if (Yaw > 180) Yaw -= 360;
Minecraft yaw rotations are a bit wierd. They are not clamped between 0 and 360. For example, if you rotate your head to the right for a very long time, you can end up with rotations that are over a 1000 degrees. When clamped between 0 and 360 degrees, 1000 degrees becomes 280 degrees. If the number is negative, then you should subtract the number from 360. This is an example on clamping an angle between 0 and 360:
public static float clamp(float angle) {
angle = angle % 360; // Gets the remainder when dividing angle by 360
if(angle < 0) {
angle = 360 - angle;
}
return angle;
}
If you set your angles with this, things will always be between 0 and 360. Also, this is a snippet directly from MCP source code where rotation yaw/pitch was translated into a rotation vector:
/**
* Creates a Vec3 using the pitch and yaw of the entities rotation.
*/
protected final Vec3 getVectorForRotation(float pitch, float yaw)
{
float f = MathHelper.cos(-yaw * 0.017453292F - (float)Math.PI);
float f1 = MathHelper.sin(-yaw * 0.017453292F - (float)Math.PI);
float f2 = -MathHelper.cos(-pitch * 0.017453292F);
float f3 = MathHelper.sin(-pitch * 0.017453292F);
return new Vec3(f1 * f2, f3, f * f2);
}

Why isn't my method to rotate a point around another point working?

I have a method in my android app that looks like this:
//get point after rotation
public static PointF getRotatedPoint(PointF pt,PointF center, float degrees)
{
double angleInRadians = degrees * (Math.PI / 180);
pt.x = (float) (Math.cos(angleInRadians) * (pt.x-center.x) - Math.sin(angleInRadians) * (pt.y-center.y) + center.x);
pt.y = (float) (Math.sin(angleInRadians) * (pt.x-center.x) + Math.cos(angleInRadians) * (pt.y-center.y) + center.y);
return pt;
}
I have a rectangle that I rotate by 45 degrees. I can touch any point on the rotated rectangle and it gives me the touched point I want to get the coordinates of the point if the rectangle wasn't rotated. So I pass in -45 in the degrees argument. Here is how I call it:
getRotatedPoint(touchedPoint, centerOfRectangle,-45);
When I draw the point on the rectangle before it gets rotated, it gives me a result close to the position I touched on the rotated rectangle but off by a pretty big difference.
Here is a picture to explain my problem:
I think this might be a problem with my math so any answers are greatly appreciated.
You are mixing initial and final values in the calculations. You re-assign pt.x:
pt.x = (float) (Math.cos(angleInRadians) * (pt.x-center.x) - Math.sin(angleInRadians) * (pt.y-center.y) + center.x);
which doesn't immediately pose any problems. But the calculation for pt.y relies on the original value of pt.x, not the rotated value:
pt.y = (float) (Math.sin(angleInRadians) * (pt.x-center.x) + Math.cos(angleInRadians) * (pt.y-center.y) + center.y);
Thus just use some temporary variables to hold the initial values.
public static PointF getRotatedPoint(PointF pt,PointF center, float degrees)
{
double x0 = pt.x;
double y0 = pt.y;
double angleInRadians = degrees * (Math.PI / 180);
pt.x = (float) (Math.cos(angleInRadians) * (x0-center.x) - Math.sin(angleInRadians) * (y0-center.y) + center.x);
pt.y = (float) (Math.sin(angleInRadians) * (x0-center.x) + Math.cos(angleInRadians) * (y0-center.y) + center.y);
return pt;
}

Circle Arc Equation - Understanding speed?

I have a circle being drawn at a certain position. I can move it just fine with speed set to 10f but when it starts to circle it becomes extremely fast. Its obviously not moving at (units/second) I'm not sure whats going on. I thought that the archSpeed needed to be in radians or something, that slowed it down - still not right though.
Here's the Circle Arc Equation I'm basing off of:
s = r * theta
Here are the functions I'm using:
private void moveOut(double deltaTime)
{
SetPosition(x += direction * speed * deltaTime, y, 0);
if (x - (direction * GetWidth() / 2f) >= centerX + radius + GetWidth() / 2f)
{
//onOutside = true;
}
Log.d(TAG, "moving out");
}
private void circleCenter(double deltaTime)
{
float angleSpeed = (float) (radius * (speed * Math.PI / 180) * deltaTime);
currentAngle += angleSpeed;
if (currentAngle >= 2 * Math.PI)
{
currentAngle = (float) (2 * Math.PI - currentAngle);
}
SetPosition(centerX + radius * FloatMath.cos(currentAngle), centerY + radius * FloatMath.sin(currentAngle), 0);
}
Your angleSpeed formula looks wrong.
I'd work it out first by saying What is the distance I travel in that time. The answer as you already know is speed*deltaTime. Now you have a distance you can work out the angle by using the arc forumla that says arclength = radius*angle. So angle = arclength/radius.
Put these two together to get
angle = speed*deltaTime/radius
This will be in radians of course.
Essentially this boils down to the fact you were multiplying by radius instead of dividing by it (looking at it in terms of units would have helped spot this but that is outside the scope of a programming forum).

rotate a matrix using different angles

I searched on internet and I saw lots of posts about how to rotate a matrix or an image by 90 or 180 degrees.But how can I rotate a matrix with 12 degrees or 162 degrees?
From:
To:
This image is rotated with ~35 degrees.
As you can see my matrix is the horse image and the circle is the rotation path, and the big rectangle is the new matrix created after rotation.
How can i achieve this? Thanks!
PS: This does not work
int angle=35*Math.PI/180;
int x1 = (int)(x * cos(angle)) - (y * sin(angle));
int y1 = (int)(y * cos(angle)) + (x * sin(angle));
Maybe your code would work if you saved x value before using it to compute y.
deg should be in radian not in degrees: 35*PI/180
you shouldn't compute with integers since cos and sin are between [0,1], think about floats.
float angle = 35*Math.PI/180;
int x1 = round(x * cos(angle) - y * sin(angle));
int y1 = round(y * cos(angle) + x * sin(angle));
Note: casting is huggly.

Categories