LWJGL Get mouse coordinates on given plane - java

So far, i have the code to get the mouse's position in the world, however i need to get it on a given plane (z=0), how can i go about doing this? Here is my picking code :
static public Vector3f getMousePositionIn3dCoords(int mouseX, int mouseY) {
viewport.clear();
modelview.clear();
projection.clear();
winZ.clear();;
position.clear();
float winX, winY;
GL11.glGetFloat( GL11.GL_MODELVIEW_MATRIX, modelview );
GL11.glGetFloat( GL11.GL_PROJECTION_MATRIX, projection );
GL11.glGetInteger( GL11.GL_VIEWPORT, viewport );
winX = (float)mouseX;
winY = (float)mouseY;
GL11.glReadPixels(mouseX, mouseY, 1, 1, GL11.GL_DEPTH_COMPONENT, GL11.GL_FLOAT, winZ);
float zz = winZ.get();
GLU.gluUnProject(winX, winY, zz, modelview, projection, viewport, position);
Vector3f v = new Vector3f(position.get(0),position.get(1),position.get(2));
return v ;
}

I've solved it, for anyone interested, here is my modified code for getting the x and y in the world when z=0 :
static public Vector3f getMouseOnPlaneZ(int mouseX, int mouseY)
{
viewport.clear();
modelview.clear();
projection.clear();
winZ.clear();;
position.clear();
float winX, winY;
GL11.glGetFloat( GL11.GL_MODELVIEW_MATRIX, modelview );
GL11.glGetFloat( GL11.GL_PROJECTION_MATRIX, projection );
GL11.glGetInteger( GL11.GL_VIEWPORT, viewport );
winX = (float)mouseX;
winY = (float)mouseY;
GLU.gluUnProject(winX, winY, 0.0f, modelview, projection, viewport, position);
GLU.gluUnProject(winX, winY, 1.0f, modelview, projection, viewport, position1);
float zeropoint, zeroperc;
double posXt, posYt, posZt;
posXt = position.get(0) - position1.get(0);
posYt = position.get(1) - position1.get(1);
posZt = position.get(2) - position1.get(2);
if ((position.get(2) < 0.0 && position1.get(2) < 0.0) || (position.get(2) > 0.0 && position1.get(2) > 0.0))
return null;
zeropoint = 0.0f - (float)position.get(2);
//Find the percentage that this point is between them
zeroperc = (zeropoint / (float)posZt);
Vector3f v = new Vector3f((float)position.get(0) + (float)(posXt * zeroperc),(float)position.get(1) + (float)(posYt * zeroperc),(float)position.get(2) + (float)(posZt * zeroperc));
return v ;
}

Related

How do I rotate objects using mouse input in LibGDX?

I am implementing 3D modeling with LibGDX, and I want to manually rotate objects with the mouse, but I just I can't find any tutorials and examples that are right.
EDIT: Originally I only asked this question regarding rotating models, but I've discovered the same problem exists when rotating the camera.
Click here for the source code of a fully functional demo in github.com
Here's a snapshot of rotating the view and a model:
I want the object to rotate in the direction it is dragged by the mouse, no matter which direction it happens to be orientated at the time. As it is now, when I first drag the mouse to the right, the object rotates to the right about the screen Y axis as expected; but then when I drag the mouse upward I want the object to rotate upward about the screen X axis, but instead it spins to the left about the screen Z axis. Think of it like a floating ball in a bowl of water - whichever way you swipe at it, it rotates in that direction.
It seems to me that the mouse movement is transforming the objects directly in their local coordinate system; but instead I think it needs to transform the axis of rotation itself from the Screen Coordinate System into the Object Coordinate System before applying it to the object. I don't know, but it may be even more complicated than that.
I would really appreciate any insight or help to resolve this; I'm running out of hair to pull out... Thanks in advance.
LibGDX implements OpenGL. The terminology we use in OpenGL can help us to know how LibGDX works behind the scene. The other technology that implements OpenGL is WebGL, WebGL uses JavaScript. LibGDX uses Java. Once we know how OpenGL works, drawing objects and rotating objects should be fun. Of course depending on what we are drawing. OpenGL is well
documented. OpenGL itself always works the same. The first question, should be what are we drawing? And what are the objectives of the project. So, you want to draw the cube and rotate it. Cool. Once we can draw one cube and rotate it, we can add more objects in the scene. Cool. Strategically you can divide your project into parts.
Draw your object.
Rotate it.
Add more objects.
Rotate them.
We are done.
If you want to rotate the view as well, you can use the same process as above with some modifications. For example:
Draw the view.
Draw objects inside the view.
Rotate the objects.
Rotate the view.
On the other hand you can just use the camera and move it around the scene.
Done.
To make things worse LibGDX can extend many different classes, the programmer have to implement all abstract methods. Your code might look differently or some functions behave differently depending on which class you extend or implement in your project. The documentation about those classes is stochastic. Each and every abstract class comes with its abstract methods. The programmer should release any other resources allocated by LibGDX by using dispose() method. With just few changes your code should work as expected.
For Example:
//
package com.mygdx.game;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Input.Buttons;
import com.badlogic.gdx.Input.Keys;
import com.badlogic.gdx.InputProcessor;
//import etc...
public class Space extends ApplicationAdapter implements ApplicationListener, InputProcessor {
SpriteBatch batch;
BitmapFont font;
float backgroundColor[];
Cube cubes[];
ModelBatch modelBatch;
int selectedCube;
PerspectiveCamera camera;
Environment environment;
int xCubes = 27;
int touchedButton;
int lastX, lastY;
float Theta, Phi, dX, dY;
Vector3 screenAOR;
float screenAng;
float point[];
int side[];
int front[];
float w;
float h;
Model viewM;
ModelInstance viewMin;
Vector3 position;
ColorAttribute attrib;
Vector3 cubePositions[];
boolean drag;
#Override
public void create () {
drag = false;
Theta = 0;
Phi = 0;
batch = new SpriteBatch();
font = new BitmapFont();
font.setColor(Color.FOREST);
w = Gdx.graphics.getWidth();
h = Gdx.graphics.getHeight();
modelBatch = new ModelBatch();
screenAOR = new Vector3();
camera = new PerspectiveCamera(67f, 3f, 2f);
camera.position.set(10f, -10f, 70f);
camera.lookAt(Vector3.Zero);
camera.up.set(Vector3.Y);
camera.near = 1f;
camera.far = 500f;
camera.update();
backgroundColor = new float[]{.9f, .9f, .7f};
environment = new Environment();
environment.set(new ColorAttribute( ColorAttribute.AmbientLight, .6f, .6f, .6f, 1f));
environment.add(new DirectionalLight().set(.8f, .8f, .8f, 50f, 50f, 50f));
environment.add(new DirectionalLight().set(.5f, .5f, .5f, -50f, -50f, 50f));
spaceModel();
Gdx.input.setInputProcessor(this);
//Gdx.graphics.requestRendering();
}
#Override
public void render () {
Gdx.gl.glClearColor(1, 0, 0, 1);
//Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
//Gdx.gl.glClearColor(backgroundColor[0], backgroundColor[1], backgroundColor[2], 1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
Gdx.gl.glEnable(GL20.GL_DEPTH_TEST);
Gdx.gl.glEnable(GL20.GL_CULL_FACE);
batch.begin();
modelBatch.begin(camera);
modelBatch.render(viewMin, environment);
for(int i = 0; i < cubes.length; i++){
modelBatch.render(cubes[i].modelInstance, environment);
}
font.draw(batch, "Space pro...", 10, 100);
batch.end();
modelBatch.end();
}
#Override
public void dispose () {
batch.dispose();
modelBatch.dispose();
font.dispose();
viewM.dispose();
}
/*///////////////////////////////////
//Implements all abstract methods.
//
*/////////////////////////////////
#Override
public boolean touchDragged(int screenX, int screenY, int pointer) {
//lastX -= screenX;
//lastY -= screenY;
//float aspRatio = w/h;
//float angle = 40.0f;
moveModel(screenX, screenY);
// distance of mouse movement
//screenAng = (float) Math.sqrt( ((lastX * lastX) + (lastY * lastY)) );
//screenAng = (float) Math.tan((angle * 0.5)* (Math.PI/180));
// direction vector of the AOR
//screenAOR.set((lastY/screenAng), (lastX/screenAng), 0f );
//screenAOR.set(projection(angle,aspRatio,h,w));
//public Vector3 set(float x, float y, float z)
screenAOR.set(dX, dY, 0f);
if ( touchedButton == 0 ){
//public Matrix4 rotate(Vector3 axis, float degrees)
//cubes[ selectedCube ].modelInstance.transform.rotate( screenAOR, screenAng );
//public Matrix4 rotate(float axisX, float axisY, float axisZ, float degrees)
cubes[ selectedCube ].modelInstance.transform.rotate(dX, dY, 0f, Theta);
cubes[ selectedCube ].modelInstance.transform.rotate(dX, dY, 0f, Phi);
}
else{
//public void rotateAround(Vector3 point, Vector3 axis, float angle)
//camera.rotateAround( Vector3.Zero, screenAOR, (-screenAng/5.5f) );
//public void rotate(float angle, float axisX, float axisY, float axisZ)
//camera.rotate(Theta, dX, dY, 0f);
//camera.rotate(Phi, dX, dY, 0f);
//camera.rotateAround(position, screenAOR, Theta);
camera.rotateAround(Vector3.Zero, screenAOR, Theta);
camera.update();
//camera.rotateAround(position, screenAOR, Phi);
camera.rotateAround(Vector3.Zero, screenAOR, Phi);
camera.update();
viewMin.transform.rotate(dX, dY, 0f, Theta);
viewMin.transform.rotate(dX, dY, 0f, Phi);
}
//Gdx.graphics.requestRendering();
//Gdx.app.log("touchDragged:", screenAng+" : "+screenAOR+" : "+touchedButton);
return true;
}
#Override
public boolean touchDown(int screenX, int screenY, int pointer, int button) {
drag = true;
if(button == Buttons.LEFT){
touchedButton = 0;
}
else{
touchedButton = button;
}
Gdx.app.log("touchDown:", button+" : "+screenX+" : "+screenY+" : "+pointer);
return true;
}
#Override
public boolean keyDown(int i) {
float move = 1.0f;
float pX = w/10;
float pY = h/10;
if(i == Keys.LEFT){
pX -= move;
//public void rotate(float angle, float axisX, float axisY, float axisZ)
//camera.rotate(-45, pX, 0f, 0f);
camera.rotate(-45, 0f, pY, 0f);
//camera.update();
//public void translate(float x, float y, float z)
//camera.translate(move, 0f, 0f);
}
if(i == Keys.RIGHT){
pX += move;
//camera.rotate(45, pX, 0f, 0f);
camera.rotate(45, 0f, pY, 0f);
//camera.update();
//camera.translate(-move, 0f, 0f);
}
if(i == Keys.DOWN){
pY -= move;
//camera.rotate(-45, 0f, pY, 0f);
//camera.rotate(-45, pX, 0f, 0f);
camera.rotate(45, pX, 0f, 0f);
//camera.update();
//camera.translate(0f, 0f, move);
//camera.update();
//camera.translate(0f, move, 0f);
}
if(i == Keys.UP){
pY += move;
//camera.rotate(45, 0f, pY, 0f);
//camera.rotate(45, pX, 0f, 0f);
camera.rotate(-45, pX, 0f, 0f);
//camera.update();
//camera.translate(0f, 0f, -move);
//camera.update();
//camera.translate(0f, -move, 0f);
}
camera.update();
Gdx.app.log("KeyDown: ", pX+" : "+pY+" : "+i);
return true;
}
#Override
public boolean keyUp(int i) {
Gdx.app.log("KeyUp: ",i+" : ");
return false;
}
#Override
public boolean keyTyped(char c) {
//Gdx.app.log("KeyTyped: ",c+" : ");
return false;
}
#Override
public boolean touchUp(int i, int i1, int i2, int i3) {
drag = false;
Gdx.app.log("touchUp: ",i+" : "+i1+" : "+i2+" : "+i3);
return true;
}
#Override
public boolean mouseMoved(int i, int i1) {
if(!drag)
{
dX *= 0.96;
dY *= 0.96;
Theta += dX;
Phi += dY;
return false;
}
Gdx.app.log("mouseMoved: ", i+" : "+i1);
return false;
}
#Override
public boolean scrolled(int i) {
return false;
}
public void moveModel(int x2, int y2){
dX = (float) ((x2 - lastX)*2*(Math.PI/w));
dY = (float) ((y2 - lastY)*2*(Math.PI/h));
Theta += dX;
Phi += dY;
lastX = x2;
lastY = y2;
}
public void spaceModel(){
xCubes = 27;
selectedCube = 14;
ModelBuilder modelB = new ModelBuilder();
attrib = new ColorAttribute(1,Color.VIOLET);
Material m = new Material();
m.set(attrib);
//public Model createXYZCoordinates(float axisLength, Material material, long attributes)
viewM = modelB.createXYZCoordinates(w, m , 1);
cubePositions = new Vector3[xCubes];
for(int i = 0; i < xCubes; i++){
cubePositions[i] = new Vector3((i/9), ((i%9)/3), (i%3)).scl(20f).add(-20f, -20f, -20f);
}
cubes = new Cube[xCubes];
for(int i = 0; i < xCubes; i++){
cubes[i] = new Cube(cubePositions[i], (i == selectedCube));
}
viewMin = new ModelInstance(viewM);
position = cubePositions[0];
viewMin.transform.setTranslation(position);
Gdx.app.log("viewModel: ", w+" : "+h+" : "+w/h);
}
float[] projection(float angle, float a, float z1, float z2){
float ang = (float) Math.tan((angle * 0.5)* (Math.PI/180));
float[] proj = {
(float)0.5/ang, 0, 0, 0,
0, (float)0.5*a/ang, 0, 0,
0, 0, -(z2+z1)/(z2-z1), -1,
0, 0, (-2*z2*z1)/(z2-z1), 0
};
return proj;
}
}
/*////////////////////
//Draw cubes.
//
*/////////////////
class Cube{
Vector3 position;
Model model;
ModelInstance modelInstance;
Cube(Vector3 cubePosition, boolean selected) {
position = cubePosition;
compose(selected);
}
//etc...
}
//
When you are rotating the camera and rotating the objects, the direction can change or sometimes reversed. Depending at what angle is the object and the camera are at, at that point in time. It's like looking at the review mirror facing the opposite direction. So user's position and orientation in the scene is also important.
//
//
When you are looking at an object spinning “ < ---” around the circle, logically it is going to change direction “ ---> ”. When it reaches the far end “ <---> ”. i.e. From right to left and from left to right. Of course the user will still be using the same button. When you press the other buttons the same logic follows. Different sequences of rotations can result in different images as well. The other option which is time consuming is: translate(rotate(scale(geometry))). Eventually the entire image will be a single whole that is composed of its various parts. This process mighty help you to debug your code, and figuring out what caused the errors. With a little bit of maths, things can't get any worse. The science behind the object you are rotating is also important. When you are looking at the object rotating, the beauty is on the observers eye. e.g. Are you looking at the front side or the back side? Finally you have to use maths to get your model to behave the way you want it to behave.
Enjoy.
I asked the same question on one of the other sister forums, and got an answer that I was able to implement.
See the discussion here.
Here is the change to the code that made the whole thing work correctly:
Here is the change to the code that made the whole thing work correctly:
#Override public boolean touchDragged( int screenX, int screenY, int pointer )
{
lastX -= screenX;
lastY -= screenY;
// distance of mouse movement
screenAng = Vector3.len( lastX, lastY, 0f );
// direction vector of the AOR
screenAOR.set( lastY/screenAng, lastX/screenAng, 0f );
if ( touchedButton == 0 )
{ // rotate the part
// transform the screen AOR to a model rotation
Matrix4 camT, camR, camRi, modT, modR;
camT = new Matrix4();
camR = new Matrix4();
modT = new Matrix4();
modR = new Matrix4();
decompose( camera.view, camT, camR );
camRi = camR.cpy().inv();
decompose( cubes[ selectedCube ].modelInstance.transform, modT, modR );
tempMat.idt()
.mul( modT )
.mul( camRi )
.rotate( screenAOR, -screenAng )
.mul( camR )
.mul( modR );
cubes[ selectedCube ].modelInstance.transform.set( tempMat );
}
else if ( touchedButton == 1 )
{ // rotate the camera
// transform the AOR from screen CS to camera CS
// get the camera transformation matrix
tempMat.set( camera.view );
tempMat.translate( camera.position );
tempMat.inv();
// transform the screen AOR to a world AOR
worldAOR = transform( tempMat, screenAOR, worldAOR ).nor();
// apply the rotation of the angle about the world AOR to the camera
camera.rotateAround( Vector3.Zero, worldAOR, screenAng/5.5f );
camera.update();
}
lastX = screenX;
lastY = screenY;
Gdx.graphics.requestRendering();
return true;
}
Vector3 transform( Matrix4 mat, Vector3 from, Vector3 to )
{
// transform a vector according to a transformation matrix
to.x = from.dot( mat.val[ Matrix4.M00 ], mat.val[ Matrix4.M01 ],
mat.val[ Matrix4.M02 ] )+mat.val[ Matrix4.M03 ];
to.y = from.dot( mat.val[ Matrix4.M10 ], mat.val[ Matrix4.M11 ],
mat.val[ Matrix4.M12 ] )+mat.val[ Matrix4.M13 ];
to.z = from.dot( mat.val[ Matrix4.M20 ], mat.val[ Matrix4.M21 ],
mat.val[ Matrix4.M22 ] )+mat.val[ Matrix4.M23 ];
return to;
}
void decompose( Matrix4 m, Matrix4 t, Matrix4 r )
{
Matrix4 I4 = new Matrix4(); // Identity
for ( int i = 0; i < 4; i++ )
{
for ( int j = 0; j < 4; j++ )
{
if (i == 3 || j == 3)
{
r.val[ i*4+j ] = I4.val[ i*4+j ];
t.val[ i*4+j ] = m.val[ i*4+j ];
}
else
{
r.val[ i*4+j ] = m.val[ i*4+j ];
t.val[ i*4+j ] = I4.val[ i*4+j ];
}
}
}
}

Kinect & Processing: Passing skeleton hand data to mouse position

I've been working on this a while and feel so close! Should be easy, but I'm still new to this.
The skeleton hand data is being passed in as joints[KinectPV2.JointType_HandLeft] and can be accessed through joint.getX() and joint.getY(). I want to pass this data into the update function to replace mouseX and mouseY. I'm guessing I have to create global variables to access it within the update function or maybe I have to pass the skeleton data as parameters into the update function? How can I replace the mouse position data with the hand position?
import KinectPV2.*;
KinectPV2 kinect;
private class MyFluidData implements DwFluid2D.FluidData{
// update() is called during the fluid-simulation update step.
#Override
public void update(DwFluid2D fluid) {
float px, py, vx, vy, radius, vscale, temperature;
radius = 15;
vscale = 10;
px = width/2;
py = 50;
vx = 1 * +vscale;
vy = 1 * vscale;
radius = 40;
temperature = 1f;
fluid.addDensity(px, py, radius, 0.2f, 0.3f, 0.5f, 1.0f);
fluid.addTemperature(px, py, radius, temperature);
particles.spawn(fluid, px, py, radius, 100);
boolean mouse_input = mousePressed;
// add impulse: density + velocity, particles
if(mouse_input && mouseButton == LEFT){
radius = 15;
vscale = 15;
px = mouseX;
py = height-mouseY;
vx = (mouseX - pmouseX) * +vscale;
vy = (mouseY - pmouseY) * -vscale;
fluid.addDensity (px, py, radius, 0.25f, 0.0f, 0.1f, 1.0f);
fluid.addVelocity(px, py, radius, vx, vy);
particles.spawn(fluid, px, py, radius*2, 300);
}
// add impulse: density + temperature, particles
if(mouse_input && mouseButton == CENTER){
radius = 15;
vscale = 15;
px = mouseX;
py = height-mouseY;
temperature = 2f;
fluid.addDensity(px, py, radius, 0.25f, 0.0f, 0.1f, 1.0f);
fluid.addTemperature(px, py, radius, temperature);
particles.spawn(fluid, px, py, radius, 100);
}
// particles
if(mouse_input && mouseButton == RIGHT){
px = mouseX;
py = height - 1 - mouseY; // invert
radius = 50;
particles.spawn(fluid, px, py, radius, 300);
}
}
}
int viewport_w = 1280;
int viewport_h = 720;
int viewport_x = 230;
int viewport_y = 0;
int gui_w = 200;
int gui_x = 20;
int gui_y = 20;
int fluidgrid_scale = 3;
DwFluid2D fluid;
// render targets
PGraphics2D pg_fluid;
//texture-buffer, for adding obstacles
PGraphics2D pg_obstacles;
// custom particle system
MyParticleSystem particles;
// some state variables for the GUI/display
int BACKGROUND_COLOR = 0;
boolean UPDATE_FLUID = true;
boolean DISPLAY_FLUID_TEXTURES = false;
boolean DISPLAY_FLUID_VECTORS = false;
int DISPLAY_fluid_texture_mode = 0;
boolean DISPLAY_PARTICLES = true;
public void settings() {
size(viewport_w, viewport_h, P2D);
smooth(4);
}
public void setup() {
surface.setLocation(viewport_x, viewport_y);
// main library context
DwPixelFlow context = new DwPixelFlow(this);
context.print();
context.printGL();
// fluid simulation
fluid = new DwFluid2D(context, viewport_w, viewport_h, fluidgrid_scale);
// set some simulation parameters
fluid.param.dissipation_density = 0.999f;
fluid.param.dissipation_velocity = 0.99f;
fluid.param.dissipation_temperature = 0.80f;
fluid.param.vorticity = 0.10f;
fluid.param.timestep = 0.25f;
fluid.param.gridscale = 8f;
// interface for adding data to the fluid simulation
MyFluidData cb_fluid_data = new MyFluidData();
fluid.addCallback_FluiData(cb_fluid_data);
// pgraphics for fluid
pg_fluid = (PGraphics2D) createGraphics(viewport_w, viewport_h, P2D);
pg_fluid.smooth(4);
pg_fluid.beginDraw();
pg_fluid.background(BACKGROUND_COLOR);
pg_fluid.endDraw();
// pgraphics for obstacles
pg_obstacles = (PGraphics2D) createGraphics(viewport_w, viewport_h, P2D);
pg_obstacles.smooth(4);
pg_obstacles.beginDraw();
pg_obstacles.clear();
float radius;
radius = 200;
pg_obstacles.stroke(64);
pg_obstacles.strokeWeight(1);
pg_obstacles.fill(0);
pg_obstacles.rect(1*width/2f, 1*height/4f, radius, radius/2, 10);
pg_obstacles.stroke(64);
pg_obstacles.strokeWeight(1);
pg_obstacles.fill(0);
pg_obstacles.rect(1*width/3.5f, 1*height/2.5f, radius, radius/2, 10);
//// border-obstacle
//pg_obstacles.strokeWeight(20);
//pg_obstacles.stroke(64);
//pg_obstacles.noFill();
//pg_obstacles.rect(0, 0, pg_obstacles.width, pg_obstacles.height);
pg_obstacles.endDraw();
fluid.addObstacles(pg_obstacles);
// custom particle object
particles = new MyParticleSystem(context, 1024 * 1024);
kinect = new KinectPV2(this);
//Enables depth and Body tracking (mask image)
kinect.enableDepthMaskImg(true);
kinect.enableSkeletonDepthMap(true);
kinect.init();
background(0);
frameRate(60);
}
public void draw() {
PImage imgC = kinect.getDepthMaskImage();
image(imgC, 0, 0, 320, 240);
//get the skeletons as an Arraylist of KSkeletons
ArrayList<KSkeleton> skeletonArray = kinect.getSkeletonDepthMap();
//individual joints
for (int i = 0; i < skeletonArray.size(); i++) {
KSkeleton skeleton = (KSkeleton) skeletonArray.get(i);
//if the skeleton is being tracked compute the skleton joints
if (skeleton.isTracked()) {
KJoint[] joints = skeleton.getJoints();
color col = skeleton.getIndexColor();
fill(col);
stroke(col);
drawHandState(joints[KinectPV2.JointType_HandRight]);
drawHandState(joints[KinectPV2.JointType_HandLeft]);
}
}
// update simulation
if(UPDATE_FLUID){
fluid.addObstacles(pg_obstacles);
fluid.update();
particles.update(fluid);
}
// clear render target
pg_fluid.beginDraw();
pg_fluid.background(BACKGROUND_COLOR);
pg_fluid.endDraw();
// render fluid stuff
if(DISPLAY_FLUID_TEXTURES){
// render: density (0), temperature (1), pressure (2), velocity (3)
fluid.renderFluidTextures(pg_fluid, DISPLAY_fluid_texture_mode);
}
if(DISPLAY_FLUID_VECTORS){
// render: velocity vector field
fluid.renderFluidVectors(pg_fluid, 10);
}
if( DISPLAY_PARTICLES){
// render: particles; 0 ... points, 1 ...sprite texture, 2 ... dynamic points
particles.render(pg_fluid, BACKGROUND_COLOR);
}
// display
image(pg_fluid , 320, 0);
image(pg_obstacles, 320, 0);
// display number of particles as text
//String txt_num_particles = String.format("Particles %,d", particles.ALIVE_PARTICLES);
//fill(0, 0, 0, 220);
//noStroke();
//rect(10, height-10, 160, -30);
//fill(255,128,0);
//text(txt_num_particles, 20, height-20);
// info
//String txt_fps = String.format(getClass().getName()+ " [size %d/%d] [frame %d] [fps %6.2f]", fluid.fluid_w, fluid.fluid_h, fluid.simulation_step, frameRate);
//surface.setTitle(txt_fps);
}
//draw a ellipse depending on the hand state
void drawHandState(KJoint joint) {
noStroke();
handState(joint.getState());
//println(joint.getState());
pushMatrix();
translate(joint.getX(), joint.getY(), joint.getZ());
//println(joint.getX(), joint.getY(), joint.getZ());
ellipse(joint.getX(), joint.getY(), 70, 70);
popMatrix();
}
/*
Different hand state
KinectPV2.HandState_Open
KinectPV2.HandState_Closed
KinectPV2.HandState_Lasso
KinectPV2.HandState_NotTracked
*/
//Depending on the hand state change the color
void handState(int handState) {
switch(handState) {
case KinectPV2.HandState_Open:
fill(0, 255, 0);
break;
case KinectPV2.HandState_Closed:
fill(255, 0, 0);
break;
case KinectPV2.HandState_Lasso:
fill(0, 0, 255);
break;
case KinectPV2.HandState_NotTracked:
fill(100, 100, 100);
break;
}
}
I'm guessing I have to create global variables to access it within the update function or maybe I have to pass the skeleton data as parameters into the update function?
What happened when you tried those approaches?
Either approach sounds fine. You could store the variables in a sketch-level variable, set those variables from the kinect code, then use those variables in your drawing code. Or you could pass the variables as a parameter to the drawing code. Either should work fine. I'd probably go for the first approach because it sounds easier to me, but that's just my personal preference.
I suggest working in smaller chunks. Create a separate program that ignores the kinect for now. Create a hard-coded sketch-level variable that holds the same type of information you'd get from the kinect. Then write drawing code that uses that hard-coded variable to draw the frame. Get that working perfectly before you try adding the kinect code back in.
Then if you get stuck on a specific step, you can post a MCVE and we can go from there. Good luck.

Zooming in map with OpenGL-ES2 Android

I have created a pinch zoom with a scale detector which in turn calls the following renderer.
This uses the projection matrix to do the zoom and then scales the eye per the zoom when panning.
public class vboCustomGLRenderer implements GLSurfaceView.Renderer {
// Store the model matrix. This matrix is used to move models from object space (where each model can be thought
// of being located at the center of the universe) to world space.
private float[] mModelMatrix = new float[16];
// Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
// it positions things relative to our eye.
private float[] mViewMatrix = new float[16];
// Store the projection matrix. This is used to project the scene onto a 2D viewport.
private float[] mProjectionMatrix = new float[16];
// Allocate storage for the final combined matrix. This will be passed into the shader program.
private float[] mMVPMatrix = new float[16];
// This will be used to pass in the transformation matrix.
private int mMVPMatrixHandle;
// This will be used to pass in model position information.
private int mPositionHandle;
// This will be used to pass in model color information.
private int mColorUniformLocation;
// How many bytes per float.
private final int mBytesPerFloat = 4;
// Offset of the position data.
private final int mPositionOffset = 0;
// Size of the position data in elements.
private final int mPositionDataSize = 3;
// How many elements per vertex for double values.
private final int mPositionFloatStrideBytes = mPositionDataSize * mBytesPerFloat;
// Position the eye behind the origin.
public double eyeX = default_settings.mbrMinX + ((default_settings.mbrMaxX - default_settings.mbrMinX)/2);
public double eyeY = default_settings.mbrMinY + ((default_settings.mbrMaxY - default_settings.mbrMinY)/2);
// Position the eye behind the origin.
//final float eyeZ = 1.5f;
public float eyeZ = 1.5f;
// We are looking toward the distance
public double lookX = eyeX;
public double lookY = eyeY;
public float lookZ = 0.0f;
// Set our up vector. This is where our head would be pointing were we holding the camera.
public float upX = 0.0f;
public float upY = 1.0f;
public float upZ = 0.0f;
public double mScaleFactor = 1;
public double mScrnVsMapScaleFactor = 0;
public vboCustomGLRenderer() {}
public void setEye(double x, double y){
eyeX -= (x / screen_vs_map_horz_ratio);
lookX = eyeX;
eyeY += (y / screen_vs_map_vert_ratio);
lookY = eyeY;
// Set the camera position (View matrix)
Matrix.setLookAtM(mViewMatrix, 0, (float)eyeX, (float)eyeY, eyeZ, (float)lookX, (float)lookY, lookZ, upX, upY, upZ);
}
public void setScaleFactor(float scaleFactor, float gdx, float gdy){
mScaleFactor *= scaleFactor;
mRight = mRight / scaleFactor;
mLeft = -mRight;
mTop = mTop / scaleFactor;
mBottom = -mTop;
//Need to calculate the shift in the eye when zooming on a particular spot.
//So get the distance between the zoom point and eye point, figure out the
//new eye point by getting the factor of this distance.
double eyeXShift = (((mWidth / 2) - gdx) - (((mWidth / 2) - gdx) / scaleFactor));
double eyeYShift = (((mHeight / 2) - gdy) - (((mHeight / 2) - gdy) / scaleFactor));
screen_vs_map_horz_ratio = (mWidth/(mRight-mLeft));
screen_vs_map_vert_ratio = (mHeight/(mTop-mBottom));
eyeX -= (eyeXShift / screen_vs_map_horz_ratio);
lookX = eyeX;
eyeY += (eyeYShift / screen_vs_map_vert_ratio);
lookY = eyeY;
// Set the scale (Projection matrix)
Matrix.frustumM(mProjectionMatrix, 0, (float)mLeft, (float)mRight, (float)mBottom, (float)mTop, near, far);
}
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
// Set the background frame color
//White
GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
// Set the view matrix. This matrix can be said to represent the camera position.
// NOTE: In OpenGL 1, a ModelView matrix is used, which is a combination of a model and
// view matrix. In OpenGL 2, we can keep track of these matrices separately if we choose.
Matrix.setLookAtM(mViewMatrix, 0, (float)eyeX, (float)eyeY, eyeZ, (float)lookX, (float)lookY, lookZ, upX, upY, upZ);
final String vertexShader =
"uniform mat4 u_MVPMatrix; \n" // A constant representing the combined model/view/projection matrix.
+ "attribute vec4 a_Position; \n" // Per-vertex position information we will pass in.
+ "attribute vec4 a_Color; \n" // Per-vertex color information we will pass in.
+ "varying vec4 v_Color; \n" // This will be passed into the fragment shader.
+ "void main() \n" // The entry point for our vertex shader.
+ "{ \n"
+ " v_Color = a_Color; \n" // Pass the color through to the fragment shader.
// It will be interpolated across the triangle.
+ " gl_Position = u_MVPMatrix \n" // gl_Position is a special variable used to store the final position.
+ " * a_Position; \n" // Multiply the vertex by the matrix to get the final point in
+ "} \n"; // normalized screen coordinates.
final String fragmentShader =
"precision mediump float; \n" // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
+ "uniform vec4 u_Color; \n" // This is the color from the vertex shader interpolated across the
// triangle per fragment.
+ "void main() \n" // The entry point for our fragment shader.
+ "{ \n"
+ " gl_FragColor = u_Color; \n" // Pass the color directly through the pipeline.
+ "} \n";
// Load in the vertex shader.
int vertexShaderHandle = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER);
if (vertexShaderHandle != 0)
{
// Pass in the shader source.
GLES20.glShaderSource(vertexShaderHandle, vertexShader);
// Compile the shader.
GLES20.glCompileShader(vertexShaderHandle);
// Get the compilation status.
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(vertexShaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
// If the compilation failed, delete the shader.
if (compileStatus[0] == 0)
{
GLES20.glDeleteShader(vertexShaderHandle);
vertexShaderHandle = 0;
}
}
if (vertexShaderHandle == 0)
{
throw new RuntimeException("Error creating vertex shader.");
}
// Load in the fragment shader shader.
int fragmentShaderHandle = GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER);
if (fragmentShaderHandle != 0)
{
// Pass in the shader source.
GLES20.glShaderSource(fragmentShaderHandle, fragmentShader);
// Compile the shader.
GLES20.glCompileShader(fragmentShaderHandle);
// Get the compilation status.
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(fragmentShaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
// If the compilation failed, delete the shader.
if (compileStatus[0] == 0)
{
GLES20.glDeleteShader(fragmentShaderHandle);
fragmentShaderHandle = 0;
}
}
if (fragmentShaderHandle == 0)
{
throw new RuntimeException("Error creating fragment shader.");
}
// Create a program object and store the handle to it.
int programHandle = GLES20.glCreateProgram();
if (programHandle != 0)
{
// Bind the vertex shader to the program.
GLES20.glAttachShader(programHandle, vertexShaderHandle);
// Bind the fragment shader to the program.
GLES20.glAttachShader(programHandle, fragmentShaderHandle);
// Bind attributes
GLES20.glBindAttribLocation(programHandle, 0, "a_Position");
GLES20.glBindAttribLocation(programHandle, 1, "a_Color");
// Link the two shaders together into a program.
GLES20.glLinkProgram(programHandle);
// Get the link status.
final int[] linkStatus = new int[1];
GLES20.glGetProgramiv(programHandle, GLES20.GL_LINK_STATUS, linkStatus, 0);
// If the link failed, delete the program.
if (linkStatus[0] == 0)
{
GLES20.glDeleteProgram(programHandle);
programHandle = 0;
}
}
if (programHandle == 0)
{
throw new RuntimeException("Error creating program.");
}
// Set program handles. These will later be used to pass in values to the program.
mMVPMatrixHandle = GLES20.glGetUniformLocation(programHandle, "u_MVPMatrix");
mPositionHandle = GLES20.glGetAttribLocation(programHandle, "a_Position");
mColorUniformLocation = GLES20.glGetUniformLocation(programHandle, "u_Color");
// Tell OpenGL to use this program when rendering.
GLES20.glUseProgram(programHandle);
}
static double mWidth = 0;
static double mHeight = 0;
static double mLeft = 0;
static double mRight = 0;
static double mTop = 0;
static double mBottom = 0;
static double mRatio = 0;
double screen_width_height_ratio;
double screen_height_width_ratio;
final float near = 1.5f;
final float far = 10.0f;
double screen_vs_map_horz_ratio = 0;
double screen_vs_map_vert_ratio = 0;
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
// Adjust the viewport based on geometry changes,
// such as screen rotation
// Set the OpenGL viewport to the same size as the surface.
GLES20.glViewport(0, 0, width, height);
screen_width_height_ratio = (double) width / height;
screen_height_width_ratio = (double) height / width;
//Initialize
if (mRatio == 0){
mWidth = (double) width;
mHeight = (double) height;
//map height to width ratio
double map_extents_width = default_settings.mbrMaxX - default_settings.mbrMinX;
double map_extents_height = default_settings.mbrMaxY - default_settings.mbrMinY;
double map_width_height_ratio = map_extents_width/map_extents_height;
if (screen_width_height_ratio > map_width_height_ratio){
mRight = (screen_width_height_ratio * map_extents_height)/2;
mLeft = -mRight;
mTop = map_extents_height/2;
mBottom = -mTop;
}
else{
mRight = map_extents_width/2;
mLeft = -mRight;
mTop = (screen_height_width_ratio * map_extents_width)/2;
mBottom = -mTop;
}
mRatio = screen_width_height_ratio;
}
if (screen_width_height_ratio != mRatio){
final double wRatio = width/mWidth;
final double oldWidth = mRight - mLeft;
final double newWidth = wRatio * oldWidth;
final double widthDiff = (newWidth - oldWidth)/2;
mLeft = mLeft - widthDiff;
mRight = mRight + widthDiff;
final double hRatio = height/mHeight;
final double oldHeight = mTop - mBottom;
final double newHeight = hRatio * oldHeight;
final double heightDiff = (newHeight - oldHeight)/2;
mBottom = mBottom - heightDiff;
mTop = mTop + heightDiff;
mWidth = (double) width;
mHeight = (double) height;
mRatio = screen_width_height_ratio;
}
screen_vs_map_horz_ratio = (mWidth/(mRight-mLeft));
screen_vs_map_vert_ratio = (mHeight/(mTop-mBottom));
Matrix.frustumM(mProjectionMatrix, 0, (float)mLeft, (float)mRight, (float)mBottom, (float)mTop, near, far);
}
ListIterator<mapLayer> orgNonAssetCatLayersList_it;
ListIterator<FloatBuffer> mapLayerObjectList_it;
ListIterator<Byte> mapLayerObjectTypeList_it;
mapLayer MapLayer;
#Override
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
drawPreset();
orgNonAssetCatLayersList_it = default_settings.orgNonAssetCatMappableLayers.listIterator();
while (orgNonAssetCatLayersList_it.hasNext()) {
MapLayer = orgNonAssetCatLayersList_it.next();
if (MapLayer.BatchedPointVBO != null){
}
if (MapLayer.BatchedLineVBO != null){
drawLineString(MapLayer.BatchedLineVBO, MapLayer.lineStringObjColor);
}
if (MapLayer.BatchedPolygonVBO != null){
drawPolygon(MapLayer.BatchedPolygonVBO, MapLayer.polygonObjColor);
}
}
}
private void drawPreset()
{
Matrix.setIdentityM(mModelMatrix, 0);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
}
private void drawLineString(final FloatBuffer geometryBuffer, final float[] colorArray)
{
// Pass in the position information
geometryBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false, mPositionFloatStrideBytes, geometryBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
GLES20.glUniform4f(mColorUniformLocation, colorArray[0], colorArray[1], colorArray[2], 1f);
GLES20.glLineWidth(2.0f);
GLES20.glDrawArrays(GLES20.GL_LINES, 0, geometryBuffer.capacity()/mPositionDataSize);
}
private void drawPolygon(final FloatBuffer geometryBuffer, final float[] colorArray)
{
// Pass in the position information
geometryBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false, mPositionFloatStrideBytes, geometryBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
GLES20.glUniform4f(mColorUniformLocation, colorArray[0], colorArray[1], colorArray[2], 1f);
GLES20.glLineWidth(1.0f);
GLES20.glDrawArrays(GLES20.GL_LINES, 0, geometryBuffer.capacity()/mPositionDataSize);
}
}
This works very well up until it gets to a certain level then the panning starts jumping. After testing I found that it was because the floating point value of the eye, could not cope with such a small shift in position. I keep my x and y eye values in doubles so it continues to calculate shifting positions, then when calling setLookAtM() I convert them to floats.
So need I need to change the way the zoom works. I was thinking instead of zooming with the projection, scaling the model larger or smaller.
The setScaleFactor() function in my code will change, by removing the projection and eye shifting.
There is a Matrix.scaleM(m,Offset,x,y,z) function but I am unsure how or where to implement this.
Could use some suggestions on how to accomplish this.
[Edit] 24/7/2013
I tried altering setScaleFactor() like so:
public void setScaleFactor(float scaleFactor, float gdx, float gdy){
mScaleFactor *= scaleFactor;
}
and in drawPreset()
private void drawPreset()
{
Matrix.setIdentityM(mModelMatrix, 0);
//*****Added scaleM
Matrix.scaleM(mModelMatrix, 0, (float)mScaleFactor, (float)mScaleFactor, 1.0f);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
}
Now as soon as you do a zoom the image disappears from the screen.
Actually I found it right off to the right hand side. I could still pan over to it.
Still not sure on what I should be scaling to zoom, is it the model, view or view-model?
I have found out that if you take the center of your model back to the origin (0,0) it allows you to extend your zoom capabilities. With my x coord data which was between 152.6 and 152.7.
Taking it back to the origin by the offset 152.65, which needs to be applied to the data before loading it into the floatbuffer.
So the width of the data becomes 0.1 or 0.05 on each side, allowing for more precision on the trailing end of the value.

Two classes instaniated but only one is be used

Hi I'm creating a water scene and have a class called drawWater. The class takes in a equation to alter it's appearance. When I try to create
drawater water = new drawWater();
drawater water2 = new drawWater();
They both seem to be created with the correct values but when I draw them onto the screen only one is shown. Both the values are different.
I don't know where I'm going wrong. It doesn't seem to be a JOGL problem but the way I'm setting up the class?
Can anyone see where I went wrong?
This is the main class:
package waterAttempt41;
import Common.GLDisplay;
public class Lesson27 {
public static void main(String[] args) {
GLDisplay neheGLDisplay = GLDisplay.createGLDisplay("Current water attempt");
Renderer renderer = new Renderer();
//InputHandler inputHandler = new InputHandler(renderer, neheGLDisplay);
neheGLDisplay.addGLEventListener(renderer);
//neheGLDisplay.addKeyListener(inputHandler);
neheGLDisplay.start();
}
}
This is the renderer class:
package waterAttempt41;
import Common.TextureReader;
import java.io.IOException;
import java.util.logging.Logger;
import javax.media.opengl.GL;
import javax.media.opengl.GLAutoDrawable;
import javax.media.opengl.GLEventListener;
import javax.media.opengl.glu.GLU;
class Renderer implements GLEventListener {
private static final Logger logger = Logger.getLogger(Renderer.class.getName());
drawWater water;
drawWater water2;
private int[] textures = new int[3]; // Storage For 3 Textures
private boolean aDown = false;
private boolean up = false;
private GLU glu = new GLU();
public void init(GLAutoDrawable drawable) {
GL gl = drawable.getGL();
try {
loadGLTextures(drawable);
} catch (IOException e) {
logger.info("Exception loading textures or Objects");
System.out.println("Couldn't load model/Texture");
throw new RuntimeException(e);
}
gl.glEnable(GL.GL_DEPTH_TEST);
gl.glShadeModel(GL.GL_SMOOTH);
gl.glLightModeli(GL.GL_LIGHT_MODEL_TWO_SIDE, GL.GL_TRUE);
gl.glCullFace(GL.GL_BACK); // Set Culling Face To Back Face
gl.glEnable(GL.GL_CULL_FACE); // Enable Culling
gl.glClearColor(0.0f, 0.0f, 0.0f, 1.0f); // Set Clear Color (Greenish Color)
float spot_ambient[] = {0.2f, 0.2f, 0.2f, 1.0f};
float spot_diffuse[] = {10.2f, 10.2f, 10.2f, 10.0f};
float spot_specular[] = {10.2f, 10.2f, 10.2f, 10.0f};
gl.glLightfv(GL.GL_LIGHT0, GL.GL_AMBIENT, spot_ambient, 1);
gl.glLightfv(GL.GL_LIGHT0, GL.GL_DIFFUSE, spot_diffuse, 1);
gl.glLightfv(GL.GL_LIGHT0, GL.GL_SPECULAR, spot_specular, 1);
gl.glEnable(GL.GL_LIGHTING);
gl.glEnable(GL.GL_LIGHT0);
water = new drawWater();
water.setup(gl, "a*sin( y *(x-b) )");
water.setFuncVar('a', 1.103);
water.setFuncVar('b', 1.103);
water.setMax(5);
water2 = new drawWater();
water2.setup(gl, "a*sin( y *(x-b) )");
water2.setFuncVar('a', 0.05);
water2.setFuncVar('b', 10.0);
water2.setMax(25);
}
public void display(GLAutoDrawable drawable) {
GL gl = drawable.getGL();
// Clear Color Buffer, Depth Buffer, Stencil Buffer
gl.glClear(GL.GL_COLOR_BUFFER_BIT | GL.GL_DEPTH_BUFFER_BIT | GL.GL_STENCIL_BUFFER_BIT);
logger.info("\nWater: a = " + water.getFuncVar('a') + " b =" + water.getFuncVar('b') + "\n"
+ "Water2: a = " + water2.getFuncVar('a') + " b =" + water2.getFuncVar('b')
+ "\nWater: = " + water.getEqu() + " \nWater2 =" + water2.getEqu()
+ "\nWater: max = " + water.getMax() + " Water2: max = " + water2.getMax());
gl.glPushMatrix();
gl.glTranslatef(0.0f, -20.50f, 0.0f);
gl.glTranslatef(0.0f, 0.0f, -31.0f);
gl.glRotatef(90, 0.0f, 0.0f, 1.0f); // Rotate By -yrot On Y Axis
gl.glRotatef(90, 0.0f, 1.0f, 0.0f);
water.draw(gl);
gl.glTranslatef(0.0f, -20.0f, 0.0f);
water2.draw(gl);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(0.0f, 0.0f, -20.0f); // Zoom Into The Screen 20 Units
gl.glEnable(GL.GL_TEXTURE_2D); // Enable Texture Mapping ( NEW )
drawRoom(gl); // Draw The Room
gl.glPopMatrix();
gl.glFlush(); // Flush The OpenGL Pipeline
}
private void drawRoom(GL gl) { // Draw The Room (Box)
gl.glBegin(GL.GL_QUADS); // Begin Drawing Quads
/*
// Floor
gl.glNormal3f(0.0f, 1.0f, 0.0f); // Normal Pointing Up
gl.glVertex3f(-20.0f, -20.0f, -40.0f); // Back Left
gl.glVertex3f(-20.0f, -20.0f, 40.0f); // Front Left
gl.glVertex3f(20.0f, -20.0f, 40.0f); // Front Right
gl.glVertex3f(20.0f, -20.0f, -40.0f); // Back Right
// Ceiling
gl.glNormal3f(0.0f, -1.0f, 0.0f); // Normal Point Down
gl.glVertex3f(-10.0f, 10.0f, 20.0f); // Front Left
gl.glVertex3f(-10.0f, 10.0f, -20.0f); // Back Left
gl.glVertex3f(10.0f, 10.0f, -20.0f); // Back Right
gl.glVertex3f(10.0f, 10.0f, 20.0f); // Front Right
// Back Wall
gl.glNormal3f(0.0f, 0.0f, -1.0f); // Normal Pointing Towards Viewer
gl.glVertex3f(20.0f, 20.0f, 30.0f); // Top Right
gl.glVertex3f(20.0f, -20.0f, 30.0f); // Bottom Right
gl.glVertex3f(-20.0f, -20.0f, 30.0f); // Bottom Left
gl.glVertex3f(-20.0f, 20.0f, 30.0f); // Top Left
// Left Wall
gl.glNormal3f(1.0f, 0.0f, 0.0f); // Normal Pointing Right
gl.glVertex3f(-20.0f, 20.0f, 30.0f); // Top Front
gl.glVertex3f(-20.0f, -20.0f, 30.0f); // Bottom Front
gl.glVertex3f(-20.0f, -20.0f, -30.0f); // Bottom Back
gl.glVertex3f(-20.0f, 20.0f, -30.0f); // Top Back
// Right Wall
gl.glNormal3f(-1.0f, 0.0f, 0.0f); // Normal Pointing Left
gl.glVertex3f(20.0f, 20.0f, -30.0f); // Top Back
gl.glVertex3f(20.0f, -20.0f, -30.0f); // Bottom Back
gl.glVertex3f(20.0f, -20.0f, 30.0f); // Bottom Front
gl.glVertex3f(20.0f, 20.0f, 30.0f); // Top Front
*/
// Front Wall
gl.glNormal3f(0.0f, 0.0f, 1.0f); // Normal Pointing Away From Viewer
gl.glTexCoord2f(1, 1);
gl.glVertex3f(-20.0f, 20.0f, -30.0f); // Top Left
gl.glTexCoord2f(1, 0);
gl.glVertex3f(-20.0f, -20.0f, -30.0f); // Bottom Left
gl.glTexCoord2f(0, 0);
gl.glVertex3f(20.0f, -20.0f, -30.0f); // Bottom Right
gl.glTexCoord2f(0, 1);
gl.glVertex3f(20.0f, 20.0f, -30.0f); // Top Right
gl.glPopMatrix();
gl.glEnd(); // Done Drawing Quads
}
public void reshape(GLAutoDrawable drawable, int xstart, int ystart, int width, int height) {
GL gl = drawable.getGL();
height = (height == 0) ? 1 : height;
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL.GL_PROJECTION);
gl.glLoadIdentity();
gl.glRotatef(90, 0.0f, 0.0f, 1.0f);
glu.gluPerspective(60, (float) width / height, 1, 1000);
glu.gluLookAt(1.0f, 0.0f, 25.0f,
0.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f);
gl.glMatrixMode(GL.GL_MODELVIEW);
gl.glLoadIdentity();
}
public void displayChanged(GLAutoDrawable drawable, boolean modeChanged, boolean deviceChanged) {
}
private void loadGLTextures(GLAutoDrawable gldrawable) throws IOException {
TextureReader.Texture texture = null;
texture = TextureReader.readTexture("data/images/042.bmp");
GL gl = gldrawable.getGL();
//Create Nearest Filtered Texture
gl.glGenTextures(1, textures, 0);
gl.glBindTexture(GL.GL_TEXTURE_2D, textures[0]);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MIN_FILTER, GL.GL_LINEAR);
gl.glTexImage2D(GL.GL_TEXTURE_2D,
0,
3,
texture.getWidth(),
texture.getHeight(),
0,
GL.GL_RGB,
GL.GL_UNSIGNED_BYTE,
texture.getPixels());
}
}
The drawWater Class:
import com.sun.opengl.util.BufferUtil;
import javax.media.opengl.GL;
import java.nio.FloatBuffer;
import java.nio.IntBuffer;
/**
*
* #author Shane
*/
public class drawWater {
private Expr2 func; // The function that is being drawn.
private String functionInput;
private boolean version_1_5; // Check is OpenGL 1.5 is available; set in init().
private boolean dataIsValid; // Set to true whenever data needs to be recomputed.
// This is checked in the display() method before drawing.
private int max;
/* Buffers to hold the points and normals for the surface. */
private FloatBuffer vBuf = BufferUtil.newFloatBuffer(201 * 201 * 3);
private FloatBuffer nBuf = BufferUtil.newFloatBuffer(201 * 201 * 3);
/* Buffers to hold the indices for drawing the surface and lines with glDrawElements*/
private IntBuffer surfaceIndexBuffer = BufferUtil.newIntBuffer(200 * 201 * 2);
private IntBuffer xLineIndexBuffer = BufferUtil.newIntBuffer(21 * 201);
private IntBuffer yLineIndexBuffer = BufferUtil.newIntBuffer(21 * 201);
/* VBO ID numbers for holding the data when OpenGL version is 1.5 or higher */
private int vertexVBO, normalVBO; // VBO IDs for surface data.
private int xLineVBO, yLineVBO, surfaceVBO; // VBO IDs for index data.
public drawWater() {
}
public void setup(GL gl, String equ) {
version_1_5 = gl.isExtensionAvailable("GL_VERSION_1_5");
if (gl.isExtensionAvailable("GL_VERSION_1_3")) {
gl.glEnable(GL.GL_MULTISAMPLE);
}
this.makeElementBuffers(); // Generate lists of indices for glDrawElements. This data never changes.
if (version_1_5) {
// Generate VBOs for the data, and fill the ones that are for index data with
// data from Java nio buffers. The VBOs for index data won't change again and
// so use GL.GL_STATIC_DRAW.
int[] ids = new int[5];
gl.glGenBuffers(5, ids, 0);
this.vertexVBO = ids[0];
this.normalVBO = ids[1];
this.xLineVBO = ids[2];
this.yLineVBO = ids[3];
this.surfaceVBO = ids[4];
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, vertexVBO);
gl.glVertexPointer(3, GL.GL_FLOAT, 0, 0);
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, normalVBO);
gl.glNormalPointer(GL.GL_FLOAT, 0, 0);
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, 0);
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, surfaceVBO);
gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, 4 * 2 * 200 * 201, surfaceIndexBuffer, GL.GL_STATIC_DRAW);
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, xLineVBO);
gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, 4 * 21 * 201, xLineIndexBuffer, GL.GL_STATIC_DRAW);
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, yLineVBO);
gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, 4 * 21 * 201, yLineIndexBuffer, GL.GL_STATIC_DRAW);
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, 0);
} else {
gl.glVertexPointer(3, GL.GL_FLOAT, 0, vBuf);
gl.glNormalPointer(GL.GL_FLOAT, 0, nBuf);
}
this.functionInput = equ;
this.func = new Expr2(equ);
this.dataIsValid = false; // Force recomputation of data with new graph definition.
}
public void draw(GL gl) {
if (func != null) {
gl.glMaterialfv(GL.GL_FRONT, GL.GL_AMBIENT_AND_DIFFUSE, new float[]{0.7f, 10.7f, 1}, 0);
gl.glMaterialfv(GL.GL_BACK, GL.GL_AMBIENT_AND_DIFFUSE, new float[]{10.8f, 0.8f, 0.5f}, 0);
if (!dataIsValid) {
this.computeSurfaceData();
if (version_1_5) {
// Set up VBOs for surface points and normals. Since these change
// pretty regularly, use GL.GL_DYNAMIC_DRAW.
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, vertexVBO);
gl.glBufferData(GL.GL_ARRAY_BUFFER, 4 * 3 * 201 * 201, vBuf, GL.GL_DYNAMIC_DRAW);
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, normalVBO);
gl.glBufferData(GL.GL_ARRAY_BUFFER, 4 * 3 * 201 * 201, nBuf, GL.GL_DYNAMIC_DRAW);
gl.glBindBuffer(GL.GL_ARRAY_BUFFER, 0);
}
}
gl.glEnableClientState(GL.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL.GL_NORMAL_ARRAY);
this.drawSurface(gl); // Just draw the surface.
gl.glPolygonOffset(1, 1);
gl.glEnable(GL.GL_POLYGON_OFFSET_FILL);
this.drawSurface(gl);
gl.glDisable(GL.GL_POLYGON_OFFSET_FILL);
gl.glDisable(GL.GL_LIGHTING);
gl.glColor3f(0, 0, 0);
gl.glDisableClientState(GL.GL_NORMAL_ARRAY);
gl.glEnable(GL.GL_LIGHTING);
}
gl.glDisableClientState(GL.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL.GL_NORMAL_ARRAY);
}
public void setIsVaild(boolean bool) {
this.dataIsValid = bool;
}
public void setMax(int max) {
this.max = max;
}
public int getMax() {
return this.max;
}
public double getFuncVar(char var) {
return this.func.getVariable(var);
}
public void setFuncVar(char var, double value) {
this.func.setVariable(var, value);
}
public void setFunc(String func) {
this.func.parse(func);
}
public String getEqu() {
return this.functionInput;
}
private void makeElementBuffers() {
for (int i = 0; i < 201; i += 10) { // indices for drawing lines in x-direction
for (int j = 0; j < 201; j++) {
this.xLineIndexBuffer.put(201 * i + j);
}
}
for (int i = 0; i < 201; i += 10) { // indices for drawing lines in y-direction
for (int j = 0; j < 201; j++) {
this.yLineIndexBuffer.put(201 * j + i);
}
}
for (int i = 0; i < 200; i++) { // indices for drawing surface with GL_TRIANGLE_STRIPs
for (int j = 0; j < 201; j++) {
this.surfaceIndexBuffer.put(201 * (i + 1) + j);
this.surfaceIndexBuffer.put(201 * i + j);
}
}
this.xLineIndexBuffer.rewind();
this.yLineIndexBuffer.rewind();
this.surfaceIndexBuffer.rewind();
}
private void computeSurfaceData() {
double xmin = -5;
double xmax = 5;
double ymin = -5;
double ymax = 5;
double xRes = 200;
double yRes = 200;
float[] surfaceData = new float[301 * 3];
float[] normalData = new float[301 * 3];
double dx = (xmax - xmin) / xRes;
double dy = (ymax - ymin) / yRes;
for (int i = 0; i <= xRes; i++) {
int v = 0;
int n = 0;
double y1 = ymin + dy * i;
for (int j = 0; j <= yRes; j++) {
double x = xmin + dx * j;
this.func.setVariable('x', x);
this.func.setVariable('y', y1);
double z1 = this.func.value();
float[] normal1 = computeUnitNormal(x, y1);
surfaceData[v++] = (float) x;
surfaceData[v++] = (float) y1;
surfaceData[v++] = (float) z1;
normalData[n++] = normal1[0];
normalData[n++] = normal1[1];
normalData[n++] = normal1[2];
}
this.vBuf.put(surfaceData, 0, 201 * 3);
this.nBuf.put(normalData, 0, 201 * 3);
}
this.vBuf.rewind();
this.nBuf.rewind();
this.dataIsValid = true;
}
/**
* Draw the surface as a series of triangle strips.
*/
private void drawSurface(GL gl) {
if (version_1_5) {
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, surfaceVBO);
for (int i = 0; i < 200; i++) {
gl.glDrawElements(GL.GL_TRIANGLE_STRIP, 402, GL.GL_UNSIGNED_INT, 402 * i * 4);
}
gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, 0);
} else {
for (int i = 0; i < 200; i++) {
this.surfaceIndexBuffer.position(402 * i);
gl.glDrawElements(GL.GL_TRIANGLE_STRIP, 402, GL.GL_UNSIGNED_INT, surfaceIndexBuffer);
}
}
}
/**
* Compute a unit normal to the graph of z = func(x,y).
* This is only an approximation, using nearby points instead
* of exact derivatives.
*/
private float[] computeUnitNormal(double x, double y) {
double epsilon = 0.00001;
this.func.setVariable('x', x);
this.func.setVariable('y', y);
double z = this.func.value();
this.func.setVariable('x', x + epsilon);
double z1 = func.value();
this.func.setVariable('x', x);
this.func.setVariable('y', y + epsilon);
double z2 = this.func.value();
// normal is (epsilon,0,z1-z) X (0,epsilon,z2-z)
double a = -epsilon * (z1 - z);
double b = -epsilon * (z2 - z);
double c = epsilon * epsilon;
double length = Math.sqrt(a * a + b * b + c * c);
if (Double.isNaN(length) || Double.isInfinite(length)) {
return new float[]{0, 0, 1};
} else {
return new float[]{(float) (a / length), (float) (b / length), (float) (c / length)};
}
}
}
Any help would be a appreciated. This is part of my final year project in college. So any help would be great.
The calls to glVertexPointer/glNormalPointer are not done per draw, but per setup.
So, when you create the 2 drawWater objects, the last one leaves its vertex and normal data bound to the GL, which all the calls to glDrawElements will use.
You need to modify the code so that glVertexPointer/glNormalPointer (along with the glBindBuffer calls that accompany them) are done per draw.

How to draw a directed arrow line in Java?

I want to draw a directed arrow line through Java.
At present, I am using java.awt.Line2D.Double class to draw a line
g2.setStroke(new BasicStroke(2.0f, BasicStroke.CAP_BUTT, BasicStroke.JOIN_BEVEL)); // g2 is an instance of Graphics2D
g2.draw(new Line2D.Double(x1,y1,x2,y2));
But only the line appears and no directed arrow appears. BasicStroke.Join_BEVEL is used to draw a directed arrow. It is applied when two line segments meet.
The line I am drawing meets the border of a rectangle but no directed arrow is drawn. Only a simple line is drawn.
Is there anything I am missing?
Although Pete's post is awesomely comprehensive, I'm using this method to draw a very simple line with a little triangle at its end.
// create an AffineTransform
// and a triangle centered on (0,0) and pointing downward
// somewhere outside Swing's paint loop
AffineTransform tx = new AffineTransform();
Line2D.Double line = new Line2D.Double(0,0,100,100);
Polygon arrowHead = new Polygon();
arrowHead.addPoint( 0,5);
arrowHead.addPoint( -5, -5);
arrowHead.addPoint( 5,-5);
// [...]
private void drawArrowHead(Graphics2D g2d) {
tx.setToIdentity();
double angle = Math.atan2(line.y2-line.y1, line.x2-line.x1);
tx.translate(line.x2, line.y2);
tx.rotate((angle-Math.PI/2d));
Graphics2D g = (Graphics2D) g2d.create();
g.setTransform(tx);
g.fill(arrowHead);
g.dispose();
}
The bevel is drawn between segments in a polyline if they are at certain angles. It has no bearing if you are drawing a line which happens to be drawn near some other pixels which are of a certain colour - once you've drawn the rectangle, the Graphics object doesn't know about the rectangle, it (in effect) only holds the pixels. ( or rather the image or OS window holds the pixels ).
To draw a simple arrow, draw a line for the stalk as you're doing, then a polyline for the vee. Nicer looking nicer arrows have curved sides and are filled.
You probably don't want to use bevel for the arrow head, as bevels are a flat; instead use the mitre option:
import java.awt.*;
import java.awt.geom.*;
import javax.swing.*;
public class BevelArrows
{
public static void main ( String...args )
{
SwingUtilities.invokeLater ( new Runnable () {
BevelArrows arrows = new BevelArrows();
#Override
public void run () {
JFrame frame = new JFrame ( "Bevel Arrows" );
frame.add ( new JPanel() {
public void paintComponent ( Graphics g ) {
arrows.draw ( ( Graphics2D ) g, getWidth(), getHeight() );
}
}
, BorderLayout.CENTER );
frame.setSize ( 800, 400 );
frame.setDefaultCloseOperation ( JFrame.EXIT_ON_CLOSE );
frame.setVisible ( true );
}
} );
}
interface Arrow {
void draw ( Graphics2D g );
}
Arrow[] arrows = { new LineArrow(), new CurvedArrow() };
void draw ( Graphics2D g, int width, int height )
{
g.setRenderingHint ( RenderingHints.KEY_ANTIALIASING, RenderingHints.VALUE_ANTIALIAS_ON );
g.setColor ( Color.WHITE );
g.fillRect ( 0, 0, width, height );
for ( Arrow arrow : arrows ) {
g.setColor ( Color.ORANGE );
g.fillRect ( 350, 20, 20, 280 );
g.setStroke ( new BasicStroke ( 20.0f, BasicStroke.CAP_BUTT, BasicStroke.JOIN_BEVEL ) );
g.translate ( 0, 60 );
arrow.draw ( g );
g.setStroke ( new BasicStroke ( 20.0f, BasicStroke.CAP_BUTT, BasicStroke.JOIN_MITER ) );
g.translate ( 0, 100 );
arrow.draw ( g );
g.setStroke ( new BasicStroke ( 20.0f, BasicStroke.CAP_BUTT, BasicStroke.JOIN_ROUND ) );
g.translate ( 0, 100 );
arrow.draw ( g );
g.translate ( 400, -260 );
}
}
static class LineArrow implements Arrow
{
public void draw ( Graphics2D g )
{
// where the control point for the intersection of the V needs calculating
// by projecting where the ends meet
float arrowRatio = 0.5f;
float arrowLength = 80.0f;
BasicStroke stroke = ( BasicStroke ) g.getStroke();
float endX = 350.0f;
float veeX;
switch ( stroke.getLineJoin() ) {
case BasicStroke.JOIN_BEVEL:
// IIRC, bevel varies system to system, this is approximate
veeX = endX - stroke.getLineWidth() * 0.25f;
break;
default:
case BasicStroke.JOIN_MITER:
veeX = endX - stroke.getLineWidth() * 0.5f / arrowRatio;
break;
case BasicStroke.JOIN_ROUND:
veeX = endX - stroke.getLineWidth() * 0.5f;
break;
}
// vee
Path2D.Float path = new Path2D.Float();
path.moveTo ( veeX - arrowLength, -arrowRatio*arrowLength );
path.lineTo ( veeX, 0.0f );
path.lineTo ( veeX - arrowLength, arrowRatio*arrowLength );
g.setColor ( Color.BLUE );
g.draw ( path );
// stem for exposition only
g.setColor ( Color.YELLOW );
g.draw ( new Line2D.Float ( 50.0f, 0.0f, veeX, 0.0f ) );
// in practice, move stem back a bit as rounding errors
// can make it poke through the sides of the Vee
g.setColor ( Color.RED );
g.draw ( new Line2D.Float ( 50.0f, 0.0f, veeX - stroke.getLineWidth() * 0.25f, 0.0f ) );
}
}
static class CurvedArrow implements Arrow
{
// to draw a nice curved arrow, fill a V shape rather than stroking it with lines
public void draw ( Graphics2D g )
{
// as we're filling rather than stroking, control point is at the apex,
float arrowRatio = 0.5f;
float arrowLength = 80.0f;
BasicStroke stroke = ( BasicStroke ) g.getStroke();
float endX = 350.0f;
float veeX = endX - stroke.getLineWidth() * 0.5f / arrowRatio;
// vee
Path2D.Float path = new Path2D.Float();
float waisting = 0.5f;
float waistX = endX - arrowLength * 0.5f;
float waistY = arrowRatio * arrowLength * 0.5f * waisting;
float arrowWidth = arrowRatio * arrowLength;
path.moveTo ( veeX - arrowLength, -arrowWidth );
path.quadTo ( waistX, -waistY, endX, 0.0f );
path.quadTo ( waistX, waistY, veeX - arrowLength, arrowWidth );
// end of arrow is pinched in
path.lineTo ( veeX - arrowLength * 0.75f, 0.0f );
path.lineTo ( veeX - arrowLength, -arrowWidth );
g.setColor ( Color.BLUE );
g.fill ( path );
// move stem back a bit
g.setColor ( Color.RED );
g.draw ( new Line2D.Float ( 50.0f, 0.0f, veeX - arrowLength * 0.5f, 0.0f ) );
}
}
}
This is my approach, absolute Math only:
/**
* Draw an arrow line between two points.
* #param g the graphics component.
* #param x1 x-position of first point.
* #param y1 y-position of first point.
* #param x2 x-position of second point.
* #param y2 y-position of second point.
* #param d the width of the arrow.
* #param h the height of the arrow.
*/
private void drawArrowLine(Graphics g, int x1, int y1, int x2, int y2, int d, int h) {
int dx = x2 - x1, dy = y2 - y1;
double D = Math.sqrt(dx*dx + dy*dy);
double xm = D - d, xn = xm, ym = h, yn = -h, x;
double sin = dy / D, cos = dx / D;
x = xm*cos - ym*sin + x1;
ym = xm*sin + ym*cos + y1;
xm = x;
x = xn*cos - yn*sin + x1;
yn = xn*sin + yn*cos + y1;
xn = x;
int[] xpoints = {x2, (int) xm, (int) xn};
int[] ypoints = {y2, (int) ym, (int) yn};
g.drawLine(x1, y1, x2, y2);
g.fillPolygon(xpoints, ypoints, 3);
}
In the past, I've written the following method to create an an arrow shape, which I can then fill with ((Graphics2D) g).fill(shape);
public static Shape createArrowShape(Point fromPt, Point toPt) {
Polygon arrowPolygon = new Polygon();
arrowPolygon.addPoint(-6,1);
arrowPolygon.addPoint(3,1);
arrowPolygon.addPoint(3,3);
arrowPolygon.addPoint(6,0);
arrowPolygon.addPoint(3,-3);
arrowPolygon.addPoint(3,-1);
arrowPolygon.addPoint(-6,-1);
Point midPoint = midpoint(fromPt, toPt);
double rotate = Math.atan2(toPt.y - fromPt.y, toPt.x - fromPt.x);
AffineTransform transform = new AffineTransform();
transform.translate(midPoint.x, midPoint.y);
double ptDistance = fromPt.distance(toPt);
double scale = ptDistance / 12.0; // 12 because it's the length of the arrow polygon.
transform.scale(scale, scale);
transform.rotate(rotate);
return transform.createTransformedShape(arrowPolygon);
}
private static Point midpoint(Point p1, Point p2) {
return new Point((int)((p1.x + p2.x)/2.0),
(int)((p1.y + p2.y)/2.0));
}
Just in case if you want an non-programmatic arrow (I.e. for text purpose) in Fast way, you can use <html> code for making arrow as text, just put your HTML code inside .setText() method for a component. I have java 1.8u202 it works fine.
myLabel.setText("<html><body>←</body></html>");
this code ← is for left-pointing arrow
other arrow directions HTML code from This Website
void drawArrow(Graphics g1, double x1, double y1, double x2, double y2 ) {
Graphics2D ga = (Graphics2D) g1.create();
ga.drawLine((int)x1, (int)y1, (int)x2, (int)y2);
double l = Math.sqrt(Math.pow((x2 - x1), 2) + Math.pow((y2 - y1), 2));// line length
double d = l / 10; // arrowhead distance from end of line. you can use your own value.
double newX = ((x2 + (((x1 - x2) / (l) * d)))); // new x of arrowhead position on the line with d distance from end of the line.
double newY = ((y2 + (((y1 - y2) / (l) * d)))); // new y of arrowhead position on the line with d distance from end of the line.
double dx = x2 - x1, dy = y2 - y1;
double angle = (Math.atan2(dy, dx)); //get angle (Radians) between ours line and x vectors line. (counter clockwise)
angle = (-1) * Math.toDegrees(angle);// cconvert to degree and reverse it to round clock for better understand what we need to do.
if (angle < 0) {
angle = 360 + angle; // convert negative degrees to posative degree
}
angle = (-1) * angle; // convert to counter clockwise mode
angle = Math.toRadians(angle);// convert Degree to Radians
AffineTransform at = new AffineTransform();
at.translate(newX, newY);// transport cursor to draw arrowhead position.
at.rotate(angle);
ga.transform(at);
Polygon arrowHead = new Polygon();
arrowHead.addPoint(5, 0);
arrowHead.addPoint(-5, 5);
arrowHead.addPoint(-2, -0);
arrowHead.addPoint(-5, -5);
ga.fill(arrowHead);
ga.drawPolygon(arrowHead);
}
This is the code from Vicente Reig's great answer, simplified a little and packaged as a nice utility class.
import java.awt.*;
import java.awt.geom.AffineTransform;
public class Arrow
{
private final Polygon arrowHead = new Polygon ();
/**
* Create an arrow.
*
* #see https://stackoverflow.com/questions/2027613/how-to-draw-a-directed-arrow-line-in-java
*
* #param size Size of the arrow to draw.
*/
public Arrow (int size)
{
// create a triangle centered on (0,0) and pointing right
arrowHead.addPoint (size, 0);
arrowHead.addPoint (-size, -size);
arrowHead.addPoint (-size, size);
//arrowHead.addPoint (0, 0); // Another style
}
/**
* Draw the arrow at the end of a line segment. Drawing the line segment must be done by the caller, using whatever
* stroke and color is required.
*/
public void drawArrowHead (Graphics2D g, double x0, double y0, double x1, double y1)
{
final AffineTransform tx = AffineTransform.getTranslateInstance (x1, y1);
tx.rotate (Math.atan2 (y1 - y0, x1 - x0));
g.fill (tx.createTransformedShape (arrowHead));
}
}

Categories