I'm making a libgdx game where the user can create a distance joint and a revolute joint on its own, so what I do whenever 2 bodies have been touched they are both added to an arrayList, then when a button is touched a joint will be created. The problem is that the joint are always at the center, so I was wondering if there is a way to get the location of the body where it was touched?, then set those locations as anchorPointA and anchorPointB.
The first idea I get is a Gesture listener, see for example the GestureDetector.GestureAdapter(). Then you implement the touchDown method where you can get the x, y touch positions. To see wheter a body is touched, you might use the Vector.dst() method, but don't forget to unproject if you need.
Another idea might be to add an InputListener to your actor (which is connected to your body), but I haven't tried it yet.
Related
Hi I have successfully rendered Utah Teapot in OpenlGL ES 2.0. Currently I am trying to implement touch events so that whenever I touch the teapot it will explode.
My question is where should I implement the touch event in Renderer Class or in GLSurfaceView? and how do i make the teapot explode. Thank you in advance. I am new in Android Any suggestion is highly appreciated
About the touch event:
You would need to show the system/architecture you are currently having. Generally you would have a separate class that controls the scene. In your case you may only have a single object that contains the teapot and all its values that are needed to draw or move or explode... So this class should be initialized and owned by the surface view or its parent. In both cases the surface view would have the access to the scene. Now if I assume the renderer is responsible for drawing then the surface view is the owner of the renderer and would call something similar to this.myRenderer.drawScene(this.myScene) or if the renderer has a control on the draw initialization then the surface view must forward the access to the scene so this.myRenderer.setScene(this.myScene). So in the end both of the classes have the access to the scene and with it to the teapot object.
Now to handle the touch event you will need to check what is the nearest location you can intercept a touch event. If the surface view can intercept these calls then implement it there. If not I am sure the owner of the surface view can, in this case the owner would call this.mySurfaceView.handleTouchEvent(touch).
Now the surface view can optionally do some checks to see if the pot was hit and begin the explosion procedure. This might be as easy as only calling a method on the teapot like this.myScene.teapotObject.explode().
About the explosion itself:
There are many ways on how to explode an object and generally none of them are easy. The minimum would most likely be to have a system where your vertex buffer will be split into smaller chunks where each of the chunk is then misplaced animatedly while exploding.
Even creating an animation might be a hard procedure. One way is to create an interpolation where you would have an animation start time startTimeStamp, its duration, object start position startPosition and object end position endPosition. Then the current position is retrieved by currentPosition = startPosition + (endPosition -startPosition)*(startTimeStamp-currentTimeStamp)/duration. Another way is to implement physics where an object would move on every frame. In this case you define the objects speed and then on every frame you would call teapot.chunk.move(1.0/60.0) which would then do this.position += this.speed*dt. You can then add gravity to manipulate speed or collisions...
I'm using Java to create a simple game in 2D.
However, I would like to get the DX of the mouse so that i can move a certain object to a different place with my mouse in my game.
When i was learning a bit of LWJGL, there was a method called
Mouse.getDX()/~.getDY()
It returns the movement on the x/y axis since last time getDY() was called.
But I'm not sure how to get such value without using any other libraries like LWJGL. I only know how to get the Mouse Position using the MouseListener interface. Or is there anything I've done wrong? Thanks if you can answer :)
See this StackOverflow: Get Mouse Position
It gives information on how to get the mouse position and gives links to the java API for more details.
Or, use this tutorial to write your own listener, storing the last known position to obtain the delta: https://docs.oracle.com/javase/tutorial/uiswing/events/mousemotionlistener.html
I have various bodies rotated at some angle with the help of Box2d in libGDX. What I want is to destroy the body when I click on it, but the problem is that I am not able to get the area definition of the body so, that I can check whether my touch point lies inside the body or not. I tried using actor and its hit() method but its working only if I don`t rotate it, As far as I know , once I rotated the body , its bounds are not rotated . So, how can we check Touch event in a Body.
Thanks in advance.
See the touchDown handler in the libgdx Box2DTest. They use World.QueryAABB (AABB is "Axis-Aligned Bounding Box") to query which objects intersect a small bounding box around the the touch point, and then use the query callback to verify the actual touch point intersects the object in question.
I'm implementing a physics game powered by AndEngine with box2d.
suppose there is an object falling from above vertically.
the ball collide with another object, and change its direction
now, after the collision, the ball should spin/rotate in the air, right?
so, I wanted to know if I need to do the calculation by myself (and how?) using setAngularVelocity function
or box2d can do it automatically.
I hope I expressed myself correctly
thanks for the help
sock.socket
No... you dont need to do any calculations... when you create your physics connector for your body like this..
public PhysicsConnector(final IShape pShape, final Body pBody, final boolean pUdatePosition, final boolean pUpdateRotation) {
if you set the pUpdateRotation true... you can see your body rotating.. and if dont want to see..put it to false... but this will only disable the updating of the sprite on the screen. It still keeps the Body in the Physics.. so the body will still be rotating.. but the rotation is not visible...
For anyone else reading this, another reason your sprite might not be rotating is if the friction of the fixture/fixtures is set to 0. (think sliding on ice).
I am facing an strange issue not sure why this is happening.
I have a Java based Activity which has a LinearLayout. This LinearLayout consists of two GLSurfaceView. All the associated methods of GLSurfaceView like OnDraw, SurfaceChanged etc are moving call down to JNI layer. Inside the JNI layer I am drawing a cube using OpenGLES. I have also created a touch listener and have associated it with first GLSurfaceView. Once I get a touch event I move the call to JNI layer and randomly rotate the first cube.
Problem is when I rotate my first cube both the cubes rotates at exactly same angle. I have debugged this issue for last four hours and I am pretty sure there is nothing wrong in my logic. But for some unknown reason when I make change in one GLSurfaceView other cube changes automatically.
Any ideas? Similar issues? Guess?
Update
I am using same context i.e. my activity for both GLSurfaceView. Basically I have a class inside C++ which draws cube through opengles. Now I am successfully creating two cubes and displaying them simultaneously. Both cubes have different texture on them which I am passing via Java layer. My c++ class has a method which randomly rotates the cube. Problem is if I call method of one cube to rotate it other automatically rotates at same angle no make what I do.
Without your code, I'd guess you are initializing your GLSurfaceView's using the same context. When sharing a context changing one will change the other because they will share the same GL10 instance in the Renderer. I don't program in android, but in general you'd use multiple "viewports" to display different things.
Say that your first GLSurfaceView is on the first left side of your screen, and the second is on the second right half. One idea is to check what side the coordinates x, y of the motionEvent belongs to. And then pass rotation and translation accordingly.
Issue solved there was one logical mistake in my code
Sorry for inconvenience