I'm implementing a physics game powered by AndEngine with box2d.
suppose there is an object falling from above vertically.
the ball collide with another object, and change its direction
now, after the collision, the ball should spin/rotate in the air, right?
so, I wanted to know if I need to do the calculation by myself (and how?) using setAngularVelocity function
or box2d can do it automatically.
I hope I expressed myself correctly
thanks for the help
sock.socket
No... you dont need to do any calculations... when you create your physics connector for your body like this..
public PhysicsConnector(final IShape pShape, final Body pBody, final boolean pUdatePosition, final boolean pUpdateRotation) {
if you set the pUpdateRotation true... you can see your body rotating.. and if dont want to see..put it to false... but this will only disable the updating of the sprite on the screen. It still keeps the Body in the Physics.. so the body will still be rotating.. but the rotation is not visible...
For anyone else reading this, another reason your sprite might not be rotating is if the friction of the fixture/fixtures is set to 0. (think sliding on ice).
Related
I'm currently developing an Air Hockey game in Java, using libgdx.
The point of the game is to move the handle using touch or mouse, according to the platform, and use it to touch the puck in order for it to move around.
I've searched around for a bit but couldn't find a suitable solution. I know there's methods like gesture listeners, touchDragged, setTransform, applyForce, etc...
Right now, in order to test it, I'm using setLinearVelocity, the body gains velocity towards the mouse click. This way, when the handle touches the puck it does exactly what I want, the contact listener works and the puck moves in the correct direction, but obviously I don't want this method to apply movement to the handle.
So how can I move it using touch and making sure that the handle hits the puck?
You can set the body position using setTransform. In order to move it with mouse/finger you can implement touchDragged method of InputAdapter, like this:
Gdx.input.setInputProcessor(new InputAdapter() {
public boolean touchDragged(int screenX, int screenY, int pointer) {
body.setTransform(screenX, Gdx.graphics.getHeight() - screenY, 0);
return true;
}
});
Also I think you should define your stick as kinematic body:
Like static bodies, they do not react to forces, but like dynamic
bodies, they do have the ability to move. Kinematic bodies are great
for things where you, the programmer, want to be in full control of a
body's motion, such as a moving platform in a platform game.
For curiosity I made a test myself. The stick is a kinematic body, and the ball is dynamic. As you see the stick is not affected by any forces:
I'm making a libgdx game where the user can create a distance joint and a revolute joint on its own, so what I do whenever 2 bodies have been touched they are both added to an arrayList, then when a button is touched a joint will be created. The problem is that the joint are always at the center, so I was wondering if there is a way to get the location of the body where it was touched?, then set those locations as anchorPointA and anchorPointB.
The first idea I get is a Gesture listener, see for example the GestureDetector.GestureAdapter(). Then you implement the touchDown method where you can get the x, y touch positions. To see wheter a body is touched, you might use the Vector.dst() method, but don't forget to unproject if you need.
Another idea might be to add an InputListener to your actor (which is connected to your body), but I haven't tried it yet.
I am developing my first libgdx 3D game. Until now i can move arround in a maze-like (hardcoded) world, collision detection works. Also i have some enemies with working A* Pathfinding.
I also loded my first (pretty ugly) Blender model, using FBX-Conv to get a .g3db file. For some reason the model lise on the floor instead of standing. Maybe i had some wrong settings when i exported it as .fbx.
For that i tryed to rotate() him arround the z-Axis by 90 degrees by calling:
modelInstance.transform.rotate(Vector3.Z, 90) in the show() method of my Screen, after loading the Model and instantiating my ModelInstance (at a given position). For some reason it did not rotate. Then i put the rotate method in the render(delta), thinking, that it would now rotate 90 degrees every render loop. But instead it was standing still, like it should.
Okay, but now i want the modelInstance to rotate to where it actually looks, meaning it should rotate, depending on my enemies Vector3 direction.
I am allready setting his position with modelInstance.transform.setTotranslation(enemie.getPosition()) which works perfect. So i thought i can also use modelInstance.transform.setToRotation(Vector3 v1, Vector3 vs), with v1 = enemie.getPosition() and v2 = enemie.getPosition().add(enemie.getDirection). Note, that the position Vector is not used directly, as it would change its values inside the add() method.
Doing this, i don't see the object anymore, meaning also its position is wrong.
Why is this happening?
And how can i rotate my modelInstance by using the direction vector?
Thanks a lot.
I solved this with #Xoppas help. The problem was:
i used setToTranslation to move my Model to a given position, but this als resets the rotation
I missunderstood the setToRotation(Vector3, Vector3) method.
So the solution was to to the setToTranslation first, and then use setToRotation(Vector3 direction, Vector3 face), where direction is the direction, in which my Model is looking and face is the face, which should look in this direction, in my case the Vector3.X.
Hope it helps someone else.
Worse case scenario, you could modify the transformation matrix using:
ModelInstance.transform.rotate()
I have various bodies rotated at some angle with the help of Box2d in libGDX. What I want is to destroy the body when I click on it, but the problem is that I am not able to get the area definition of the body so, that I can check whether my touch point lies inside the body or not. I tried using actor and its hit() method but its working only if I don`t rotate it, As far as I know , once I rotated the body , its bounds are not rotated . So, how can we check Touch event in a Body.
Thanks in advance.
See the touchDown handler in the libgdx Box2DTest. They use World.QueryAABB (AABB is "Axis-Aligned Bounding Box") to query which objects intersect a small bounding box around the the touch point, and then use the query callback to verify the actual touch point intersects the object in question.
I've just started with Box2D and have come across a strange problem.
I have a simple function to constrain object position to within a predefined area.
I do this by getting the body's world position, checking this against the predefined area's bounding box values, and applying a force to the body to keep it within.
if (bodyWorldPos.x >= worldWidth)
body.setLinearVelocity(...);
This works fine.
However, if the body collides with another body, this simple method stops working.
The body's world position, retrieved like this:
body.getWorldPoint(body.getPosition())
returns wrong values.
Is this a bug in Box2D for LibGDX or am I doing something wrong?
The function getWorldPoint converts a point from 'local coordinates' (relative to the body's 0,0 position) to global coordinates (relative to 0,0 in the world).
I think for this purpose you can just use getPosition() only.