I'm trying to get the values of Magnetic Field on each axis using my mobile phone, it seems that the results that I get from magnetometer sensors depend on the phone reference (see 1). So every time I rotate or move my phone, I get different values.
I tried to fix a reference so I can get constant values. For the moment, I can get the 3 angles: yaw pitch and roll. I'm conviced that by calculating a certain rotation matrix with those angles, I will arrive to fix that issue. Have you had a similar problem ?
phone axis
Thank you and have a good day :)
Related
I am currently working on an ARCore app and want to place objects (eg. Arrows) in the AR scene that start at the current location of the user and point to geographical north.
My idea was to place objects on the X-axis, rotate them around the Y-axis and at last translate the objects to have their origin in the position of the user. I planned to at first get the device orientation in relation to north (which I am able to do) and also get the orientation in relation to the ARCore world coordinate system. I'm now struggling to do the latter. I think I need the angle around the Y-axis between the X-axis and the current direction of view. I figured out that it should be somehow possible to get this via the Pose:
arFragment.getArSceneView().getArFrame().getCamera().getDisplayOrientedPose();
I don’t have any clue what to do with the pose. The values qx(), qy(), qz(), getXAxis(), getYAxis() and getZAxis() don’t seem to be what I expected. As I watched them change while moving the device (after converting them to degrees) they ranged roughly from -55 to 55.
TL;DR:
How to rotate objects in ARCore to point to a given geographical direction?
I found this solution but don't know what the 'dHelper' is supposed to be:
Placing an object with a given compass bearing in ARCore
to put an arrow on the camera view that will direct you you do not need arcore in any matter. This is siple rotation of na ImageView
if you want to rotate 3D object according to the bearing you need to get actual position, target position, count the bearing and rotate based on the int value of degrees that you received from the calculation.
Sorry if the title is hard to understand, I don't know how else to phrase it.
Accelerometer data on android is relative to the screen's orientation and tilt (TYPE_ACCELEROMETER) I am trying to make an app in which accelerometer data is constant regardless of how the user is holding his device. I think I'd have to calculate angle of tilt from the gravitational force affecting the XYZ values and somehow go from there as well as use the compass to lock the X axis to north (or any other direction). Anyone have some ideas/tips?
Thanks!
I am creating an Android game that uses the accelerometer for X-axis and Y-axis movement in landscape orientation. The X-axis works as expected, however the Y-axis is causing an issue.
For the Y-axis to work properly, the Y-axis needs to be at 0 (The phone is perfectly screen-up.) This is an issue because I don't expect users to be hunched over their phones to play properly.
I attempted to correct this by taking the initial reading of the Y-axis orientation and subtracting that from the following Y-axis readings, but if the user begins the game at -10 (Phone screen directly facing them) the phone will not register any further tilting back.
Does anyone know a better way to handle this situation? Thank you all for the help thus far!
It's possible. Check out Allowing both landscape options and accelerometer with libgdx
and a fix for accelerometer reading
Dear programmers/scripters/engineers/other people,
The problem:
I'm currently developing an augmented reality application for an Android 3.2 tablet and having some issues with getting an accurate compass reading. I need to know exactly the (z) direction the tablet is facing, measured from the north. It doesn't matter if it's in degrees or radians.
What I currently have tried:
I used the magnetometer and accelerometer to calculate the angle. It has one big disadvantage. If you rotate 90 degrees, the sensors will measure a larger or a smaller angle. Even when I'm in an open field far away from metal or any magnetic objects. Even the declination doesn't solve it.
Using the gyroscope would be an option. I have tried to measure the rotating speed and store the measured units into a variable to know the exact view direction. There seems to be an factor that causes distortion though. I found out that fast rotations distort the accurate direction measurement. The gyro's drift wasn't that much troublesome. The application checks the other sensors for any movement. If none is detected, the gyro's rotation change won't be taken into account.
The rotation vector works okay. It has some issues like the gyroscope. If you move slowly and stop at a suddenly moment, it will drift away for a few seconds. Another problem is that it will be inaccurate with quick rotations depending on the speed and how many turns you've made. (You don't want to know how my co-workers are looking at me when I'm swinging the tablet in all directions...)
Sensor.Orientation, not much to say about. It is deprecated for some reason so I won't use it. A lot of examples on the internet are using this sensor and it's probably the same thing as the magnetic/accelerometer combination.
So I'm currently out of idea's. Could you help me with brain storming / solution solving?
Thanks in advance, yours sincerely, Roland
EDIT 1:
I am willing to provide the code I have tried.
I'm summing up our comments:
its clear from this video that the sensors on phones are not very accurate to begin with. Also interesting to read is this
it its important that the user calibrates the sensors by doing a figure 8 motion holding the phone flat. An App can programmatically check if such a calibration is necessary and notify the user. See this question for details
To eliminate jitter the values obtained from the sensors need to be filtered by a low pass filter of some kind. This has also been discussed on StackOverflow.
The orientation obtained is not the true north. To obtain the true north one must use GeomagneticField
I'm working on an android application which can calculate device's movement in 6 direction. I think I can use acceleration as;
"x=a.t^2" but a is not a constant. And this is the problem. How can I calculate the total movement??
The accelerometer gives you three directions (x, y, z). They are acceleration measurements which is harder to know what the position of the device is. But, remember acceleration is related to position through integration:
a(t) = a[x]
v(t) = a[x]t + c
x(t) = a[x]t ^ 2 + ct + d
Problem is you can't know c or d because as you take the derivative the constants drop out. So there is some amount you can't get right with c and d missing. You can attempt to compensate by remembering the values you used last for those. So after grabbing 3 samples you can start to calculate position from that.
There is a significant amount of information about how to interpret the data from the sensors. Like figuring out where gravity is for orientation, and subtracting out gravity to get linear acceleration.
http://developer.android.com/reference/android/hardware/SensorEvent.html
Here is a way to come up with position using an accelerometer along with an algorithm for finding position in detail:
https://www.nxp.com/docs/en/application-note/AN3397.pdf
It is true, you get position by integrating the linear acceleration twice. But the error is horrible. It is useless in practice.
Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
It is not the accelerometer noise that causes the problem but the gyro white noise, see subsection 6.2.3 Propagation of Errors. (By the way, you will need the gyroscopes too.)
A similar question is Distance moved by Accelerometer.