Android Speed based on accelerometer values - java

I need to obtain the velocity of an android device, based on the accelerometer values. I made a code that allows me to get the accelerometer values, and then I calculate the velocity, using the formula:
v = v0 + at. (vector calculation)
My problem is that my velocity only increases and never decreases. I think the problem is that the device never gets an negative acceleration.
Can you help me with this?

Obtaining velocity from the accelerometers might not be possible (forget reliable) because at constant speed there will be no acceleration (other than gravity). You might be better off obtaining GPS location data and their associated time samples and computing velocity by distance over time.

Are you subtracting out the force of gravity? The device is always accelerating -- even if it is just sitting on your desk, it is accelerating at 9.8 m/s^2 away from the center of the Earth.

You can use a combination of the accelerometer and the digital compass, in phones that have them, to determine a speed and direction as mentioned in this post.
If all you need to do is determine if the person is walking, all you need is the accelerometer. Just process its output for foot steps.
There are plenty of tutorials on the web for detecting foot steps with an accelerometer.
There an app note here: http://www.analog.com/library/analogDialogue/archives/41-03/pedometer.html that gives a decent mathematical background and an example algorithm. Its of course up to you to extract the math and rewrite it for Android (the example code is written in C). I don't currently know of an open source android library with a footstep detection algorithm.
If you implement something, I would like to get the code, don't forget to post back the results.

Related

GPS too inaccurate

I am new in android devlopment and I am trying to create a lap counting app.
In favor I calculate a distance (finishline) and look for intersections with toher distances (two latest coordinates).
My Problem is that my gps coordinates got 3.0 m (radius) accuracy in best case and this this is not enough.
Do you have any idea how to improve my accuracy? or is there a smarter way to count laps?
help and advice is greatly appreciated. Thank you!
Ok, this comment became too long so posting as an answer although it is more in the form of a suggestion.
Without going into much detail, you can use activity recognition to determine wheter the user is walking, running, driving. Using this information, keep a hold on the 'good quality' location updates and use a combination of accuracy and assumed max velocity based on activity to detect poor quality location updates.
This idea might be extended to perform dead reconing in areas where the location updates are too inaccurate.
I have added Activity Recognition to the library https://github.com/mcharmas/Android-ReactiveLocation, greatly reducing the code needed to get it up and running.

Does the getFocusDistances() camera API function actually work for Nexus 5? or any other device?

I would like to determine the distance of an object from my Nexus 5 camera, preferably without using an object like a coin for scale. I figured the Camera.Parameters getFocusDistances function would work for this.
I attempted to do this via something like the following in my takePicture() jpeg callback:
Parameters params = camera.getParameters();
Float focusDistances[] = new float [3];
params.getFocusDistances( focusDistances );
I tried running this a few times with objects of different distances from the camera, though each time, focusDistances[FOCUS_DISTANCE_NEAR_INDEX], focusDistances[FOCUS_DISTANCE_OPTIMAL_INDEX], and focusDistances[FOCUS_DISTANCE_FAR_INDEX] all contained the value positive infinity.
It's possible I'm doing something wrong, in which case please let me know if there is a specific way I'm which this will work on the Nexus 5. However the android API specifically states you can call getParameters() (and then getFocusDistances()) at any time to get the latest focus distances and therefore I think this should work. One thing I haven't tried yet is doing the above in an on auto focus handler, however I don't see why this should matter.
I did some research to try and see what was going on, and I found several questions regarding this sort of behavior from getFocusDistances() and typically the answer, if there was one, was that the function is not supported by the android API and/or the hardware manufacturer. Now a lot of these discussions I found online were from several years ago, and dispite the questionable feelings it gives me about getFocusDistances, I've still seen this function suggested to be used for getting the focus distance so I figure it must work on SOME device for SOME android API version.
Does anybody know if getFocusDistances() works for any particular version of android on the Nexus 5? If not, does anybody know ANY device it does work on?
EDIT:
Since posting, I have tried obtaining the focus distances in the onAutoFocus handler, as well as trying a bit more extensively for objects atvarious distances. The results have been consistent - positive infinity is always returned for all 3 focus distances (NEAR, OPTIMAL, and FAR). I even tried this with a Nexus 7 and getFocusDistances always returns the constant values (0.95, 1.9, and infinity), so apparently getFocusDistances isn't implemented on that device either.
Therefore, I really have two questions:
Is there any way to get somewhat accurate focus distances using the android Camera API with the Nexus 5? I'm even wondering if there is custom android version where getFocusDistances is actually implemented, since otherwise I may attempt to do so myself depending on what I find when examining the API code.
Are there any android capable devices that are known to implement getFocusDistances in a somewhat accurate manner?
First of all, It's very difficult to measure the object distance from one single shot/view. You would find many research papers which tried to employ vision based techniques to judge the object distance. I can refer you one such paper. They tried to implement a positioning system that would solely work on mobile camera+sensors. You would probably realize how non-trivial it is to measure the object distance from one single camera view. They finally used a method called "structure from motion" vision technique to calculate the distance (From multiple photos taken from multiple angle).
Even traditional apps like SmartDistance and SmartMeasure needs to use geometric tricks to measure the distance. None of them could only rely on camera parameters. Sorry for the elongated introduction. I have done a project of this sort before and I am telling you all these based on my experience.
To answer your query, I haven't found any Android device yet which returns realistic values of focus distances. They are either returned as some constant values or sometimes 0 and infinity. I found someone reporting that it worked for Galaxy Nexus but only within 30cm object distance, it doesn't work for distances more than that. The bottom line is that you cannot rely on this function from camera API which is heavily dependent on the device drivers. And, phone camera's are not well-known for their lens/sensor qualities. It would be very very difficult for you to work on any optics based formula for mobile-phone cameras. I would suggest you to rather go for some sensor based geometric tricks.

Is there a way of obtaining the longitude and latitude of Wi-Fi access points for triangulation purposes

I'm trying to obtain the longitude and latitude positions of existing access points within an indoor environment using Java (Eclipse). I understand that these are needed to complete the triangulation method. I have spoken to the IT team and they're unable to provide me with these readings. However I'm wondering if there's another way to do this?
You will need to work with signal strength. You cannot determine lat. & long. with wifi. It appears that a few iPhone apps leverage triangulation of wifi.
But for outdoor triangulation , the default is GPS.
See this question - Wifi Triangulation
Specifically this answer
If you knew the locations of the access points to within the tolerances required for your application and had a good way to accurately measure the distance between the APs and you had a way to account for signal attenuation between your measurement device and the APs then you could do a little bit of math to solve for where you are.
Java certainly has the mathematical functions you'd need to calculate your location. However, there's a ton of other variables that you would need to account for while triangulating your position from just WiFi access points.
Wifi transmission range usually are in a radius, you have to discover available networks.
You cannot know the exact point where are you, but you can know the area where you are.
If you add the lat & long of the access point, and know the reach of each access point, you can play with sin, cos & tan to accieve the aproximate position.

Mechanical safe-cracking using Android audio and orientation sensors

I’m writing an app that helps lock-smiths with safe manipulation, mainly by creating the charts they need on the fly. When trying to gain entry to a safe via manipulation, a detailed analysis of the "wheel-pack" is required and accomplished by charting the relative depth of a “fence” over a “gate”. If all gates are lined up on the pack, the fences lever will drop into the cam gate and the lock will open. If anyone’s interested in a much more detailed explanation, you can find an awesome treatment by Matt Blaze here (starting around 3.3): www.crypto.com/papers/safelocks.pdf
Making the charts is important, and all it really requires is accurately measuring two places on the dial over and over and over, and recording the dial distance between two sounds. So, say the “drop-in” point is between 10 and 15, a sound event might occur at 11.5 and 14.5, or the next time around it might occur at 12 and 15. The lock-smith makes a chart of distance between these numbers and looks for say, a narrowing of numbers, on a chart, or maybe just the lowest place on the chart.
I’m using an old-school radio-shack Telephone Pickup (suction-cup mic) via my Androids headphone jack, to listen for the sound events. And to precisely measure on the dial where the events occur, I’ve simple mounted the phone to the dial with velcro and use SensorManager to figure the distance between the sounds based on how far the phone has been rotated from audio spike-event to spike-event. Which is fine, but I’d like to do it without mounting the phone to the dial. A couple companies used to accomplish it by having a webcam look at the dial itself, but that seems much less accurate than just mounting the phone as I can get more precision using fractional degrees of rotation.
Once you enter the drop-in location, you always will hit one sound, then backup the dial to the next sound, so I was thinking I could simply, listen for the sound event, and then once the dial reverses direction measure to the next sound event, but this would require that the dial move at constant, which won’t happen in real-life. And I guess I could do it by using a dynamixel or servo to move the dial for the locksmith, but again, not a good solution. So my question is if any of you smart folks can think of way that these related rates (distance between sound events and change of dial position) can be quantified without mounting the phone to the dial?
Not a full answer but a few of ideas I can throw at you:
from a control systems engineer point of view I can tell you that the way to accurately measure a rotation without direct access to the shaft is with an encoder.
So maybe you can use Android accessory mode to read an encoder (through an Arduino) that you'll mount on the dial (probably placing a rubber disc on the encoder shaft and touching it against the safe-dial).
This whole effort would be just to avoid sticking the whole phone to the dial and gain some extra precision.
A different approach could be to attach something else on the dial that would generate click noise that you could filter on the sound to differentiate from the safe sound. But that would definitely lower your precision.

how to calibrate the orientation sensor in android?

I'm writing an app in Google Android 2.1 that needs to know which direction (n/w/s/e) the device (HTC Hero) is facing. The sensor and its listener are working great, but the values I get from the sensor are totally crappy. e.g. it tells me I'd be facing north when the device is facing SW or so...
This seems to be a known problem with android devices. The "solutions" I found on the web look like this:
shake the device around
move the device like an eight
tap on the devices back
This is thought to trigger the sensors recalibration. And: the thing with the "moving around" works for me... but that's not very professional I guess...
so - how do I trigger the recalibration of the orientation sensor from the SDK? I need the sensor to be properly calibrated without any fancy stuff that would make users of this app look like complete idiots while they are "manually" recalibrating their phones...
Is there any way to do this "right"?
EDIT:
Or: is there any way to determine PROGRAMMATICALLY, if the device is correctly calibrated or not? As a fallback-option so to speak... then I could warn the user that the device needs "manual" recalibration.
I don't believe there is a way to know programatically if you compass sensor is calibrated correctly unless you use a secondary data source like GPS. If you can use GPS then when the user is moving you can compare the GPS movement with the compass heading and correct. Remember that local magnetic fields can screw up the compass readings and the devices has no idea if you are out in the middle of a forest or next to a transformer.
With these micro devices there is always a bit of skew you'll have to deal with. If you check the values for the accelerometer as well you'll see that at rest they aren't always returning 9.8 m/s^2 (or at least consistently between devices).
In your help you may just need to tell the user to rotate/twist their phone in a figure eight to reset the compass.
I assume you are referring to the Magnetometer inside the Hero.
Callibrating it is a tough one and will/should always require user interaction for a realiable callibration. There are seperate strategies to deal with that. You could ask users to hold there device in north direction and then recallibrate. If the users don't know where north is, you can ask them to direct zhe device towards the sun and based on location and time you can calculate where that is.
Leaving callibration aside, I would guess that your problem is that the readings you get from the sensor are inaccurate. Of course callibration is a prerequisite for accurate readings, but there are also other factors in play.
It is common practice to complement sensor data from one sensor with the data a different sensor to increase accuracy. You could use the GPS to determine a heading when the user is moving. If he's moving slowly however, this is inaccurate as well. You could integrate the data reported by the Accelerometer to guess about orientation changes (not the absolute orientation). But honestly a Gyrometer would be more ideal in this case.
Systems that work like this are sometimes called Inertial Navigation Systems (INS) because they can, given a fixed point in space, determine their subsequent relative position and orientation accurately without further external data. Using a Kalman filter is common practice to recallibrate the system from time to time when an absolute position (e.g. retrieved via GPS) is available.
Although it is unrealistic to implement a full-fledged INS, you can certainly draw a few ideas from how they work to make your orientation readings more accurate.

Categories