Android sampling rates variation of hardware Sensors on Nexus 6P - java

I'm developing an Android app, for a research, and im reading several Sensor data like accelerometer, gyroscope, barometer etc.
So I have 4 Nexus 6P devices all with the newest Factory Image and freshly set up with no other app installed than the standard once which are pre-installed.
So the Problem that occurs now is that one of the phones is constantly lagging behind, so for example i record for half an hour the accelerometer at 105 Hz (so the max possible rate for the accelerometer is 400Hz), just to make sure i get at least about the amount of samples i would expect for 100Hz and the results are the following:
Smapling for Half an hour at 100Hz -> 180000 Samples
Smapling for Half an hour at 105Hz -> 189000 Samples
(This is now just an example for the accelerometer but is the same for every other sensor on each device. So device 1,3,4 get about the same good results for other senors while device 2 gets the same bad results on all other sensors).
Device 1: 180000 Samples
Device 2: 177273 Samples <- the phone that is lagging behind
Device 3: 181800 Samples
Device 4: 179412 Samples
So the problem is at device number 2 where I'm missing almost 3000 Samples (I know this is crying at a high level) and my guess for this problem is that it is probably Hardware related. That it might be a performance issue i can probably rule out since it does not matter how many Sensors im reading and also reading them at 400Hz works as expected (if wanted i can also offer the Samples for this too). I also tried to set the sampling rate to 400Hz so to the fastest and then take recordings according to the timestamp which led to the same result.
So just in case I'll provide how I register the Sensor Listener:
protected void onCreate(Bundle savedInstanceState){
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
unaccDataSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER_UNCALIBRATED);
}
....
private void Start(){
sensorManager.registerListener(unaccDataListener, unaccDataSensor, 10000);
}
So what i want is to get at least about the amount of samples that i should expect so above is no problem and just a bit below is also acceptable.
So if anyone has an idea about what else I can try or what can cause the problem i would be really thankful.
This is my first Post so if anything is missing or if i explained something in a bad way im sorry and i try my best to fix it.

I work with Android sensors a lot, and i can tell you the hardware is of variable quality. I usually use a filter if I need the results to be consistent across phones:
// Filter to remove readings that come too often
if (TS < LAST_TS_ACC + 100) {
//Log.d(TAG, "onSensorChanged: skipping");
return;
}
however this means you can only set the phones to match the lowest common denominator. If it helps I find that getting any more than 25hz is overkill for most applications, even medical ones.
It can also help to make sure any file writes you are doing are done off thread, and in batches, as writing to file is an expensive operation.
accelBuffer = new StringBuilder();
accelBuffer.append(LAST_TS_ACC + "," + event.values[0] + "," + event.values[1] + "," + event.values[2] + "\n");
if((accelBuffer.length() > 500000) && (writingAccelToFile == false) ){
writingAccelToFile = true;
AccelFile = new File(path2 +"/Acc/" + LAST_TS_ACC +"_Service.txt");
Log.d(TAG, "onSensorChanged: accelfile created at : " + AccelFile.getPath());
File parent = AccelFile.getParentFile();
if(!parent.exists() && !parent.mkdirs()){
throw new IllegalStateException("Couldn't create directory: " + parent);
}
//Try threading to take of UI thread
new Thread(new Runnable() {
#Override
public void run() {
//Log.d(TAG, "onSensorChanged: in accelbuffer");
// Log.d(TAG, "run: in runnable");
//writeToStream(accelBuffer);
writeStringBuilderToFile(AccelFile, accelBuffer);
accelBuffer.setLength(0);
writingAccelToFile = false;
}
}).start();
}
Doing all of the above has got me reasonably good results, but it will never be perfect due to differences in the hardware.
Good luck!

Related

Get distance by road between two lat lng in android

I am creating a blood bank app in which I am showing the user, his current position and different donors available near him on a map. When the user clicks on the blood request button, I show him a list of different donors available near him. Now on the list with the names of donors, I want to show the distance of that donor from the user current location. Right now I am getting distance by line which always shows 56 KM less than the actual distance. For that I am doing this :
donarLat = profiles.getLatitude();
donarLong = profiles.getLongitude();
String distance = "";
if (currentLat != null && currentLong != null && donarLat != null && donarLong != null) {
origin = new LatLng(currentLat, currentLong);
dest = new LatLng(donarLat, donarLong);
float[] result = new float[1];
// Location.distanceBetween(currentLat, currentLong,donarLat, donarLong, result);
distance = String.valueOf(SphericalUtil.computeDistanceBetween(origin,dest));
System.out.println("d9" + profiles.getName()+ " : " + distance);
I have also got the distance using Location as you can see the commented line in code but it all gives location by line but I want to get the Location by road for which I have seen a lot of answers on StackOverflow which was answered minimum 6 years ago and also tried but sometimes it crashes app or some times it does nothing. I assume that for location by the road, I have to use google direction API but how I don't understand how to use it. I have tried that API in postman but first, it gave me an error to enable direction API after doing that it asked me to enable billing method. I am attaching the photo of Postman. And will be really thankful if someone shows me how to use API properly to get the exact distance by road.
Google API is not FREE now
Some API is free for some period but charge cost after that trial period.
And they require an API key with billing info to use trial.
In your case you have to your API key is not valid.
Create an API key with billing info form this link and be sure you can use it for a trial period. If not you may get changed.

Adding EnviromentalReverb and PresetReverb to mediaPlayer (wav and m4a formats) but the reverb doesn't apply

I am working on a simple music app in android and I have tried adding EnviromentalReverb and PresetReverb to mediaPlayer (wav and m4a formats) but the reverb doesn't apply. There is no change when the audio plays. I have checked whether my device supports the reverb using the below code and it does. I have looked at similar questions on stackoverflow but there isn't an answer that works.
final AudioEffect.Descriptor[] effects = AudioEffect.queryEffects();
// Determine available/supported effects
for (final AudioEffect.Descriptor effect : effects) {
Log.d("Effects", effect.name.toString() + ", type: " + effect.type.toString());
}
The code used for EnvironmentalReverb and PresetReverb is below
First try
EnvironmentalReverb eReverb = new EnvironmentalReverb(1,0);
eReverb.setReverbDelay(85);
eReverb.setEnabled(true);
mMediaPlayer.attachAuxEffect(eReverb.getId());
mMediaPlayer.setAuxEffectSendLevel(1.0f);
Second try
PresetReverb mReverb = new PresetReverb(1, 0);
mReverb.setPreset(PresetReverb.PRESET_LARGEROOM);
mReverb.setEnabled(true);
mMediaPlayer.attachAuxEffect(mReverb.getId());
mMediaPlayer.setAuxEffectSendLevel(1.0f);
Both return 0 for setEnabled(true) but neither work on the audio. Can someone please point me in the right direction? I am not sure what is wrong with the implementation.
Answering my question so it can be helpful for someone else.
I wasn't able to get the PresetReverb to work. The EnvironmentalReverb however was working but to find out whether it was working I had to add seekbars for room level and reverb level so I could alter it in real time.
EnvironmentalReverb eReverb = new EnvironmentalReverb(0,0);
mMediaPlayer.attachAuxEffect(eReverb.getId());
mMediaPlayer.setAuxEffectSendLevel(1.0f);
I enabled the reverb on click of a button and then used seek bars to change the room level and reverb level.

Detecting the device presence in a Pocket

My application needs to know whether the phone is in the pocket or in hand based on which few parameters are set specific to individual than to move on to perform next tasks.
I have read various blogs and also SensorManager android developer but none helped me out. The only related link I found on stack is this with no solution, Though one comment on that question suggests using Awareness API. I am going through it, my understanding is that the User Activity is the context to find this- I may be wrong. There maybe someone worked or may be doing R&D on this, please share your observation that may help me out in some way to go further.
Is there any way to find is the phone in pocket or not?If yes, Can somebody tell me How do One do that?
Any guidance/links to the concepts are helpful.
Thanks.
I implemented this in my project. I got readings from the Light sensor, Accelerometer and Proximity sensor. Keep in mind that it approximately detects device presence in a pocket.
Getting the current parameteres from the sensors (accelerometer, proximity and light sensors):
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
g = new float[3];
g = event.values.clone();
double norm_Of_g = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
g[0] = (float)(g[0] / norm_Of_g);
g[1] = (float)(g[1] / norm_Of_g);
g[2] = (float)(g[2] / norm_Of_g);
inclination = (int) Math.round(Math.toDegrees(Math.acos(g[2])));
accReading.setText("XYZ: " + round(g[0]) + ", " + round(g[1]) + ", " + round(g[2]) + " inc: " + inclination);
}
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
proximityReading.setText("Proximity Sensor Reading:" + String.valueOf(event.values[0]));
rp = event.values[0];
}
if (event.sensor.getType() == Sensor.TYPE_LIGHT) {
lightReading.setText("LIGHT: " + event.values[0]);
rl = event.values[0];
}
if ((rp != -1) && (rl != -1) && (inclination != -1)) {
main.detect(rp, rl, g, inclination);
}
}
Then based on this data I decide whether or not the device is in a pocket:
public void detect(float prox, float light, float g[], int inc){
if((prox<1)&&(light<2)&&(g[1]<-0.6)&&( (inc>75)||(inc<100))){
pocket=1;
//IN POCKET
}
if((prox>=1)&&(light>=2)&&(g[1]>=-0.7)){
if(pocket==1){
playSound();
pocket=0;
}
//OUT OF POCKET
}
}
Keep in mind that it's not fully accurate.
Code: https://github.com/IvanLudvig/PocketSword
Blog post: https://ivanludvig.github.io/blog/2019/06/21/detecting-device-in-a-pocket-android.html
The only way we can come somewhat near to the solution is using.Google Awareness API wont solve the problem as it has a entirely different usage.
Light sensor(Environment sensor)
Proximity sensor(Position sensor)
The Android platform provides four sensors that let you monitor various environmental properties. You can use these sensors to monitor
relative ambient humidity
luminescence
ambient pressure
ambient temperature
All four environment sensors are hardware-based and are available only if a device manufacturer has built them into a device. With the exception of the light sensor, which most device manufacturers use to control screen brightness, environment sensors are not always available on devices. Because of this, it's particularly important that you verify at run time whether an environment sensor exists before you attempt to acquire data from it.
Light sensor can be used to calculate the light intensity.For example many mobile phones having Auto brightness mode function, this function work on light sensor that will adjust screen brightness as per light intensity.
There are many unites such as Lux,candela,lumen etc, to measure light intensity.
Considering this there will be considerable difference in light intensity when you phone in in pocket or outside pocket.
Although the same will happen for the case when you are operating phone is dark room. or at those place where the light intensity is quite low. hence to distinguish among such cases is the real challenge.You can use other environments sensor in combination of light sensor to come to an effective outcome.But i assume an accurate solution is dicey.
To study more about these sensors kindly refer to following links
https://developer.android.com/guide/topics/sensors/sensors_environment.html
https://developer.android.com/guide/topics/sensors/sensors_position.html
Google awareness API wont work for this case. as provides entirely different solution.
It provides two API
Fence API
Snapshot API
You can use the Snapshot API to get information about the user's current environment. Using the Snapshot API, you can access a variety of context signals:
Detected user activity, such as walking or driving.
Nearby beacons that you have registered.
Headphone state (plugged in or not)
Location, including latitude and longitude.
Place where the user is currently located.
Weather conditions in the user's current location.
Using the Fence API, you can define fences based on context signals such as:
The user's current location (lat/lng)
The user's current activity (walking, driving, etc.).
Device-specific conditions, such as whether the headphones are
plugged in.
Proximity to nearby beacons.
For a cross-platform solution, you can now use the NumberEight SDK for this task.
It performs a wide variety of context recognition tasks on both iOS and Android including:
Real-time physical activity detection
Device position detection (i.e. presence in pocket)
Motion detection
Reachability
Local weather
It can also record user context for reports and analysis via the online portal.
How to detect whether a phone is in a pocket:
For example, to record user activity in Kotlin, you would do:
val ne = NumberEight()
ne.onDevicePositionUpdated { glimpse ->
if (glimpse.mostProbable.state == State.InPocket) {
Log.d("MyApp", "Phone is in a pocket!")
}
}
or in Java:
NumberEight ne = new NumberEight();
ne.onDevicePositionUpdated(
new NumberEight.SubscriptionCallback<NEDevicePosition>() {
#Override
public void onUpdated(#NonNull Glimpse<NEDevicePosition> glimpse) {
if (glimpse.mostProbable.state == State.InPocket) {
Log.d("MyApp", "Phone is in a pocket!");
}
}
});
Here are some iOS and Android example projects.
Disclosure: I'm one of the developers.

DJI Phantom 3 Custom Mission App, Delay Between Mission Steps:

I am working on developing an app using Android Studio which pilots the DJI Phantom 3 drone in a certain pattern, taking pictures at certain way points. I uploaded the DJI Sample Code to Android Studio, entered an app key on the Android Manifest.xml file, and modified the "CustomMissionView" Code in the "MissionManager" directory in order to program the drone to fly in a specifed pattern. However, when I run this project on the DJI Simulator, there is a delay between each of the "steps" of the custom mission, sometimes the drone is idle and hovers for a few seconds without doing anything. I want to know if there is any way to minimize the delay between steps of the custom mission without setting flight speed. I suspect it has something to do with the DJICommonCallbacks.DJICompletionCallback(), but I am not sure. I am a newbie to Android Studio, so any advice would be helpful.
Here is some of the code inside the protected method DJI Mission in the "CustomMissionView" Java file
LinkedList<DJIMissionStep> steps = new LinkedList<DJIMissionStep>();
//Step 1: takeoff from the ground
steps.add(new DJITakeoffStep(new DJICommonCallbacks.DJICompletionCallback() {
public void onResult(DJIError error) {
Utils.setResultToToast(mContext, "Takeoff step: " + (error == null ? "Success" : error.getDescription()));
}
}));
//Step 2: reset the gimbal to desired angle
steps.add(new DJIGimbalAttitudeStep(
DJIGimbalRotateAngleMode.AbsoluteAngle,
new DJIGimbalAngleRotation(true, -30f, DJIGimbalRotateDirection.Clockwise),
null,
null,
new DJICommonCallbacks.DJICompletionCallback() {
public void onResult(DJIError error) {
Utils.setResultToToast(mContext, "Set gimbal attitude step: " + (error == null ? "Success" : error.getDescription()));
}
}));
//Step 3: Go 3 meters from home point
steps.add(new DJIGoToStep(mHomeLatitude, mHomeLongitude, 3, new DJICommonCallbacks.DJICompletionCallback() {
public void onResult(DJIError error) {
Utils.setResultToToast(mContext, "Goto step: " + (error == null ? "Success" : error.getDescription()));
}
}));
The pause between each step is due to how DJI set up the custom mission. When you prepare a custom mission, it does not send any mission information to the aircraft itself. It does build the custom mission on the device running the app. During execution of the mission, a step is sent to the aircraft. When that step has successfully completed, the next step is sent to the aircraft. This causes the pause between each step. If the signal from the remote control to the aircraft becomes weak, the mission can fail from timing out.
Waypoint missions do not have this pause because the entire mission is loaded onto the aircraft when it is prepared.

How to use the accelerometer to detect virbration above or equal to 2Gs?

I'm doing android application which is something like a car "blackbox" which records the traveling process of the car.
But I'm face with the problem of how am i going to integrate an accelerometer which is capable of detecting movement (Probably >= 2Gs) when an accident occur then it should trigger the video recording to stop and saving it to the Archive file, thus not losing the file as a result of the accident.. Anyone knows how to do the above mention task, i'm rather needing urgent help here please! I've read android developer on accelerometer and its not helping in my situation here first i'm rather bad in physics second i'm new to android/java and my first attempt working with the accelerometer? Any simple solution? Thanks in Advance :)
This is part of the section of the video recording but now how am i going to incorporate accelerometer for "Auto-Archiving" purposes?
A couple of points:
The Bosch BMA150 is used in many smartphones with 2g set as the maximum acceleration value (so you might never see >2g).
With SENSOR_DELAY_FASTEST you can take readings about every 20 milliseconds on an HTC Desire. However, since you have a multi-tasking operating system on the phone, you cannot guarantee this timing (delays of a couple of seconds might occur when the operating system is busy).
Hence a smartphone is currently not really suitable for this application. If Android allows smarter use of accelerometers in future this could change. If onSensorChanged was allowed a threshold parameter then accelerations exceeding this threshold could be buffered in the accelerometer chip's memory and read out when appropriate.
Put your startRecording() method in the Activity below, it's called when acceleration exceeds 2G, you can change this by changing the value of CRASH_G_MULTIPLIER
public class YourActivity extends Activity {
private double G = 9.81;
private double CRASH_G_MULTIPLIER = 2;
#Override public void onCreate(Bundle savedInstanceState){
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
startDropListening();
}
private void startRecording(){
// your code that does the recording here
}
private void startDropListening(){
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
sm.registerListener(
new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double accel = Math.sqrt(
Math.pow(arg0.values[0], 2) +
Math.pow(arg0.values[1], 2) +
Math.pow(arg0.values[2], 2));
if (accel > G * CRASH_G_MULTIPLIER){
startRecording();
}
}
},
sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_NORMAL);
}
}
There are some things you should be aware of:
As mentioned by others, the value of 2 needs to be increased quite substantially, but you can fine tune this yourself by experimentation.
You will want to acquire a wake lock. To create a wake lock, do this:
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
wl = pm.newWakeLock(PowerManager.FULL_WAKE_LOCK, "ATag");
wl.acquire();
And the following when you are finished:
wl.release();
(You may want to change FULL_WAKE_LOCK to a lower priority wake lock). (You will also need to add the appropriate permission to your manifest file for the wake lock)
You may wish to increase the sample rate. This will drain battery significantly though. There are different values you can assign to it, in the code above, replace:
SensorManager.SENSOR_DELAY_NORMAL
with any of these:
SensorManager.SENSOR_DELAY_UI
SensorManager.SENSOR_DELAY_GAME
SensorManager.SENSOR_DELAY_FASTEST

Categories