I know how to get the current GPS location of a mobile phone.
I also know how to save the GPS location to the photo when you take it. (Camera option Samsung galaxy s2).
But how can I get the GPS location of that photo (later)?
When I open the photo on the computer, I can see the GPS location data, but got no idea how to extract them later in android. So could someone put me in the good direction?
To make question more clearly:
How can I get the GPS location of a photo that is already taken?
Thanks already,
Bigflow
josnidhin made this answer possible, so be sure to give him credit too :)
Here we go:
import android.media.ExifInterface;
exif = new ExifInterface(filePath);
String lat = ExifInterface.TAG_GPS_LATITUDE;
String lat_data = exif.getAttribute(lat);
After that, lat_data will be something like: 51/1,58/1,32/1
This is the same as: 51, 58, 32. (typing this in google maps will give bad result)
To get the gps coordinates of this you need to do some math, here it comes:
Calculate the total number of seconds:
58′32″ = (58*60 + 32) = 3512
seconds.
The fractional part is total number of seconds divided by 3600:
3512 / 3600 = ~0.975556
Add fractional degrees to whole degrees to produce the final result:
51 + 0.975556 = 51.975556
If it is a West longitude coordinate, negate the result. (it isn't this time)
answer: 51.975556
This is also the same when you with TAG_GPS_LONGITUDE
I think the geotag is in the EXIF data of the photo taken. Find a suitable EXIF data reader which will help you extract the data you want.
Related
I am developing an android app and I am trying to check if the user's current location is the same with one of some addresses that I hold in a csv.
For example:
An address from the csv is a String like "Via dei Monti Tiburtini 385, Roma 00157 "
At first I have get the fully current location by the following code
Geocoder geocoder;
List<Address> addresses = null;
geocoder = new Geocoder(MainActivity.this, locale);
try {
addresses = geocoder.getFromLocation(location.getLatitude(), location.getLongitude(), 1);
} catch (IOException e) {
e.printStackTrace();
}
String address = addresses.get(0).getAddressLine(0);
I thought to make just check if the Strings are equals but ofcourse this is not a right solution because the location addresses can differ in the way that been written.
For example:
csv file -> "Via dei Monti Tiburtini 385, Roma 00157 "
user's current loc -> "Via dei Monti Tiburtini 385, Roma 00157, ITALY "
Note: My addresses are not in english characters (both user's current loc and csv loc) but for now I would like to find a solution at least for english
I am trying to find something on the web but I am not able to find something similar to that.
I would appreciate any help, Thank you
Checking if GPS coordinates are equal might be too restrictive, I agree with you. What you could do is check if they are within a certain range, maybe 0.01% margin from what's in your DB. You can test it on Google Maps and see how much you think you can tolerate based on your application.
This page has some details about the numerical values for latitude and longitude:
https://en.wikipedia.org/wiki/Geographic_coordinate_system
From a user address perspective, in my experience with shipping applications, there is this concept of Address Standardization which tries to make the addresses conform in format so that other systems consistently find the same location.
This could help with how people enter their address for example "apartment #3" or "app #3" or "app 3", which could all result in different addresses. There are libraries for UPS and others that provide this.
https://www.ups.com/us/en/services/shipping/connectship.page
This is if you are starting from an address.
If you start from GPS location, you could use API-s available on the web, maybe Google Maps, to find the address of a specific GPS location.
https://www.google.com/maps/#44.4257258,-88.1063132
and this:
https://support.google.com/maps/answer/18539?hl=en&co=GENIE.Platform%3DDesktop
I am creating a blood bank app in which I am showing the user, his current position and different donors available near him on a map. When the user clicks on the blood request button, I show him a list of different donors available near him. Now on the list with the names of donors, I want to show the distance of that donor from the user current location. Right now I am getting distance by line which always shows 56 KM less than the actual distance. For that I am doing this :
donarLat = profiles.getLatitude();
donarLong = profiles.getLongitude();
String distance = "";
if (currentLat != null && currentLong != null && donarLat != null && donarLong != null) {
origin = new LatLng(currentLat, currentLong);
dest = new LatLng(donarLat, donarLong);
float[] result = new float[1];
// Location.distanceBetween(currentLat, currentLong,donarLat, donarLong, result);
distance = String.valueOf(SphericalUtil.computeDistanceBetween(origin,dest));
System.out.println("d9" + profiles.getName()+ " : " + distance);
I have also got the distance using Location as you can see the commented line in code but it all gives location by line but I want to get the Location by road for which I have seen a lot of answers on StackOverflow which was answered minimum 6 years ago and also tried but sometimes it crashes app or some times it does nothing. I assume that for location by the road, I have to use google direction API but how I don't understand how to use it. I have tried that API in postman but first, it gave me an error to enable direction API after doing that it asked me to enable billing method. I am attaching the photo of Postman. And will be really thankful if someone shows me how to use API properly to get the exact distance by road.
Google API is not FREE now
Some API is free for some period but charge cost after that trial period.
And they require an API key with billing info to use trial.
In your case you have to your API key is not valid.
Create an API key with billing info form this link and be sure you can use it for a trial period. If not you may get changed.
I'm developing an Android app, for a research, and im reading several Sensor data like accelerometer, gyroscope, barometer etc.
So I have 4 Nexus 6P devices all with the newest Factory Image and freshly set up with no other app installed than the standard once which are pre-installed.
So the Problem that occurs now is that one of the phones is constantly lagging behind, so for example i record for half an hour the accelerometer at 105 Hz (so the max possible rate for the accelerometer is 400Hz), just to make sure i get at least about the amount of samples i would expect for 100Hz and the results are the following:
Smapling for Half an hour at 100Hz -> 180000 Samples
Smapling for Half an hour at 105Hz -> 189000 Samples
(This is now just an example for the accelerometer but is the same for every other sensor on each device. So device 1,3,4 get about the same good results for other senors while device 2 gets the same bad results on all other sensors).
Device 1: 180000 Samples
Device 2: 177273 Samples <- the phone that is lagging behind
Device 3: 181800 Samples
Device 4: 179412 Samples
So the problem is at device number 2 where I'm missing almost 3000 Samples (I know this is crying at a high level) and my guess for this problem is that it is probably Hardware related. That it might be a performance issue i can probably rule out since it does not matter how many Sensors im reading and also reading them at 400Hz works as expected (if wanted i can also offer the Samples for this too). I also tried to set the sampling rate to 400Hz so to the fastest and then take recordings according to the timestamp which led to the same result.
So just in case I'll provide how I register the Sensor Listener:
protected void onCreate(Bundle savedInstanceState){
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
unaccDataSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER_UNCALIBRATED);
}
....
private void Start(){
sensorManager.registerListener(unaccDataListener, unaccDataSensor, 10000);
}
So what i want is to get at least about the amount of samples that i should expect so above is no problem and just a bit below is also acceptable.
So if anyone has an idea about what else I can try or what can cause the problem i would be really thankful.
This is my first Post so if anything is missing or if i explained something in a bad way im sorry and i try my best to fix it.
I work with Android sensors a lot, and i can tell you the hardware is of variable quality. I usually use a filter if I need the results to be consistent across phones:
// Filter to remove readings that come too often
if (TS < LAST_TS_ACC + 100) {
//Log.d(TAG, "onSensorChanged: skipping");
return;
}
however this means you can only set the phones to match the lowest common denominator. If it helps I find that getting any more than 25hz is overkill for most applications, even medical ones.
It can also help to make sure any file writes you are doing are done off thread, and in batches, as writing to file is an expensive operation.
accelBuffer = new StringBuilder();
accelBuffer.append(LAST_TS_ACC + "," + event.values[0] + "," + event.values[1] + "," + event.values[2] + "\n");
if((accelBuffer.length() > 500000) && (writingAccelToFile == false) ){
writingAccelToFile = true;
AccelFile = new File(path2 +"/Acc/" + LAST_TS_ACC +"_Service.txt");
Log.d(TAG, "onSensorChanged: accelfile created at : " + AccelFile.getPath());
File parent = AccelFile.getParentFile();
if(!parent.exists() && !parent.mkdirs()){
throw new IllegalStateException("Couldn't create directory: " + parent);
}
//Try threading to take of UI thread
new Thread(new Runnable() {
#Override
public void run() {
//Log.d(TAG, "onSensorChanged: in accelbuffer");
// Log.d(TAG, "run: in runnable");
//writeToStream(accelBuffer);
writeStringBuilderToFile(AccelFile, accelBuffer);
accelBuffer.setLength(0);
writingAccelToFile = false;
}
}).start();
}
Doing all of the above has got me reasonably good results, but it will never be perfect due to differences in the hardware.
Good luck!
My application needs to know whether the phone is in the pocket or in hand based on which few parameters are set specific to individual than to move on to perform next tasks.
I have read various blogs and also SensorManager android developer but none helped me out. The only related link I found on stack is this with no solution, Though one comment on that question suggests using Awareness API. I am going through it, my understanding is that the User Activity is the context to find this- I may be wrong. There maybe someone worked or may be doing R&D on this, please share your observation that may help me out in some way to go further.
Is there any way to find is the phone in pocket or not?If yes, Can somebody tell me How do One do that?
Any guidance/links to the concepts are helpful.
Thanks.
I implemented this in my project. I got readings from the Light sensor, Accelerometer and Proximity sensor. Keep in mind that it approximately detects device presence in a pocket.
Getting the current parameteres from the sensors (accelerometer, proximity and light sensors):
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
g = new float[3];
g = event.values.clone();
double norm_Of_g = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
g[0] = (float)(g[0] / norm_Of_g);
g[1] = (float)(g[1] / norm_Of_g);
g[2] = (float)(g[2] / norm_Of_g);
inclination = (int) Math.round(Math.toDegrees(Math.acos(g[2])));
accReading.setText("XYZ: " + round(g[0]) + ", " + round(g[1]) + ", " + round(g[2]) + " inc: " + inclination);
}
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
proximityReading.setText("Proximity Sensor Reading:" + String.valueOf(event.values[0]));
rp = event.values[0];
}
if (event.sensor.getType() == Sensor.TYPE_LIGHT) {
lightReading.setText("LIGHT: " + event.values[0]);
rl = event.values[0];
}
if ((rp != -1) && (rl != -1) && (inclination != -1)) {
main.detect(rp, rl, g, inclination);
}
}
Then based on this data I decide whether or not the device is in a pocket:
public void detect(float prox, float light, float g[], int inc){
if((prox<1)&&(light<2)&&(g[1]<-0.6)&&( (inc>75)||(inc<100))){
pocket=1;
//IN POCKET
}
if((prox>=1)&&(light>=2)&&(g[1]>=-0.7)){
if(pocket==1){
playSound();
pocket=0;
}
//OUT OF POCKET
}
}
Keep in mind that it's not fully accurate.
Code: https://github.com/IvanLudvig/PocketSword
Blog post: https://ivanludvig.github.io/blog/2019/06/21/detecting-device-in-a-pocket-android.html
The only way we can come somewhat near to the solution is using.Google Awareness API wont solve the problem as it has a entirely different usage.
Light sensor(Environment sensor)
Proximity sensor(Position sensor)
The Android platform provides four sensors that let you monitor various environmental properties. You can use these sensors to monitor
relative ambient humidity
luminescence
ambient pressure
ambient temperature
All four environment sensors are hardware-based and are available only if a device manufacturer has built them into a device. With the exception of the light sensor, which most device manufacturers use to control screen brightness, environment sensors are not always available on devices. Because of this, it's particularly important that you verify at run time whether an environment sensor exists before you attempt to acquire data from it.
Light sensor can be used to calculate the light intensity.For example many mobile phones having Auto brightness mode function, this function work on light sensor that will adjust screen brightness as per light intensity.
There are many unites such as Lux,candela,lumen etc, to measure light intensity.
Considering this there will be considerable difference in light intensity when you phone in in pocket or outside pocket.
Although the same will happen for the case when you are operating phone is dark room. or at those place where the light intensity is quite low. hence to distinguish among such cases is the real challenge.You can use other environments sensor in combination of light sensor to come to an effective outcome.But i assume an accurate solution is dicey.
To study more about these sensors kindly refer to following links
https://developer.android.com/guide/topics/sensors/sensors_environment.html
https://developer.android.com/guide/topics/sensors/sensors_position.html
Google awareness API wont work for this case. as provides entirely different solution.
It provides two API
Fence API
Snapshot API
You can use the Snapshot API to get information about the user's current environment. Using the Snapshot API, you can access a variety of context signals:
Detected user activity, such as walking or driving.
Nearby beacons that you have registered.
Headphone state (plugged in or not)
Location, including latitude and longitude.
Place where the user is currently located.
Weather conditions in the user's current location.
Using the Fence API, you can define fences based on context signals such as:
The user's current location (lat/lng)
The user's current activity (walking, driving, etc.).
Device-specific conditions, such as whether the headphones are
plugged in.
Proximity to nearby beacons.
For a cross-platform solution, you can now use the NumberEight SDK for this task.
It performs a wide variety of context recognition tasks on both iOS and Android including:
Real-time physical activity detection
Device position detection (i.e. presence in pocket)
Motion detection
Reachability
Local weather
It can also record user context for reports and analysis via the online portal.
How to detect whether a phone is in a pocket:
For example, to record user activity in Kotlin, you would do:
val ne = NumberEight()
ne.onDevicePositionUpdated { glimpse ->
if (glimpse.mostProbable.state == State.InPocket) {
Log.d("MyApp", "Phone is in a pocket!")
}
}
or in Java:
NumberEight ne = new NumberEight();
ne.onDevicePositionUpdated(
new NumberEight.SubscriptionCallback<NEDevicePosition>() {
#Override
public void onUpdated(#NonNull Glimpse<NEDevicePosition> glimpse) {
if (glimpse.mostProbable.state == State.InPocket) {
Log.d("MyApp", "Phone is in a pocket!");
}
}
});
Here are some iOS and Android example projects.
Disclosure: I'm one of the developers.
I am a beginner in android app development and was trying to build a simple MCQ quiz app.
What I did was make a two dimensional array and stored the questions, possible answers and the correct solution in it.
A sample from the table can be seen in this image:
So I named my array database. The code for creating this array called database[][] is below:
database = new String[][]{
{"Before marking the finishing line on a running track, a groundsman measures out its 100 m length. Which instrument is the most appropriate for this purpose?",
"measuring tape","metre rule","30 cm ruler","micrometer", "A"},
{"A car of mass 1500 kg travels along a horizontal road. It accelerates steadily from 10 m / s to 25 m / s in 5.0 s. What is the force needed to produce this acceleration?",
"300N","500N","4500N","D.7500N", "C"},
{"A lorry of mass 10 000 kg takes 5000 kg of sand to the top of a hill 50 m high, unloads the sand and then returns to the bottom of the hill. The gravitational field strength is 10 N / kg. What is the overall gain in potential energy?",
"250 000 J","750 000 J","2 500 000 J","D.7 500 000J", "C"},
{"A liquid-in-glass thermometer contains mercury. Which physical property of the mercury varies with temperature, enabling the thermometer to operate?",
"mass","melting point","resistance","volume", "D"},
{"Thermal energy of 12 000 J is supplied to a 2.0 kg mass of copper. The specific heat capacity of copper is 400 J / (kg °C). What is the rise in temperature?",
"15 Degree C","30 Degree C","60 Degree C","100 Degree C", "A"},
};
So each row is basically a new question with its own set of possible answers.
As for the interface, there is a textview that shows the question. There are four buttons which show each of the answers. You click on a button to answer. Then the next question is showed.
textviewQuestion.setText(database[x][y]);
buttonA.setText("A. " + database[x][1]);
buttonB.setText("B. " + database[x][2]);
buttonC.setText("C. " + database[x][3]);
buttonD.setText("D. " + database[x][4]);
Now my question is, if I want to make this app more versatile is it better to use other ways to implement the questions? I mean maybe I can store the questions in a text file? Or should I use SQLite? I don't know the pros and cons and basically what are the limitations?
I have the questions in pdf format, so can I use that to somehow link those questions directly from the pdf to the app?
Also I want to be able to ask questions which include some images. How to achieve that? Please help me out by pointing me to some good resources. Thanks a lot!
following code is help for opening pdf in your app
File pdfFile = new File(path);
if(pdfFile.exists())
{
Uri path = Uri.fromFile(pdfFile);
Intent pdfIntent = new Intent(Intent.ACTION_VIEW);
pdfIntent.setDataAndType(path, "application/pdf");
pdfIntent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
try
{
startActivity(pdfIntent);
}
catch(ActivityNotFoundException e)
{
Toast.makeText(uractivity.this, "File does not exist", Toast.LENGTH_LONG).show();
}
}