I'm sure most of you have used an android phone before and taken a picture. Whenever the user changes the mobile phone's position and holds it steady, the camera focusses automatically. I'm having a hard time replicating this in my app. The autofocus() method is being called only once when the application is being launched. I have been searching for a solution these past 3 days and while reading the google documentation I stumbled upon the sensor method calls (such as when the user tilts the mobile forwards or backwards). I could use this API to achieve what I need but it sounds too dirty and too complicated. I'm sure there's another way around it.
All examples on the internet which I have found only focus when the user presses the screen or a button. I have also gone through several questions on SO to hopefully find what I am looking for but I was unsuccessful. I have seen this question and that String is not compatible with my phone. For some reason the only focussing modes which I can use is fixed and auto.
I was hoping someone here would shed some light on the subject because I am at a loss.
Thankyou very much for your time.
Since API 14 you can set this parameter
http://developer.android.com/reference/android/hardware/Camera.Parameters.html#FOCUS_MODE_CONTINUOUS_PICTURE
Yes, camera.autoFocus(callback) is a one-time function. You will need to call it in a loop to have it autofocus continuously. Preferably you would have a motion detection via accelerometer or compass to detect when camera is moved.
Related
I am using a phone with 3 back cameras (Realme GT Pro 2) and I want to access the one with the widest FOV. I am currently trying to implement it with the help of the Multi-camera API, but it's relatively confusing and I'm unable to implement it/find a solution for this. Can anyone give me some tips on how to access the specific back camera and display its stream?
I can get the physicalCameraId with CameraCharacteristics.getPhysicalCameraIds(). But how can I use this to open the correct (physical) camera? I also know that some manufacturers haven't yet implemented/allowed this access
I'm learning about android development. Let's say I want to be able to listen to spotify music in the background, while simultaneously listening to a spoken word podcast thru some other podcast app. Ive tried creating a Soundbuilder object and changed the maxStreams to 2 when I hit a togglebutton. However, when I run the app it makes no difference. Either spotify has the focus or the podcast app has focus.
Should I be utilizing the AudioManager class instead? To be able to eventually controll the volume of each stream independently? Also, would the phone have to be root to be able to change the maxStreams to 2?
I think You should check this example: MixingAudioInputStream.java
It's example taken from here
Check these out and try mixing both streams into single stream by Yourself - as trying to code new things is best way to learn.
I would like to determine the distance of an object from my Nexus 5 camera, preferably without using an object like a coin for scale. I figured the Camera.Parameters getFocusDistances function would work for this.
I attempted to do this via something like the following in my takePicture() jpeg callback:
Parameters params = camera.getParameters();
Float focusDistances[] = new float [3];
params.getFocusDistances( focusDistances );
I tried running this a few times with objects of different distances from the camera, though each time, focusDistances[FOCUS_DISTANCE_NEAR_INDEX], focusDistances[FOCUS_DISTANCE_OPTIMAL_INDEX], and focusDistances[FOCUS_DISTANCE_FAR_INDEX] all contained the value positive infinity.
It's possible I'm doing something wrong, in which case please let me know if there is a specific way I'm which this will work on the Nexus 5. However the android API specifically states you can call getParameters() (and then getFocusDistances()) at any time to get the latest focus distances and therefore I think this should work. One thing I haven't tried yet is doing the above in an on auto focus handler, however I don't see why this should matter.
I did some research to try and see what was going on, and I found several questions regarding this sort of behavior from getFocusDistances() and typically the answer, if there was one, was that the function is not supported by the android API and/or the hardware manufacturer. Now a lot of these discussions I found online were from several years ago, and dispite the questionable feelings it gives me about getFocusDistances, I've still seen this function suggested to be used for getting the focus distance so I figure it must work on SOME device for SOME android API version.
Does anybody know if getFocusDistances() works for any particular version of android on the Nexus 5? If not, does anybody know ANY device it does work on?
EDIT:
Since posting, I have tried obtaining the focus distances in the onAutoFocus handler, as well as trying a bit more extensively for objects atvarious distances. The results have been consistent - positive infinity is always returned for all 3 focus distances (NEAR, OPTIMAL, and FAR). I even tried this with a Nexus 7 and getFocusDistances always returns the constant values (0.95, 1.9, and infinity), so apparently getFocusDistances isn't implemented on that device either.
Therefore, I really have two questions:
Is there any way to get somewhat accurate focus distances using the android Camera API with the Nexus 5? I'm even wondering if there is custom android version where getFocusDistances is actually implemented, since otherwise I may attempt to do so myself depending on what I find when examining the API code.
Are there any android capable devices that are known to implement getFocusDistances in a somewhat accurate manner?
First of all, It's very difficult to measure the object distance from one single shot/view. You would find many research papers which tried to employ vision based techniques to judge the object distance. I can refer you one such paper. They tried to implement a positioning system that would solely work on mobile camera+sensors. You would probably realize how non-trivial it is to measure the object distance from one single camera view. They finally used a method called "structure from motion" vision technique to calculate the distance (From multiple photos taken from multiple angle).
Even traditional apps like SmartDistance and SmartMeasure needs to use geometric tricks to measure the distance. None of them could only rely on camera parameters. Sorry for the elongated introduction. I have done a project of this sort before and I am telling you all these based on my experience.
To answer your query, I haven't found any Android device yet which returns realistic values of focus distances. They are either returned as some constant values or sometimes 0 and infinity. I found someone reporting that it worked for Galaxy Nexus but only within 30cm object distance, it doesn't work for distances more than that. The bottom line is that you cannot rely on this function from camera API which is heavily dependent on the device drivers. And, phone camera's are not well-known for their lens/sensor qualities. It would be very very difficult for you to work on any optics based formula for mobile-phone cameras. I would suggest you to rather go for some sensor based geometric tricks.
I was wondering if there is a way for me to detect if the users device is being "obstructed" by a building or roof of some sort. Im developing a very precise location based app and its KEY that my users get alerted if something is wrong with there GPS or something is getting in the way. Physical object.
EDIT: The app ive created strictly takes snapshots too its not something thats constantly going. Just a quick snapshot.
Not directly. You can try calling LocationManager.getGpsStatus and iterating over the list of satelites every so often and looking for a jump in signal to noise ratio since the last reading. Getting a working algorithm is going to take a good amount of work and testing on a variety of devices with different GPS chips.
I'm writing an app in Google Android 2.1 that needs to know which direction (n/w/s/e) the device (HTC Hero) is facing. The sensor and its listener are working great, but the values I get from the sensor are totally crappy. e.g. it tells me I'd be facing north when the device is facing SW or so...
This seems to be a known problem with android devices. The "solutions" I found on the web look like this:
shake the device around
move the device like an eight
tap on the devices back
This is thought to trigger the sensors recalibration. And: the thing with the "moving around" works for me... but that's not very professional I guess...
so - how do I trigger the recalibration of the orientation sensor from the SDK? I need the sensor to be properly calibrated without any fancy stuff that would make users of this app look like complete idiots while they are "manually" recalibrating their phones...
Is there any way to do this "right"?
EDIT:
Or: is there any way to determine PROGRAMMATICALLY, if the device is correctly calibrated or not? As a fallback-option so to speak... then I could warn the user that the device needs "manual" recalibration.
I don't believe there is a way to know programatically if you compass sensor is calibrated correctly unless you use a secondary data source like GPS. If you can use GPS then when the user is moving you can compare the GPS movement with the compass heading and correct. Remember that local magnetic fields can screw up the compass readings and the devices has no idea if you are out in the middle of a forest or next to a transformer.
With these micro devices there is always a bit of skew you'll have to deal with. If you check the values for the accelerometer as well you'll see that at rest they aren't always returning 9.8 m/s^2 (or at least consistently between devices).
In your help you may just need to tell the user to rotate/twist their phone in a figure eight to reset the compass.
I assume you are referring to the Magnetometer inside the Hero.
Callibrating it is a tough one and will/should always require user interaction for a realiable callibration. There are seperate strategies to deal with that. You could ask users to hold there device in north direction and then recallibrate. If the users don't know where north is, you can ask them to direct zhe device towards the sun and based on location and time you can calculate where that is.
Leaving callibration aside, I would guess that your problem is that the readings you get from the sensor are inaccurate. Of course callibration is a prerequisite for accurate readings, but there are also other factors in play.
It is common practice to complement sensor data from one sensor with the data a different sensor to increase accuracy. You could use the GPS to determine a heading when the user is moving. If he's moving slowly however, this is inaccurate as well. You could integrate the data reported by the Accelerometer to guess about orientation changes (not the absolute orientation). But honestly a Gyrometer would be more ideal in this case.
Systems that work like this are sometimes called Inertial Navigation Systems (INS) because they can, given a fixed point in space, determine their subsequent relative position and orientation accurately without further external data. Using a Kalman filter is common practice to recallibrate the system from time to time when an absolute position (e.g. retrieved via GPS) is available.
Although it is unrealistic to implement a full-fledged INS, you can certainly draw a few ideas from how they work to make your orientation readings more accurate.