I have almost finished writing a color detector app for Android.
However I am having a problem with very close shots. Both my NexusOne and Desire are unable to properly auto-focus at such lengths. I have tried pretty much all the parameters to no avail. Is it possible to let the user focus the lens?
I tried implementing the zoom trick ( move phone back and zoom and then auto focus ) and while it works on the native camera app I am unable to get it working with my app.
All advice appreciated. :)
Basically u cannot focus item which are close by(Macro mode) using Nexus One or Desire or for that matter most phones. I am yet to find any phone which does though, but I am not discounting that there may be some phones :-)
Related
I'm writing a highly specialized application for the Kyocera E6820 that requires the device's wide angle camera and the flashlight to be on simultaneously.
I'm using the Camera2 Android API. The problem is that by default the wide angle camera doesn't support flash, so if I try to turn on the flashlight and the wide angle camera at the same time by doing the following:
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_TORCH);
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);
I get an error that says:
W/LegacyRequestMapper: mapAeAndFlashMode - Ignore flash.mode == TORCH;camera does not support it
I tried to get around this by turning on the flash using the regular rear facing camera (which does support flash) without opening it by doing the following:
manager.openCamera(wideAngleCameraID, mStateCallback, mBackgroundHandler);
manager.setTorchMode(rearFacingCameraID, true);
However, it didn't work, and I got this message:
W/System.err: android.hardware.camera2.CameraAccessException: The system-wide limit for number of open cameras has been reached, and more camera devices cannot be opened until previous instances are closed.
I can get the normal rear facing camera and the flashlight to go on simultaneously, but no such luck with the wide angle.
Does anyone know a potential way around this problem? I've already looked through this post but didn't find anything useful. Since this is such a specialized application it is not out of the question to do something like rooting the phone or talking directly to the device's LED drivers, but I would need to know where to start. Any direction or help would be appreciated.
For anyone viewing, I found the answer after looking at this forum discussion. For me, the brightness value was located in the
/sys/class/leds/led:torch_0
directory. After rooting the phone, I was able to manipulate the value by invoking an adb command using the process given here.
I saw new Android devices coming out, that show things (clock etc.) once the display "turns off". That feature is called an Always-On Display, and since my Samsung Galaxy S6 edge already features an AMOLED screen, and a night-clock, I believe there is a way, to make it show something, when the display is "turned off". Is there a way using Java on Android, to display something, once display is meant to turn off? (Like just a normal GUI, I could do the rest then.)
Like, to tell your app, to show something, when the screen turns off, that is still visible somehow? (without root permissions)
That would be useful, thanks in advance.
UPDATE:
I found some apps on the Google Play Store, which seem do to, what I want (not that specialized though):
https://play.google.com/store/apps/details?id=com.thsoft.glance&hl=de
https://play.google.com/store/apps/details?id=com.orthur.always_on_display&hl=de
So it is definetely possible, I just need to know, how.
If you are not using root, then you can only use the Android APIs. Here is a list for example for the display: http://developer.android.com/reference/android/view/Display.html
I don't see anything there for the Ambient Display mode or Always-On.
Samsung provides APIs also for the features of it's phones here: http://developer.samsung.com/galaxy Here I can find the Look API that has something close to what you want, but for Edge.
I am currently developing this Android phone application for the visually impaired people to detect and scan barcode. I am going to use ZXing(scanning via intent) for decoding barcode.
The problem now is, I need to use the camera to detect the presence of barcode around scanning region and alert the user via vibration or speech synthesizer. I am kind of stuck in this area now, I have tried borrowing books from the library and google searching for edge detection example, but to no avail:(
Can any kind soul please give me some advice in this area?
Thanks in advance!
There are other scanners on the market that use the full picture to detect barcodes. I know that NeoReader does not have this kind of scanning region.
It may be easier to find an API for different scanner then to detect barcodes (don't forget to focus...). Even if you find a way to detect codes, you still have to guide the visually impaired people to move the mobile so that the code will be inside the scanning area.
I wanted to understand how does the Android OS figure out which home screen the user is viewing currently and render the appropriate icons and widgets on that screen based on the user's left or right swipe on the touch screen of the device.
The OS must save a state of the screen and IDs or something relative to the objects placed on the screen to retrieve the state each time the screen becomes visible.
From my research I understand that Android OS treats all the 7-8 homescreens on devices as one single host.
Also my question might seem vague, but the reason why I am asking is because it seems reasonable that app widgets on android devices, update not only when the phone is awake but also only when the app widget itself is visible. I know that Google has declined the enhancement request by many others but I don't think that is good enough. Link here.
That is the reason why I am trying to give it a shot to understand and implement it for my app with whatever Android knows about the state of the home screens.
Any help or insight is much appreciated. Also the experts out there let me know if you think this can be even implemented for one off apps at all?
Well, as the link you posted clearly states, there's no way to know.
Also, if you consider the fact that "Home" is just an application like all the others, it makes even less sense to have a unified API for that. A lot of people use Launcher Pro or similar applications, which would probably not implement it.
People have been experiemcing a problem with my android app. Apperently what is happening is on phones such as the Droid x and cliq cupcake have been experiencing the main menu not showing the letters on the buttons and in some occasions it will cause a force close in certain sections. My buttons do have picture backgrounds if that could be the cause and the app is set for version 1.6. Not sure if either has anything to do with it. If anyone has experienced it or has any ideas I would appreciate some help. Thanks alot
It should work on every phone if you read this post carefully
http://developer.android.com/guide/practices/screens_support.html
In best practices part recommended list is here.
Use wrap_content, fill_parent, or the dp unit (instead of px), when specifying dimensions in an XML layout file
Do not use AbsoluteLayout
Do not use hard coded pixel values in your code
Use density and/or resolution specific resources
Every android developer should read that post to support for multiple screens.
Hope this helps