I would like to build a simple camera intent to take a short video, however on the Nexus 6p after the user records the video they are presented with 3 controls, the large blue centre control works as intended by dismissing the camera, however the two smaller controls to the left and right crash the camera and don't give any debug messages to help figure out a possible fix, is there a way to hide or disable these controls as they are not needed for my specifications? I have attached two screenshots below illustrating the issue, any help would go a long way thanks.
Related
I'm trying to figure out when a device is getting ready to switch off it's screen.
For example, I have a Samsung Galaxy S22 - when I press the power button the phone screen is turned off. I can capture this using a BroadcastReceiver which is ok.
However, before the screen is turned off, the screen actually fades to black. I am building an app that needs to seamlessly show itself for 5 seconds before the screen is turned off. I don't want it to first fade to black, then switch the screen back on and then show my app - which is currently what is happening. Is this possible? I'm just looking for a better user experience. Any other suggestions are welcome.
I have tried to google around and I'm not able to find anything.
EDIT: It's also worth noting that I have an AccessibiltyService running. So my application may not have an active Activity on the screen. The Activity is launched once the screen goes off - but I do want it to launch before the screen actually goes off.
I am working on an android app as explained by the title.
I have figured out the background part but i have found no information regarding the second part.
The main goal is to be able to change the facial features (in my case, the eyes) of any pictures that are projected in the main screen.
I am familiar with facial feature swapping given a root image from which the facial features will be extracted and then swapped to another inputed image, but i am searching for a way to swap those features from the root image to any face that appears in the main screen.
I am still in the theory phase regarding the second part.
I have considered manipulating cached photos but this wont be a complete solution.
I have considered changing the features of the faces that get detected in the main screen by drawing the features from the root image, but it wont do because it would be a simple drawing on the screen without affecting the underlying image.
I have no other ideas. I cant think of any way. It seems that it is impossible?
Any ideas?
Thanks in advance.
I am just creating a quick application in which there would be a floating icon on the home and the user have to just drag and drop that icon to the app and then it will show a alert window.
I know how to create the floating icon using some maths and windowmanager but don't know how should i get the name of the app on which the floating icon is dropped. I have thought of some ways to archive this, but don't know how to implement it....
By Getting the position of the touch when ACTION_UP Trigger is occurred and then checking what's the app is by comparing it with its position.
But there is a problem in this way, because i don't know how to get the position of launching icons on the homepage.
Help me to archive this task. Please tell me How can i get the position of launching icon(like facebook, google play) on the homepage . Also it would be very helpful if you can suggest me other ways of doing this.
I know this can be done because winners of techcrunch hackathon have
made the same application. A short video of this can be found here, in
case you might want to look,
http://techcrunch.com/2015/09/20/disrupt-sf-2015-hackathon-winners/
i don't know how to get the position of launching icons on the homepage.
That is not possible in general, outside of perhaps the accessibility APIs.
Please bear in mind that there are > 1 billion Android devices in use. Those span thousands of device models, representing hundreds or thousands of home screen implementations. Users can also install third-party home screen implementations (e.g., from the Play Store) and use those.
There is no requirement that a home screen have "launching icons" that meet your expectations. I could write a home screen whose app launcher consisted of text hyperlinks, for example. Or, I could write a home screen that is designed to be used by an external keyboard, where launching apps is triggered by keypresses rather than icons.
There is no requirement that a home screen have some sort of API that, given some X/Y coordinate on some arbitrary piece of that home screen, would tell you an app that is represented by something visual at that X/Y coordinate.
You are welcome to try using the accessibility APIs to find details of a widget at the desired X/Y coordinate on the current screen. However, from there, you would have to make guesses as to whether or not that is a launcher icon and, if so, what app it would represent. This approach is likely to be unreliable, except for specific scenarios that you have hard-coded. Hard-coding is what the team you cited appears to have done, based on the prose on the TechCrunch site.
This is a frustrating problem, because I can't easily debug my app for every different form factor. Two people have emailed me and said that 'the app crashes on tilt'. I've had one of the users with problems email me the output from the logcat when they run my app.
Here is the output. I'm a bit new to Android development, so this is very cryptic to me and I'm not sure what it relates to in my actual code.
If it's helpful I can post any of my classes, or any information that would be useful.
EDIT: I could be wrong but my guess is that the output referring to the accelerometer starts around line 432 on the pastebin.
They might be using tablets on which the default screen orientation is landscape and the origin is what would appear to be in the top-right corner of a phone.
I have almost finished writing a color detector app for Android.
However I am having a problem with very close shots. Both my NexusOne and Desire are unable to properly auto-focus at such lengths. I have tried pretty much all the parameters to no avail. Is it possible to let the user focus the lens?
I tried implementing the zoom trick ( move phone back and zoom and then auto focus ) and while it works on the native camera app I am unable to get it working with my app.
All advice appreciated. :)
Basically u cannot focus item which are close by(Macro mode) using Nexus One or Desire or for that matter most phones. I am yet to find any phone which does though, but I am not discounting that there may be some phones :-)