I want to make an app which will act as the mouse of computer. So how to start please suggest.
Just want to mimic the mouse movement using Android screen
Start by using your real computer mouse and observe its behavior.
Then mimic that behavior in your application.
Related
I have an Android app that is transparent that monitors for an event and then displays a message and toggles the background between transparent and a translucent red. The app itself is working exactly as I was hoping. The problem is that only the top most app (my app) is active and updating the view. The app displays over a 3rd party app that runs a live video stream. I want the live video stream app to continue to update while my transparent app is on top. Since I have no control over the other app and it doesn't support multi-window approach, I am looking for a different approach or work around to accomplish the same thing.
I have read of people trying to use a system popup based on a service. Does anyone have experience trying to solve a similar problem and found a good approach for designing the app or a work around?
Try to google about Activity lifecycle. When another app comes as the top application (let's say your transparent app) the last playing application goes to onPause until the user bring the last app back to the top again.
The short answer is: what you want is not possible.
any question?
I have made a light weight java web-server serving html files and static content (made with pure java library nanohttpd), i have successfully made a javaFX launcher window that has a single button, clicking button simply runs the server in the background and opens the localhost URL in Android/PC browser (I was unsuccessful in making an IOS version using javaFX)
I am thinking of using libGDX as "launcher window" because of IOS support and access to mobile specific hardware like SMS/GPS which javaFX don't have.
I am targeting IOS/Android/PC, I'd like to ask libGDX developers how possible is this given my target platforms?
Yes you can
There's a catch though, you may have to build the UI the game dev way
I've made a game or two using it and had rather steep learning curve in the beginning to get a hold on to how it worked. For example you'll have to provide the images for button, background and also, the pressed view of the button... like that. But your app is not a game. So you won't have to worry that much.
Once you learned how to place them in the screen successfully, there is not much to worry about because the API provides everything you need to carry on from there.
Also I found enough resources/tutorials online enough to make a game from ground up. So you'll definitely can.
And there's very little to worry about your multi-platform problem.
I am currently working on a health-based project focusing on a person smartphone usage. And for that i want to monitor his touch-screen usage pattern i.e how long he has been pressing his hands on the screen.
When this limit crosses a thresh-hold value will be generated warning to him.
I need to know will i be able to track down a person screen-usage in background (i.e. while he uses all other apps in his phone)
If so which functions would help me ??
Android does not support this, for privacy and security reasons.
You are welcome to intercept all screen touches, to find out when the user touches the screen. However, then the screen touches will not be available to the underlying apps, and so user will not be able to use the device.
You are welcome to look at implementing an AccessibilityService, to be able to find out about user input in all apps. However, last I checked, this does allow you to find out "how long he has been pressing his hands on the screen".
You are welcome to create a custom Android ROM that bakes in your desired monitoring, then deploy that custom ROM on whatever devices you choose to support. With your own build of Android, you can do pretty much whatever you want, but then you are not making a simple app.
On a rooted device, you can probably run something with superuser privileges to track all user inputs, but I do not have the details about how to do that.
So far, there are similar questions asked in here as well. Most commonly, some users suggest defining a window layout stays on the screen as always and capture touch events with onTouch(View v, MotionEvent event) function. But it seems, it is not available anymore (after Android 5.0).
One thing I can suggest that, if the device is rooted, you can execute
"su getevent -lt" command and write the output in a BufferReader. That way you can capture the touch events.
I am just creating a quick application in which there would be a floating icon on the home and the user have to just drag and drop that icon to the app and then it will show a alert window.
I know how to create the floating icon using some maths and windowmanager but don't know how should i get the name of the app on which the floating icon is dropped. I have thought of some ways to archive this, but don't know how to implement it....
By Getting the position of the touch when ACTION_UP Trigger is occurred and then checking what's the app is by comparing it with its position.
But there is a problem in this way, because i don't know how to get the position of launching icons on the homepage.
Help me to archive this task. Please tell me How can i get the position of launching icon(like facebook, google play) on the homepage . Also it would be very helpful if you can suggest me other ways of doing this.
I know this can be done because winners of techcrunch hackathon have
made the same application. A short video of this can be found here, in
case you might want to look,
http://techcrunch.com/2015/09/20/disrupt-sf-2015-hackathon-winners/
i don't know how to get the position of launching icons on the homepage.
That is not possible in general, outside of perhaps the accessibility APIs.
Please bear in mind that there are > 1 billion Android devices in use. Those span thousands of device models, representing hundreds or thousands of home screen implementations. Users can also install third-party home screen implementations (e.g., from the Play Store) and use those.
There is no requirement that a home screen have "launching icons" that meet your expectations. I could write a home screen whose app launcher consisted of text hyperlinks, for example. Or, I could write a home screen that is designed to be used by an external keyboard, where launching apps is triggered by keypresses rather than icons.
There is no requirement that a home screen have some sort of API that, given some X/Y coordinate on some arbitrary piece of that home screen, would tell you an app that is represented by something visual at that X/Y coordinate.
You are welcome to try using the accessibility APIs to find details of a widget at the desired X/Y coordinate on the current screen. However, from there, you would have to make guesses as to whether or not that is a launcher icon and, if so, what app it would represent. This approach is likely to be unreliable, except for specific scenarios that you have hard-coded. Hard-coding is what the team you cited appears to have done, based on the prose on the TechCrunch site.
I'm working on writing a program that controls Android device (via USB cable and the Android-SDK toolset).
The program is written in Java, and is addressed for Windows platforms.
The idea is that I get a screenshot of the device's screen every few seconds and show it in a Frame, and then I can simulate taps and swipes on the device's screen using the PC mouse.
So far everything is working as expected.
The thing is I now want to enable the ability of simulating multitouch events on the device, and I was thinking - why can't I receive such events from a multitouch device such as the touch-pad every modern laptop comes with.
How can I do it? is that hardware dependent? Platform dependent? Would I need to implement some sort of interface with drivers for every touch-pad I want to support, or is there a way to do it simply like I write a regular mouse listener?