I'm using Push Notifications in my application and have noticed that if around 50+ notifications arrive the draw stops displaying any new notification that may arrive.
This situation should be rare within my application but its still possible and I'm trying to find out more information about it (since I can't find a mention of a limit anywhere) and the best way to deal with it (so far I'm thinking using SQLlite and storing the overdraw notifications and showing them when the user dismisses a notification).
Please do not show "50+ notifications". Please show one Notification, updated perhaps to reflect 50+ events.
This is covered:
in the design guidelines (see "Stack your notifications")
in the API guide
in the Android Wear documentation (as people with Android Wear devices will get severely pissed at you for flooding their watch or whatever with 50+ notifications)
Related
I want to write an Android App that analyses chats. For each messenger, for each contact, I want to get a list of all messages sent and received (since installing my app). The key thing here, is that I want the list of messages to be in order. My app should support as many messengers as possible. What would be the best way to do this?
I researched and found a few ways to access messengers' messages:
Using accessibility services
(Related: How to read window content (using accessibilityService) and evoking UI using draw over other app permission in Android?)
to read the screen content of other apps that is annotated with accessibility labels
caveats:
the messenger needs to have accessibility implemented correctly
I have to filter which text blobs are relevant
Using NotificationListenerService
to get the content of notifications for received messages
caveats:
only gets content of messages for which there was a notification
except for their time, there is no order of the messages given
taking screenshots and doing optical character recognition
caveats:
probably insane
When I get the messages, I still have to order them. For this, I think, the accessibility services are my best bet as I could use the date information on screen in most messages, and also their position on the screen.
However, there is still an edge case. As I understand it, accessibility services can only see the content currently on the screen. The user might also use another device to chat while the device with the app installed is powered off. Thus, once the device with the app installed is powered on again, the app might not be able to see the old messages written while the device was powered off (until the user scrolls up again).
Are my assumptions correct? Do you know of a better way for my app to read/sort the messages?
The Android TV (ATV) app I'm working on has voice control capabilities. Basically, when the user presses the microphone button on a remote controller, the key event (identified by KeyEvent.KEYCODE_SEARCH) is handled by the app, speech recognition starts (using android.speech.SpeechRecognizer), the results (parsed speech) are obtained and parsed further by the app logic (e.g. showing the user search results or performing some in-app action).
Everything has been working as intended and described above, until, quite recently, Google Assistant (GA) was introduced to ATV platforms (the first one being Nvidia Shield box). Now, when the RCU mic button is pressed, the GA overlay appears and the mic key event doesn't even reach the app.
For the last few days I've done some extensive research (documentation, internet, forums, stackoverflow etc.) and experimented with some potential workarounds, but nothing's worked so far and I haven't been able to find any definite information on the topic (probably due to the ATV+GA combination being rather new on the scene, and the ATV ecosystem not being as large as the Android one).
The best hint I got so far is what's been done with the Spotify app for Android TV. When it's run on an ATV device with no GA, it basically behaves as I described above; but when GA is present, the GA overlay appears, receives the parsed speech and shows the search results, with results from Spotify in the first line - so, the Spotify app is integrated with GA, and this integration replaces the in-app voice control mechanism. This suggests that either there is no way to ignore/disable GA inside your app in order to receive the mic key event and proceed with voice control as usual, or at least this is the preferred way of handling voice commands now. It also shows that there are apps for ATV that approach voice control the way I described, so maybe someone here has already encountered similar problem.
My question(s):
is it possible to prevent Google Assistant from taking over RCU mic button signal?
is it ok to do so? (by "not ok" I would mean - are there any official guidelines that discourage such behavior - or at least are there valid reasons not to do so?)
if so, can it be done?
if not, is there a resource documenting how to integrate with GA (the way Spotify for ATV app does)?
Starting with your last question:
if not, is there a resource documenting how to integrate with GA (the way Spotify for ATV app does)?
I wrote about how to integrate on the Android Developer's Blog. Spotify has onboarded their content catalog to Google's services which is why the Google Assistant is able to work so well. You can achieve similar results if you make your app searchable (covered in the blog).
is it possible to prevent Google Assistant from taking over RCU mic button signal?
No, not at this time. The Google Assistant is a system app that takes control over the mic to give a uniform experience across all apps.
is it ok to do so? (by "not ok" I would mean - are there any official guidelines that discourage such behavior - or at least are there valid reasons not to do so?)
if so, can it be done?
You can still have an in-app search experience. There is an example in the leanback sample. You will need to set a listener on a BrowseFragment and implement a SearchFragment. We know this can be confusing, have in-app search and Google Assistant search competing, but we are working on how to improve this.
I know this can be done with some apps with Tasker and Automate, but I want to learn this by myself.
Create a app that runs always in background;
How to read a sensor data(in this case I would like to use the 'Ambient light sensor');
Lock the screen.
I have Android Studio and everything is ready to make a app and compile, but I have no idea how to make this / where to start. Thanks for any help.
The questions you put covers couple of very wide topics and it is very difficult to provide tailor made answer to them. The best that someone can do is to provide some pointer which I will attempt to -
To create an app in the background on Android, you basically create an Android service and keep it running in the background. Make sure it's a sticky service which means, the service will be restarted automatically if it was killed for some reason. If you want to run this app always on background (not a good idea though), you might listen for a Boot completed broadcast and start your background from the broadcast receiver. Below links are on related topics - https://developer.android.com/reference/android/content/BroadcastReceiver.html
https://developer.android.com/reference/android/content/Intent.html#ACTION_BOOT_COMPLETED
https://developer.android.com/guide/components/services.html
https://developer.android.com/reference/android/app/Service.html#START_STICKY
Reading ambient light sensor data is pretty straight forward. Follow the link provided below. Although all the samples shown in the link need your app to be in the foreground, but it is not mandatory. You can basically register for the sensor data change listener from your background service. Listening for sensor data, needs you to create an instance of the sensor manager framework service class and registering for a particular sensor (in your case Light) data change listener
Source page - https://developer.android.com/guide/topics/sensors/sensors_overview.html
I have no personal experience with keyguard locking/unlocking programmatically but few links which provide some sample code are worth looking -
How to Lock/Unlock screen programmatically?
Android screen lock/ unlock programmatically
I am using LocationManager's requestLocationUpdates() to get the user's coordinates on opening the app, and it takes longer to get the coordinates than in other apps.
Getting location takes a while, however while my app is still loading and waiting for location (I am using LocationManager's requestLocationUpdates()) for much longer than other apps.
Here is a screenshot of my notification panel when I launch my application:
Notice it says Finding Location... when I open my app, because my app calls LocationManager's requestLocationUpdates() as soon as the app is opened.
Here's the meat of the question:
Even while this notification is in my presence, if I navigate to an app like google maps, it is able to pinpoint my exact location in a matter of seconds. Is that because google maps uses getLastLocation?
How come other apps are able to fetch location much faster than mine?
So there are two ways of getting location- fine (gps) and coarse (network). Google Play Services provides a 3rd method, but it uses a combo of those two. GPS takes a long time- it literally has to talk to 7 out of 2 dozen or so satellites to get a location for the first time. Network is fast, it just needs 1 network request (or less if data is cached).
A lot of apps will use both simultaneously, using the network to get a fast answer, and then improving to GPS when it gets that data. I know maps does this. This will get you a fast inaccurate answer, and allow you to improve the accuracy later.
You can use getLastKnownLoacation to go even quicker, but you have to be able to deal with it returning null. Generally network is fast enough.
I am developing an Android app that allows sports team coaches to update the attendance for events like training/matches. A feature I would like to add would be to display a notification on the device to remind them that they need to update the attendance for the event when it has started.
I have been reading online a bit and it seems that push is the preferred method for data that is changing. But because i know the start times of the events, would it be better to create a background service using something like the following?
https://stackoverflow.com/a/9933130/2039505
I basically want the user to receive a vibrate notification which when they click on it, it will open the events attendance screen. Hopefully someone will have some insight into which option is best!
Since all you need is a notification on a timer, the AlarmManager would be the best way to go.
If you used Push Notifications(GCM), that would require server side code and a method to store the device id to send the notification to.
Overkill if you ask me.
Here are links to the official documentation and example code:
Official documentation
Vogella's example on services