I am working on an developing an application for Mobile and Wearable to get sensor data at the same time from both mobile and handheld. When I press 'start collecting data' it sends a message to the Wearable to start the Sensor Service and starts collecting data, it also start collecting sensor data simultaneously from mobile sensors. Similarly, when I press the 'Stop' it stops collecting data. I am sending every value of Wearable sensor data back to mobile using DataItem to be saved later on in the mobile storage.
public void onSensorChanged(SensorEvent event) {
int sensorType = event.sensor.getType();
if (sensorType == Sensor.TYPE_ACCELEROMETER) {
float[] values = event.values;
Log.d(TAG, "onSensorChanged: Changed" );
sendSensorData (values);
}
}
private void sendSensorData (float[] values) {
PutDataMapRequest putDataMapRequest= PutDataMapRequest.create(NEW_VALUE);
putDataMapRequest.getDataMap().putFloatArray(KEY, values);
putDataMapRequest.getDataMap().putLong("Time", System.currentTimeMillis());
PutDataRequest putDataRequest= putDataMapRequest.asPutDataRequest().setUrgent();
Task<DataItem> dataItemTask = Wearable.getDataClient(this).putDataItem(putDataRequest);
dataItemTask.addOnSuccessListener(new OnSuccessListener<DataItem>() {
#Override
public void onSuccess(DataItem dataItem) {
Log.d(TAG, "onSuccess: "+ dataItem);
}
});
}
I am using the onDataChanged in Mobile package to listen to data changes from Wearable. I am using System.currentTimeMillis() in Wearable package to ensure a continuous stream of sensor data back to mobile.
This is the code on the receiving side i.e. the handheld.
public void onDataChanged(DataEventBuffer dataEventBuffer) {
for(DataEvent event: dataEventBuffer) {
if (event.getType() == DataEvent.TYPE_CHANGED) {
DataItem dataItem= event.getDataItem();
Uri uri = dataItem.getUri();
String path = uri.getPath();
if(path.equals(NEW_VALUE)) {
DataMap dataMap = DataMapItem.fromDataItem(dataItem).getDataMap();
getSensorData (dataMap);
//Log.d(TAG, "onDataChanged: "+ dataMap);
}
}
}
super.onDataChanged(dataEventBuffer);
}
The problem is that when I stop collecting data and compare both data records, wearable data records received by the mobile are very less in number which I understand is due to communication time between mobile and sensor and which causes the delay. I do understand the data records will not be exactly similar in number in any ideal scenario. Is there any way that I can minimize the delay of the data between the two devices sensor data?
The problem you are facing is because the sample rate of the sensors inside your phone is different than the sample rate of the wearable device. For instance the accelerometer on your phone may run at 400 Hz, and the one on your wearable at only 100 Hz or less.
A while ago I compiled a small list showing the maximum sensor sample rate of smartphones. As you can see almost every model has a different sample rate:
https://docs.google.com/spreadsheets/d/1vZEryeslHOq-pl_C-scoS4us21goxg8oyqviUgDGq2k/edit?usp=sharing
Related
I am developing an Android app (the app runs on Android 6): I want the app to send a notification to the user when it is near a BLE device (a device that I have at home). So I continuously scan, I scan through a service (which is running in the background). It works well when the phone screen is on; but, when the screen turns off, a few seconds later the application can no longer find the BLE (the scan is still running, but there is no callback.
if (enable) {
if (mScanning) return;
// Stops scanning after a pre-defined scan period.
handler.postDelayed(new Runnable() {
#Override
public void run() {
if (!mScanning) return;
try {
mScanning = false;
mBluetoothLeScanner.stopScan(mScanCallback);
Log.i(TAG_LOG, "Stop scanning after pre-defined scan periode");
} catch (Exception e){Log.e(TAG_LOG,"mBluetoothLeScanner.stopScan Exception:=>"+e.getMessage());}
}
}, SCAN_PERIOD);
mScanning = true;
mBluetoothLeScanner.startScan(filters, settings, mScanCallback);
Log.i(TAG_LOG, "Start scanning ....");
}
private ScanCallback mScanCallback = new ScanCallback() {
//When a BLE advertisement has been found
#Override
public void onScanResult(int callbackType, ScanResult result) {
super.onScanResult(callbackType, result);
Log.i(TAG_LOG, "Name: "+result.getDevice().getName()+". Adresse: "+result.getDevice().getAddress()+". Rssi: "+result.getRssi());
//scanDevices(false);
if(result.getDevice().getName() != null && result.getDevice().getName().toString().equals(deviceName)){
mDeviceAdress = result.getDevice().getAddress();
mDevice = mBluetoothAdapter.getRemoteDevice(mDeviceAdress);
Log.i(TAG_LOG, "Device found");
scanDevices(false);
}
}
You can't make this work. Scanning is a very expensive operation that Android won't allow in the background. Instead, make an attempt to connect to the device. I had success doing this in a WorkManager job, running every 15 minutes. Battery drain was negligible and it was pretty reliable. Note that a connection state 0x85 usually represents the device being out of range, and 0x80 means a different device is already connected to it (or the phone is already connected to too many different devices). Full error list is at https://android.googlesource.com/platform/external/bluetooth/bluedroid/+/master/stack/include/gatt_api.h#27
Unable to send firebase fcm push notiication to android device.
When I send test message from firebase console, it goes to device. So Android side code is working fine.
For Backend Java server, I am using firebase sdk
public void sendNotification2(String notification, String title, String to) {
String registrationToken = to;
Message message = Message.builder().putData("score", "850").putData("time", "2:45")
.setToken(registrationToken).build();
String response;
response = FirebaseMessaging.getInstance().send(message);
System.out.println("Successfully sent message: " + response);
}
Code outputs: Successfully sent message: projects/myappname/messages/0:1590590793114288%9cceaf10deffd7abc
But the device doesn't receive notification.
It would be important to clarify if your app is in background or foreground while sending the message from backend, in fcm this is important to consider.
In this case, you send a data message, not a notification message.
The difference is explained in the doc:
https://firebase.google.com/docs/cloud-messaging/android/receive
That means, you should receive this message in the overriden onMessageReceived(...)-method when the app is in the foreground and in the background.
(You can control this with a debug log command.)
If you use data messages, you can define custom key-value pairs and implement the notification process on your own, when the app is in the foreground and background. With firebase console in contrast, you send notification messages, they are processed in the background of your app automaticly, in the foreground the overriden onMessageReceived(...) is triggered.
Here is an example of building a notification message (with different priorities which should not confuse you):
private Message createFCMMessageWithNormalPriority(String notificationMessage) {
return Message.builder().setNotification(createFCMNotification(notificationMessage)).setToken(registrationToken)
.setAndroidConfig(createAndroidConfigWithNormalMessagePriority()).build();
}
private Message createFCMMessageWithHighPriority(String notificationMessage) {
return Message.builder().setNotification(createFCMNotification(notificationMessage)).setToken(registrationToken)
.setAndroidConfig(createAndroidConfigWithHighMessagePriority()).build();
}
private AndroidConfig createAndroidConfigWithHighMessagePriority() {
return createAndroidConfigWithPriority(Priority.HIGH);
}
private AndroidConfig createAndroidConfigWithNormalMessagePriority() {
return createAndroidConfigWithPriority(Priority.NORMAL);
}
private AndroidConfig createAndroidConfigWithPriority(Priority priority) {
return AndroidConfig.builder().setPriority(priority).build();
}
private Notification createFCMNotification(String notificationMessage) {
return Notification.builder().setBody(notificationMessage).setTitle(NOTIFICATION_CONTENT_TITLE).build();
}
Forgive me if this question was already asked, I couldn't find an answer for my case.
So, I have an Android app with Voice & Video call feature. I used webRTC for this.
I was able to make both Voice and Video call working perfectly inside an Activity, but now I want to keep the call running while the user exit the CallActivity and go back to the ChatActivity (to send a file/link/photo for example).
I managed to make the Voice call run perfectly inside a Background Service, but video call won't work as expected.
The remote video won't be displayed even though the audio from the video track is playing.
here is my Background Service code :
#Override
public void onAddStream(MediaStream mediaStream) {
if (mediaStream.videoTracks.size() > Constants.ONE || mediaStream.audioTracks.size() > Constants.ONE) {
return;
}
//check for video track, means this is a video call
if (!isAudioCall && mediaStream.videoTracks.size() > Constants.ZERO) {
remoteVideoTrack = mediaStream.videoTracks.get(Constants.ZERO);
CallActivityNew.remoteVideoTrack = remoteVideoTrack;
try {
localAudioTrack.setEnabled(true);
//Now ask the UI to display the video track
sendOrderToActivity(Constants.START_REMOTE_VIDEO, null);
} catch (Exception ignored) {}
} else if (mediaStream.audioTracks.size() > Constants.ZERO) {
//Means this is a Voice call, only audio tracks available
remoteAudioTrack = mediaStream.audioTracks.get(Constants.ZERO);
try {
localAudioTrack.setEnabled(true);
remoteAudioTrack.setEnabled(true);
} catch (Exception ignored) {}
}
}
and below my CallActivity code :
case Constants.START_REMOTE_VIDEO: {
if (remoteVideoView == null) {
remoteVideoView = findViewById(R.id.remote_gl_surface_view);
}
remoteVideoView.init(eglBaseContext, null);
remoteVideoView.setEnableHardwareScaler(true);
remoteVideoView.setMirror(true);
remoteVideoView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FIT);
remoteVideoView.setZOrderMediaOverlay(true);
//Apply video track to the Surface View in order to display it
remoteVideoTrack.addSink(remoteVideoView);
//now enable local video track
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
//now enable local video track
remoteVideoTrack.setEnabled(true);
}
}, Constants.TIME_THREE_HUNDRED_MILLIS);
setSpeakerphoneOn(false);
break;
}
I am sending orders from Service to Activity, the "case Constants.START_REMOTE_VIDEO" work after receiving the order from Service.
I don't see where the problem, why am I only hearing sound but the remote video won't start display !!
Thank you in advance for helping.
After testing for long hours, I found that my code works just fine, I just forget to change the view visibility from "GONE" to "VISIBLE".
Yeah that was the solution, i swear xD
So here's my problem. Currently I am making an app that has users upload an image which can then be retrieved and viewed by other users. The user is supposed to be able to swipe through these pictures similar to the way it works in the facebook app, but randomly. The problem I'm encountering is that the images are downloading very slowly! It takes about 10 seconds for each image to download. Which is incredibly slow when you want to flip through images every 2-3 seconds or so. Currently I have used the AWS transfer utility to download and upload files to my S3 bucket. An example of my code is below:
TransferObserver observer = transferUtility.download(
"mybucket", /* The bucket to download from */
image, /* The key for the object to download */
iFile /* The file to download the object to */
);
observer.setTransferListener(new TransferListener() {
public void onProgressChanged(int id, long bytesCurrent, long bytesTotal) {
// update progress bar
//progressBar.setProgress(bytesCurrent);
Log.i(TAG, "progress changed");
}
public void onStateChanged(int id, TransferState state) {
Bitmap myBitmap = BitmapFactory.decodeFile(iFile.getAbsolutePath());
//Bitmap fin = rotateBitmap(myBitmap, 90);
//portraitView.setImageBitmap(myBitmap);
bmp = myBitmap;
update();
}
public void onError(int id, Exception ex) {
Log.e("ERROR", ex.getMessage(), ex);
Log.i("ERROR", "189");
Log.i("ERROR", "image is:" + image);
Log.i("ERROR", "iFile is:" + iFile);
update();
}
});
This gets the image file and stores it in iFile, then converts it to a bitmap and displays it on the screen. But it takes 10 seconds or so for onStateChanged() to run.
I have tried this with multiple buckets in multiple US regions (some very close to me, as well as standard.) I've tried activating the transfer acceleration on my buckets but the best download increase I have gotten is about 20% which still isn't going to cut it. Is there any way I can make this run faster? Is it even supposed to run faster?
Should I just use a different place to store my images?
I'm trying to send the content of a DataMap from an Android device to a wearable. It works fine when the app is in the foreground on my app but once I lock the mobile device it gets stuck at the pendingResult.await() and the wearable doesn't receive any data where as it normal would if I keep the app open.
public void send(final DataMap dataMap) {
new Thread(new Runnable() {
#Override
public void run() {
PutDataMapRequest putDMR = PutDataMapRequest.create(WEARABLE_DATA_PATH);
putDMR.getDataMap().putAll(dataMap);
PendingResult<DataApi.DataItemResult> pendingResult = Wearable.DataApi.putDataItem(googleClient, request);
DataApi.DataItemResult result = pendingResult.await();
if(result.getStatus().isSuccess()) {
Log.d("qwe", "Data item set: " + result.getDataItem().getUri());
}
}
}).start();
}
This method is in a class which extends WearableListenerService and I have added the XML in the AndroidMainfest for the service also. Am I doing something completely wrong or missing something?
Thanks
try to check google api client status for each send.
use blockingConnect when google api client is not connected.
Found out I was doing googleClient.disconnect() in my main activity onStop() which was causing it to hang as googleClient wasn't connected once my app was in the background.