Forgive me if this question was already asked, I couldn't find an answer for my case.
So, I have an Android app with Voice & Video call feature. I used webRTC for this.
I was able to make both Voice and Video call working perfectly inside an Activity, but now I want to keep the call running while the user exit the CallActivity and go back to the ChatActivity (to send a file/link/photo for example).
I managed to make the Voice call run perfectly inside a Background Service, but video call won't work as expected.
The remote video won't be displayed even though the audio from the video track is playing.
here is my Background Service code :
#Override
public void onAddStream(MediaStream mediaStream) {
if (mediaStream.videoTracks.size() > Constants.ONE || mediaStream.audioTracks.size() > Constants.ONE) {
return;
}
//check for video track, means this is a video call
if (!isAudioCall && mediaStream.videoTracks.size() > Constants.ZERO) {
remoteVideoTrack = mediaStream.videoTracks.get(Constants.ZERO);
CallActivityNew.remoteVideoTrack = remoteVideoTrack;
try {
localAudioTrack.setEnabled(true);
//Now ask the UI to display the video track
sendOrderToActivity(Constants.START_REMOTE_VIDEO, null);
} catch (Exception ignored) {}
} else if (mediaStream.audioTracks.size() > Constants.ZERO) {
//Means this is a Voice call, only audio tracks available
remoteAudioTrack = mediaStream.audioTracks.get(Constants.ZERO);
try {
localAudioTrack.setEnabled(true);
remoteAudioTrack.setEnabled(true);
} catch (Exception ignored) {}
}
}
and below my CallActivity code :
case Constants.START_REMOTE_VIDEO: {
if (remoteVideoView == null) {
remoteVideoView = findViewById(R.id.remote_gl_surface_view);
}
remoteVideoView.init(eglBaseContext, null);
remoteVideoView.setEnableHardwareScaler(true);
remoteVideoView.setMirror(true);
remoteVideoView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FIT);
remoteVideoView.setZOrderMediaOverlay(true);
//Apply video track to the Surface View in order to display it
remoteVideoTrack.addSink(remoteVideoView);
//now enable local video track
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
//now enable local video track
remoteVideoTrack.setEnabled(true);
}
}, Constants.TIME_THREE_HUNDRED_MILLIS);
setSpeakerphoneOn(false);
break;
}
I am sending orders from Service to Activity, the "case Constants.START_REMOTE_VIDEO" work after receiving the order from Service.
I don't see where the problem, why am I only hearing sound but the remote video won't start display !!
Thank you in advance for helping.
After testing for long hours, I found that my code works just fine, I just forget to change the view visibility from "GONE" to "VISIBLE".
Yeah that was the solution, i swear xD
Related
I have recently added a system volume controller to my app & i have overlooked Casting.
The app detects a volume button click using an accessibility service, intercepts the system volume panel by broadcasting the close system dialogues intent and pops my overlay panel, allowing the user to control audio directly from the panel (alarm, music & ring).
I have already added stop checks if the user is in call or the screen is off.
Is there a way to determine if the android device is currently casting video or audio?
I have dug through several API's and they all seem to point to methods within the context of the app, nothing system wide.
The solution was to create a MediaSessionManager instance and check for active controllers, then get the PlaybackType.
MediaController = null;
boolean isCasting = false;
MediaSessionManager mediaSessionManager = (MediaSessionManager) getSystemService(MEDIA_SESSION_SERVICE);
assert mediaSessionManager != null;
List<MediaController> sessions = mediaSessionManager.getActiveSessions(new ComponentName(this, NotificationListener.class));
for(MediaController controller : sessions) {
try {
isCasting = Objects.requireNonNull(controller.getPlaybackInfo()).getPlaybackType() == MediaController.PlaybackInfo.PLAYBACK_TYPE_REMOTE;
} catch (Exception e) {
e.printStackTrace();
}
if(isCasting){
mediaController = controller;
break;
}
}
I'm trying to play music in my app and while the media player is loading I want to allow the user to keep on using the app, but the app gets stuck for few seconds every time I start loading the media player, and when the media player is finished loading, only then the app returns to normal and starts working again, sometimes it's not just freezing, it also shows popup menu from the OS that prompts the user to quit the app.
I couldn't find any solution in Google or YouTube, anyone knows what's wrong with my code?
final Handler handler = new Handler();
Runnable runnable = new Runnable() {
#Override
public void run() {
try {
String STREAM_URL = #####; // here i put the URL of the song
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
try {
mediaPlayer.setDataSource(STREAM_URL);
mediaPlayer.prepare();
} catch (IOException e) {
e.printStackTrace();
}
} catch (NullPointerException e) {
Log.d(TAG, "run: NullPointerException = " + e.getMessage());
FirebaseCrash.log("Tag = " + TAG + "run: NullPointerException = " + e.getMessage());
}
}
}
};
handler.post(runnable);
Even though you are creating a Handler, the creation of the MediaPlayer still happens on the main UI thread. You should call prepareAsync or use an AsyncTask or some other means to avoid calling prepare on the main thread.
From the documentation for Handler:
When you create a new Handler, it is bound to the thread / message
queue of the thread that is creating it
If you are streaming music from the network, preparing the media for playback is especially going to take a while. One option may be to call prepareAsync instead of calling prepare. In that case, you should set the OnPreparedListener, and in that callback call start.
Basically, what I am trying to do is change the CONTROL_AE_MODE by button click in the app. The user can use AUTO flash(ON_AUTO_FLASH), turn if ON(ON_ALWAYS_FLASH), or OFF(CONTROL_AE_MODE_OFF).
In this example: https://github.com/googlesamples/android-Camera2Basic/blob/master/Application/src/main/java/com/example/android/camera2basic/Camera2BasicFragment.java
Line 818, they set the flash once:
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
setAutoFlash(captureBuilder);
// Orientation
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG, mFile.toString());
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
And then builds the CaptureSession at line 840.
Is there a way to change the CONTROL_AE_MODE after the preview is made?
I have tried remaking the session, which kinda worked:
if(flashMode == CameraView.CAMERA_FLASH_ON){
Log.e("CAMERA 2", "FLASH ON");
mPreviewCaptureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
}else if(flashMode == CameraView.CAMERA_FLASH_OFF){
Log.e("CAMERA 2", "FLASH OFF");
mPreviewCaptureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
}else if(flashMode == CameraView.CAMERA_FLASH_AUTO){
Log.e("CAMERA 2", "FLASH AUTO");
mPreviewCaptureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
}
mFlashMode = flashMode;
if (mCameraCaptureSession != null) {
mCameraCaptureSession.close();
mCameraCaptureSession = null;
}
createCameraPreviewSession();
For some reason, CONTROL_AE_MODE_OFF would turn the whole preview black.
I tried looking in the docs for methods to update but haven't found anything.
Any tutorials or docs is much appreciated.
As mentioned by #cyborg86pl when switching flash modes you should not switch CONTROL_AE_MODE . Instead you can switch between FLASH_MODEĀ“s. Here is a working example for my case:
when (currentFlashState) {
FlashState.AUTO -> {
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH)
}
FlashState.ON -> {
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON)
previewRequestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_TORCH)
}
FlashState.OFF -> {
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON)
previewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF)
}
}
previewRequest = previewRequestBuilder.build()
captureSession.setRepeatingRequest(previewRequest, captureCallback, backgroundHandler)
I don't know why your preview turn black, but you don't need to close capture session manually. From .close() method's docs:
Using createCaptureSession(List , CameraCaptureSession.StateCallback,
Handler) directly without closing is the recommended approach for
quickly switching to a new session, since unchanged target outputs can
be reused more efficiently.
So you can reuse existing CaptureRequest.Builder, set your changed value, build new PreviewRequest and just start new session with this new request, like this:
try {
// Change some capture settings
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
// Build new request (we can't just edit existing one, as it is immutable)
mPreviewRequest = mPreviewRequestBuilder.build();
// Set new repeating request with our changed one
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
It will be much faster (almost without any visible freeze of preview).
What you want is disabling flash, not auto-exposure (AE), thus you want to use CONTROL_AE_MODE_ON rather than CONTROL_AE_MODE_OFF.
As mentioned in the documentation:
CONTROL_AE_MODE_ON
The camera device's autoexposure routine is active, with no flash control.
I'm trying to send the content of a DataMap from an Android device to a wearable. It works fine when the app is in the foreground on my app but once I lock the mobile device it gets stuck at the pendingResult.await() and the wearable doesn't receive any data where as it normal would if I keep the app open.
public void send(final DataMap dataMap) {
new Thread(new Runnable() {
#Override
public void run() {
PutDataMapRequest putDMR = PutDataMapRequest.create(WEARABLE_DATA_PATH);
putDMR.getDataMap().putAll(dataMap);
PendingResult<DataApi.DataItemResult> pendingResult = Wearable.DataApi.putDataItem(googleClient, request);
DataApi.DataItemResult result = pendingResult.await();
if(result.getStatus().isSuccess()) {
Log.d("qwe", "Data item set: " + result.getDataItem().getUri());
}
}
}).start();
}
This method is in a class which extends WearableListenerService and I have added the XML in the AndroidMainfest for the service also. Am I doing something completely wrong or missing something?
Thanks
try to check google api client status for each send.
use blockingConnect when google api client is not connected.
Found out I was doing googleClient.disconnect() in my main activity onStop() which was causing it to hang as googleClient wasn't connected once my app was in the background.
Im quite new to android and i have searched about this for quite a while. I would like to build an application that is something like a decibel meter. In realtime it shows the sound level. It there is much noise in the room, there will be something indicating that, if its quiet something will indicate that!.
I don't have any idea at all how to do this. Could anyone explain what the basics of the microphone-sound-level application? If its possible, maybe provide some code?
Thanks!
You can use MediaRecorder.getMaxAmplitude().
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO},
RECORD_AUDIO);
}
Get the noise level using the MediaRecorder,
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setOutputFile("/dev/null");
try {
mRecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mRecorder.start();
Start the MediaRecorder,
private Runnable mSleepTask = new Runnable() {
public void run() {
//Log.i("Noise", "runnable mSleepTask");
mSensor.start();
if (!mWakeLock.isHeld()) {
mWakeLock.acquire();
}
//Noise monitoring start
// Runnable(mPollTask) will execute after POLL_INTERVAL
mHandler.postDelayed(mPollTask, POLL_INTERVAL);
}
};
Create Runnable Thread to check the noise level frequently,
private Runnable mPollTask = new Runnable() {
public void run() {
double amp = mSensor.getAmplitude();
//Log.i("Noise", "runnable mPollTask");
// Runnable(mPollTask) will again execute after POLL_INTERVAL
mHandler.postDelayed(mPollTask, POLL_INTERVAL);
}
};
Convert the Amplitude to decibel using the following formula,
return 20 * Math.log10(mRecorder.getMaxAmplitude() / 2700.0);
Monitoring the Voice and Alert for the Louder Noise.
// Create runnable thread to Monitor Voice
private Runnable mPollTask = new Runnable() {
public void run() {
double amp = mSensor.getAmplitude();
//Log.i("Noise", "runnable mPollTask");
updateDisplay("Monitoring Voice...", amp);
if ((amp > mThreshold)) {
callForHelp(amp);
//Log.i("Noise", "==== onCreate ===");
}
// Runnable(mPollTask) will again execute after POLL_INTERVAL
mHandler.postDelayed(mPollTask, POLL_INTERVAL);
}
};
This question has been addressed generally for Java, and the required classes are available in Android.
The basic idea is to sample the data line for the microphone, and calculate the level from the returned buffer.
How to calculate the level/amplitude/db of audio signal in java?
You can also have a look at the Visualizer class which does FFT frequency analysis, however the permissions for microphone may not be consistent across various devices. You may also have to connect it to the Equalizer class to access the mic.
https://developer.android.com/reference/android/media/audiofx/Visualizer.html
There's a great app in the marketplace called Audalyzer
http://code.google.com/p/moonblink/wiki/Audalyzer
Also check this discussion..
Android: sample microphone without recording to get live amplitude/level?