Android Service stops recording audio when app is killed - java

My problem changed a bit, please take a look at the EDIT 2 below
I am learning how to work with Services and audio recording on Android.
I want to create an app, that will do nothing except starting a service: after requesting permissions (RECORD_AUDIO, INTERNET) the app only calls startService() from the MainActivity.
The service will then record audio and stream it to given IP address. I took my inspiration for the audio streaming server and client from the answers to this question. I am testing the app on Android 6.0 in the Android Studio Emulator so far.
This is the onStartCommand() method of my service. I create a new thread where I wait until the user has clicked "Allow" in the permissions request dialog. When I receive the permissions I initialize the AudioRecord and then I read from it in a while-loop and send the data away.
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
Thread streamThread = new Thread(new Runnable() {
#Override
public void run() {
waitForPermissions();
try {
DatagramSocket socket = new DatagramSocket();
byte[] buffer = new byte[minBufSize];
DatagramPacket packet;
final InetAddress destination = InetAddress.getByName(address);
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleRate, channelConfig, audioFormat, minBufSize * 10);
recorder.startRecording();
while (status == true) {
minBufSize = recorder.read(buffer, 0, buffer.length);
packet = new DatagramPacket(buffer, buffer.length, destination, port);
socket.send(packet);
}
} catch (...) {...}
}
});
streamThread.start();
return START_STICKY;
}
In onDestroy() I set the flag status to false so that the loop in the thread created in onStartCommand() terminates. Then I release the recorder so that it can be initialized and used again.
#Override
public void onDestroy(){
status = false;
recorder.stop();
recorder.release();
super.onDestroy();
}
This app works works only partially:
When I launch the listening server on the computer and the app in the emulator, I can hear the audio normally.
When I minimize the app (tap Home button), the audio stream is still OK.
My problem is that when I kill the app (swipe it right in the running apps screen), the audio stops. In the Logcat I can see that the service has been restarted and runs as usual (initializes the AudioRecord and everything, no exceptions thrown), only I can't hear anything. When I run the app again (and therefore call the startService() again), I see an exception in the Logcat telling me that the AudioRecord start failed with an error code -38 (meaning that the microphone is in use by the previous instance of the service).
What am I doing wrong here? Why is the service running, but audio not streaming when I kill the app?
Many thanks for your answers.
EDIT:
I know this approach won't work on newer Android versions. This app is just for private purposes and will be run on one of my old phones, either with Android 5.1.1 or 6.0.
After I read the article in this answer I moved what was previously in onDestroy to onTaskRemoved. However, nothing changed.
The Service is apparently still sending some packets after it is restarted. I looked at them in Wireshark and the packets are still leaving the device and the Server is still receiving them. The payload of the packets is nonzero, unique for every packet.
EDIT 2:
I changed the buffer size to AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat) * 10 in the Android app and AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat) * 20 in the Server application. Now when I swipe the app, the Service restarts and then I can hear some audio. But the sound is very chunky after the restart and I don't know why. It also gets chunky when I uninstall the app, keep the server running and then install and run the app again.
So the problem is probably on the Server side. What could be the problem with the server? Am I working correctly with the SourceDataLine? The Server code looks like this (taken from here):
...
static int sampleRate = 8000;
static int bufferLen = minBufSize * 20;
public static void main(String args[]) throws Exception {
DatagramSocket serverSocket = new DatagramSocket(port);
byte[] receiveData = new byte[bufferLen];
format = new AudioFormat(sampleRate, 16, 1, true, false);
dataLineInfo = new DataLine.Info(SourceDataLine.class, format);
sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
sourceDataLine.open(format);
sourceDataLine.start();
FloatControl volumeControl = (FloatControl) sourceDataLine.getControl(FloatControl.Type.MASTER_GAIN);
volumeControl.setValue(volumeControl.getMaximum());
while (status == true) {
DatagramPacket receivePacket = new DatagramPacket(receiveData, receiveData.length);
serverSocket.receive(receivePacket);
toSpeaker(receivePacket.getData(), receivePacket.getLength());
}
sourceDataLine.drain();
sourceDataLine.close();
}
public static void toSpeaker(byte soundbytes[], int length) {
try {
sourceDataLine.write(soundbytes, 0, length);
} catch (...) {...}
}
...
The sample rate of the server is identical to the Android app. Format is AudioFormat.ENCODING_PCM_16BIT.

Read this article: What exactly happens to running services when you swipe an android app from recent app list ?
In Your case the service stops working, in recreation because of the start sticky it seems you have a problem that recording fails which I don't know what it is. In the sense of improving your service use IntentService or a foreground service to solve the problem.
BTW, if you want to use your app in android 8 or higher you probably has to use ForeGrounService only since IntentService will be killed by system after several minutes if app is in background. Read this: Background Execution Limits
So after testing all the solutions you might be agree with me on using a foreground service: Android Foreground Service Example

Related

Android get frame from an RTSP Stream of an IP Camera(PTZ)

I am developing an app that play an RTSP stream from a camera using VLC, but I have a situation where I need to get frame of the stream without playing it.Kindly suggest the proper way of doing it. kindly recommend the best Library or SDK to achieve this.
I have used MediaMetadataRetriever, but so far no luck. I have also used FFmpegMediaMetadataRetriever but the app crashes.
D/apitrace: apitrace: warning: caught signal 6
D/apitrace: call flush from exceptionCallback
A/libc: Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 5035 (pool-3-thread-1), pid 4832
I have used below code.
ExecutorService executor = Executors.newSingleThreadExecutor();
Handler handler = new Handler(Looper.getMainLooper());
executor.execute(()->{
Bitmap bitmap;
try {
if (mmr == null){
mmr = new FFmpegMediaMetadataRetriever();
}
mmr.setDataSource(url);
bitmap = mmr.getFrameAtTime();
Log.d("BITMAP_IS_DONE",bitmap.toString());
mmr.release();
handler.post(()->{
thumbnailArrays[1].setImageBitmap(bitmap);
});
} catch (IllegalArgumentException ie) {
ie.printStackTrace();
}
});

Scanning for Bluetooth LE devices when the phone is in doze mode, don't work. CallBack method dosn't called when doze mode

I am developing an Android app (the app runs on Android 6): I want the app to send a notification to the user when it is near a BLE device (a device that I have at home). So I continuously scan, I scan through a service (which is running in the background). It works well when the phone screen is on; but, when the screen turns off, a few seconds later the application can no longer find the BLE (the scan is still running, but there is no callback.
if (enable) {
if (mScanning) return;
// Stops scanning after a pre-defined scan period.
handler.postDelayed(new Runnable() {
#Override
public void run() {
if (!mScanning) return;
try {
mScanning = false;
mBluetoothLeScanner.stopScan(mScanCallback);
Log.i(TAG_LOG, "Stop scanning after pre-defined scan periode");
} catch (Exception e){Log.e(TAG_LOG,"mBluetoothLeScanner.stopScan Exception:=>"+e.getMessage());}
}
}, SCAN_PERIOD);
mScanning = true;
mBluetoothLeScanner.startScan(filters, settings, mScanCallback);
Log.i(TAG_LOG, "Start scanning ....");
}
private ScanCallback mScanCallback = new ScanCallback() {
//When a BLE advertisement has been found
#Override
public void onScanResult(int callbackType, ScanResult result) {
super.onScanResult(callbackType, result);
Log.i(TAG_LOG, "Name: "+result.getDevice().getName()+". Adresse: "+result.getDevice().getAddress()+". Rssi: "+result.getRssi());
//scanDevices(false);
if(result.getDevice().getName() != null && result.getDevice().getName().toString().equals(deviceName)){
mDeviceAdress = result.getDevice().getAddress();
mDevice = mBluetoothAdapter.getRemoteDevice(mDeviceAdress);
Log.i(TAG_LOG, "Device found");
scanDevices(false);
}
}
You can't make this work. Scanning is a very expensive operation that Android won't allow in the background. Instead, make an attempt to connect to the device. I had success doing this in a WorkManager job, running every 15 minutes. Battery drain was negligible and it was pretty reliable. Note that a connection state 0x85 usually represents the device being out of range, and 0x80 means a different device is already connected to it (or the phone is already connected to too many different devices). Full error list is at https://android.googlesource.com/platform/external/bluetooth/bluedroid/+/master/stack/include/gatt_api.h#27

Android webRTC video call inside a Background Service

Forgive me if this question was already asked, I couldn't find an answer for my case.
So, I have an Android app with Voice & Video call feature. I used webRTC for this.
I was able to make both Voice and Video call working perfectly inside an Activity, but now I want to keep the call running while the user exit the CallActivity and go back to the ChatActivity (to send a file/link/photo for example).
I managed to make the Voice call run perfectly inside a Background Service, but video call won't work as expected.
The remote video won't be displayed even though the audio from the video track is playing.
here is my Background Service code :
#Override
public void onAddStream(MediaStream mediaStream) {
if (mediaStream.videoTracks.size() > Constants.ONE || mediaStream.audioTracks.size() > Constants.ONE) {
return;
}
//check for video track, means this is a video call
if (!isAudioCall && mediaStream.videoTracks.size() > Constants.ZERO) {
remoteVideoTrack = mediaStream.videoTracks.get(Constants.ZERO);
CallActivityNew.remoteVideoTrack = remoteVideoTrack;
try {
localAudioTrack.setEnabled(true);
//Now ask the UI to display the video track
sendOrderToActivity(Constants.START_REMOTE_VIDEO, null);
} catch (Exception ignored) {}
} else if (mediaStream.audioTracks.size() > Constants.ZERO) {
//Means this is a Voice call, only audio tracks available
remoteAudioTrack = mediaStream.audioTracks.get(Constants.ZERO);
try {
localAudioTrack.setEnabled(true);
remoteAudioTrack.setEnabled(true);
} catch (Exception ignored) {}
}
}
and below my CallActivity code :
case Constants.START_REMOTE_VIDEO: {
if (remoteVideoView == null) {
remoteVideoView = findViewById(R.id.remote_gl_surface_view);
}
remoteVideoView.init(eglBaseContext, null);
remoteVideoView.setEnableHardwareScaler(true);
remoteVideoView.setMirror(true);
remoteVideoView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FIT);
remoteVideoView.setZOrderMediaOverlay(true);
//Apply video track to the Surface View in order to display it
remoteVideoTrack.addSink(remoteVideoView);
//now enable local video track
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
//now enable local video track
remoteVideoTrack.setEnabled(true);
}
}, Constants.TIME_THREE_HUNDRED_MILLIS);
setSpeakerphoneOn(false);
break;
}
I am sending orders from Service to Activity, the "case Constants.START_REMOTE_VIDEO" work after receiving the order from Service.
I don't see where the problem, why am I only hearing sound but the remote video won't start display !!
Thank you in advance for helping.
After testing for long hours, I found that my code works just fine, I just forget to change the view visibility from "GONE" to "VISIBLE".
Yeah that was the solution, i swear xD

Android/Java Thread sync: while(true){} causing blocking

I'm trying to better understand the behavior of threads in my android app. For some reason, when I use while(true) in one of my worker threads, code within that thread's run method that exists sequentially BEFORE the while(true) loop never executes. To be clear, I'm not sure if the code(toast messages) actually isn't executing or if the way the thread synchronization is handled by the Android OS is causing my Toast messages not to display. This behavior appears to be some sort of blocking but I can't figure out why this happens.
My app uses 3 threads: the UI thread(default/main thread in an Android app), a thread to infinitely read data from the device's USB port during runtime, and a thread to process this data via messages from the USB-read thread. The problem seems to occur in my USBController class. When I comment out my infinite while loop, all of the Toast messages before the start of the loop display just fine. When I don't comment out my while(true), NO TOAST MESSAGES EVER DISPLAY! I'm pretty confused by this, I think i'm misunderstanding something fundamental about thread handling by the Android OS. Even if a while loop were to cause blocking, which i don't think it since it resides in a worker thread, why wouldn't the toast messages that occur before the while loop be triggered? Is this a synchronization issue? Am I misusing Android's Handler-Looper system?
Code below. Note: I've included the relevant portion of the main activity and the entirety of the USBController class. My implementation of this class relies heavily on the USB to Serial library found here mik3y/usb-serial-for-android. I don't think it's necessary, but i've included the class that contains my third thread, SensorDataBuffer, that receives messages from the thread UsbController.
UsbController.java
public class UsbController extends Thread{
...
#Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DEFAULT); //sets thread to default queing priority
Looper.prepare();
Toast.makeText(mContext.getApplicationContext(), "Hello from UsbController's run method!", Toast.LENGTH_SHORT).show();
// **********************USB otg*******************************
//Obtain permission to use Android device's USB intent
PendingIntent mPermissionIntent;
mPermissionIntent = PendingIntent.getBroadcast(mContext, 0, new Intent(ACTION_USB_PERMISSION), 0);
// Find all available drivers from attached devices.
ProbeTable customTable = new ProbeTable();
customTable.addProduct(0x03EB, 0x2044, CdcAcmSerialDriver.class);
UsbManager manager = (UsbManager) mContext.getSystemService(Context.USB_SERVICE);
UsbSerialProber prober = new UsbSerialProber(customTable);
List<UsbSerialDriver> availableDrivers = prober.findAllDrivers(manager);
if (availableDrivers.isEmpty()) {
Toast.makeText(mContext.getApplicationContext(), "No available USB drivers found",Toast.LENGTH_SHORT).show(); // Toast message for debugging
}
else { // open connection to first avail. driver
UsbSerialDriver driver = availableDrivers.get(0);
Toast.makeText(mContext.getApplicationContext(), "Driver found",Toast.LENGTH_SHORT).show(); // Toast message for debugging
UsbDeviceConnection connection = manager.openDevice(driver.getDevice());
Toast.makeText(mContext.getApplicationContext(), "Device Driver Opened",Toast.LENGTH_SHORT).show(); // Toast message for debugging
if (connection == null) { // You probably need to call UsbManager.requestPermission(driver.getDevice(), ..)
Toast.makeText(mContext.getApplicationContext(),"Connection to device not allowed, need permissions",Toast.LENGTH_LONG).show();
manager.requestPermission(driver.getDevice(),mPermissionIntent); //conn test
if (manager.hasPermission(driver.getDevice())==true){
Toast.makeText(mContext.getApplicationContext(),"Permissions granted",Toast.LENGTH_SHORT).show();
}
}
else { // Read some data! Most have just one port (port 0).
List<UsbSerialPort> myPortList = driver.getPorts();
UsbSerialPort port = myPortList.get(0);
Toast.makeText(mContext.getApplicationContext(),"USB OTG Connection Established",Toast.LENGTH_SHORT).show();
try {
port.open(connection);
port.setParameters(9600, 8, UsbSerialPort.STOPBITS_1, UsbSerialPort.PARITY_NONE); // sets baud rate,databits, stopbits, & parity
port.setDTR(true); //necessary to make Arduino Micro begin running it's program
Toast.makeText(mContext.getApplicationContext(),"port opened, parameters set, DTR set",Toast.LENGTH_SHORT).show();
byte buffer[] = new byte[16];
String incompPacket = "";
Toast.makeText(mContext.getApplicationContext(), "hi again!"), Toast.LENGTH_LONG).show();
while (true){ //continuous loop to read data
numBytesRead = port.read(buffer, 100);
arduinoData = new String(buffer, "US-ASCII");
String raw = arduinoData.substring(0, numBytesRead);
if (numBytesRead > 0) {
...
}
}
} catch (IOException e) {
Toast.makeText(mContext, e.getMessage(), Toast.LENGTH_SHORT).show();
}
}
}
Looper.loop();
}
}
MainActivity.java
...
#Override
protected void onCreate(Bundle savedInstanceState) {
//Multi-threading
//Create thread to handle incoming data from USB Controller thread
SensorDataBuffer pressureDataBuffer = new SensorDataBuffer(MainActivity.this);
Thread bufferThread = new Thread(pressureDataBuffer);
bufferThread.start();
//Create USB Serial Worker thread which will continuously receive data
UsbController serialDataLink = new UsbController(PlayFrets.this);
Thread sensorMonitorThread = new Thread(serialDataLink);
sensorMonitorThread.start();
//Toast.makeText(this, "USB Controller thread started", Toast.LENGTH_SHORT).show();
//Build GUI
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE); //Removes action bar from display
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN); //Removes status bar from display
//Create AsyncTask to load the note files. A splash screen will be displayed while task is executing
new AsyncTask_NoteFileLoader(this).execute();
}
...
SensorDataBuffer.java
public class SensorDataBuffer extends Thread{
//Handler subclass which accepts messages one by one in
//the main activitiy's FIFO message que called a "Looper"
//The worker thread, sensorMonitor, runs UsbController in parallel
//with the UI thread and continuously formats and sends pressure sensor
//values read from the microcontroller to the Handler which updates the
//corresponding pressure state logic variables in the UI thread.
public void run(){
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO); //TODO:priority was previously more favorable, test this to ensure UI doesn't lag
Looper.prepare(); //create MessageQue to receive messages from USB Controller thread
UsbController.setHandler(bufferHandler);
bufferHandler = new Handler(Looper.myLooper()) {
//do stuff
};
Looper.loop();
}
}
How about using HandlerThreads, Handlers and Runnables instead? Makes your code a lot cleaner and easier to maintain.
In your onCreate() just create a couple of them:
HandlerThread usbThread = new HandlerThread("USBController");
usbThread.start();
usbHandler = new Handler(usbThread.getLooper());
HandlerThread sensorThread = new HandlerThread("SensorDataBuffer");
sensorThread.start();
sensorHandler = new Handler(sensorThread.getLooper());
Then you create your Runnables and post them to the Handlers
usbHandler.post(new Runnable(){
run(){
//....
numBytesRead = port.read(buffer, 100);
if (numBytesRead > 0) {
sensorHandler.post(new Runnable(){run(){//doSomething}});
}
//....
if(isStillRunning)
usbHandler.post(this);
}
});
You can let the runnable post itself and it will run forever. From within you can post runnables to other handlers (like the Main Thread Handler) to show your Toasts.

Errors when recording sound in Android

I have made an app that records sound and analyses it for frequency. This process is repeated a couple of times every second and thus uses threading.
This does work most of the time, but for some reason in the logcat I get these messages repeated after the first analysis.
Rarely (but sometimes) when I test, the app records no sound. So I'm thinking it has something to do with this error.
01-23 13:52:03.414: E/AudioRecord(3647): Could not get audio input for record source 1
01-23 13:52:03.424: E/AudioRecord-JNI(3647): Error creating AudioRecord instance: initialization check failed.
01-23 13:52:03.424: E/AudioRecord-Java(3647): [ android.media.AudioRecord ] Error code -20 when initializing native AudioRecord object.
The code is below, does anyone have any idea where im going wrong? Am I not killing the AudioRecord object correctly? Code has been modifed for ease of reading:
public class recorderThread extends AsyncTask<Sprite, Void, Integer> {
short[] audioData;
int bufferSize;
#Override
protected Integer doInBackground(Sprite... ball) {
boolean recorded = false;
int sampleRate = 8192;
AudioRecord recorder = instatiateRecorder(sampleRate);
while (!recorded) { //loop until recording is running
if (recorder.getState()==android.media.AudioRecord.STATE_INITIALIZED) // check to see if the recorder has initialized yet.
{
if (recorder.getRecordingState()==android.media.AudioRecord.RECORDSTATE_STOPPED)
recorder.startRecording();
//check to see if the Recorder has stopped or is not recording, and make it record.
else {
//read the PCM audio data into the audioData array
//get frequency
//checks if correct frequency, assigns number
int correctNo = correctNumber(frequency, note);
checkIfMultipleNotes(correctNo, max_index, frequency, sampleRate, magnitude, note);
if (audioDataIsNotEmpty())
recorded = true;
return correctNo;
}
}
else
{
recorded = false;
recorder = instatiateRecorder(sampleRate);
}
}
if (recorder.getState()==android.media.AudioRecord.RECORDSTATE_RECORDING)
{
killRecorder(recorder);
}
return 1;
}
private void killRecorder(AudioRecord recorder) {
recorder.stop(); //stop the recorder before ending the thread
recorder.release(); //release the recorders resources
recorder=null; //set the recorder to be garbage collected
}
#Override
protected void onPostExecute(Integer result) {
ballComp.hitCorrectNote = result;
}
private AudioRecord instatiateRecorder(int sampleRate) {
bufferSize= AudioRecord.getMinBufferSize(sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT)*2;
//get the buffer size to use with this audio record
AudioRecord recorder = new AudioRecord (AudioSource.MIC,sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,bufferSize);
//instantiate the AudioRecorder
audioData = new short [bufferSize];
//short array that pcm data is put into.
return recorder;
}
}
As your log says that "Could not get audio input for record source 1" that means. Android Device not found any hardware for recording the Sound.
So If you are testing the app from Emulator then make sure that you have successfully attached the mice during recording of the sound or if you are debugging or running it from the device then be sure that the Mic is on to record the Sound.
Hope it will help you.
Or
If above not solve your issue then use the below code to record the Sound as it works perfect for me.
Code:
record.setOnClickListener(new View.OnClickListener()
{
boolean mStartRecording = true;
public void onClick(View v)
{
if (mStartRecording==true)
{
//startRecording();
haveStartRecord=true;
String recordWord = wordValue.getText().toString();
String file = Environment.getExternalStorageDirectory().getAbsolutePath();
file = file+"/"+recordWord+".3gp";
System.out.println("Recording Start");
//record.setText("Stop recording");
record.setBackgroundDrawable(getResources().getDrawable( R.drawable.rec_on));
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(file);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
// mRecorder.setAudioChannels(1);
// mRecorder.setAudioSamplingRate(8000);
try
{
mRecorder.prepare();
}
catch (IOException e)
{
Log.e(LOG_TAG, "prepare() failed");
}
mRecorder.start();
}
else
{
//stopRecording();
System.out.println("Recording Stop");
record.setBackgroundDrawable(getResources().getDrawable( R.drawable.rec_off));
mRecorder.stop();
mRecorder.release();
mRecorder = null;
haveFinishRecord=true;
}
mStartRecording = !mStartRecording;
}
});
Hope this answer help you.
Enjoy. :)
What stops you having two RecorderThreads running at the same time? Show the code that instantiates one of these objects, executes it, and of course waits for any previous RecorderThread to finish first.
If the answer is that nothing stops two RecorderThreads running at the same time, then your use of 'static' will obviously be a problem... a second thread will cause the first AudioRecord to be leaked while open. IMHO it's a good idea to try to avoid using static data.
I had the same problem. And I solved it by adding
"<uses-permission android:name="android.permission.RECORD_AUDIO"></uses-permission>"
to the "AndroidManifest.xml"

Categories