I'm trying to implement an app that listens to microphone input (specifically, breathing), and presents data based on it. I'm using the Android class AudioRecord, and when trying to instantiate AudioRecord I get three errors.
AudioRecord: AudioFlinger could not create record track, status: -1
AudioRecord-JNI: Error creating AudioRecord instance: initialization check failed with status -1.
android.media.AudioRecord: Error code -20 when initializing native AudioRecord object.
I found this excellent thread: AudioRecord object not initializing
I have borrowed the code from the accepted answer that tries all sample rates, audio formats and channel configurations to try to solve the problem, but it didn't help, I get the above errors for all settings. I have also added a call to AudioRecord.release() on several places according to one of the answers in the thread but it made no difference.
This is my code:
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.util.Log;
public class SoundMeter {
private AudioRecord ar = null;
private int minSize;
private static int[] mSampleRates = new int[] { 8000, 11025, 22050, 32000, 44100 };
public boolean start() {
ar = findAudioRecord();
if(ar != null){
ar.startRecording();
return true;
}
else{
Log.e("SoundMeter", "ERROR, could not create audio recorder");
return false;
}
}
public void stop() {
if (ar != null) {
ar.stop();
ar.release();
}
}
public double getAmplitude() {
short[] buffer = new short[minSize];
ar.read(buffer, 0, minSize);
int max = 0;
for (short s : buffer)
{
if (Math.abs(s) > max)
{
max = Math.abs(s);
}
}
return max;
}
public AudioRecord findAudioRecord() {
for (int rate : mSampleRates) {
for (short audioFormat : new short[] { AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT, AudioFormat.ENCODING_PCM_FLOAT }) {
for (short channelConfig : new short[] { AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO }) {
try {
Log.d("SoundMeter", "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: " + channelConfig);
int bufferSize = AudioRecord.getMinBufferSize(rate, channelConfig, audioFormat);
if (bufferSize != AudioRecord.ERROR_BAD_VALUE) {
// check if we can instantiate and have a success
Log.d("SoundMeter", "Found a supported bufferSize, attempting to instantiate");
AudioRecord recorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, rate, channelConfig, audioFormat, bufferSize);
if (recorder.getState() == AudioRecord.STATE_INITIALIZED){
minSize = bufferSize;
return recorder;
}
else
recorder.release();
}
} catch (Exception e) {
Log.e("SoundMeter", rate + " Exception, keep trying.", e);
}
}
}
}
return null;
}
}
I have also added the
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
tag to my manifest file, as a child to the manifest tag and a sibling to the application tag according to one of the other answers in the thread mentioned above. I have rebuilt the project after adding this tag.
These are the solutions I find when googling the problem, but they unfortunately don't seem to do it for me.
I am debugging on my Nexus 5 phone (not an emulator). These errors appear upon calling the contructor of AudioRecord. I have rebooted my phone several times to try to release the microphone, to no avail. The project is based on Android 4.4, and my phone is currently running Android 6.0.1.
Would highly appreciate some tips on what else I can try, what I could have missed. Thank you!
I found the answer myself. It had to do with permissions.
The problem was that I am running API version 23 (Android 6.0.1) on my phone, which no longer uses only the manifest file to handle permissions. From version 23, permissions are granted in run-time instead. I added a method that makes sure to request the permission in run-time, and when I had allowed it once on my phone, it worked.
private void requestRecordAudioPermission() {
//check API version, do nothing if API version < 23!
int currentapiVersion = android.os.Build.VERSION.SDK_INT;
if (currentapiVersion > android.os.Build.VERSION_CODES.LOLLIPOP){
if (ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
// Should we show an explanation?
if (ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.RECORD_AUDIO)) {
// Show an expanation to the user *asynchronously* -- don't block
// this thread waiting for the user's response! After the user
// sees the explanation, try again to request the permission.
} else {
// No explanation needed, we can request the permission.
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO}, 1);
}
}
}
}
#Override
public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults) {
switch (requestCode) {
case 1: {
// If request is cancelled, the result arrays are empty.
if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
// permission was granted, yay! Do the
// contacts-related task you need to do.
Log.d("Activity", "Granted!");
} else {
// permission denied, boo! Disable the
// functionality that depends on this permission.
Log.d("Activity", "Denied!");
finish();
}
return;
}
// other 'case' lines to check for other
// permissions this app might request
}
}
I then call requestRecordAudioPermission() from the onCreate() method in my main activity before creating the AudioRecord.
Related
I have an app where I can connect an external biometric reader to the tablet. Everything works fine, but only on API 28.
When I switch to api 30, required by the play store, the app stops working when it should ask for permission. the application is then closed. How can I grant permission correctly?
This is the error displayed:
/DBG ( 4805): UsbEnumDevices(): fn: 0x71,ctx: 0x17e2
F/example.palchi( 4805): java_vm_ext.cc:579] JNI DETECTED ERROR IN APPLICATION: JNI GetArrayLength called with pending exception java.lang.SecurityException: User has not given 10569/com.example.palchik permission to access device /dev/bus/usb/001/002
The method I use is this:
if (call.method.equals("checkOrRequestPermissions")) {
try {
PendingIntent mPermissionIntent;
mPermissionIntent = PendingIntent.getBroadcast(appContext, 0, new Intent(ACTION_USB_PERMISSION), 0);
boolean checkResult = DPFPDDUsbHost.DPFPDDUsbCheckAndRequestPermissions(appContext, mPermissionIntent, reader.GetDescription().name);
result.success(checkResult);
} catch (DPFPDDUsbException e) {
result.error("2", "USB exception: " + e.getMessage(), "");
return;
}
}
And I do it like this in flutter:
static Future<bool> checkOrRequestPermissions(BuildContext context) async {
return await platform
.invokeMethod('checkOrRequestPermissions')
.catchError((e) {
if (e is PlatformException) {
if (e.code == "0") {
alertaController.setTitulo(
"error");
leitorInexistente(context);
}else if(e.code == "1"){
alertaController.setTitulo(
"error");
leitorInexistente(context);
}
} else {
print('ERROR $e');
}
});
}
Thanks if anyone can help me!
I am trying to upload an image selected from gallery to my Springboot server, but when my service try to post the image I get permission denied for the file path. I have added these permissions to my AndroidManifest:
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<!-- permission below just in case, should not be needed I believe -->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
I then ask for permission in real time to select the image, and then I want to place it in an inflated view where the user can provide more details about the image, then add it to a report which I will later post.
Since I got this permission trouble I also asked for permission again when I try to submit this Report object containing the images (Uri).
But still I get this error:
Caused by: java.io.FileNotFoundException: /storage/emulated/0/DCIM/Camera/IMG_20200206_120434.jpg (Permission denied)
Every hit I find on this error on google will point to someone who don't ask for this real-time permission, but I even do it once to much I believe.
This is some related snippets of my code:
else if (view.getId() == R.id.stubNewBreedingReportSelectImageButt) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (checkSelfPermission(Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(new String[] {Manifest.permission.READ_EXTERNAL_STORAGE}, 1);
} else {
getPhotoFromPhone(); // this starts the intent to pick an image
}
}
}
else if (view.getId() == R.id.stubNewBreedingReportSubmitButt) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (checkSelfPermission(Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(new String[] {Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.WRITE_EXTERNAL_STORAGE}, 2);
} else {
submitNewBreedingReport();
}
}
}
This is from my onClick(View view) method. The first one works since I am allowed to pick an image from the gallery. The second one should probably not need to check the permissions based on every example I have found of projects uploading images from android.
In my onActivityResult(int requestCode, int resultCode, Intent data) method I inflate this "add image details view". I also store the selected Uri as a private Uri selectedImg in the activity for future use. This all seems to work pretty much fine.
Then when I submit the image (in the submitNewReport() method) I use an ExecutorService (java class) to start a new async thread for the upload. In this Callable<> I get an instance of Springs RestTemplate and try to post the image, but when my restTemplate try to call and fetch the file from my Uri I get the permission denied.
This is the upload method in my apps ImageService:
public Gallery uploadPictureWithInfo(Uri uri, Map<String,Object> imgParams, Context context) {
if (uri.getPath() != null) {
File resourceFile = new File(getPathFromUri(uri,context));
//if (resourceFile.exists()) {
Gallery saved = null;
Map<String,Object> params = new HashMap<>();
params.put(PARAM_FILE, new FileSystemResource(resourceFile));
if (imgParams.get(PARAM_GALLERY_ID) != null || (long) imgParams.get(PARAM_GALLERY_ID) > (long) 0) {
params.put(PARAM_GALLERY_ID, imgParams.get(PARAM_GALLERY_ID));
if (imgParams.get(PARAM_DESCRIPTION) != null) {
params.put(PARAM_DESCRIPTION, imgParams.get(PARAM_DESCRIPTION));
}
if (imgParams.get(PARAM_PHOTOGRAPH) != null) {
params.put(PARAM_PHOTOGRAPH, imgParams.get(PARAM_PHOTOGRAPH));
}
if (imgParams.get(PARAM_USER_ID) != null && (long) imgParams.get(PARAM_USER_ID) > 0) {
params.put(PARAM_USER_ID, imgParams.get(PARAM_USER_ID));
}
HttpEntity requestEntity = new HttpEntity<>(params, AquaDbConfig.getImageHeaders());
ResponseEntity<Gallery> responseEntity =
restTemplate.exchange(AquaDbConfig.getApiUrl() + "/images/uploadImgWithInfo", HttpMethod.POST, requestEntity, Gallery.class);
if (responseEntity.hasBody()) {
saved = responseEntity.getBody();
}
return saved;
}
//}
}
return null;
}
public static String getPathFromUri(Uri uri, Context context) {
String[] filePath = { MediaStore.Images.Media.DATA };
Cursor c = context.getContentResolver().query(uri,filePath, null, null, null);
c.moveToFirst();
int columnIndex = c.getColumnIndex(filePath[0]);
String picturePath = c.getString(columnIndex);
c.close();
return picturePath;
}
I commented out the check for the file.isExist() to get past that test since it wont generate a stack trace otherwise.
So my question is HOW do I get to read the image file when I POST it to the server? I read a little about FileProvider class, but it seems to me that it is used to send files through Intents to new Activites or other Apps. It don't seem to me like it is intended for this because I never leave my Activity exept for picking the image in the gallery. The diffrent steps of creating this ReportedBreeding object is handeled by inflated ViewStubs and not new activites. Also the Uri I use don't refer to any directories I created for my app but rather the users image gallery (external storage).
I also tried to declare my ImageService as a Service in the android manifest, even though I'm not sure we talk about the same kind of service. I then added it this permission but it made no diffrence:
<service
android:name=".service.MyImageFactory"
android:permission="android.permission.READ_EXTERNAL_STORAGE">
</service>
If you know how to get the permission all the way to this RestTemplate POST method (which noone else seems to need in my reviewed examples) or how I can get around this problem, please share! I'm starting to get a little frustrated and stuck. The problem to me is Why do android require yet another permission check and how do I provide it or work around it in my uploadPictureWithInfo(..) method?
Try asking the permission for WRITE_EXTERNAL_STORAGE before getPhotoFromPhone()
For Android 10 this may be the permission issue, there are two solutions for that to handle for now. First method is to permission to manifest Application tag: android:requestLegacyExternalStorage="true"
The other one is to use openFileDescriptor
val parcelFileDescriptor = context.contentResolver.openFileDescriptor(fileUri, "r", null)
val inputStream = FileInputStream(parcelFileDescriptor.fileDescriptor)
fun ContentResolver.getFileName(fileUri: Uri): String {
var name = ""
val returnCursor = this.query(fileUri, null, null, null, null)
if (returnCursor != null) {
val nameIndex = returnCursor.getColumnIndex(OpenableColumns.DISPLAY_NAME)
returnCursor.moveToFirst()
name = returnCursor.getString(nameIndex)
returnCursor.close()
}
return name
}
val file = File(context.cacheDir, getFileName(context.contentResolver, fileUri))
val parcelFileDescriptor = context.contentResolver.openFileDescriptor(fileUri, "r", null)
parcelFileDescriptor?.let {
val inputStream = FileInputStream(parcelFileDescriptor.fileDescriptor)
val file = File(context.cacheDir, context.contentResolver.getFileName(fileUri))
val outputStream = FileOutputStream(file)
IOUtils.copy(inputStream, outputStream)
}
I have a small functionality. Switching on the torch and keeping it on, till the user switches it off from my app or my app exits. Using :
params = camera.getParameters();
if (params.getFlashMode().equals(Parameters.FLASH_MODE_TORCH)) {
isFlashOn = true;
return;
}
params.setFlashMode(Parameters.FLASH_MODE_TORCH);
camera.setParameters(params);
camera.startPreview();
And to switch off :
if (params.getFlashMode().equals(Parameters.FLASH_MODE_OFF)) {
isFlashOn = false;
return;
}
params.setFlashMode(Parameters.FLASH_MODE_OFF);
camera.setParameters(params);
camera.stopPreview();
But I notice that this is not very robust. Works fine the first time, but after that (especially on my API levle 22 phone) might not work. I was thinking of testing with the android.hardware.camera2 as suggested here
Plan to use if (android.os.Build.VERSION.SDK_INT >20) and a strategy (a base interface implemented by two classes, one using old API and one the new camera2 API.
Is this safe on all devices? I saw that we can do it for android classes, but is it okay for our own classes too? Or are there devices which scan all our code and reject it if it has code that refers to API it does not know about?
I do not want to make two APKs as its a small functionality.
I make sure flash is available like this , not tested on all devices but in an emulator by Genymotion app did not crash.
public boolean torchInit() {
try {
PackageManager pm = app.getPackageManager();
// First check if device supports flashlight
boolean hasFlash = pm.hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH);
if (!hasFlash) {
Toast.makeText(app, "Flash not found or can\'t get hold of it. No torch", Toast.LENGTH_LONG).show();
return false;
}
camera = Camera.open();
Camera.Parameters params = camera.getParameters();
Log.i(LogId, "camera params flash: " + params.getFlashMode());
return true;
} catch (Throwable e) {
Log.e(LogId, "cameraFlashSetup " + e, e);
Toast.makeText(app, "Flash error, no torch. Description : " + e, Toast.LENGTH_LONG).show();
camera = null;
}
return false;
}
The flash interface to change between the two classes is :
public abstract class TorchInterface {
protected AppCompatActivity app;
public void init(AppCompatActivity ap){
app = ap;
}
public abstract boolean torchInit();
public boolean torchReInit(){
return torchInit();//if re init is not different than init
}
public abstract boolean torchOn();
public abstract boolean torchOff();
}
Update: new code worked but only if I:
mBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
Instead of:
mBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_MANUAL);
But then switches on flash as soon as init the app. I was going to chuck it, then realised on my Camera2Impl I can :
public boolean torchInit() {
//do nothing on usual init that app calls on create
return true;
}
And instead do the init on torch on (flash on):
public boolean torchOn() {
//if not in it, try first 3 times
if(mBuilder == null || mSession == null){
if(firstFewTimesTorchOn > 0){
firstFewTimesTorchOn--;
torchInit2();
try{
Thread.sleep(150);
}catch(Exception e){}
if(mBuilder == null || mSession == null) {
return false;
}
}
}
try {
mBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_TORCH);//and etc
Android devices do not "scan" code - compiler does. Therefore, I don't see any issue with your idea. On contrary - using Strategy pattern is way better then if-else all over the code.
Something along these lines should work:
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.LOLLIPOP) {
mFlashlightStrategy = new PostLollipopStrategy();
} else {
mFlashlightStrategy = new PreLollipopStrategy();
}
Is this safe on all devices?
Why dont't you put one check whether flash is available or not.
context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH);
which will return true if a flash is available, false if not. You can write your further code in true block.
I am trying to check if my if my mobile device is dual sim, if sim1 is ready, if sim2 is ready, I am done with this using java reflection, now i want to find out if sim1 isRoaming and if sim2 isRoaming, and if its dual sim which sim is set as default. Is it possible with the help of java reflection.
You can do something like this:
public int getDefaultSimmm(Context context) {
Object tm = context.getSystemService(Context.TELEPHONY_SERVICE);
Method method_getDefaultSim;
int defaultSimm = -1;
try {
method_getDefaultSim = tm.getClass().getDeclaredMethod("getDefaultSim");
method_getDefaultSim.setAccessible(true);
defaultSimm = (Integer) method_getDefaultSim.invoke(tm);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Method method_getSmsDefaultSim;
int smsDefaultSim = -1;
try {
method_getSmsDefaultSim = tm.getClass().getDeclaredMethod("getSmsDefaultSim");
smsDefaultSim = (Integer) method_getSmsDefaultSim.invoke(tm);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return smsDefaultSim;
}
Starting from API 22 (Lollipop MR1) android has officially added SubscriptionManager class which gives all the information required by the developer in relation to sim cards and related services.
Documentation for SubscriptionManager
However support for retrieving defaults for calls, SMS and Mobile data were added in API 24.
If you use your minimum SDK version to 24 you can use getDefaultSmsSubscriptionId() method to get SMS default set by the user
SubscriptionManager manager = context.getSystemService(Context.TELEPHONY_SUBSCRIPTION_SERVICE);
int defaultSmsId = manager.getDefaultSmsSubscriptionId();
SubscriptionInfo info = manager.getActiveSubscriptionInfo(defaultSmsId);
Note: Above mention call requires READ_PHONE_STATE permission. Make sure you add it in your manifest file
A very late answer but I got into developing an application which has the above requirement
Below is the latest way to get it done.
/**
* #return - True - if any sim selected as default sim , False - No default sim is selected or
* permission for reading the sim status is not enabled
*/
boolean isDefaultSimSetForCall() {
if (ContextCompat.checkSelfPermission(context, Manifest.permission.READ_PHONE_STATE) != PackageManager.PERMISSION_GRANTED) {
Log.d(Utils.getTag(), "Read Phone state permission Disabled");
genericCallbacks.onPermissionAccessError(Constants.PermissionErrorCodes.READ_PHONE_STATE_ACCESS);
return false;
} else {
PhoneAccountHandle defaultPhoneAccount = telecomManager.getDefaultOutgoingPhoneAccount(Uri.fromParts("tel", "text", null).getScheme());
if (defaultPhoneAccount != null) {
Log.d(Utils.getTag(), "DefaultOutgoingPhoneAccount: " + defaultPhoneAccount.getId());
return true;
}
}
return false;
}
From the received PhoneAccountHandle, we can get the necessary fields
I have made an app that records sound and analyses it for frequency. This process is repeated a couple of times every second and thus uses threading.
This does work most of the time, but for some reason in the logcat I get these messages repeated after the first analysis.
Rarely (but sometimes) when I test, the app records no sound. So I'm thinking it has something to do with this error.
01-23 13:52:03.414: E/AudioRecord(3647): Could not get audio input for record source 1
01-23 13:52:03.424: E/AudioRecord-JNI(3647): Error creating AudioRecord instance: initialization check failed.
01-23 13:52:03.424: E/AudioRecord-Java(3647): [ android.media.AudioRecord ] Error code -20 when initializing native AudioRecord object.
The code is below, does anyone have any idea where im going wrong? Am I not killing the AudioRecord object correctly? Code has been modifed for ease of reading:
public class recorderThread extends AsyncTask<Sprite, Void, Integer> {
short[] audioData;
int bufferSize;
#Override
protected Integer doInBackground(Sprite... ball) {
boolean recorded = false;
int sampleRate = 8192;
AudioRecord recorder = instatiateRecorder(sampleRate);
while (!recorded) { //loop until recording is running
if (recorder.getState()==android.media.AudioRecord.STATE_INITIALIZED) // check to see if the recorder has initialized yet.
{
if (recorder.getRecordingState()==android.media.AudioRecord.RECORDSTATE_STOPPED)
recorder.startRecording();
//check to see if the Recorder has stopped or is not recording, and make it record.
else {
//read the PCM audio data into the audioData array
//get frequency
//checks if correct frequency, assigns number
int correctNo = correctNumber(frequency, note);
checkIfMultipleNotes(correctNo, max_index, frequency, sampleRate, magnitude, note);
if (audioDataIsNotEmpty())
recorded = true;
return correctNo;
}
}
else
{
recorded = false;
recorder = instatiateRecorder(sampleRate);
}
}
if (recorder.getState()==android.media.AudioRecord.RECORDSTATE_RECORDING)
{
killRecorder(recorder);
}
return 1;
}
private void killRecorder(AudioRecord recorder) {
recorder.stop(); //stop the recorder before ending the thread
recorder.release(); //release the recorders resources
recorder=null; //set the recorder to be garbage collected
}
#Override
protected void onPostExecute(Integer result) {
ballComp.hitCorrectNote = result;
}
private AudioRecord instatiateRecorder(int sampleRate) {
bufferSize= AudioRecord.getMinBufferSize(sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT)*2;
//get the buffer size to use with this audio record
AudioRecord recorder = new AudioRecord (AudioSource.MIC,sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,bufferSize);
//instantiate the AudioRecorder
audioData = new short [bufferSize];
//short array that pcm data is put into.
return recorder;
}
}
As your log says that "Could not get audio input for record source 1" that means. Android Device not found any hardware for recording the Sound.
So If you are testing the app from Emulator then make sure that you have successfully attached the mice during recording of the sound or if you are debugging or running it from the device then be sure that the Mic is on to record the Sound.
Hope it will help you.
Or
If above not solve your issue then use the below code to record the Sound as it works perfect for me.
Code:
record.setOnClickListener(new View.OnClickListener()
{
boolean mStartRecording = true;
public void onClick(View v)
{
if (mStartRecording==true)
{
//startRecording();
haveStartRecord=true;
String recordWord = wordValue.getText().toString();
String file = Environment.getExternalStorageDirectory().getAbsolutePath();
file = file+"/"+recordWord+".3gp";
System.out.println("Recording Start");
//record.setText("Stop recording");
record.setBackgroundDrawable(getResources().getDrawable( R.drawable.rec_on));
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(file);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
// mRecorder.setAudioChannels(1);
// mRecorder.setAudioSamplingRate(8000);
try
{
mRecorder.prepare();
}
catch (IOException e)
{
Log.e(LOG_TAG, "prepare() failed");
}
mRecorder.start();
}
else
{
//stopRecording();
System.out.println("Recording Stop");
record.setBackgroundDrawable(getResources().getDrawable( R.drawable.rec_off));
mRecorder.stop();
mRecorder.release();
mRecorder = null;
haveFinishRecord=true;
}
mStartRecording = !mStartRecording;
}
});
Hope this answer help you.
Enjoy. :)
What stops you having two RecorderThreads running at the same time? Show the code that instantiates one of these objects, executes it, and of course waits for any previous RecorderThread to finish first.
If the answer is that nothing stops two RecorderThreads running at the same time, then your use of 'static' will obviously be a problem... a second thread will cause the first AudioRecord to be leaked while open. IMHO it's a good idea to try to avoid using static data.
I had the same problem. And I solved it by adding
"<uses-permission android:name="android.permission.RECORD_AUDIO"></uses-permission>"
to the "AndroidManifest.xml"