Errors when recording sound in Android - java

I have made an app that records sound and analyses it for frequency. This process is repeated a couple of times every second and thus uses threading.
This does work most of the time, but for some reason in the logcat I get these messages repeated after the first analysis.
Rarely (but sometimes) when I test, the app records no sound. So I'm thinking it has something to do with this error.
01-23 13:52:03.414: E/AudioRecord(3647): Could not get audio input for record source 1
01-23 13:52:03.424: E/AudioRecord-JNI(3647): Error creating AudioRecord instance: initialization check failed.
01-23 13:52:03.424: E/AudioRecord-Java(3647): [ android.media.AudioRecord ] Error code -20 when initializing native AudioRecord object.
The code is below, does anyone have any idea where im going wrong? Am I not killing the AudioRecord object correctly? Code has been modifed for ease of reading:
public class recorderThread extends AsyncTask<Sprite, Void, Integer> {
short[] audioData;
int bufferSize;
#Override
protected Integer doInBackground(Sprite... ball) {
boolean recorded = false;
int sampleRate = 8192;
AudioRecord recorder = instatiateRecorder(sampleRate);
while (!recorded) { //loop until recording is running
if (recorder.getState()==android.media.AudioRecord.STATE_INITIALIZED) // check to see if the recorder has initialized yet.
{
if (recorder.getRecordingState()==android.media.AudioRecord.RECORDSTATE_STOPPED)
recorder.startRecording();
//check to see if the Recorder has stopped or is not recording, and make it record.
else {
//read the PCM audio data into the audioData array
//get frequency
//checks if correct frequency, assigns number
int correctNo = correctNumber(frequency, note);
checkIfMultipleNotes(correctNo, max_index, frequency, sampleRate, magnitude, note);
if (audioDataIsNotEmpty())
recorded = true;
return correctNo;
}
}
else
{
recorded = false;
recorder = instatiateRecorder(sampleRate);
}
}
if (recorder.getState()==android.media.AudioRecord.RECORDSTATE_RECORDING)
{
killRecorder(recorder);
}
return 1;
}
private void killRecorder(AudioRecord recorder) {
recorder.stop(); //stop the recorder before ending the thread
recorder.release(); //release the recorders resources
recorder=null; //set the recorder to be garbage collected
}
#Override
protected void onPostExecute(Integer result) {
ballComp.hitCorrectNote = result;
}
private AudioRecord instatiateRecorder(int sampleRate) {
bufferSize= AudioRecord.getMinBufferSize(sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT)*2;
//get the buffer size to use with this audio record
AudioRecord recorder = new AudioRecord (AudioSource.MIC,sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,bufferSize);
//instantiate the AudioRecorder
audioData = new short [bufferSize];
//short array that pcm data is put into.
return recorder;
}
}

As your log says that "Could not get audio input for record source 1" that means. Android Device not found any hardware for recording the Sound.
So If you are testing the app from Emulator then make sure that you have successfully attached the mice during recording of the sound or if you are debugging or running it from the device then be sure that the Mic is on to record the Sound.
Hope it will help you.
Or
If above not solve your issue then use the below code to record the Sound as it works perfect for me.
Code:
record.setOnClickListener(new View.OnClickListener()
{
boolean mStartRecording = true;
public void onClick(View v)
{
if (mStartRecording==true)
{
//startRecording();
haveStartRecord=true;
String recordWord = wordValue.getText().toString();
String file = Environment.getExternalStorageDirectory().getAbsolutePath();
file = file+"/"+recordWord+".3gp";
System.out.println("Recording Start");
//record.setText("Stop recording");
record.setBackgroundDrawable(getResources().getDrawable( R.drawable.rec_on));
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(file);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
// mRecorder.setAudioChannels(1);
// mRecorder.setAudioSamplingRate(8000);
try
{
mRecorder.prepare();
}
catch (IOException e)
{
Log.e(LOG_TAG, "prepare() failed");
}
mRecorder.start();
}
else
{
//stopRecording();
System.out.println("Recording Stop");
record.setBackgroundDrawable(getResources().getDrawable( R.drawable.rec_off));
mRecorder.stop();
mRecorder.release();
mRecorder = null;
haveFinishRecord=true;
}
mStartRecording = !mStartRecording;
}
});
Hope this answer help you.
Enjoy. :)

What stops you having two RecorderThreads running at the same time? Show the code that instantiates one of these objects, executes it, and of course waits for any previous RecorderThread to finish first.
If the answer is that nothing stops two RecorderThreads running at the same time, then your use of 'static' will obviously be a problem... a second thread will cause the first AudioRecord to be leaked while open. IMHO it's a good idea to try to avoid using static data.

I had the same problem. And I solved it by adding
"<uses-permission android:name="android.permission.RECORD_AUDIO"></uses-permission>"
to the "AndroidManifest.xml"

Related

Android webRTC video call inside a Background Service

Forgive me if this question was already asked, I couldn't find an answer for my case.
So, I have an Android app with Voice & Video call feature. I used webRTC for this.
I was able to make both Voice and Video call working perfectly inside an Activity, but now I want to keep the call running while the user exit the CallActivity and go back to the ChatActivity (to send a file/link/photo for example).
I managed to make the Voice call run perfectly inside a Background Service, but video call won't work as expected.
The remote video won't be displayed even though the audio from the video track is playing.
here is my Background Service code :
#Override
public void onAddStream(MediaStream mediaStream) {
if (mediaStream.videoTracks.size() > Constants.ONE || mediaStream.audioTracks.size() > Constants.ONE) {
return;
}
//check for video track, means this is a video call
if (!isAudioCall && mediaStream.videoTracks.size() > Constants.ZERO) {
remoteVideoTrack = mediaStream.videoTracks.get(Constants.ZERO);
CallActivityNew.remoteVideoTrack = remoteVideoTrack;
try {
localAudioTrack.setEnabled(true);
//Now ask the UI to display the video track
sendOrderToActivity(Constants.START_REMOTE_VIDEO, null);
} catch (Exception ignored) {}
} else if (mediaStream.audioTracks.size() > Constants.ZERO) {
//Means this is a Voice call, only audio tracks available
remoteAudioTrack = mediaStream.audioTracks.get(Constants.ZERO);
try {
localAudioTrack.setEnabled(true);
remoteAudioTrack.setEnabled(true);
} catch (Exception ignored) {}
}
}
and below my CallActivity code :
case Constants.START_REMOTE_VIDEO: {
if (remoteVideoView == null) {
remoteVideoView = findViewById(R.id.remote_gl_surface_view);
}
remoteVideoView.init(eglBaseContext, null);
remoteVideoView.setEnableHardwareScaler(true);
remoteVideoView.setMirror(true);
remoteVideoView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FIT);
remoteVideoView.setZOrderMediaOverlay(true);
//Apply video track to the Surface View in order to display it
remoteVideoTrack.addSink(remoteVideoView);
//now enable local video track
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
//now enable local video track
remoteVideoTrack.setEnabled(true);
}
}, Constants.TIME_THREE_HUNDRED_MILLIS);
setSpeakerphoneOn(false);
break;
}
I am sending orders from Service to Activity, the "case Constants.START_REMOTE_VIDEO" work after receiving the order from Service.
I don't see where the problem, why am I only hearing sound but the remote video won't start display !!
Thank you in advance for helping.
After testing for long hours, I found that my code works just fine, I just forget to change the view visibility from "GONE" to "VISIBLE".
Yeah that was the solution, i swear xD

How to get streaming bitrate with MediaPlayer?

I'm developing a stream video app in Android with MediaPlayer. The problem is that I need to show the current bitrate, but I haven't found any valid suggestions on how to do get it?
Here is how I'm setting the video url to play:
mediaPlayer = new MediaPlayer();
try {
mediaPlayer.setDataSource(VIDEO_PATH);
mediaPlayer.prepare();
mediaPlayer.init();
} catch (IOException e) {
e.printStackTrace();
}
I don't know if the only way to get that working is using ExoPlayer (which I've read it may be possible)
Any suggestions?
Thanks!
Apparently you cannot do this with MediaPlayer but you can use MediaMetadataRetriever, which is available since API level 10, i.e., quite a while ago.
int getBitRate(String url) {
final MediaMetadataRetriever mmr = new MediaMetadataRetriever();
try {
mmr.setDataSource(url, Collections.EMPTY_MAP);
return Integer.parseInt(mmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_BITRATE));
} catch (NumberFormatException e) {
return 0;
} finally {
mmr.release();
}
}
The disadvantage this can have is that you will make an extra HTTP request for getting the metadata (only an RTT if you are streaming from an URI; if you are reading from an file descriptor it could be more serious). Hopefully no big deal.

Android AudioRecord won't initialize

I'm trying to implement an app that listens to microphone input (specifically, breathing), and presents data based on it. I'm using the Android class AudioRecord, and when trying to instantiate AudioRecord I get three errors.
AudioRecord: AudioFlinger could not create record track, status: -1
AudioRecord-JNI: Error creating AudioRecord instance: initialization check failed with status -1.
android.media.AudioRecord: Error code -20 when initializing native AudioRecord object.
I found this excellent thread: AudioRecord object not initializing
I have borrowed the code from the accepted answer that tries all sample rates, audio formats and channel configurations to try to solve the problem, but it didn't help, I get the above errors for all settings. I have also added a call to AudioRecord.release() on several places according to one of the answers in the thread but it made no difference.
This is my code:
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.util.Log;
public class SoundMeter {
private AudioRecord ar = null;
private int minSize;
private static int[] mSampleRates = new int[] { 8000, 11025, 22050, 32000, 44100 };
public boolean start() {
ar = findAudioRecord();
if(ar != null){
ar.startRecording();
return true;
}
else{
Log.e("SoundMeter", "ERROR, could not create audio recorder");
return false;
}
}
public void stop() {
if (ar != null) {
ar.stop();
ar.release();
}
}
public double getAmplitude() {
short[] buffer = new short[minSize];
ar.read(buffer, 0, minSize);
int max = 0;
for (short s : buffer)
{
if (Math.abs(s) > max)
{
max = Math.abs(s);
}
}
return max;
}
public AudioRecord findAudioRecord() {
for (int rate : mSampleRates) {
for (short audioFormat : new short[] { AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT, AudioFormat.ENCODING_PCM_FLOAT }) {
for (short channelConfig : new short[] { AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO }) {
try {
Log.d("SoundMeter", "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: " + channelConfig);
int bufferSize = AudioRecord.getMinBufferSize(rate, channelConfig, audioFormat);
if (bufferSize != AudioRecord.ERROR_BAD_VALUE) {
// check if we can instantiate and have a success
Log.d("SoundMeter", "Found a supported bufferSize, attempting to instantiate");
AudioRecord recorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, rate, channelConfig, audioFormat, bufferSize);
if (recorder.getState() == AudioRecord.STATE_INITIALIZED){
minSize = bufferSize;
return recorder;
}
else
recorder.release();
}
} catch (Exception e) {
Log.e("SoundMeter", rate + " Exception, keep trying.", e);
}
}
}
}
return null;
}
}
I have also added the
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
tag to my manifest file, as a child to the manifest tag and a sibling to the application tag according to one of the other answers in the thread mentioned above. I have rebuilt the project after adding this tag.
These are the solutions I find when googling the problem, but they unfortunately don't seem to do it for me.
I am debugging on my Nexus 5 phone (not an emulator). These errors appear upon calling the contructor of AudioRecord. I have rebooted my phone several times to try to release the microphone, to no avail. The project is based on Android 4.4, and my phone is currently running Android 6.0.1.
Would highly appreciate some tips on what else I can try, what I could have missed. Thank you!
I found the answer myself. It had to do with permissions.
The problem was that I am running API version 23 (Android 6.0.1) on my phone, which no longer uses only the manifest file to handle permissions. From version 23, permissions are granted in run-time instead. I added a method that makes sure to request the permission in run-time, and when I had allowed it once on my phone, it worked.
private void requestRecordAudioPermission() {
//check API version, do nothing if API version < 23!
int currentapiVersion = android.os.Build.VERSION.SDK_INT;
if (currentapiVersion > android.os.Build.VERSION_CODES.LOLLIPOP){
if (ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
// Should we show an explanation?
if (ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.RECORD_AUDIO)) {
// Show an expanation to the user *asynchronously* -- don't block
// this thread waiting for the user's response! After the user
// sees the explanation, try again to request the permission.
} else {
// No explanation needed, we can request the permission.
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO}, 1);
}
}
}
}
#Override
public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults) {
switch (requestCode) {
case 1: {
// If request is cancelled, the result arrays are empty.
if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
// permission was granted, yay! Do the
// contacts-related task you need to do.
Log.d("Activity", "Granted!");
} else {
// permission denied, boo! Disable the
// functionality that depends on this permission.
Log.d("Activity", "Denied!");
finish();
}
return;
}
// other 'case' lines to check for other
// permissions this app might request
}
}
I then call requestRecordAudioPermission() from the onCreate() method in my main activity before creating the AudioRecord.

Get the Microphone sound level ( Decibel level) in Android

Im quite new to android and i have searched about this for quite a while. I would like to build an application that is something like a decibel meter. In realtime it shows the sound level. It there is much noise in the room, there will be something indicating that, if its quiet something will indicate that!.
I don't have any idea at all how to do this. Could anyone explain what the basics of the microphone-sound-level application? If its possible, maybe provide some code?
Thanks!
You can use MediaRecorder.getMaxAmplitude().
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO},
RECORD_AUDIO);
}
Get the noise level using the MediaRecorder,
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setOutputFile("/dev/null");
try {
mRecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mRecorder.start();
Start the MediaRecorder,
private Runnable mSleepTask = new Runnable() {
public void run() {
//Log.i("Noise", "runnable mSleepTask");
mSensor.start();
if (!mWakeLock.isHeld()) {
mWakeLock.acquire();
}
//Noise monitoring start
// Runnable(mPollTask) will execute after POLL_INTERVAL
mHandler.postDelayed(mPollTask, POLL_INTERVAL);
}
};
Create Runnable Thread to check the noise level frequently,
private Runnable mPollTask = new Runnable() {
public void run() {
double amp = mSensor.getAmplitude();
//Log.i("Noise", "runnable mPollTask");
// Runnable(mPollTask) will again execute after POLL_INTERVAL
mHandler.postDelayed(mPollTask, POLL_INTERVAL);
}
};
Convert the Amplitude to decibel using the following formula,
return 20 * Math.log10(mRecorder.getMaxAmplitude() / 2700.0);
Monitoring the Voice and Alert for the Louder Noise.
// Create runnable thread to Monitor Voice
private Runnable mPollTask = new Runnable() {
public void run() {
double amp = mSensor.getAmplitude();
//Log.i("Noise", "runnable mPollTask");
updateDisplay("Monitoring Voice...", amp);
if ((amp > mThreshold)) {
callForHelp(amp);
//Log.i("Noise", "==== onCreate ===");
}
// Runnable(mPollTask) will again execute after POLL_INTERVAL
mHandler.postDelayed(mPollTask, POLL_INTERVAL);
}
};
This question has been addressed generally for Java, and the required classes are available in Android.
The basic idea is to sample the data line for the microphone, and calculate the level from the returned buffer.
How to calculate the level/amplitude/db of audio signal in java?
You can also have a look at the Visualizer class which does FFT frequency analysis, however the permissions for microphone may not be consistent across various devices. You may also have to connect it to the Equalizer class to access the mic.
https://developer.android.com/reference/android/media/audiofx/Visualizer.html
There's a great app in the marketplace called Audalyzer
http://code.google.com/p/moonblink/wiki/Audalyzer
Also check this discussion..
Android: sample microphone without recording to get live amplitude/level?

How can I intercept the audio stream on an android device?

Let's suppose that we have the following scenario: something is playing on an android device (an mp3 par example, but it could be anything that use the audio part of an android device). From an application (android application :) ), I would like to intercept the audio stream to analyze it, to record it, etc. From this application (let's say "the analyzer") I don't want to start an mp3 or something, all I want is to have access to the audio stream of android.
Any advice is appreciated, it could a Java or C++ solution.
http://developer.android.com/reference/android/media/MediaRecorder.html
public class AudioRecorder {
final MediaRecorder recorder = new MediaRecorder();
final String path;
/**
* Creates a new audio recording at the given path (relative to root of SD
* card).
*/
public AudioRecorder(String path) {
this.path = sanitizePath(path);
}
private String sanitizePath(String path) {
if (!path.startsWith("/")) {
path = "/" + path;
}
if (!path.contains(".")) {
path += ".3gp";
}
return Environment.getExternalStorageDirectory().getAbsolutePath()
+ path;
}
/**
* Starts a new recording.
*/
public void start() throws IOException {
String state = android.os.Environment.getExternalStorageState();
if (!state.equals(android.os.Environment.MEDIA_MOUNTED)) {
throw new IOException("SD Card is not mounted. It is " + state
+ ".");
}
// make sure the directory we plan to store the recording in exists
File directory = new File(path).getParentFile();
if (!directory.exists() && !directory.mkdirs()) {
throw new IOException("Path to file could not be created.");
}
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(path);
recorder.prepare();
recorder.start();
}
/**
* Stops a recording that has been previously started.
*/
public void stop() throws IOException {
recorder.stop();
recorder.release();
}
}
Consider using the AudioPlaybackCapture API that was introduced in Android 10 if you want to get the audio stream for a particular app.

Categories