Im trying to show a live stream from an ACTI IP camera on an android phone. The ACTI camera is set to stream H.264 Baseline with 320x240 resolution. An error keeps popping up after prepareAsync() is called.
W/IMediaDeathNotifier: media server died
W/AudioSystem: AudioFlinger server died!
E/MediaPlayer: Error (100,0)
E/MediaPlayer: error (100, 0)
I checked the URL format, tried removing the "?". I also tried the stream URL in VLC and it worked. Im running this on an Android Jelly Bean. Code is below
final static String RTSP_URL = "rtsp://192.168.34.52:7070?/";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_vo_ipphone);
mySurfaceView = findViewById(R.id.surface);
Log.i(TAG, "prepare surface holder");
_surfaceHolder = mySurfaceView.getHolder();
_surfaceHolder.addCallback(this);
_surfaceHolder.setFixedSize(320, 240);
}
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
Log.i(TAG, "Prepared!");
_mediaPlayer.start();
}
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
Log.i(TAG, "Surface created");
_mediaPlayer = new MediaPlayer();
_mediaPlayer.setOnErrorListener(new MediaPlayer.OnErrorListener() {
public boolean onError(MediaPlayer mp, int what, int extra) {
_mediaPlayer.release();
//create another mediaplayer preferrably in another thread
return false;
}
});
_mediaPlayer.setDisplay(_surfaceHolder);
Context context = getApplicationContext();
Map<String, String> headers = getRtspHeaders();
Uri source = Uri.parse(RTSP_URL);
try {
// Specify the IP camera's URL and auth headers.
Log.i(TAG, "Set data source");
_mediaPlayer.setDataSource(context, source, headers);
//_mediaPlayer.setDataSource(this, source);
// Begin the process of setting up a video stream.
Log.i(TAG, "set on prepared listener");
_mediaPlayer.setOnPreparedListener(this);
Log.i(TAG, "prepare async");
_mediaPlayer.prepareAsync();
}
catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
_mediaPlayer.release();
}
It's looks like all line is true but Maybe error comes from wrong URL , url should like below
final static String RTSP_URL = "rtsp://192.168.34.52:7070/";
try this url without '?'
And also check your device internet and permission
Add to permission in AndroidManifest.xml if is not exist
<uses-permission android:name="android.permission.INTERNET" />
I hope it work
Related
Hello I want to detect using a Bluetooth iBeacon, but it does not work well. How can I solve it?
I am testing by connecting with a real mobile phone, not an emulator.
2020-06-13 22:42:01.729 26437-26728/com.example.beacon
D/BluetoothAdapter: STATE_ON 2020-06-13 22:42:01.729
26437-26728/com.example.beacon D/BluetoothLeScanner: could not find
callback wrapper 2020-06-13 22:42:08.353
26437-26728/com.example.beacon D/BluetoothAdapter: STATE_ON 2020-06-13
22:42:08.353 26437-26728/com.example.beacon D/BluetoothLeScanner:
could not find callback wrapper
Is that it continues to be filmed.
public class MainActivity extends AppCompatActivity implements BeaconConsumer {
protected static final String TAG = "MonitoringActivity";
private BeaconManager beaconManager;
PermissionListener permissionlistener = new PermissionListener() {
#Override
public void onPermissionGranted() {
Toast.makeText(MainActivity.this, "Permission Granted", Toast.LENGTH_SHORT).show();
}
#Override
public void onPermissionDenied(List<String> deniedPermissions) {
Toast.makeText(MainActivity.this, "Permission Denied\n" + deniedPermissions.toString(), Toast.LENGTH_SHORT).show();
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
beaconManager = BeaconManager.getInstanceForApplication(this);
// To detect proprietary beacons, you must add a line like below corresponding to your beacon
// type. Do a web search for "setBeaconLayout" to get the proper expression.
beaconManager.getBeaconParsers().add(new BeaconParser().
setBeaconLayout("m:2-3=0215,i:4-19,i:20-21,i:22-23,p:24-24,d:25-25"));
beaconManager.bind(this);
}
#Override
protected void onDestroy() {
super.onDestroy();
beaconManager.unbind(this);
}
#Override
public void onBeaconServiceConnect() {
beaconManager.removeAllMonitorNotifiers();
beaconManager.addMonitorNotifier(new MonitorNotifier() {
#Override
public void didEnterRegion(Region region) {
Log.i(TAG, "I just saw an beacon for the first time!");
}
#Override
public void didExitRegion(Region region) {
Log.i(TAG, "I no longer see an beacon");
}
#Override
public void didDetermineStateForRegion(int state, Region region) {
Log.i(TAG, "I have just switched from seeing/not seeing beacons: "+state);
}
});
try {
beaconManager.startMonitoringBeaconsInRegion(new Region("E2C56DB5-DFFB-48D2-B060-D0F5A71096E0", null, null, null));
} catch (RemoteException e) { }
}
}
Make sure you have declared FINE_LOCATION permission in your AndroidManifest.xml and you go through the steps to obtain that permission from the user.
Once you have completed those steps, if you still see these messages, read here: https://stackoverflow.com/a/42821272/1461050
I am using the Gradle dependency implementation 'org.webrtc:google-webrtc:1.0.30039'
This is the error:
E/rtc: Fatal error in:
gen/sdk/android/generated_metrics_jni/../../../../../../../../usr/local/google/home/sakal/code/webrtc-aar-release/src/sdk/android/src/jni/jni_generator_helper.h,
line 94 last system error: 0 Check failed: !env->ExceptionCheck()
A/libc: Fatal signal 6 (SIGABRT), code -6 in tid 11556 (network_thread
), pid 11515 (est.applicatoin)
onSignalingChange: HAVE_LOCAL_OFFER is the last log call before the exception.
An SDP is created but ICE trickling is not happening i.e OnIceGathering or OnIceCandidate is not called at all.
Java Activity Code:
public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
DataChannel mainDataChannel;
PeerConnection mainPeerConnection;
PeerConnectionFactory factory;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initializePeerConnectionFactory();
initializeMyPeerConnection(); // Connection Initialization.
startConnection(); //Getting the offer
private void initializePeerConnectionFactory() {
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.setEnableInternalTracer(true)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
factory = PeerConnectionFactory.builder()
.setOptions(options)
.createPeerConnectionFactory();
}
private void startConnection() {
Log.d(TAG, "startConnection: Starting Connection...");
//CreateOffer fires the request to get ICE candidates and finish the SDP. We can listen to all these events on the corresponding observers.
mainPeerConnection.createOffer(new SimpleSdpObserver() {
#Override
public void onCreateSuccess(SessionDescription sessionDescription) {
Log.d(TAG, "onCreateSuccess: " + sessionDescription.description);
mainPeerConnection.setLocalDescription(new SimpleSdpObserver(), sessionDescription);
}
#Override
public void onCreateFailure(String s) {
Log.e(TAG, "onCreateFailure: FAILED:" + s);
}
}, new MediaConstraints());
Log.d(TAG, "startConnection: Start Connection end");
}
private void initializeMyPeerConnection() {
Log.d(TAG, "initializeMyPeerConnection: Starting Initialization...");
mainPeerConnection = createPeerConnection(factory);
mainDataChannel = mainPeerConnection.createDataChannel("sendDataChannel", new DataChannel.Init());//Setting the data channel.
mainDataChannel.registerObserver(new DataChannel.Observer() {
#Override
public void onBufferedAmountChange(long l) {
}
#Override
public void onStateChange() {
//Data channel state change
Log.d(TAG, "onStateChange: " + mainDataChannel.state().toString());
}
#Override
public void onMessage(DataChannel.Buffer buffer) {
Toast.makeText(MainActivity.this, "Got the message!", Toast.LENGTH_SHORT).show();
}
});
Log.d(TAG, "initializeMyPeerConnection: Finished Initializing.");
}
private PeerConnection createPeerConnection(PeerConnectionFactory factory) {
List<PeerConnection.IceServer> iceServers = new LinkedList<>();
iceServers.add(PeerConnection.IceServer.builder("stun:stun.l.google.com:19302").createIceServer());
PeerConnection.RTCConfiguration rtcConfiguration = new PeerConnection.RTCConfiguration(iceServers);
PeerConnection.Observer pcObserver = new MyPeerConnectionObserver(TAG, mainPeerConnection);
return factory.createPeerConnection(rtcConfiguration, pcObserver);
}
}
I was facing the same error and solved it by adding this line
android.enableDexingArtifactTransform.desugaring=false
On gradle.properties file
Try to add this permission to your manifest file:
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
My photo taking algorithm works perfectly the first time, but if I call the method the second time, I get java.lang.RuntimeException: Fail to connect to camera service on camera.open()
takePhoto(this, 0);//back camera.
takePhoto(this, 1);//selfie. No matter what, the second line crashes. Even if I switch the two lines.
Here is the method that only works the first time:
private void takePhoto(final Context context, final int frontorback) {
Log.v("myTag", "In takePhoto()");
final SurfaceView preview = new SurfaceView(context);
SurfaceHolder holder = preview.getHolder();
// deprecated setting, but required on Android versions prior to 3.0
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
holder.addCallback(new SurfaceHolder.Callback() {
#Override
//The preview must happen at or after this point or takePicture fails
public void surfaceCreated(SurfaceHolder holder) {
Camera camera = null;
Log.v("myTag", "Surface created ");
try {
camera = Camera.open(frontorback); //** THIS IS WHERE IT CRASHES THE SECOND TIME **
Log.v("myTag", "Opened camera");
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
Log.v("myTag", "Can't assign preview to Surfaceview holder" + e.toString());
}
try {
camera.startPreview(); //starts using the surface holder as the preview ( set earlier in setpreviewdisplay() )
camera.autoFocus(new Camera.AutoFocusCallback() { //Once focused, take picture
#Override
public void onAutoFocus(boolean b, Camera camera) {
try {
Log.v("myTag", "Started focusing");
camera.takePicture(null, null, mPictureCallback);
Log.v("myTag", "Took picture!");
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
});
} catch (Exception e) {
Log.v("myTag", "Can't start camera preview " + e.toString());
if (camera != null)
camera.release();
throw new RuntimeException(e);
}
}catch(Exception e){
Log.v("myTag", "can't open camera " +e.toString());
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
});
addPreviewToSurfaceView(); //Irrelavent here
}
//CALLBACK WHERE I RELEASE:
android.hardware.Camera.PictureCallback mPictureCallback = new android.hardware.Camera.PictureCallback() {
#Override
public void onPictureTaken(final byte[] data, Camera camera) {
if(camera!=null){
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release();
camera = null;
}
downloadPicture(data);
sendPicture(data);
Log.v("myTag","Picture downloaded and sent");
}
};
It's odd, because takePhoto(context, int) only works the first time no matter what. Even if I switch the second parameter, only the first takePhoto() works.
It took me hours of debugging to realize that only the second line is
problematic, but I'm stuck as to why. Any feedback is much
appreciated!
EDIT:
Even after removing the code in my onPictureTaken callback, the problem continues to persist. I suspect the camera may need time to reopen immediately, but I can't sleep the thread since I'm performing UI interactions on it. This bug is like a puzzle right now!
You cannot call takePhoto() one after another, because this call takes long time (and two callbacks) to complete. You should start the second call after the first picture is finished. Here is an example, based on your code:
private void takePhoto(final Context context, final int frontorback) {
...
android.hardware.Camera.PictureCallback mPictureCallback = new android.hardware.Camera.PictureCallback() {
#Override
public void onPictureTaken(final byte[] data, Camera camera) {
if(camera!=null){
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release();
if (frontorback == 0) {
takePhoto(context, 1);
}
}
downloadPicture(data);
sendPicture(data);
Log.v("myTag","Picture downloaded and sent");
}
};
This will start the first photo and start the second photo only when the first is complete.
Here might be problem.
After you take photo, the picture taken callback get called.
if(camera!=null){
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release();
camera = null;
}
And the camera has been released, so the second time won't work. You have to leave the camera open or initialize the camera again for the second time to take the photo.
Does anyone know how i can add a progress bar that tells me how much of the file has been uploaded. This is what i have managed to do in the code below, i have managed to pick a file from my phone and have also managed to send it, but the problem is if i am uploading a huge file i keep waiting not knowing when it will finish and that is why i need a progress bar that shows me how much has been transferred to the server. This is the code that i have i have tried different implementations but to no avail.
public class MainActivity extends AppCompatActivity {
private Button fileUploadBtn;
protected static String IPADDRESS = "http://10.42.0.1/hugefile/save_file.php";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
fileUploadBtn = (Button) findViewById(R.id.btnFileUpload);
pickFile();
}
private void pickFile() {
fileUploadBtn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
new MaterialFilePicker()
.withActivity(MainActivity.this)
.withRequestCode(10).start();
}
});
}
ProgressDialog progress;
#Override
protected void onActivityResult(int requestCode, int resultCode, final Intent data) {
progress = new ProgressDialog(MainActivity.this);
progress.setTitle("Uploading File(s)");
progress.setMessage("Please wait...");
progress.show();
if (requestCode == 10 && resultCode == RESULT_OK) {
Thread t = new Thread(new Runnable() {
#Override
public void run() {
File f = new File(data.getStringExtra(FilePickerActivity.RESULT_FILE_PATH));
String content_type = getMimeType(f.getPath());
String file_path = f.getAbsolutePath();
OkHttpClient client = new OkHttpClient();
RequestBody file_body = RequestBody.create(MediaType.parse(content_type), f);
RequestBody request_body = new MultipartBody.Builder()
.setType(MultipartBody.FORM)
.addFormDataPart("type", content_type)
.addFormDataPart("uploaded_file", file_path.substring(file_path.lastIndexOf("/") + 1), file_body).build();
Request request = new Request.Builder()
.url(IPADDRESS)
.post(request_body)
.build();
try {
Response response = client.newCall(request).execute();
Log.d("Server response", response.body().string());
if (!response.isSuccessful()) {
throw new IOException("Error : " + response);
}
progress.dismiss();
} catch (IOException e) {
e.printStackTrace();
}
}
});
t.start();
}
}
private String getMimeType(String path) {
String extension = MimeTypeMap.getFileExtensionFromUrl(path);
return MimeTypeMap.getSingleton().getMimeTypeFromExtension(extension);
}
}
I think your whole solution need some corrections. Pausing user with some dialog is bad practice, also your upload progress can be lost if config changes. And new Thread I really too crude nowadays. So first of all I advise you to move code of upload process​ in Service or IntentService. You can show progress and status in notification and then alert user with dialog or snackbar, etc. Secondly there is no straight way to monitor progress of upload. Generally best way is implementing custom RequestBody which will notify bytes uploaded via listener. Please reffer to Tracking progress of multipart file upload using OKHTTP
Then you can use broadcasts or event bus to publish progress.
i managed to solve it with a bit of some improvements and here is the link to the solution for anyone who encounters the same problem
https://github.com/MakuSimz/Android-Multipart-Upload-with-Progress
I was using following code to play sound. Everything worked fine before ICS. But on ICS and higher versions no sound is heard. There is no error, but no sound can be heard.
EDIT: Note, the following code is triggered by a broadcase receiver. BroadCast receiver invokes a async task. In the post process method of asycn task the following method is called.
What could the error possibly be?
public static void playSound(final Context context, final int volume,
Uri uri, final int stream, int maxTime, int tickTime) {
//stopPlaying();
/*
if (stream < 0 || stream > 100) {
throw new IllegalArgumentException(
"volume must be between 0 and 100 .Current volume "
+ volume);
}*/
final AudioManager mAudioManager = (AudioManager) context
.getSystemService(Context.AUDIO_SERVICE);
int deviceLocalVolume = getDeviceVolume(volume,
mAudioManager.getStreamMaxVolume(stream));
Log.d(TAG,
"device max volume = "
+ mAudioManager.getStreamMaxVolume(stream)
+ " for streamType " + stream);
Log.d(TAG, "playing sound " + uri.toString()
+ " with device local volume " + deviceLocalVolume);
final int oldVolume = mAudioManager.getStreamVolume(stream);
// set the volume to what we want it to be. In this case it's max volume
// for the alarm stream.
Log.d(Constants.APP_TAG, "setting device local volume to " + deviceLocalVolume);
mAudioManager.setStreamVolume(stream, deviceLocalVolume,
AudioManager.FLAG_REMOVE_SOUND_AND_VIBRATE);
final MediaPlayer mediaPlayer = new MediaPlayer();
golbalMMediaPlayer = mediaPlayer;
try {
final OnPreparedListener OnPreparedListener = new OnPreparedListener() {
#Override
public void onPrepared(final MediaPlayer mp) {
Log.d(TAG, "onMediaPlayercompletion listener");
mp.start();
countDownTimer.start();
}
};
mediaPlayer.setDataSource(context.getApplicationContext(), uri);
mediaPlayer.setAudioStreamType(stream);
mediaPlayer.setLooping(false);
mediaPlayer.setOnPreparedListener(OnPreparedListener);
mediaPlayer.setOnCompletionListener(new OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
Log.d(Constants.APP_TAG, "Entered onCompletion listener of mediaplayer");
mAudioManager.setStreamVolume(stream, oldVolume,
AudioManager.FLAG_REMOVE_SOUND_AND_VIBRATE);
try{
if(mediaPlayer != null && mediaPlayer.isPlaying()){
mediaPlayer.release();
}
}catch(Exception ex){
Log.e(Constants.APP_TAG, "error on oncompletion listener" ,ex);
}
}
});
CountDownTimer timer = new CountDownTimer(maxTime*1000, tickTime*1000) {
#Override
public void onTick(long millisUntilFinished) {
Log.d(TAG, "tick while playing sound ");
}
#Override
public void onFinish() {
Log.d(TAG, "timer finished");
stopPlaying();
}
};
countDownTimer = timer;
mediaPlayer.prepareAsync();
} catch (Exception e) {
Log.e(TAG, "problem while playing sound", e);
} finally {
}
}
LOGS:
:07-01 00:00:00.030: D/beephourly(9500): device max volume = 7 for streamType 5
07-01 00:00:00.030: D/beephourly(9500): playing sound content://media/internal/audio/media/166 with device local volume 7
07-01 00:00:00.030: D/beephourly(9500): setting device local volume to 7
07-01 00:00:00.080: D/beephourly(9500): vibrating with pattern = [J#428bae20
07-01 00:00:00.090: D/beephourly(9500): will show normal notification
07-01 00:00:00.100: D/beephourly(9500): notification is enabled
07-01 00:00:00.100: D/usersettings(9500): hr = 0
07-01 00:00:00.110: D/beephourly(9500): onMediaPlayercompletion listener
07-01 00:00:00.451: D/beephourly(9500): tick while playing sound
07-01 00:00:20.460: D/beephourly(9500): timer finished
07-01 00:00:20.460: D/beephourly(9500): got request to stop playing
07-01 00:00:20.460: D/beephourly(9500): cancelling countdowntimer
07-01 00:00:20.460: D/beephourly(9500): releasing mediaplayer now
Try this :
Playing sound
public class PlaySound extends Activity implements OnTouchListener {
private SoundPool soundPool;
private int soundID;
boolean loaded = false;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
View view = findViewById(R.id.textView1);
view.setOnTouchListener(this);
// Set the hardware buttons to control the music
this.setVolumeControlStream(AudioManager.STREAM_MUSIC);
// Load the sound
soundPool = new SoundPool(10, AudioManager.STREAM_MUSIC, 0);
soundPool.setOnLoadCompleteListener(new OnLoadCompleteListener() {
#Override
public void onLoadComplete(SoundPool soundPool, int sampleId,
int status) {
loaded = true;
}
});
soundID = soundPool.load(this, R.raw.sound1, 1);
}
#Override
public boolean onTouch(View v, MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
// Getting the user sound settings
AudioManager audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
float actualVolume = (float) audioManager
.getStreamVolume(AudioManager.STREAM_MUSIC);
float maxVolume = (float) audioManager
.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
float volume = actualVolume / maxVolume;
// Is the sound loaded already?
if (loaded) {
soundPool.play(soundID, volume, volume, 1, 0, 1f);
Log.e("Test", "Played sound");
}
}
return false;
}
}
Layout file :
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical" >
<TextView
android:id="#+id/textView1"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:text="Click on the screen to start playing" >
</TextView>
</LinearLayout>
Source link : http://www.vogella.com/tutorials/AndroidMedia/article.html#sound
Sometimes MediaPlayer objects have to be declared as a public variable or they will be deleted by the Dalvik Heap.
public final MediaPlayer mediaPlayer = new MediaPlayer();
private MediaPlayer mPlayer;
....
SoundPool sp = new SoundPool(5, AudioManager.STREAM_MUSIC, 0);
int iTmp = sp.load(getBaseContext(), R.raw.windows_8_notify, 1);
sp.play(iTmp, 1, 1, 0, 0, 1);
mPlayer = MediaPlayer.create(getBaseContext(), R.raw.windows_8_notify);
mPlayer.start();
mPlayer.setLooping(true); }
First , where all your privates are , before the onCreate, put the first line, then, Inside the onCreate start the music, just make sure to change the "windows_8_notify" to the name of the song you want.
I would Wrap the call in an IllegalStateException, run it through the debugger and see what you get.
Things to try
Set the boolean isPlaying=mp.isPlaying(); and check its value.
Try a mp.reset() before starting and see if it works.
Implement MediaPlayer.OnErrorListener and register the method with the media player.
See what error you get. This might be helpful.
LOGS
...streamType 5
StreamType 5 means STREAM_NOTIFICATION.
(Called from notification?)
It should be STREAM_MUSIC (3)
To check it's not ICS/device specific problem,
- place a sound file (sound_01.ogg or sound_01.mp3) under res/raw/ folder
- place buttons named start_button and stop_button in main_layout
and try this.
(I've checked this code with API10 and API19 emulator and sounds are played.)
import android.app.Activity;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
public class MainActivity extends Activity
// implements MediaPlayer.OnPreparedListener
{
private MediaPlayer mediaPlayer;
private boolean isPrepared;
private boolean isPlaying;
private View start_button;
private View stop_button;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main_layout);
Init();
}
#Override
protected void onResume()
{
super.onResume();
Load();
}
#Override
protected void onPause()
{
super.onPause();
Unload();
}
private void Init()
{
setVolumeControlStream(AudioManager.STREAM_MUSIC);
start_button = findViewById(R.id.start_button);
stop_button = findViewById(R.id.stop_button);
start_button.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
Play();
}
});
stop_button.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
Stop();
}
});
}
private void Load()
{
Unload();
// load from resource (res/raw/xx.ogg or .mp3)
// It's better to use thread
mediaPlayer = MediaPlayer.create(MainActivity.this, R.raw.sound_01); // On success, prepare() will already have been called
// mediaPlayer.setOnPreparedListener(this); // cannot set this listener (MediaPlayer.create does not return before prepared)
isPrepared = true;
}
private void Unload()
{
isPrepared = false;
if (null != mediaPlayer)
{
mediaPlayer.release();
mediaPlayer = null;
}
}
// #Override
// public void onPrepared(MediaPlayer mp)
// {
// isPrepared = true;
// }
private void Play()
{
// If you got "start called in state xx" error, no sound will be heard.
// To reset this error, call reset(), setDataSource() and prepare()
// (for resources: call release() and create())
if (!isPrepared)
{
return;
}
mediaPlayer.start();
isPlaying = true;
}
private void Stop()
{
// Do not omit this check
// or you will get "called in wrong state" errors
// like "pause called in state 8"
// and error (-38, 0)
if (!isPlaying)
{
return;
}
isPlaying = false;
mediaPlayer.pause();
mediaPlayer.seekTo(0);
}
}
If it's ICS/device specific, these links may help. (A little old...)
after small sound is played, no sound will be heard
Issue 35861: Low Volume sound cut out - ICS Galaxy Note
audio focus bug
Issue 1908: No Audio with Android 4.0.4 ICS Galaxy Tab 10.1
device specific problem
No sound during calls (samsung galaxy s3 problem)
You might have a problem if you are using other AsyncTasks or the SerialExecutor in another task elsewhere in your program (and you may not even know it if you are using third party SDK's).
See the post here:
https://code.google.com/p/android/issues/detail?id=20941
I'm suggesting this because your sound "tick" isn't working either. So it isn't a matter of AudioPlayer executing with an incorrect setting necessarily, but rather some other task appears to be blocking it until that task stops, and it probably is a task that runs concurrently with when you expect to hear sound.