I'm trying to play an RTSP url in my android app, using the following code:
String url = "rtsp://mobilestr1.livestream.com/livestreamiphone/nyc";
Uri uri = Uri.parse(url);
System.out.println("URL="+url);
startActivity(new Intent(Intent.ACTION_VIEW, uri));
However an alert dialog pops up after a few seconds saying "Unable to play video".
I have tried several RTSP urls and none of them work. What am I doing wrong?
Thanks
That stream is h264 MPEG-4 AVC Part 10. Which doesnt work on most android devices.
This page has a list of what does work. But essentially you need an MPEG-4 Baseline stream.
http://developer.android.com/guide/appendix/media-formats.html#recommendations
If you open the stream in VLC and then :
Window > Media Information > Codec Details you can verify this info as well
Try the below code. It works for me. Don't forget to add internet permission in your manifest.
private void rtspStream(String rtspUrl) {
mVideoView.setVideoURI(Uri.parse(rtspUrl));
mVideoView.requestFocus();
mVideoView.start();
}
Related
My problem is reading videos and streaming on the net. If I know the location of a video file and I want to read it I just do:
String Url = "https://www.w3schools.com/html/mov_bbb.mp4";
VideoView videoView;
protected void onCreate (Bundle savedInstanceState) {
...
videoView = (VideoView) findViewById (R.id.loadVideo);
videoView.setVideoURI (Uri.parse (this.Url));
videoView.start ();
But if the video is a stream, for example sent from my computer, I know the address (example: 127.0.0.1:8080 , yeah is random example i know this is localhost and can't work in localhost, but i can't use the real address of video sorry) and I can play from windows via Vlc quietly, the video is not played, the code ends up "catch", then the same syntax can not be used for video streaming via http address.
I looked everywhere on the internet, but I did not find any working solution and many plugins seen are now obsolete and / or unusable with android studio. I'm not a professional Android programmer, but I've been asked to do it, do you know how I can do it?
I forgot: the intention is to create a live stream
I had a problem with my videoview.
When I try play video from specific URL at my API-27 emulator Android show me message dialog
Can't play this video
That's it what I get in Logcat
source returned error -1010, 0 retries left
initFromDataSource, source has no track!
Failed to init from data source!
MediaPlayerNative: error (1, -2147483648)
MediaPlayer: Error (1,-2147483648)
That's my code where I use my videoview
mVideoView = findViewById(R.id.videoView);
mMediaController = new MediaController(this);
mVideoView.setVideoPath("https://clips.vorwaerts-gmbh.de/VfE_html5.mp4");
mVideoView.requestFocus();
initListeners();
initListeners method
mVideoView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mVideoView.setMediaController(mMediaController);
mVideoView.setBackground(null);
mMediaController.setAnchorView(mVideoView);
mMediaController.show();
mVideoView.start();
}
});
mVideoView.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
mVideoView.setBackground(getDrawable(R.drawable.webinar_photo_preview));
}
});
I test my videoview feature at api23, api24 and everything sounds good.
Can somebody tell me what I'm doing wrong?
EDITED: Now I found that the error also appears on api24
Problem was in VideoView, setVideoPath set videos only with little file size (1 - 2 MB) and if it size is bigger MediaPlayer crashes with MEDIA_ERROR_SYSTEM (-2147483648) - (low-level system error), read in documentation . That why I start using exoPlayer.
Its look like your code is working fine and maybe the problem is that you are not using cookies that are send to browser at the time of request.
In simple terms this video is not for direct access via code.
If you still wants to try the steps below.
Make http request to link (https://clips.vorwaerts-gmbh.de/VfE_html5.mp4) and store the received cookies.
Use the received cookies with your next request when you want to play the video.
Note - If you want to use cookies to play video it can be done via ExoPlayer(https://github.com/google/ExoPlayer).
I'm trying to access a RTSP video stream from an IP camera using OpenCV and Java. I can access the stream using VLC player with the following format: rtsp://192.168.1.10:554/rtsp_live0 but when I try to use OpenCV the video stream seems to always be closed.
The code I'm using... (simplified)
VideoCapture capture = new VideoCapture();
capture.open("rtsp://192.168.1.10:554/rtsp_live0");
while(!capture.isOpened())
System.out.print("Not opened :( \r");
I have a Mustcam H806P and found the stream URI from this website: http://www.ispyconnect.com/man.aspx?n=ipcamera
What am I doing wrong?
I'm reporting Alexander Smorkalov answer on answers.opencv.org
OpenCV uses ffmpeg library for video I/O. Try to get video stream with console ffmpeg tool. The address must be the same.
See also here OpenCV - how to capture rtsp video stream
I am succeed to record video through Mediarecorder on SD card
but i want to send this video to a server without writing to SD card.
I search it and i found the parcelFileDescriptor is the way to send
video to TCP socket
but i don't know how to receive it on server side please explain it.
here is my client side code
socket = new Socket("hostname", portnumber);
ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(socket);
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mPreview = new Preview(VideoRecorder.this,recorder);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(mPreview);
I want to receive it on server side and play it to create areal time video transer.
knowing that
"The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms, where each atom starts with its size. There are different kinds of atoms in a file, mdat atoms store the actual raw frames of the encoded video and audio. In the Cupcake version Android starts writing out an mdat atom with the encoded frames, but it has to leave the size of the atom empty for obvious reasons. When writing to a seekable file descriptor, it can simply fill in the blanks after the recording, but of course socket file descriptors are not seekable. So the received stream will have to be fixed up after the recording is finished, or the raw video / audio frames have to be processed by the server".
I want a server(may be Android handset or PC) side code.
if there is another way please help me......
Thanks
In order to stream from android or pc you need to implement protocol over which the stream is carried over and server. There are several of them like HSL, RTPS etc (more http://en.wikipedia.org/wiki/Streaming_media). It is not a trivial problem, and there are only very few successful streaming service from android.
You can check how to implement and steaming service on android here: https://github.com/fyhertz/libstreaming
The library is broken for Android 5, but works for 4.4.x
I am developing an application where i have file uri as well as bt device address with me. I need to send the file to the defined bt device. But, the device picker screen should not be shown. It should directly start the device sending.
Obviously, intent:ACTION_SEND is not an option here, as it will show the chooser dialog. The main intension of the application is to bypass the chooser dialog and enable user to send selected file to selected device directly.
So, i was trying the following solution suggested in stack-overflow:
/*BluetoothDevice device;
String filePath = Environment.getExternalStorageDirectory().toString() + "/file.jpg";
ContentValues values = new ContentValues();
values.put(BluetoothShare.URI, Uri.fromFile(new File(filePath)).toString());
values.put(BluetoothShare.DESTINATION, device.getAddress());
values.put(BluetoothShare.DIRECTION, BluetoothShare.DIRECTION_OUTBOUND);
Long ts = System.currentTimeMillis();
values.put(BluetoothShare.TIMESTAMP, ts);
Uri contentUri = getContentResolver().insert(BluetoothShare.CONTENT_URI, values);*/
But unfortunately, it is not working. After getContentResolver().insert, there is no action taken. Needless to say, i have tried various permissions and other stuffs, but to no effect.
So, people who have used this code, please provide your suggestions. Any help to meet the requirement will be very much appreciated.
Does it produce any exception? Please share your Logcat output.
By the way, make use you have the following permission in AndroidManifest.xml
<uses-permission android:name="android.permission.BLUETOOTH"/>
If you are doing a device discovery, add this too in the manifest file
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN"/>
I also found this problem. I can show some evidence to solve this problem. After trying this code check outbound transfer queue of your device (to open it, Send a file manually to a device and click the notification icon). Then you can see device has tried to send objects and those were failed. Click on one failure message and you can see pop-up with no file path. I think the problem is though here we are setting the file path using URI to the contentValues it will not be checked in the process. We have to do some thing for this.