I'm trying to access a RTSP video stream from an IP camera using OpenCV and Java. I can access the stream using VLC player with the following format: rtsp://192.168.1.10:554/rtsp_live0 but when I try to use OpenCV the video stream seems to always be closed.
The code I'm using... (simplified)
VideoCapture capture = new VideoCapture();
capture.open("rtsp://192.168.1.10:554/rtsp_live0");
while(!capture.isOpened())
System.out.print("Not opened :( \r");
I have a Mustcam H806P and found the stream URI from this website: http://www.ispyconnect.com/man.aspx?n=ipcamera
What am I doing wrong?
I'm reporting Alexander Smorkalov answer on answers.opencv.org
OpenCV uses ffmpeg library for video I/O. Try to get video stream with console ffmpeg tool. The address must be the same.
See also here OpenCV - how to capture rtsp video stream
Related
My problem is reading videos and streaming on the net. If I know the location of a video file and I want to read it I just do:
String Url = "https://www.w3schools.com/html/mov_bbb.mp4";
VideoView videoView;
protected void onCreate (Bundle savedInstanceState) {
...
videoView = (VideoView) findViewById (R.id.loadVideo);
videoView.setVideoURI (Uri.parse (this.Url));
videoView.start ();
But if the video is a stream, for example sent from my computer, I know the address (example: 127.0.0.1:8080 , yeah is random example i know this is localhost and can't work in localhost, but i can't use the real address of video sorry) and I can play from windows via Vlc quietly, the video is not played, the code ends up "catch", then the same syntax can not be used for video streaming via http address.
I looked everywhere on the internet, but I did not find any working solution and many plugins seen are now obsolete and / or unusable with android studio. I'm not a professional Android programmer, but I've been asked to do it, do you know how I can do it?
I forgot: the intention is to create a live stream
I usually don't like to post questions because I would rather figure things out myself, but I am ready to pull my hair out with this one. I am trying to interface with a Sony IP Camera using Java. One of the products of the company I work for uses a Sony IP camera (IPela EP550). I have been tasked with writing the new interface. I can connect to the stream using the VLC ActiveX embedded control, but I can't manipulate the PTZ of the camera from in Java. If I type: "http://xxx.xxx.xxx.xxx/command/ptzf.cgi?Move=left,0" in a web browser it will move, but I have tried every bit of code I can find with Google to get it to move with no success. This last thing I tried (because a page on Oracle said all I should have to do is open the connection):
URL url1 = new URL("http://xxx.xxx.xxx.xxx/command/ptzf.cgi?Move=left,0&t="+new Date().getTime());
HttpURLConnection con = (HttpURLConnection)url1.openConnection();
Any help will be appreciated. Thank you.
Joe
Check out whether the camera needs login.
type the url in the browser, get HTTP request header and put header data into your code!
I figured out how to do this. I am posting the solution in case anybody is looking to fix a similar problem. I took the basic idea in this Dr. Dobbs article and used it to get movement from the camera. I don't yet know why I can't get the camera to respond with URLConnection and HttpURLConnection, but using a Socket and PrintWriter to specifically print the GET request to the socket.
I'm trying to play an RTSP url in my android app, using the following code:
String url = "rtsp://mobilestr1.livestream.com/livestreamiphone/nyc";
Uri uri = Uri.parse(url);
System.out.println("URL="+url);
startActivity(new Intent(Intent.ACTION_VIEW, uri));
However an alert dialog pops up after a few seconds saying "Unable to play video".
I have tried several RTSP urls and none of them work. What am I doing wrong?
Thanks
That stream is h264 MPEG-4 AVC Part 10. Which doesnt work on most android devices.
This page has a list of what does work. But essentially you need an MPEG-4 Baseline stream.
http://developer.android.com/guide/appendix/media-formats.html#recommendations
If you open the stream in VLC and then :
Window > Media Information > Codec Details you can verify this info as well
Try the below code. It works for me. Don't forget to add internet permission in your manifest.
private void rtspStream(String rtspUrl) {
mVideoView.setVideoURI(Uri.parse(rtspUrl));
mVideoView.requestFocus();
mVideoView.start();
}
I am succeed to record video through Mediarecorder on SD card
but i want to send this video to a server without writing to SD card.
I search it and i found the parcelFileDescriptor is the way to send
video to TCP socket
but i don't know how to receive it on server side please explain it.
here is my client side code
socket = new Socket("hostname", portnumber);
ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(socket);
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mPreview = new Preview(VideoRecorder.this,recorder);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(mPreview);
I want to receive it on server side and play it to create areal time video transer.
knowing that
"The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms, where each atom starts with its size. There are different kinds of atoms in a file, mdat atoms store the actual raw frames of the encoded video and audio. In the Cupcake version Android starts writing out an mdat atom with the encoded frames, but it has to leave the size of the atom empty for obvious reasons. When writing to a seekable file descriptor, it can simply fill in the blanks after the recording, but of course socket file descriptors are not seekable. So the received stream will have to be fixed up after the recording is finished, or the raw video / audio frames have to be processed by the server".
I want a server(may be Android handset or PC) side code.
if there is another way please help me......
Thanks
In order to stream from android or pc you need to implement protocol over which the stream is carried over and server. There are several of them like HSL, RTPS etc (more http://en.wikipedia.org/wiki/Streaming_media). It is not a trivial problem, and there are only very few successful streaming service from android.
You can check how to implement and steaming service on android here: https://github.com/fyhertz/libstreaming
The library is broken for Android 5, but works for 4.4.x
I am recent beginner to Android SDK, and the overall goal of this project is to create an app very similar to Ustream's or Qik's (yeah, I know not the best idea for a beginner). I need to stream live audio and video to the web. There will be a video server, most likely using Wowza, handling the encoding of the videos to the proper format.
From what I have found so far, I need to use android's MediaRecorder with the camera as the source and direct the output to the server. That makes sense to me, but I do not know exactly how to go about doing that. Can anyone give me a push in the right direction? I have browsed through an example at "http://ipcamera-for-android.googlecode.com/svn/trunk", but that appears to be far more complicated than necessary for what I need to do and I have been unable to get it working in eclipse to test it anyway.
Doing so is not simple but possible.
The MediaRecorder API assumes that the output is a random access file, meaning, it can got forth and back for writing the mp4 (or other) file container.
As you can see in the ipcamera-for-android, the output file is directed to a socket which is not random access.
The fact makes it hard to parse the outgoing stream since the MediaRecorder API will "write" some data like fps, sps/pps (on h264) and so on only when the recording is done. The API will try to seek back to the beginning of the stream (where the file header exists) but it will fail since the stream is sent to a socket and not to a file.
Taking the ipcamera-for-android is a good reference, if I recall correctly, before streaming, it records a video to a file, opens the header and takes what it needs from there, than, it start recording to the socket and uses the data it took from the header in order to parse the stream.
You will also need some basic understanding in parsing mp4 (or other file container you'd want to use) in order to capture the frames.
You can do that either on the device or on the server side.
Here is a good start for writing the stream to a socket:
Tutorial
I hope it was helpful, there is no good tutorial for parsing and decoding the outgoing streams since it is not so simple...but again, it is possible with some effort.
Take a look also here to see how to direct the output stream to a stream that can be sent to the server:
MediaRecorder Question
SipDroid does exactly what you need.
It involves a hack to circumvent the limitation of the MediaRecorder class which require a file descriptor. It saves the result of the MediaRecorder video stream to a local socket (used as a kind of pipe), then re-read (in the same application but another thread) from this socket on the other end, create RTP packets out of the received data, and finally broadcast the RTP packets to the network (you can use here broadcast or unicast mode, as you wish).
Basically it boils down to the following (simplified code):
// Create a MediaRecorder
MediaRecorder mr = new MediaRecorder();
// (Initialize mr as usual)
// Create a LocalServerSocket
LocalServerSocket lss = new LocalServerSocket("foobar");
// Connect both end of this socket
LocalSocket sender = lss.accept();
LocalSocket receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress("foobar"));
// Set the output of the MediaRecorder to the sender socket file descriptor
mr.setOutputFile(sender.getFileDescriptor());
// Start the video recording:
mr.start();
// Launch a background thread that will loop,
// reading from the receiver socket,
// and creating a RTP packet out of read data.
RtpSocket rtpSocket = new RtpSocket();
InputStream in = receiver.getInputStream();
while(true) {
fis.read(buffer, ...);
// Here some data manipulation on the received buffer ...
RtpPacket rtp = new RtpPacket(buffer, ...);
rtpSocket.send(rtpPacket);
}
The implementation of RtpPacket and RtpSocket classes (rather simple), and the exact code which manipulate the video stream content can be found in the SipDroid project (especially VideoCamera.java).