Video streaming in android by parcelFileDescriptor - java

I am succeed to record video through Mediarecorder on SD card
but i want to send this video to a server without writing to SD card.
I search it and i found the parcelFileDescriptor is the way to send
video to TCP socket
but i don't know how to receive it on server side please explain it.
here is my client side code
socket = new Socket("hostname", portnumber);
ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(socket);
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mPreview = new Preview(VideoRecorder.this,recorder);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(mPreview);
I want to receive it on server side and play it to create areal time video transer.
knowing that
"The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms, where each atom starts with its size. There are different kinds of atoms in a file, mdat atoms store the actual raw frames of the encoded video and audio. In the Cupcake version Android starts writing out an mdat atom with the encoded frames, but it has to leave the size of the atom empty for obvious reasons. When writing to a seekable file descriptor, it can simply fill in the blanks after the recording, but of course socket file descriptors are not seekable. So the received stream will have to be fixed up after the recording is finished, or the raw video / audio frames have to be processed by the server".
I want a server(may be Android handset or PC) side code.
if there is another way please help me......
Thanks

In order to stream from android or pc you need to implement protocol over which the stream is carried over and server. There are several of them like HSL, RTPS etc (more http://en.wikipedia.org/wiki/Streaming_media). It is not a trivial problem, and there are only very few successful streaming service from android.
You can check how to implement and steaming service on android here: https://github.com/fyhertz/libstreaming
The library is broken for Android 5, but works for 4.4.x

Related

DJI mavic pro : Receiving corrupted video when using remote controller

I am currently developing an Android application using your SDK. This application is supposed to connect to a Mavic Pro drone in order to receive its video stream and other relevant data.
This application works well when the phone is connected to the drone via WiFi, however, the video stream is corrupted when the phone is connected via the remote controller.
To receive the video frames, I use the following code :
// Callback fired when receiving a new frame of 'size' bytes
VideoFeeder.getInstance().getPrimaryVideoFeed().setCallback((bytes, size) -> {
if (codecManager != null) {
// Shows the video in a "SurfaceTexture" on the phone
codecManager.sendDataToDecoder(bytes, size, UsbAccessoryService.VideoStreamSource.Camera.getIndex());
DroneVideoFrame videoFrame = new DroneVideoFrame(bytes, size, getVideoWidth(), getVideoHeight());
}
});
As mentioned above, this snippet works perfectly when connecting to the drone via WiFi. To be exact, each frame contains ~2000 bytes of data, the video is in 1280x720p in 24fps. The resulting video quality is perfect.
However, when using the remote controller, the data I get is completely different. While the "size" variable tells me that the received frame weighs ~2000 bytes, the frame itself (contained in the variable named "bytes") weighs more than 30 kilobytes. Moreover, this 30 Kb frame seems to be corrupted as it mostly contains what I recognize as buffer bytes (a long sequence of 0's).
Also, the functions "getVideoWidth" and "getVideoHeight" return respectively "9px" and "16px", which is obviously wrong. Moreover thoses function return correct values when using the drone WiFi.
What I have tried :
Update the firmware
Update DJI Go 4 application.
Truncate the buffer bytes (sequences of '0') seen in the frame. This results in a video full of artifacts as seen in the following image
System information :
Drone : DJI Mavic Pro, firmware up to date as of 09 July 2018
Phone : Panasonic FZ-N1 "Toughpad" - Android version : 6.0.1
Would you have any idea what causes that corruption ?
You have to rule out the problem one by one
(1) Can try downgrade one version? it would be a firmware issue. DJI is known to have this sort of problem. Latest doesn't mean safest. For consumer product review, you can refer to here https://forum.dji.com/thread-120739-1-1.html.
If you are a DJI partner, you can call them to confirm on firmware. Today we just had some firmware issue with M200 and PSDK. And we msgs them and they quickly replied. We have to change the firmware to enable gimbal power control for DJI PSDK.
(2) Change a RF spectrum, say from 15 to 18. To make sure it is not because of some frequency hopping or intentional jamming from other people
(3) If changing firmware version and RF doesn't help, try to borrow a 2nd set and run the same code to rule out the hardware issue e.g broken rf link.
(4) If you borrowed a drone and they all have the same problem, then it could be somewhere in your code, there is a bug.
Thats all i can think so far. Ill add if I remember something else

This code work only for video, not for streaming

My problem is reading videos and streaming on the net. If I know the location of a video file and I want to read it I just do:
String Url = "https://www.w3schools.com/html/mov_bbb.mp4";
VideoView videoView;
protected void onCreate (Bundle savedInstanceState) {
...
videoView = (VideoView) findViewById (R.id.loadVideo);
videoView.setVideoURI (Uri.parse (this.Url));
videoView.start ();
But if the video is a stream, for example sent from my computer, I know the address (example: 127.0.0.1:8080 , yeah is random example i know this is localhost and can't work in localhost, but i can't use the real address of video sorry) and I can play from windows via Vlc quietly, the video is not played, the code ends up "catch", then the same syntax can not be used for video streaming via http address.
I looked everywhere on the internet, but I did not find any working solution and many plugins seen are now obsolete and / or unusable with android studio. I'm not a professional Android programmer, but I've been asked to do it, do you know how I can do it?
I forgot: the intention is to create a live stream

Kurento group call example : Can we record each individual user's mediapipeline separately?

In Kurento Group call example is it possible to record each individual user's mediapipeline separately?
Yes, you can record each user's WebRTC media element separately. A couple of suggestions:
Make sure you record in WEBM format, so you avoid transcoding
Start recording once the media is flowing. You can do this by listening to the MediaStateChanged event, or checking the status of media in the WebRTC element for that participant.
Consider recording your files in external storage, such as S3, to prevent running out of space
You'll have to connect the recorder to the outgoingMedia element, located in the UserSession. You can add the recorder initialisation in the constructor, and the listener for the MediaStatechangedEvent similar to the IceCandidateListener, so you start recording once media starts to flow between the client and the media server.

OpenCV IP Camera RTSP stream

I'm trying to access a RTSP video stream from an IP camera using OpenCV and Java. I can access the stream using VLC player with the following format: rtsp://192.168.1.10:554/rtsp_live0 but when I try to use OpenCV the video stream seems to always be closed.
The code I'm using... (simplified)
VideoCapture capture = new VideoCapture();
capture.open("rtsp://192.168.1.10:554/rtsp_live0");
while(!capture.isOpened())
System.out.print("Not opened :( \r");
I have a Mustcam H806P and found the stream URI from this website: http://www.ispyconnect.com/man.aspx?n=ipcamera
What am I doing wrong?
I'm reporting Alexander Smorkalov answer on answers.opencv.org
OpenCV uses ffmpeg library for video I/O. Try to get video stream with console ffmpeg tool. The address must be the same.
See also here OpenCV - how to capture rtsp video stream

Streaming audio and video from Android to PC/web.

I am recent beginner to Android SDK, and the overall goal of this project is to create an app very similar to Ustream's or Qik's (yeah, I know not the best idea for a beginner). I need to stream live audio and video to the web. There will be a video server, most likely using Wowza, handling the encoding of the videos to the proper format.
From what I have found so far, I need to use android's MediaRecorder with the camera as the source and direct the output to the server. That makes sense to me, but I do not know exactly how to go about doing that. Can anyone give me a push in the right direction? I have browsed through an example at "http://ipcamera-for-android.googlecode.com/svn/trunk", but that appears to be far more complicated than necessary for what I need to do and I have been unable to get it working in eclipse to test it anyway.
Doing so is not simple but possible.
The MediaRecorder API assumes that the output is a random access file, meaning, it can got forth and back for writing the mp4 (or other) file container.
As you can see in the ipcamera-for-android, the output file is directed to a socket which is not random access.
The fact makes it hard to parse the outgoing stream since the MediaRecorder API will "write" some data like fps, sps/pps (on h264) and so on only when the recording is done. The API will try to seek back to the beginning of the stream (where the file header exists) but it will fail since the stream is sent to a socket and not to a file.
Taking the ipcamera-for-android is a good reference, if I recall correctly, before streaming, it records a video to a file, opens the header and takes what it needs from there, than, it start recording to the socket and uses the data it took from the header in order to parse the stream.
You will also need some basic understanding in parsing mp4 (or other file container you'd want to use) in order to capture the frames.
You can do that either on the device or on the server side.
Here is a good start for writing the stream to a socket:
Tutorial
I hope it was helpful, there is no good tutorial for parsing and decoding the outgoing streams since it is not so simple...but again, it is possible with some effort.
Take a look also here to see how to direct the output stream to a stream that can be sent to the server:
MediaRecorder Question
SipDroid does exactly what you need.
It involves a hack to circumvent the limitation of the MediaRecorder class which require a file descriptor. It saves the result of the MediaRecorder video stream to a local socket (used as a kind of pipe), then re-read (in the same application but another thread) from this socket on the other end, create RTP packets out of the received data, and finally broadcast the RTP packets to the network (you can use here broadcast or unicast mode, as you wish).
Basically it boils down to the following (simplified code):
// Create a MediaRecorder
MediaRecorder mr = new MediaRecorder();
// (Initialize mr as usual)
// Create a LocalServerSocket
LocalServerSocket lss = new LocalServerSocket("foobar");
// Connect both end of this socket
LocalSocket sender = lss.accept();
LocalSocket receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress("foobar"));
// Set the output of the MediaRecorder to the sender socket file descriptor
mr.setOutputFile(sender.getFileDescriptor());
// Start the video recording:
mr.start();
// Launch a background thread that will loop,
// reading from the receiver socket,
// and creating a RTP packet out of read data.
RtpSocket rtpSocket = new RtpSocket();
InputStream in = receiver.getInputStream();
while(true) {
fis.read(buffer, ...);
// Here some data manipulation on the received buffer ...
RtpPacket rtp = new RtpPacket(buffer, ...);
rtpSocket.send(rtpPacket);
}
The implementation of RtpPacket and RtpSocket classes (rather simple), and the exact code which manipulate the video stream content can be found in the SipDroid project (especially VideoCamera.java).

Categories