Through WebSocket, I'm getting chunks of a video file that I want it to play immediately in ExoPlayer. In fact, I'm trying to implement video file streaming which works very well on the web page with MediaSource and appendBuffer which update the player with a new chunk. So I want something similar to that.
Here is where I'm receiving chunks
#Override
public void onMessage(WebSocket webSocket, ByteString bytes) {
// here is chunk
// bytes.hex()
// I want to update exoplayer or to start exoplayer in case it is the first chunk
}
I've been looking for how to do it but I didn't find a solution. So I really need help
Related
I want to know how to receive like 100 bytes of data in my android device, I send from my program:
public void sendVoucherInformationToPhone(String data) {
this.targetDevice.openPort();
this.targetDevice.writeBytes(data.getBytes(), data.getBytes().length);
this.targetDevice.closePort();
}
//the library used in the program is JSerialComm
Screenshot of the program:
I can detect the device itself, But as I said, I have no Idea how to receive the bytes on the other side, so is there any help you can offer me?
(I did read the doc in the official android website but didn't get it, so if any example code or even pseudocode would be so helpful)
thank you
My problem is reading videos and streaming on the net. If I know the location of a video file and I want to read it I just do:
String Url = "https://www.w3schools.com/html/mov_bbb.mp4";
VideoView videoView;
protected void onCreate (Bundle savedInstanceState) {
...
videoView = (VideoView) findViewById (R.id.loadVideo);
videoView.setVideoURI (Uri.parse (this.Url));
videoView.start ();
But if the video is a stream, for example sent from my computer, I know the address (example: 127.0.0.1:8080 , yeah is random example i know this is localhost and can't work in localhost, but i can't use the real address of video sorry) and I can play from windows via Vlc quietly, the video is not played, the code ends up "catch", then the same syntax can not be used for video streaming via http address.
I looked everywhere on the internet, but I did not find any working solution and many plugins seen are now obsolete and / or unusable with android studio. I'm not a professional Android programmer, but I've been asked to do it, do you know how I can do it?
I forgot: the intention is to create a live stream
I had a problem with my videoview.
When I try play video from specific URL at my API-27 emulator Android show me message dialog
Can't play this video
That's it what I get in Logcat
source returned error -1010, 0 retries left
initFromDataSource, source has no track!
Failed to init from data source!
MediaPlayerNative: error (1, -2147483648)
MediaPlayer: Error (1,-2147483648)
That's my code where I use my videoview
mVideoView = findViewById(R.id.videoView);
mMediaController = new MediaController(this);
mVideoView.setVideoPath("https://clips.vorwaerts-gmbh.de/VfE_html5.mp4");
mVideoView.requestFocus();
initListeners();
initListeners method
mVideoView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mVideoView.setMediaController(mMediaController);
mVideoView.setBackground(null);
mMediaController.setAnchorView(mVideoView);
mMediaController.show();
mVideoView.start();
}
});
mVideoView.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
mVideoView.setBackground(getDrawable(R.drawable.webinar_photo_preview));
}
});
I test my videoview feature at api23, api24 and everything sounds good.
Can somebody tell me what I'm doing wrong?
EDITED: Now I found that the error also appears on api24
Problem was in VideoView, setVideoPath set videos only with little file size (1 - 2 MB) and if it size is bigger MediaPlayer crashes with MEDIA_ERROR_SYSTEM (-2147483648) - (low-level system error), read in documentation . That why I start using exoPlayer.
Its look like your code is working fine and maybe the problem is that you are not using cookies that are send to browser at the time of request.
In simple terms this video is not for direct access via code.
If you still wants to try the steps below.
Make http request to link (https://clips.vorwaerts-gmbh.de/VfE_html5.mp4) and store the received cookies.
Use the received cookies with your next request when you want to play the video.
Note - If you want to use cookies to play video it can be done via ExoPlayer(https://github.com/google/ExoPlayer).
I am succeed to record video through Mediarecorder on SD card
but i want to send this video to a server without writing to SD card.
I search it and i found the parcelFileDescriptor is the way to send
video to TCP socket
but i don't know how to receive it on server side please explain it.
here is my client side code
socket = new Socket("hostname", portnumber);
ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(socket);
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mPreview = new Preview(VideoRecorder.this,recorder);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(mPreview);
I want to receive it on server side and play it to create areal time video transer.
knowing that
"The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms, where each atom starts with its size. There are different kinds of atoms in a file, mdat atoms store the actual raw frames of the encoded video and audio. In the Cupcake version Android starts writing out an mdat atom with the encoded frames, but it has to leave the size of the atom empty for obvious reasons. When writing to a seekable file descriptor, it can simply fill in the blanks after the recording, but of course socket file descriptors are not seekable. So the received stream will have to be fixed up after the recording is finished, or the raw video / audio frames have to be processed by the server".
I want a server(may be Android handset or PC) side code.
if there is another way please help me......
Thanks
In order to stream from android or pc you need to implement protocol over which the stream is carried over and server. There are several of them like HSL, RTPS etc (more http://en.wikipedia.org/wiki/Streaming_media). It is not a trivial problem, and there are only very few successful streaming service from android.
You can check how to implement and steaming service on android here: https://github.com/fyhertz/libstreaming
The library is broken for Android 5, but works for 4.4.x
I am recent beginner to Android SDK, and the overall goal of this project is to create an app very similar to Ustream's or Qik's (yeah, I know not the best idea for a beginner). I need to stream live audio and video to the web. There will be a video server, most likely using Wowza, handling the encoding of the videos to the proper format.
From what I have found so far, I need to use android's MediaRecorder with the camera as the source and direct the output to the server. That makes sense to me, but I do not know exactly how to go about doing that. Can anyone give me a push in the right direction? I have browsed through an example at "http://ipcamera-for-android.googlecode.com/svn/trunk", but that appears to be far more complicated than necessary for what I need to do and I have been unable to get it working in eclipse to test it anyway.
Doing so is not simple but possible.
The MediaRecorder API assumes that the output is a random access file, meaning, it can got forth and back for writing the mp4 (or other) file container.
As you can see in the ipcamera-for-android, the output file is directed to a socket which is not random access.
The fact makes it hard to parse the outgoing stream since the MediaRecorder API will "write" some data like fps, sps/pps (on h264) and so on only when the recording is done. The API will try to seek back to the beginning of the stream (where the file header exists) but it will fail since the stream is sent to a socket and not to a file.
Taking the ipcamera-for-android is a good reference, if I recall correctly, before streaming, it records a video to a file, opens the header and takes what it needs from there, than, it start recording to the socket and uses the data it took from the header in order to parse the stream.
You will also need some basic understanding in parsing mp4 (or other file container you'd want to use) in order to capture the frames.
You can do that either on the device or on the server side.
Here is a good start for writing the stream to a socket:
Tutorial
I hope it was helpful, there is no good tutorial for parsing and decoding the outgoing streams since it is not so simple...but again, it is possible with some effort.
Take a look also here to see how to direct the output stream to a stream that can be sent to the server:
MediaRecorder Question
SipDroid does exactly what you need.
It involves a hack to circumvent the limitation of the MediaRecorder class which require a file descriptor. It saves the result of the MediaRecorder video stream to a local socket (used as a kind of pipe), then re-read (in the same application but another thread) from this socket on the other end, create RTP packets out of the received data, and finally broadcast the RTP packets to the network (you can use here broadcast or unicast mode, as you wish).
Basically it boils down to the following (simplified code):
// Create a MediaRecorder
MediaRecorder mr = new MediaRecorder();
// (Initialize mr as usual)
// Create a LocalServerSocket
LocalServerSocket lss = new LocalServerSocket("foobar");
// Connect both end of this socket
LocalSocket sender = lss.accept();
LocalSocket receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress("foobar"));
// Set the output of the MediaRecorder to the sender socket file descriptor
mr.setOutputFile(sender.getFileDescriptor());
// Start the video recording:
mr.start();
// Launch a background thread that will loop,
// reading from the receiver socket,
// and creating a RTP packet out of read data.
RtpSocket rtpSocket = new RtpSocket();
InputStream in = receiver.getInputStream();
while(true) {
fis.read(buffer, ...);
// Here some data manipulation on the received buffer ...
RtpPacket rtp = new RtpPacket(buffer, ...);
rtpSocket.send(rtpPacket);
}
The implementation of RtpPacket and RtpSocket classes (rather simple), and the exact code which manipulate the video stream content can be found in the SipDroid project (especially VideoCamera.java).