Streaming audio and video from Android to PC/web. - java

I am recent beginner to Android SDK, and the overall goal of this project is to create an app very similar to Ustream's or Qik's (yeah, I know not the best idea for a beginner). I need to stream live audio and video to the web. There will be a video server, most likely using Wowza, handling the encoding of the videos to the proper format.
From what I have found so far, I need to use android's MediaRecorder with the camera as the source and direct the output to the server. That makes sense to me, but I do not know exactly how to go about doing that. Can anyone give me a push in the right direction? I have browsed through an example at "http://ipcamera-for-android.googlecode.com/svn/trunk", but that appears to be far more complicated than necessary for what I need to do and I have been unable to get it working in eclipse to test it anyway.

Doing so is not simple but possible.
The MediaRecorder API assumes that the output is a random access file, meaning, it can got forth and back for writing the mp4 (or other) file container.
As you can see in the ipcamera-for-android, the output file is directed to a socket which is not random access.
The fact makes it hard to parse the outgoing stream since the MediaRecorder API will "write" some data like fps, sps/pps (on h264) and so on only when the recording is done. The API will try to seek back to the beginning of the stream (where the file header exists) but it will fail since the stream is sent to a socket and not to a file.
Taking the ipcamera-for-android is a good reference, if I recall correctly, before streaming, it records a video to a file, opens the header and takes what it needs from there, than, it start recording to the socket and uses the data it took from the header in order to parse the stream.
You will also need some basic understanding in parsing mp4 (or other file container you'd want to use) in order to capture the frames.
You can do that either on the device or on the server side.
Here is a good start for writing the stream to a socket:
Tutorial
I hope it was helpful, there is no good tutorial for parsing and decoding the outgoing streams since it is not so simple...but again, it is possible with some effort.
Take a look also here to see how to direct the output stream to a stream that can be sent to the server:
MediaRecorder Question

SipDroid does exactly what you need.
It involves a hack to circumvent the limitation of the MediaRecorder class which require a file descriptor. It saves the result of the MediaRecorder video stream to a local socket (used as a kind of pipe), then re-read (in the same application but another thread) from this socket on the other end, create RTP packets out of the received data, and finally broadcast the RTP packets to the network (you can use here broadcast or unicast mode, as you wish).
Basically it boils down to the following (simplified code):
// Create a MediaRecorder
MediaRecorder mr = new MediaRecorder();
// (Initialize mr as usual)
// Create a LocalServerSocket
LocalServerSocket lss = new LocalServerSocket("foobar");
// Connect both end of this socket
LocalSocket sender = lss.accept();
LocalSocket receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress("foobar"));
// Set the output of the MediaRecorder to the sender socket file descriptor
mr.setOutputFile(sender.getFileDescriptor());
// Start the video recording:
mr.start();
// Launch a background thread that will loop,
// reading from the receiver socket,
// and creating a RTP packet out of read data.
RtpSocket rtpSocket = new RtpSocket();
InputStream in = receiver.getInputStream();
while(true) {
fis.read(buffer, ...);
// Here some data manipulation on the received buffer ...
RtpPacket rtp = new RtpPacket(buffer, ...);
rtpSocket.send(rtpPacket);
}
The implementation of RtpPacket and RtpSocket classes (rather simple), and the exact code which manipulate the video stream content can be found in the SipDroid project (especially VideoCamera.java).

Related

How to continuously read an online/web file in java?

I am making a simple text-only instant messenger in java. The way that it currently works is that all the messages are put into an online text file by sending a php file a request (I know this is bad for security, but this is just to learn how to do web-connected apps in java.)
Currently, I am continually fetching the entire contents of the messages.txt file and placing them into my JTextPane like this:
while(true) {
URL url = new URL("{path to text file}");
InputStream in = url.openStream();
Scanner s = new Scanner(in).useDelimiter("\\A");
String conversation = s.hasNext() ? s.next() : "";
textPane.setText(conversation);
}
But when the conversation becomes long enough, it lags as it is fetching 100kb+ files constantly from a web server.
What I want to happen is to only read the changes to the file so that it doesn't lag and max out my internet connection by requesting enourmous amounts of plain text files. I don't want to just make it run every 2 seconds because it's an instant messenger, no delays.
How would I go around only fetching the changes to the file and adding them to the text pane?
Can you push the file contents to another file once it crosses a certain threshold. This will ensure that you operate upon a smaller file.
Instead of contacting url directly you could place some server side script that would call the file
You could (as a client) provide the server script the last line / message identifier and the server could respond with new messages that have been added after the message id that the client has provided. In addition it could send the new last message id
With this approach the server side script doesnt even need to read itself the whole file and instead can immediately skip to the required line if message id contains the information about the line in the log file
Of course this approach is really far from real scenarios but its ok for learning IMO

AWS Java SDK - using ProgressListener with TransferManager

I am looking at usage example provided in AWS SDK docs for TransferManager, in particular for the following code:
TransferManager tx = new TransferManager(
               credentialProviderChain.getCredentials());
Upload myUpload = tx.upload(myBucket, myFile.getName(), myFile);
 // Transfers also allow you to set a <code>ProgressListener</code> to receive
// asynchronous notifications about your transfer's progress.
myUpload.addProgressListener(myProgressListener);
and I am wondering whether we don't have here case of a race condition. AFAIU TransferManager works asynchronously, it may start the uploading file straight away after the Upload object creation, even before we add the listener, so if we use the snippet as provided in the docs, it seems to be possible that we won't receive all notifications. I've looked briefly into the addProgressListener and I don't see there that past events would be replayed on attaching a new listener. Am I wrong? Am I missing something?
If you need to get ALL events, I imagine this can be achieved using a different upload method that takes in a ProgressListener as a parameter. Of course, using this method will require encapsulating your bucketname, key, and file into an instance of PutObjectRequest.

Java outputstream, trigger file download before all data retrieved from db

Im trying to wrap my head around Java Out/Inputstreams, closing and flushing. I have a situation where I want to create a file using Apache POI with data from a server. I would like the file to start downloading as soon as I retrieve the first record from the DB(Show the file at the bottom of the browser has started to download).
public void createExcelFile(final HttpServletResponse response,
final Long vendorId) {
try {
// setup responses types...
final XSSFWorkbook xssfWorkbook = new XSSFWorkbook();
final XSSFSheet sheet = xssfWorkbook.createSheet("sheets1");
// create file with data
writeExcelOutputData(sheet, xssfWorkbook);
xssfWorkbook.write(response.getOutputStream());
xssfWorkbook.close();
}
catch (final Exception e) {
LOGGER.error("Boom");
}
The above code will perform a file download no problem, but this could be a big file. I go off getting the data(around 20/30s) and then after that the download begins < no good...
Can I achive what I need or whats the best approach, thanks in advance
Cheers :)
Reasons could be as following:
maybe there is a read/write timeout with your http server, then if the process gets lengthy or becasue of low-bandwidth, so the connection will be closed by the server.
make sure the process(the excel work) gets completely done, maybe there would be an error/exception during work.
The solution of Jorge looks very promising. User need once request for a file, then server would do the work in background and then either user check the work process and download the file if ready, or server informs the user by email, web notification, etc...
Also you would keep the file in the server in a temp file for a while, and if the connection gets interrupted, server would respond the generated file partial(omit the bytes sent, like normal file download)
Keeping a connection alive to do a lengthy work is not very logical.
Again, if the file gets ready fast(really fast) for download/stream, and the download interrupts, if could be becasue of read/write timeout by server, or a very bad network.

Kurento group call example : Can we record each individual user's mediapipeline separately?

In Kurento Group call example is it possible to record each individual user's mediapipeline separately?
Yes, you can record each user's WebRTC media element separately. A couple of suggestions:
Make sure you record in WEBM format, so you avoid transcoding
Start recording once the media is flowing. You can do this by listening to the MediaStateChanged event, or checking the status of media in the WebRTC element for that participant.
Consider recording your files in external storage, such as S3, to prevent running out of space
You'll have to connect the recorder to the outgoingMedia element, located in the UserSession. You can add the recorder initialisation in the constructor, and the listener for the MediaStatechangedEvent similar to the IceCandidateListener, so you start recording once media starts to flow between the client and the media server.

Video streaming in android by parcelFileDescriptor

I am succeed to record video through Mediarecorder on SD card
but i want to send this video to a server without writing to SD card.
I search it and i found the parcelFileDescriptor is the way to send
video to TCP socket
but i don't know how to receive it on server side please explain it.
here is my client side code
socket = new Socket("hostname", portnumber);
ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(socket);
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mPreview = new Preview(VideoRecorder.this,recorder);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(mPreview);
I want to receive it on server side and play it to create areal time video transer.
knowing that
"The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms, where each atom starts with its size. There are different kinds of atoms in a file, mdat atoms store the actual raw frames of the encoded video and audio. In the Cupcake version Android starts writing out an mdat atom with the encoded frames, but it has to leave the size of the atom empty for obvious reasons. When writing to a seekable file descriptor, it can simply fill in the blanks after the recording, but of course socket file descriptors are not seekable. So the received stream will have to be fixed up after the recording is finished, or the raw video / audio frames have to be processed by the server".
I want a server(may be Android handset or PC) side code.
if there is another way please help me......
Thanks
In order to stream from android or pc you need to implement protocol over which the stream is carried over and server. There are several of them like HSL, RTPS etc (more http://en.wikipedia.org/wiki/Streaming_media). It is not a trivial problem, and there are only very few successful streaming service from android.
You can check how to implement and steaming service on android here: https://github.com/fyhertz/libstreaming
The library is broken for Android 5, but works for 4.4.x

Categories