I use my Android application for streaming video from phone camera to my PC Server and need to save them into file on HDD. So, file created and stream successfully saved, but the resulting file can not play with any video player (GOM, KMP, Windows Media Player, VLC etc.) - no picture, no sound, only playback errors.
I tested my Android application into phone and may say that in this instance captured video successfully stored on phone SD card and after transfer it to PC played witout errors, so, my code is correct.
In the end, I realized that the problem in the video container: data streamed from phone in MP4 format and stored in *.mp4 files on PC, and in this case, file may be incorrect for playback with video players. Can anyone suggest how to correctly save streaming video to a file?
There is my code that process and store stream data (without errors handling to simplify):
// getOutputMediaFile() returns a new File object
DataInputStream in = new DataInputStream (server.getInputStream());
FileOutputStream videoFile = new FileOutputStream(getOutputMediaFile());
int len;
byte buffer[] = new byte[8192];
while((len = in.read(buffer)) != -1) {
videoFile.write(buffer, 0, len);
}
videoFile.close();
server.close();
Also, I would appreciate if someone will talk about the possible "pitfalls" in dealing with the conservation of media streams.
Thank you, I hope for your help!
Alex.
UPD:
For record video locally to phone storage I use:
//targetFile - new File object, represents a file on phone SD card
myMediaRecorder.setOutputFile(targetFile);
And for streaming it to PC (without errors handling to simplify)
ParcelFileDescriptor pfd = null;
Socket socket = null;
String hostname = "my IP";
int port = 8081;
socket = new Socket(InetAddress.getByName(hostname), port);
pfd = ParcelFileDescriptor.fromSocket(socket);
myMediaRecorder.setOutputFile(pfd.getFileDescriptor());
Make this comment to flag question as answered: RTMP should be properly encoded and streamed, my simple socket solution is invalid and question is not correct in this sense. Related to How to encode h.264 live stream to RTP packet with Java
Related
I am quite new to StackOverflow, if my question is inappropriately asked or confusing, please let me know, thank you!
I am working on an audio streaming project, in which the clients are allowed to upload their mp3 files to the server. The server will store them into a playlist and stream the songs back to all the clients.
Here is my code for the client to upload the mp3:
public static void sendPackets(){
System.out.println("Sending test file...");
try{
while (active){
//The song needs to be uploaded;
File file = new File("Sorrow.mp3")
FileInputStream fis = new FileInputStream(file);
byte[] byteStream = new byte[(int) file.length()];
//Trying to convert mp3 to byteStream
fis.close();
InetAddress destination = InetAddress.getByName("localhost");
DatagramPacket sendingAirMail = new DatagramPacket(byteStream, byteStream.length, destination, 50010); // 50010 is the listening port
serverSocket.send(sendingAirMail); // sending the entire bytestream via UDP
// ServerSocket is a DatagramSocket
break;
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
The Problem lies here:
serverSocket.send(sendingAirMail);
As it will give me this error:
java.net.SocketException: The message is larger than the maximum supported by the underlying transport: Datagram send failed
at java.base/java.net.DualStackPlainDatagramSocketImpl.socketSend(Native Method)
at java.base/java.net.DualStackPlainDatagramSocketImpl.send(DualStackPlainDatagramSocketImpl.java:136)
at java.base/java.net.DatagramSocket.send(DatagramSocket.java:695)
at client.sendPackets(client.java:116)
at client$2.run(client.java:67)
After looking up google, I learned that it is because the UDP has a limit of size in each package delivered, so I wish to know how to separate the UDP package properly in this case? I know the TCP will be better in this case, but I think I need to learn how to separate packages anyway because I need to stream back the byte arrays from the server using UDP. Any help will be appreciated!
I can post my server and other client information if needed.
The thing is that you need to cut your file into several pieces and deliver them. For UDP, you need some things to make sure the file is complete and correct. Here are some suggestions:
First, you need to cut your file, so you need to give a seq flag in the head. What's more you may need some extra infomations like the whole size of file, the timestamp and so on.
struct msg {
int seq;
int total_seq;
int size;
void *data;
};
Then, it's better to build a send buffer and recive buffer, check every time if the buffer is empty, if not, send/receive it.
After receiving some pieces, you need to rebuild them using the seq flag. When some seq gets lost, you need retransmission. So you need a retransmission design here.
In a word, you need the following things at least:
A user defined head before information
cut/rebuild file
retransmission(GBN or FEC or both)
Hope that can help you.
To overcome Chromecast's restriction on streaming from self-certificated https servers (in my case the Subsonic music server) I'm utilizing an instance of the NanoHTTPD server already running as part of my Android app. The idea is to stream from the Subsonic server (SSL) and connect that stream to a new stream (non SSL) for the NanoHTTP.Response back to Chromecast. I have the InputStream working from the Subsonic server (which plays through the MediaPlayer) but don't know how to re-stream unencrypted for the following call: new NanoHTTPD.Response(NanoHTTPD.Response.Status.OK, "audio/mpeg", newStream); So in a nutshell, how do I convert an https encrypted audio stream to a decrypted audio stream on the fly?
Ok, managed to get all of this working. Subsonic server using https, accessible from Android and Chromecast using the server's self-signed certificate. If https is on then both Android and Chromecast use a nanohttpd proxy server running on the Android client to stream to the Android MediaPlayer and the html5 audio element respectively. The serve function override of the nanohttpd server running on Android contains the following code:-
int filesize = .... obtained from Subsonic server for the track to be played
// establish the https connection using the self-signed certificate
// placed in the Android assets folder (code not shown here)
HttpURLConnection con = _getConnection(subsonic_url,"subsonic.cer")
// Establish the InputStream from the Subsonic server and
// the Piped Streams for re-serving the unencrypted data
// back to the requestor
final InputStream is = con.getInputStream();
PipedInputStream sink = new PipedInputStream();
final PipedOutputStream source = new PipedOutputStream(sink);
// On a separate thread, read from Subsonic and write to the pipe
Thread t = new Thread(new Runnable() {
public void run () {
try {
byte[] b = new byte[1024];
int len;
while ((len = is.read(b,0,1024)) != -1)
source.write(b, 0, len);
source.flush();
}
catch (IOException e) {
}
}});
t.start();
sleep(200); // just to let the PipedOutputStream start up
// Return the PipedInputStream to the requestor.
// Important to have the filesize argument
return new NanoHTTPD.Response(NanoHTTPD.Response.Status.OK,
"audio/mpeg", sink, filesize);
I found that streaming flac files with mp3 transcoding on gave me the flac filesize but of course the mp3 stream. This proved difficult to handle for the html5 audio element so I reverted to adding &format=raw to the Subsonic api call for the stream. So regardless of user's configuration over https I stream raw format and it all seems to be working well so far in testing.
I want to write a "simple" Java Server-type application to stream videos to different clients. My first step would be an again "simple" Android App containing a VideoView and a MediaPlayer set to Video Streaming (More Information on Android SDK - MediaPlayer) though later i might add desktop java application too.
What i'm not sure is how i would actually do the streaming on the server. I already wrote a little Http Server processing HTTP GET requests from a client over TCP.
There i write/"stream" the files back using this coding:
FileInputStream fs = new FileInputStream(f);
final byte[] buffer = new byte[1024];
int count = 0;
//add the header information to the respone
while ((count = fs.read(buffer)) >= 0) {
os.write(buffer, 0, count);
}
os.flush();
fs.close();
os.close();
os being the OutputStream of the Response i get through the tcp socket and f being the requested file.
This seems to send the file almost completly at once though and not like i want to, stream it "in chunks".
So my questions are:
What do i, or do i have to change my coding to actually stream the video, or is it already correct this way?
When i want to make it using UDP instead of TCP would i then only put the buffer byte arrays read from the fileinputstream directly into the DatagramPacket and the MediaPlayer would know what to do with it?
PS: I know there are several questions on here about streaming in Java but none of them actually cover the server side but mainly the (in this case Android-) client side.
I want android.media.MediaRecorder. to record audio not into file, but into same variable, for example char[ ] or byte[ ] or some other datta buffer structure. I want to send it to the remote server via Wi-Fi, can android.media.MediaRecorder provide this functionality?
What you can do here is utilize the ParcelFileDescriptor class.
//make a pipe containing a read and a write parcelfd
ParcelFileDescriptor[] fdPair = ParcelFileDescriptor.createPipe();
//get a handle to your read and write fd objects.
ParcelFileDescriptor readFD = fdPair[0];
ParcelFileDescriptor writeFD = fdPair[1];
//next set your mediaRecorder instance to output to the write side of this pipe.
mediaRecorder.setOutputFile(writeFD.getFileDescriptor());
//next create an input stream to read from the read side of the pipe.
FileInputStream reader = new FileInputStream(readFD.getFileDescriptor());
//now to fill up a buffer with data, we just do a simple read
byte[] buffer = new byte[4096];//or w/e buffer size you want
//fill up your buffer with data from the stream
reader.read(buffer);// may want to do this in a separate thread
and now you have a buffer full of audio data
alternatively, you may want to write data directly to a socket from the recorder. this can also be achieved with the ParcelFileDescriptor class.
//create a socket connection to another device
Socket socket = new Socket("123.123.123.123",65535);//or w/e socket address you are using
//wrap the socket with a parcel so you can get at its underlying File descriptor
ParcelFileDescriptor socketWrapper = ParcelFileDescriptor.fromSocket(socket);
//set your mediaRecorder instance to write to this file descriptor
mediaRecorder.setOutputFile(socketWrapper.getFileDescriptor());
now any time your media recorder has data to write it will automatically write it over the socket
I am attempting to transfer files (MP3s about six megabytes in size) between two PCs using SPP over Bluetooth (in Java, with the BlueCove API). I can get the file transfer working fine in one direction (for instance, one file from the client to the server), but when I attempt to send any data in the opposite direction during the same session (i.e., send a file from the server to the client), the program freezes and will not advance.
For example, if I simply:
StreamConnection conn;
OutputStream outputStream;
outputStream = conn.openOutputStream();
....
outputStream.write(data); //Data here is an MP3 file converted to byte array
outputStream.flush();
The transfer works fine. But if I try:
StreamConnection conn;
OutputStream outputStream;
InputStream inputStream;
ByteArrayOutputStream out = new ByteArrayOutputStream();
outputStream = conn.openOutputStream();
inputStream = conn.openInputStream();
....
outputStream.write(data);
outputStream.flush();
int receiveData;
while ((receiveData = inputStream.read()) != -1) {
out.write(receiveData);
}
Both the client and the server freeze, and will not advance. I can see that the file transfer is actually happening at some point, because if I kill the client, the server will still write the file to the hard drive, with no issues. I can try to respond with another file, or with just an integer, and it still will not work.
Anyone have any ideas what the problem is? I know OBEX is commonly used for file transfers over Bluetooth, but it seemed overkill for what I needed to do. Am I going to have to use OBEX for this functionality?
It could be as simple as both programs stuck in blocking receive calls, waiting for the other end to say something... try adding a ton of log statements so you can see what "state" each program is in (ie, so it gives you a running commentary such as "trying to recieve", "got xxx data", "trying to reply", etc), or set up debugging, wait until it gets stuck and then stop one of them and single step it.
you can certainly use SPP to transfer file between your applications (assuming you are sending and receiving at both ends using your application). From the code snippet it is difficult to tell what is wrong with your program.
I am guessing that you will have to close the stream as an indication to the other side that you are done with sending the data .. Note even though you write the whole file in one chunk, SPP / Bluetooth protocol layers might fragment it and the other end could receive in fragments, so you need to have some protocol to indicate transfer completion.
It is hard to say without looking at the client side code, but my guess, if the two are running the same code (i.e. both writing first, and then reading), is that the outputStream needs to be closed before the reading occurs (otherwise, both will be waiting for the other to close their side in order to get out of the read loop, since read() only returns -1 when the other side closes).
If the stream should not be closed, then the condition to stop reading cannot be to wait for -1. (so, either change it to transmit the file size first, or some other mechanism).
Why did you decide to use ByteArrayOutputStream? Try following code:
try {
try {
byte[] buf = new byte[1024];
outputstream = conn.openOutputStream();
inputStream = conn.openInputStream();
while ((n = inputstream.read(buf, 0, 1024)) > -1)
outputstream.write(buf, 0, n);
} finally {
outputstream.close();
inputstream.close();
log.debug("Closed input streams!");
}
} catch (Exception e) {
log.error(e);
e.printStackTrace();
}
And to convert the outputStream you could do something like this:
byte currentMP3Bytes[] = outputStream.toString().getBytes();
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream(currentMP3Bytes);