Recording audio not to file on Android - java

I want android.media.MediaRecorder. to record audio not into file, but into same variable, for example char[ ] or byte[ ] or some other datta buffer structure. I want to send it to the remote server via Wi-Fi, can android.media.MediaRecorder provide this functionality?

What you can do here is utilize the ParcelFileDescriptor class.
//make a pipe containing a read and a write parcelfd
ParcelFileDescriptor[] fdPair = ParcelFileDescriptor.createPipe();
//get a handle to your read and write fd objects.
ParcelFileDescriptor readFD = fdPair[0];
ParcelFileDescriptor writeFD = fdPair[1];
//next set your mediaRecorder instance to output to the write side of this pipe.
mediaRecorder.setOutputFile(writeFD.getFileDescriptor());
//next create an input stream to read from the read side of the pipe.
FileInputStream reader = new FileInputStream(readFD.getFileDescriptor());
//now to fill up a buffer with data, we just do a simple read
byte[] buffer = new byte[4096];//or w/e buffer size you want
//fill up your buffer with data from the stream
reader.read(buffer);// may want to do this in a separate thread
and now you have a buffer full of audio data
alternatively, you may want to write data directly to a socket from the recorder. this can also be achieved with the ParcelFileDescriptor class.
//create a socket connection to another device
Socket socket = new Socket("123.123.123.123",65535);//or w/e socket address you are using
//wrap the socket with a parcel so you can get at its underlying File descriptor
ParcelFileDescriptor socketWrapper = ParcelFileDescriptor.fromSocket(socket);
//set your mediaRecorder instance to write to this file descriptor
mediaRecorder.setOutputFile(socketWrapper.getFileDescriptor());
now any time your media recorder has data to write it will automatically write it over the socket

Related

Stream Video from a self-written Server to Android App

I want to write a "simple" Java Server-type application to stream videos to different clients. My first step would be an again "simple" Android App containing a VideoView and a MediaPlayer set to Video Streaming (More Information on Android SDK - MediaPlayer) though later i might add desktop java application too.
What i'm not sure is how i would actually do the streaming on the server. I already wrote a little Http Server processing HTTP GET requests from a client over TCP.
There i write/"stream" the files back using this coding:
FileInputStream fs = new FileInputStream(f);
final byte[] buffer = new byte[1024];
int count = 0;
//add the header information to the respone
while ((count = fs.read(buffer)) >= 0) {
os.write(buffer, 0, count);
}
os.flush();
fs.close();
os.close();
os being the OutputStream of the Response i get through the tcp socket and f being the requested file.
This seems to send the file almost completly at once though and not like i want to, stream it "in chunks".
So my questions are:
What do i, or do i have to change my coding to actually stream the video, or is it already correct this way?
When i want to make it using UDP instead of TCP would i then only put the buffer byte arrays read from the fileinputstream directly into the DatagramPacket and the MediaPlayer would know what to do with it?
PS: I know there are several questions on here about streaming in Java but none of them actually cover the server side but mainly the (in this case Android-) client side.

Sending serialized objects and byte[] from same stream

I have a client server application, they communicate throught objectoutputstream and objectinputstream. I send serialized objects from one to other, but now i want to send files also. If i pass the byte[] of file inside a serialized object it can be transmitted but the object stays at objectoutputstream and objectinputstream and after some sends if the file is big enough i get memory exception. If i send it like:
File file = new File("C:\\a.txt");
FileInputStream fis = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(fis);
byte[] buffer = new byte[1024*1024*10];
int n = -1;
while((n = bis.read(buffer))!=-1) {
oos.write(buffer,0,n);
}
and read it:
while ((fromServer = ois.read()) != null) {
}
works well.
My question is, do i have to implement a system to know if i have to writeObject/readObject or just write/read? Do i have to get rid of serialized communication, do i have to create another streams for read and write?
You have to define a protocol which doesn't have any ambiguity, and stick to this protocol on the client and server.
For example, if you send a stream of bytes, and then an object, you have no way, at the receiving side, to know when the stream of bytes ends, and when the object begins.
A protocol to sole this problem might be:
send an int (4 bytes) which is the size N of the file
send N bytes
send an object
The receiving side can then read the size (N), then read N bytes from the stream, then read the object.

Save video stream from Socket to File

I use my Android application for streaming video from phone camera to my PC Server and need to save them into file on HDD. So, file created and stream successfully saved, but the resulting file can not play with any video player (GOM, KMP, Windows Media Player, VLC etc.) - no picture, no sound, only playback errors.
I tested my Android application into phone and may say that in this instance captured video successfully stored on phone SD card and after transfer it to PC played witout errors, so, my code is correct.
In the end, I realized that the problem in the video container: data streamed from phone in MP4 format and stored in *.mp4 files on PC, and in this case, file may be incorrect for playback with video players. Can anyone suggest how to correctly save streaming video to a file?
There is my code that process and store stream data (without errors handling to simplify):
// getOutputMediaFile() returns a new File object
DataInputStream in = new DataInputStream (server.getInputStream());
FileOutputStream videoFile = new FileOutputStream(getOutputMediaFile());
int len;
byte buffer[] = new byte[8192];
while((len = in.read(buffer)) != -1) {
videoFile.write(buffer, 0, len);
}
videoFile.close();
server.close();
Also, I would appreciate if someone will talk about the possible "pitfalls" in dealing with the conservation of media streams.
Thank you, I hope for your help!
Alex.
UPD:
For record video locally to phone storage I use:
//targetFile - new File object, represents a file on phone SD card
myMediaRecorder.setOutputFile(targetFile);
And for streaming it to PC (without errors handling to simplify)
ParcelFileDescriptor pfd = null;
Socket socket = null;
String hostname = "my IP";
int port = 8081;
socket = new Socket(InetAddress.getByName(hostname), port);
pfd = ParcelFileDescriptor.fromSocket(socket);
myMediaRecorder.setOutputFile(pfd.getFileDescriptor());
Make this comment to flag question as answered: RTMP should be properly encoded and streamed, my simple socket solution is invalid and question is not correct in this sense. Related to How to encode h.264 live stream to RTP packet with Java

File transfer through Socket in java

I'm making a Network File Transfer System for transfering any kind of file over a network in java. The size also could be of any kind. Therefore I've used UTF-8 protocol for the task.
I'm providing the codes which I've made but the problem is some times the file gets transfered as it is, with no problem at all. But sometimes few kb's of data is just skipped at the receiving end, which actually restricts the mp3/video/image file to be opened correctly. I think the problem is with BUFFER. I'm not creating any buffer which, right now, I think may be of some use to me.
I would really appreciate if anyone could provide any help regarding the problem, so that the file gets transferred fully.
Client side : --->> File Sender
Socket clientsocket = new Socket(host,6789); // host contains the ip address of the remote server
DataOutputStream outtoserver = new DataOutputStream(clientsocket.getOutputStream());
try
{
int r=0;
FileInputStream fromFile1 = new FileInputStream(path); // "path" is the of the file being sent.
while(r!=-1)
{
r = fromFile1.read();
outtoserver.writeUTF(r+"");
}
}
catch(Exception e)
{
System.out.println(e.toString());
}
clientsocket.close();
Server side: --->> File Receiver
ServerSocket welcome = new ServerSocket(6789);
Socket conn = welcome.accept();
try
{
String r1 = new String();
int r=0;
FileOutputStream toFile1 = new FileOutputStream(path); // "path" is the of the file being received.
BufferedOutputStream toFile= new BufferedOutputStream(toFile1);
DataInputStream recv = new DataInputStream(conn.getInputStream());
while(r!=-1)
{
r1 = recv.readUTF();
r = Integer.parseInt(r1);
toFile.write(r);
}
}
catch(Exception e)
{
System.out.println(e.toString());
}
I don't understand why you are encoding binary data as text.
Plain sockets can send and receive streams of bytes without any problems. So, just read the file as bytes using a FileInputStream and write the bytes to the socket as-is.
(For the record, what you are doing is probably sending 3 to 5 bytes for each byte of the input file. And you are reading the input file one byte at a type without any buffering. These mistakes and others you have made are likely to have a significant impact on file transfer speed. The way to get performance is to simply read and write arrays of bytes using a buffer size of at least 1K bytes.)
I'm not sure of this, but I suspect that the reason that you are losing some data is that you are not flushing or closing outtoserver before you close the socket on the sending end.
FOLLOW UP
I also noticed that you are not flushing / closing toFile on the receiver end, and that could result in you losing data at the end of the file.
The first problem I see is that you're using DataInputStream and DataOutputStream. These are for reading/writing primitive Java types (int, long etc), you don't need them for just binary data.
Another problem is that you're not flushing your file output stream - this could be causing the lost bytes.
An explicit flush might help the situation.

Write a stream into a file with NIO and the Channel system

I have an inputStream and i want to write it to a file.
I saw NIO and the FileChannel which has the method "transferTo" ou "transferFrom" and i know how to create the WriteableChannel, but i don't know to transform my inputStream to a ReadableChannel.
Thanks.
Have a look at the Channels.newChannel(java.io.InputStream) method.
newChannel
public static ReadableByteChannel newChannel(InputStream in)
Constructs a channel that reads bytes from the given stream.
The resulting channel will not be buffered; it will simply redirect
its I/O operations to the given stream. Closing the channel will in
turn cause the stream to be closed.
Parameters:in - The stream from which bytes are to be read
Returns:A new readable byte channel

Categories