Audio streaming via TCP socket on Android - java

I am streaming mic input from a C Server via socket. I know the stream works because it does with a C client and I am getting the right values on my Android client.
I am streaming a 1024 floatarray. One float are 4 bytes. So I got a incoming stream with 4096 bytes per frame. I am getting the floats out of this bytes and I know this floats are the ones I sent, so that part should work.
Now I want to get that stream directly to the phones speakers by using AudioTrack. I tried to input the bytes I received directly: just noise. I tried to cast it back to a byte array, still the same. I tried to cast that float into short (because AudioTrack takes bytes or short). I could get something that could have been my mic input (knocking), but very scratchy and and extremely laggy. I would understand if there was a lag between the frames, but I can't even get one clear sound.
I can, however, output a sin sound clearly that I produce locally and put into that shortarray.
Now I wonder if I got some issues in my code anyone of you can see, because I don't see them.
What I am doing is: I put 4 bytes in a byte array. I get the float out of it. As soon as I got one Frame in my float array (I am controlling that with a bool, not nice, but it should work) I put it in my shortarray and let audiotrack play it. This double casting might be slow, but I do it because its the closest I got to playing the actual input.
Edit:
I checked the endianess by comparing the floats, they have the proper values between -1 and 1 and are the same ones I send. Since I don't change the endianess when casting to float, I don't get why forwarding a 4096 byte array to AudioTrack directly doesn't work neither. There might be something wrong with the multithreading, but I don't see what it could be.
Edit 2: I discovered a minor problem - I reset j at 1023. But that missing float should not have been the problem. What I did other than that was to put the method that took the stream from the socket in another thread instead of calling it in a async task. That made it work, I now am able to understand the mic sounds. Still the quality is very poor - might there be a reason for that in the code? Also I got a delay of about 10 seconds. Only about half a second is caused by WLAN, so I wonder if it might be the codes fault. Any further thoughts are appreciated.
Edit 3: I played around with the code and implemented a few of greenapps ideas in the comments. With the new thread structure I was facing the problem of not getting any sound. Like at all. I don't get how that is even possible, so I switched back. Other things I tried to make the threads more lightweight didn't have any effect. I got a delay and I got a very poor quality (I can identify knocks, but I can't understand voices). I figured something might be wrong with my convertions, so I put the bytes I receive from the socket directly in AudioTrack - nothing but ugly pulsing static noise. Now I am even more confused, since this exact stream still works with the C client. I will report back if I find a solution, but still any help is welcome.
Edit 4 I should add, that I can play mic inputs from another android app where I send that input directly as bytes (I would exclude the float casting stuff and put the bytes I receive directly to audioTrack in my player code).
Also it occured to me, that it could be a problem, that the said floatarray that is streamed by the C Server comes from a 64bit machine while the phone is 32bit. Could that be a problem somehow, even though I am just streaming floats as 4 bytes?
Or, another thought of mine: The underlying number format of the bytes I receive is float. What format does AudioTrack expect? Even if put in just bytes - would I need to cast that float to a int and cast that back to bytes or something?
new code:
public class PCMSocket {
AudioTrack audioTrack;
boolean doStop = false;
int musicLength = 4096;
byte[] music;
Socket socket;
short[] buffer = new short[4096];
float[] fmusic = new float[1024];
WriteToAudio writeThread;
ReadFromSocket readThread;
public PCMSocket()
{
}
public void start()
{
doStop = false;
readThread = new ReadFromSocket();
readThread.start();
}
public class ReadFromSocket extends Thread
{
public void run()
{
doStop=true;
InetSocketAddress address = new InetSocketAddress("xxx.xxx.xxx.x", 8000);
socket = new Socket();
int timeout = 6000;
try {
socket.connect(address, timeout);
} catch (IOException e2) {
e2.printStackTrace();
}
musicLength = 1024;
InputStream is = null;
try {
is = socket.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
try{
int minSize =AudioTrack.getMinBufferSize( 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT );
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_OUT_STEREO,
AudioFormat.ENCODING_PCM_16BIT, minSize,
AudioTrack.MODE_STREAM);
audioTrack.play();
} catch (Throwable t)
{
t.printStackTrace();
doStop = true;
}
writeThread = new WriteToAudio();
readThread.start();
int i = 0;
int j=0;
try {
if(dis.available()>0)Log.d("PCMSocket", "receiving");
music = new byte[4];
while (dis.available() > 0)
{
music[i]=0;
music[i] = dis.readByte();
if(i==3)
{
int asInt = 0;
asInt = ((music[0] & 0xFF) << 0)
| ((music[1] & 0xFF) << 8)
| ((music[2] & 0xFF) << 16)
| ((music[3] & 0xFF) << 24);
float asFloat = 0;
asFloat = Float.intBitsToFloat(asInt);
fmusic[j]=asFloat;
}
i++;
j++;
if(i==4)
{
music = new byte[4];
i=0;
}
if(j==1024)
{
j=0;
if(doStop)doStop=false;
}
}
} catch (IOException e) {
e.printStackTrace();
}
try {
dis.close();
} catch (IOException e) {
e.printStackTrace();
}
}
};
public class WriteToAudio extends Thread
{
public void run()
{
while(true){
while(!doStop)
{
try{
writeSamples(fmusic);
}catch(Exception e)
{
e.printStackTrace();
}
doStop = true;
}
}
}
};
public void writeSamples(float[] samples)
{
fillBuffer( samples );
audioTrack.write( buffer, 0, samples.length );
}
private void fillBuffer( float[] samples )
{
if( buffer.length < samples.length )
buffer = new short[samples.length];
for( int i = 0; i < samples.length; i++ )
{
buffer[i] = (short)(samples[i] * Short.MAX_VALUE);
}
}
}
old code:
public class PCMSocket {
AudioTrack audioTrack;
WriteToAudio thread;
boolean doStop = false;
int musicLength = 4096;
byte[] music;
Socket socket;
short[] buffer = new short[4096];
float[] fmusic = new float[1024];
public PCMSocket()
{
}
public void start()
{
doStop = false;
new GetStream().executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR);
}
private class GetStream extends AsyncTask<Void, Void, Void> {
#Override
protected Void doInBackground(Void... values) {
PCMSocket.this.getSocket();
return null;
}
#Override
protected void onPreExecute() {
}
#Override
protected void onPostExecute(Void result)
{
return;
}
#Override
protected void onProgressUpdate(Void... values) {
}
}
private void getSocket()
{
doStop=true;
InetSocketAddress address = new InetSocketAddress("xxx.xxx.xxx.x", 8000);
socket = new Socket();
int timeout = 6000;
try {
socket.connect(address, timeout);
} catch (IOException e2) {
e2.printStackTrace();
}
musicLength = 1024;
InputStream is = null;
try {
is = socket.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
try{
int minSize =AudioTrack.getMinBufferSize( 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT );
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_OUT_STEREO,
AudioFormat.ENCODING_PCM_16BIT, minSize,
AudioTrack.MODE_STREAM);
audioTrack.play();
} catch (Throwable t)
{
t.printStackTrace();
doStop = true;
}
thread = new WriteToAudio();
thread.start();
int i = 0;
int j=0;
try {
if(dis.available()>0)Log.d("PCMSocket", "receiving");
music = new byte[4];
while (dis.available() > 0)
{
music[i]=0;
music[i] = dis.readByte();
if(i==3)
{
int asInt = 0;
asInt = ((music[0] & 0xFF) << 0)
| ((music[1] & 0xFF) << 8)
| ((music[2] & 0xFF) << 16)
| ((music[3] & 0xFF) << 24);
float asFloat = 0;
asFloat = Float.intBitsToFloat(asInt);
fmusic[j]=asFloat;
}
i++;
j++;
if(i==4)
{
music = new byte[4];
i=0;
}
if(j==1023)
{
j=0;
if(doStop)doStop=false;
}
}
} catch (IOException e) {
e.printStackTrace();
}
try {
dis.close();
} catch (IOException e) {
e.printStackTrace();
}
}
public class WriteToAudio extends Thread
{
public void run()
{
while(true){
while(!doStop)
{
try{
writeSamples(fmusic);
}catch(Exception e)
{
e.printStackTrace();
}
doStop = true;
}
}
}
};
public void writeSamples(float[] samples)
{
fillBuffer( samples );
audioTrack.write( buffer, 0, samples.length );
}
private void fillBuffer( float[] samples )
{
if( buffer.length < samples.length )
buffer = new short[samples.length*4];
for( int i = 0; i < samples.length; i++ )
{
buffer[i] = (short)(samples[i] * Short.MAX_VALUE);
}
}
}

Sooo...I just solved this only hours after I desperatly put bounty on it, but thats worth it.
I decided to start over. For the design thing with threads etc. I took some help from this awesome project, it helped me a lot. Now I use only one thread. It seems like the main point was the casting stuff, but I am not too sure, it also may have been the multithreading. I don't know what kind of bytes the byte[] constructor of AudioTracker expects, but certainly no float bytes. So I knew I need to use the short[] constructor. What I did was
-put the bytes in a byte[]
-take 4 of them and cast them to a float in a loop
-take each float and cast them to shorts
Since I already did that before, I am not too sure what the problem was. But now it works.
I hope this can help someone who wents trough the same pain as me. Big thanks to all of you who participated and commented.
Edit: I just thought about the changes and figured that me using CHANNEL_CONFIGURATION_STEREO instead of MONO earlier has contributed a lot to the stuttering. So you might want to try that one first if you encounter this problem. Still for me it was only a part of the solution, changing just that didn't help.
static final int frequency = 44100;
static final int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
static final int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
boolean isPlaying;
int playBufSize;
Socket socket;
AudioTrack audioTrack;
playBufSize=AudioTrack.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, frequency, channelConfiguration, audioEncoding, playBufSize, AudioTrack.MODE_STREAM);
new Thread() {
byte[] buffer = new byte[4096];
public void run() {
try {
socket = new Socket(ip, port);
}
catch (Exception e) {
e.printStackTrace();
}
audioTrack.play();
isPlaying = true;
while (isPlaying) {
int readSize = 0;
try { readSize = socket.getInputStream().read(buffer); }
catch (Exception e) {
e.printStackTrace();
}
short[] sbuffer = new short[1024];
for(int i = 0; i < buffer.length; i++)
{
int asInt = 0;
asInt = ((buffer[i] & 0xFF) << 0)
| ((buffer[i+1] & 0xFF) << 8)
| ((buffer[i+2] & 0xFF) << 16)
| ((buffer[i+3] & 0xFF) << 24);
float asFloat = 0;
asFloat = Float.intBitsToFloat(asInt);
int k=0;
try{k = i/4;}catch(Exception e){}
sbuffer[k] = (short)(asFloat * Short.MAX_VALUE);
i=i+3;
}
audioTrack.write(sbuffer, 0, sbuffer.length);
}
audioTrack.stop();
try { socket.close(); }
catch (Exception e) { e.printStackTrace(); }
}
}.start();

Get rid of all, all, the available() tests. Just let your code block in the following read() statement(s). You don't have anything better to do anyway, and you're just burning potentially valuable CPU cycles by even trying to avoid the block.
EDIT To be specific:
try {
socket.connect(address, timeout);
} catch (IOException e2) {
e2.printStackTrace();
}
Poor practice to catch this exception and allow the following code to continue as though it hadn't happened. The exception should be allowed to propagate to the caller.
try {
is = socket.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
Ditto.
try {
if(dis.available()>0)Log.d("PCMSocket", "receiving");
Remove. You're receiving anyway.
music = new byte[4];
while (dis.available() > 0)
Pointless. Remove. The following reads will block.
{
music[i]=0;
Pointless. Remove.
music[i] = dis.readByte();
if(i==3)
{
int asInt = 0;
asInt = ((music[0] & 0xFF) << 0)
| ((music[1] & 0xFF) << 8)
| ((music[2] & 0xFF) << 16)
| ((music[3] & 0xFF) << 24);
This is all pointless. Replace it all with short asInt = dis.readInt();.
float asFloat = 0;
asFloat = Float.intBitsToFloat(asInt);
Given that the original conversion to short was via floatValue * Short.MAX_VALUE, this conversion should be asFloat = (float)asInt/Short.MAX_VALUE.
if(i==4)
If i was 3 before it will be 4 now, so this test is also pointless.
music = new byte[4];
You don't need to reallocate music. Remove.
} catch (IOException e) {
e.printStackTrace();
}
See above. Pointless. The exception should be allowed to propagate to the caller.
try {
dis.close();
} catch (IOException e) {
e.printStackTrace();
}
All this should be in a finally block.
}
};
while(true){
while(!doStop)
You don't need both these loops.
try{
writeSamples(fmusic);
}catch(Exception e)
{
e.printStackTrace();
}
See above. Pointless. The exception should in this case terminate the loop, as any IOException writing to a socket is fatal to the connection.
if( buffer.length < samples.length )
buffer = new short[samples.length];
Why isn't buffer already the right size? Alternatively, what if buffer.length > samples.length?

Related

AudioTrack only playing noise instead of recorded voice

I want to play recorded voice using audio track but its making noise I tried different techniques but unable to solve this issue.
I Changed:
frequency rate, Audio Format Channel Audio Formate Encoding
public class PlayAudio extends AsyncTask<Void, Integer, Void> {
PlayAudio playTask;
String path = Environment.getExternalStorageDirectory().getAbsolutePath() + "/MyFolder/";
String myfile = path + "filename" + ".wav";
File recordingFile = new File(myfile);
boolean isRecording = false,isPlaying = false;
int frequency = 44100 ,channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
#Override
protected Void doInBackground(Void... params) {
isPlaying = true;
int bufferSize = AudioTrack.getMinBufferSize(frequency,channelConfiguration,audioEncoding);
short[] audiodata = new short[bufferSize / 4];
try {
DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream(recordingFile)));
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, frequency,
channelConfiguration, audioEncoding, bufferSize,
AudioTrack.MODE_STREAM);
audioTrack.play();
while (isPlaying && dis.available() > 0) {
int i = 0;
while (dis.available() > 0 && i < audiodata.length) {
audiodata[i] = dis.readShort();
i++;
}
audioTrack.write(audiodata, 0, audiodata.length);
}
dis.close();
// startPlaybackButton.setEnabled(false);
// stopPlaybackButton.setEnabled(true);
} catch (Throwable t) {
Log.e("AudioTrack", "Playback Failed");
}
return null;
}
}
I don't know if this is the whole problem, but part of your problem is that you're treating the wav file as if all of it is audio data. In fact, there is a fair amount of meta-data in there. See http://soundfile.sapp.org/doc/WaveFormat/ for more information.
The safest thing to do is to parse the file until you find data block, then read the data block, and then stop (because often there's meta-data that comes after the data block too.
Here's some rough code to give you the idea.
try {
byte[] buffer = new byte[1024];
// First find the data chunk
byte[] bytes = new byte[4];
// Read first 4 bytes.
// (Should be RIFF descriptor.)
// Assume it's ok
is.read(bytes);
// First subchunk will always be at byte 12.
// (There is no other dependable constant.)
is.skip(8);
for (;;) {
// Read each chunk descriptor.
if (is.read(bytes) < 0) {
break;
}
String desc = new String(bytes, "US-ASCII");
// Read chunk length.
if (is.read(bytes) < 0) {
break;
}
int dataLength = (
(bytes[0] & 0xFF) |
((bytes[1] & 0xFF) << 8) |
((bytes[2] & 0xFF) << 16) |
((bytes[3] & 0xFF) << 24));
long length = getUnsignedInt(dataLength);
if (desc.equals("data")){
// Read 'length' bytes
...
public static long getUnsignedInt(int x) {
return x & 0x00000000ffffffffL;
}

MediaRecorder record audio in a loop

I'm developing a sound recognition system. I'm using a tensorflow model developed on python to convert MFCC values to labels. I'm using the MediaRecorder class to record the audio, and I'm doing it in a loop so I can be constantly getting microphone audio and then getting the label from the model. Here is the recording loop:
temp = 0;
while (true) {
audioPath = getApplicationContext().getFilesDir().getAbsolutePath();
audioPath += "/Recording" + temp + ".3gp";
audioFile = new File(audioPath);
mediaRecorder = new MediaRecorder();
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setOutputFile(audioPath);
try {
mediaRecorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
mediaRecorder.start();
sleep(2000);
if (!isRunning) {
mediaRecorder.stop();
return;
}
try {
int amplitude = mediaRecorder.getMaxAmplitude();
Log.d("volume", Integer.toString(amplitude));
//finished = false;
avgVolumeTask task = new avgVolumeTask();
task.execute(amplitude);
} catch (Exception e) {
Log.d("Exception in startMediaRecorder()", e.toString());
}
mediaRecorder.stop();
mediaRecorder.release();
soundRecognition task2 = new soundRecognition();
task2.execute();
audioFile.delete();
temp++;
}
This is the soundRecognition method:
private class soundRecognition extends AsyncTask<Integer, Integer, Long> {
#Override
protected Long doInBackground(Integer... level) {
float[] mfccValues = null;
Interpreter tflite = null;
float[][] labelProbArray = null;
try {
mfccValues = computeMFCC();
labelList = loadLabelList();
labelProbArray = new float[1][labelList.size()];
tflite = new Interpreter(loadModel());
} catch (IOException e) {
e.printStackTrace();
} catch (UnsupportedAudioFileException e) {
e.printStackTrace();
}
tflite.run(mfccValues, labelProbArray);
for (int i = 0; i < labelProbArray[0].length; i++) {
float value = labelProbArray[0][i];
//if (i == 1f){
//Log.d("Output at " + Integer.toString(i) + ": ", Float.toString(value));
//doAlert(i);
//}
}
return null;
}
}
The computeMFCC method is this:
public float[] computeMFCC() throws IOException, UnsupportedAudioFileException {
FileInputStream in2 = new FileInputStream(audioPath);
int i;
// InputStream to byte array
byte[] buf = IOUtils.toByteArray(in2);
in2.close();
i = Integer.MAX_VALUE;
// byte array to short array
short[] shortArr = new short[buf.length / 2];
ByteBuffer.wrap(buf).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shortArr);
int count = 0;
while (count <= shortArr.length) { // Still have data to process.
for (int n = 0; n < nSubframePerBuf; n++) { // Process audio signal in ArrayList and shift by one subframe each time
int k = 0;
for (i = (n * frameShift); i < (n + 1) * frameShift; i++) {
subx[k] = shortArr[i];
k++;
}
subframeList.add(subx); // Add the current subframe to the subframe list. Later, a number of
}
count++;
}
// Need at least nSubframePerMfccFrame to get one analysis frame
x = extractOneFrameFromList(nSubframePerMfccFrame);
MFCC mfcc = new MFCC(samplePerFrm, 16000, numMfcc);
double[] mfccVals = mfcc.doMFCC(x);
float[] floatArray = new float[mfccVals.length];
for (i = 0 ; i < mfccVals.length; i++)
{
floatArray[i] = (float) mfccVals[i];
}
return floatArray;
}
And the doMFCC method is from a downloaded java file here:
https://github.com/enmwmak/ScreamDetector/blob/master/src/edu/polyu/mfcc/MFCC.java
The issue I'm having is that after a few iterations, I run into the problem that the file doesnt get created, and then get a null error passing the results from the input stream to the tensorflow model.
Possible Issues
One reason could be where the file is stored. I've been trying to send the file to local storage because I was worried that all the devices wouldnt have external storage.
Another reason could be that i'm not calling the sound recognition in the right spot. I waited will after the mediaRecorder is stopped to make sure that the file is written with the mic audio, but when I review the contents of the fileInputStream, it appears to not be working, and in each loop the file is always the same.
Any help would be much appreciated.
It may be tricky to have a sleep(2000) inside while loop.
It may be better to check millis and break until 2000 ms has lapsed.

Replace nio sockets to exec'ed software for binary protocol

I have to make an abstaction in my software - replace direct unblockable NIO sockets ( client/server ) to software abstraction.
For example, instead of connecting via tcp client would exec openssl s_client -connect xxx.xxx.xxx.xxx . I have written a little demo, and it even works. Sometimes :(
The first trouble is that Process's streams can't be used with Selector, so I can't replace socketchannel with any other type of channel, so I have to read/write without any chance to avoid blocking.
The second one is that a protocol is a duplex binary file-transfer protocol ( binkp ), so process's buffered streams are unusabe. I've tried to avoid that converting in/out data to base64 and it works, but also sometimes.
I can't understant why it works or not sometimes. I put a piece of test code below. The first word is frame's length, but first bit is ignored. Please, tell me your guesses. Thanks.
public class BufferedSocketBase64 {
static class InToOut implements Runnable {
InputStream is;
OutputStream os;
boolean direction; //
public InToOut(InputStream is, OutputStream os, boolean direction) {
super();
this.is = is;
this.os = os;
this.direction = direction;
}
#Override
public void run() {
System.out.println(Thread.currentThread().getId() + " start "
+ ((direction) ? "encode from to" : "decode from to"));
boolean eof = false;
while (true) {
if (direction) {
// encode to base64 data
try {
int[] head = new int[2];
for (int i = 0; i < 2; i++) {
head[i] = is.read();
}
int len = (head[0] & 0xff << 8 | head[1] & 0xff) & 0x7FFF;
byte[] buf = new byte[len + 2];
buf[0] = (byte) (head[0] & 0xff);
buf[1] = (byte) (head[1] & 0xff);
for (int i = 2; i < len; i++) {
buf[i] = (byte) (is.read() & 0xff);
}
System.out.println(Thread.currentThread()
.getId() + " << " + new String(buf));
if (len > 0) {
String send = Base64Util.encode(buf, len);
send += "\n";
os.write(send.getBytes());
os.flush();
}
} catch (IOException e) {
eof = true;
}
} else { // decode from base64
try {
StringBuilder sb = new StringBuilder(1024);
byte c = 0x0a;
do {
c = (byte) is.read();
if (c >= 0 && c != 0x0a) {
sb.append(new String(new byte[] { c }));
}
} while (c != 0x0a && c >= 0);
if (sb.length() != 0) {
try {
byte[] buf = Base64Util.decode(sb.toString());
System.out.println(Thread.currentThread()
.getId() + " >> " + buf.length);
os.write(buf);
os.flush();
} catch (StringIndexOutOfBoundsException e) {
System.out
.println(Thread.currentThread().getId()
+ " error on " + sb.toString());
}
}
} catch (IOException e) {
eof = true;
}
}
if (eof) {
System.out.println(Thread.currentThread().getId() + " EOF");
break;
}
}
try {
is.close();
os.close();
} catch (IOException e) {
}
}
}
public static void main(String[] args) throws Exception {
Process proc2 = Runtime.getRuntime().exec("nc -l -p 2020");
Process proc1 = Runtime.getRuntime().exec("nc 127.0.0.1 2020");
Socket sock1 = new Socket();
sock1.connect(new InetSocketAddress("127.0.0.1", 24554), 30);
Socket sock2 = new Socket();
sock2.connect(new InetSocketAddress("127.0.0.1", 24557), 30);
new Thread(new InToOut(sock1.getInputStream(), proc1.getOutputStream(),
true)).start();
new Thread(new InToOut(proc1.getInputStream(), sock1.getOutputStream(),
false)).start();
new Thread(new InToOut(sock2.getInputStream(), proc2.getOutputStream(),
true)).start();
new Thread(new InToOut(proc2.getInputStream(), sock2.getOutputStream(),
false)).start();
}
UPDATED:
I've found right way. I uses syncchronized queries for each stream and synchronized threads to fill or erase that queries. All threads mutually blocks themselves. And it works! :)
Sorry for bother.
I've found right way. I uses syncchronized queries for each stream and synchronized threads to fill or erase that queries. All threads mutually blocks themselves. And it works! :) Sorry for bother.

Streaming chunks of audio (mp3) using Java&JSP for real-time playback through servletOutputStream

I am trying to playback audio and keep it continuous and free from skips or blank spots. I have to first receive as bytes in chunks and convert this to mp3 to be streamed by the servletOutputStream. I only start playing once enough bytes have been collected by the consumer in an attempt to maintain a constant flow of audio. As you can see I have hard coded this buffer but would like it to work for any size of audio bytes. I was wondering if anyone had come across a similar problem and had any advice?
Thanks in advance. Any help would be greatly appreciated.
public class Consumer extends Thread {
private MonitorClass consBuf;
private InputStream mp3InputStream = null;
private OutputStream OutputStream = null;
public Consumer (MonitorClass buf, OutputStream servlet)
{
consBuf = buf;
OutputStream = servlet;
}
public void run()
{
byte[] data;
byte[] tempbuf;
int byteSize = 60720; //This should be dynamic
int byteIncrement = byteSize;
int dataPlayed = 0;
int start = 0;
int buffer = 0;
boolean delay = true;
AudioFormat generatedTTSAudioFormat = getGeneratedAudioFormat();
try
{
while(true)
{
try
{
data = consBuf.get(); //gets data from producer using a shared monitor class
if(data.length >= byteSize) //Buffer size hit, start playing
{
if(delay) //help with buffering
{
System.out.println("Pre-delay...");
consBuf.preDelay();
delay = false;
}
tempbuf = new byte[byteIncrement];
arraySwap(data, tempbuf, start, byteSize);
System.out.println("Section to play: " + start + ", " + byteSize);
mp3InputStream = FishUtils.convertToMP3( new ByteArrayInputStream(tempbuf), generatedTTSAudioFormat);
copyStream(mp3InputStream, OutputStream);
System.out.println("Data played: " + byteSize);
System.out.println("Data collected: " + consBuf.getDownloadedBytes() );
dataPlayed = byteSize;
start = byteSize;
byteSize += byteIncrement;
}
if( consBuf.getIsComplete() )
{
if (consBuf.checkAllPlayed(dataPlayed) > 0)
{
System.out.println("Producer finished, play remaining section...");
//mp3InputStream = convertToMP3(new ByteArrayInputStream(tempbuf), generatedTTSAudioFormat);
//copyStream(mp3InputStream, OutputStream);
}
System.out.println("Complete!");
break;
}
}
catch (Exception e)
{
System.out.println(e);
return;
}
}
}
finally
{
if (null != mp3InputStream)
{
try
{
mp3InputStream.skip(Long.MAX_VALUE);
}
catch (Exception e)
{}
}
closeStream(mp3InputStream);
closeStream(OutputStream);
}
}
}

Performance in reading and writing files, what the best?. Serialization X Java.nio

I have an object with 1 int and 4 doubles.
I compared the performance to write 5 million of these objects in a file using serialization and FileChannel object.
In the serialization used the following method to read and write the file.
public void print() throws IOException, ClassNotFoundException{
ObjectInputStream input = new ObjectInputStream(new FileInputStream(this.filePath) );
try {
while(true) {
this.sb = (Sbit) input.readObject();
//System.out.println(this.sb.toString());
}
}
catch ( EOFException eofException ) {
return;
}
catch (IOException ioException) {
System.exit( 1 );
}
finally {
if( input != null )
input.close();
}
}
public void build() throws IOException {
ObjectOutputStream output = new ObjectOutputStream( new FileOutputStream(this.filePath) );
try {
Random random = new Random();
for (int i = 0; i<5000000; i++) {
this.sb = new Sbit();
this.sb.setKey(i);
this.sb.setXMin( random.nextDouble() );
this.sb.setXMax( random.nextDouble() );
this.sb.setYMin( random.nextDouble() );
this.sb.setYMax( random.nextDouble() );
output.writeObject(this.sb);
}
}
catch (IOException ioException) {
System.exit( 1 );
}
finally {
try {
if( output != null)
output.close();
}
catch ( Exception exception ) {
exception.printStackTrace();
System.exit(1);
}
}
}
While using java.nio was:
public void print() throws IOException {
FileChannel file = new RandomAccessFile(this.filePath, "rw").getChannel();
ByteBuffer[] buffers = new ByteBuffer[5];
buffers[0] = ByteBuffer.allocate(4); // 4 bytes to int
buffers[1] = ByteBuffer.allocate(8); // 8 bytes to double
buffers[2] = ByteBuffer.allocate(8);
buffers[3] = ByteBuffer.allocate(8);
buffers[4] = ByteBuffer.allocate(8);
while (true) {
if(file.read(buffers[0]) == -1 ) // Read the int,
break; // if its EOF exit the loop
buffers[0].flip();
this.sb = new Sbit();
this.sb.setKey(buffers[0].getInt());
if(file.read(buffers[1]) == -1) { // Read the int primary value
assert false; // Should not get here!
break; // Exit loop on EOF
}
buffers[1].flip();
this.sb.setXMin( buffers[1].getDouble() );
if(file.read(buffers[2]) == -1) {
assert false;
break;
}
buffers[2].flip();
this.sb.setXMax( buffers[2].getDouble() );
if(file.read(buffers[3]) == -1) {
assert false;
break;
}
buffers[3].flip();
this.sb.setYMin( buffers[3].getDouble() );
if(file.read(buffers[4]) == -1) {
assert false;
break;
}
buffers[4].flip();
this.sb.setYMax( buffers[4].getDouble() );
for(int i = 0; i < 5; i++)
buffers[i].clear();
}
}
public void build() throws IOException {
FileChannel file = new RandomAccessFile(this.filePath, "rw").getChannel();
Random random = new Random();
for (int i = 0; i<5000000; i++) {
this.sb = new Sbit();
this.sb.setKey(i);
this.sb.setXMin( random.nextDouble() );
this.sb.setXMax( random.nextDouble() );
this.sb.setYMin( random.nextDouble() );
this.sb.setYMax( random.nextDouble() );
ByteBuffer[] buffers = new ByteBuffer[5];
buffers[0] = ByteBuffer.allocate(4); // 4 bytes to into
buffers[1] = ByteBuffer.allocate(8); // 8 bytes to double
buffers[2] = ByteBuffer.allocate(8);
buffers[3] = ByteBuffer.allocate(8);
buffers[4] = ByteBuffer.allocate(8);
buffers[0].putInt(this.sb.getKey()).flip();
buffers[1].putDouble(this.sb.getXMin()).flip();
buffers[2].putDouble(this.sb.getXMax()).flip();
buffers[3].putDouble(this.sb.getYMin()).flip();
buffers[4].putDouble(this.sb.getYMax()).flip();
try {
file.write(buffers);
}
catch (IOException e) {
e.printStackTrace(System.err);
System.exit(1);
}
for(int x = 0; x < 5; x++)
buffers[x].clear();
}
}
But I read a lot about on the java.nio and tried to use it precisely because it has better performance. But that's not what happened in my case.
To write the file were the following (java.nio):
file size: 175 MB
time in milliseconds: 57638
Using serialization:
file size: 200 MB
time in milliseconds: 34504
For the reading of this file, were as follows (java.nio):
time in milliseconds: 78172
Using serialization:
time in milliseconds: 35288
Am I doing something wrong in java.nio? I would like to write to the same binary files as done. There is another way to write file efficiently? actually serializing an object is the best way?
Thank you.
You are creating 25,000,000 ByteBuffer objects, with each ByteBuffer being at most 8 bytes. Thats very inefficient.
Create just one ByteBuffer by allocating it to 38 bytes outside the loop (before the for statement)
Inside the loop you can use the same ByteBuffer as follows:
buffer.clear();
buffer.putInt(this.sb.getKey());
buffer.putDouble(this.sb.getXMin());
buffer.putDouble(this.sb.getXMax());
buffer.putDouble(this.sb.getYMin());
buffer.putDouble(this.sb.getYMax());
buffer.flip();
try
{
file.write(buffer);
}
catch (IOException ex)
{
ex.printStackTrace();
//etc...
}
buffer.flip();
Try it out and let us know if you see any improvements.
Instead of using multiple ByteBuffers, declare a single byte buffer that is large enough to hold all of the data you want to put into it. Then put data into it just like you are now. When done, flip the buffer and write it out. When you are ready to read it back in, read the data from disk into the byte buffer, flip it, and then read the data out using getInt/getDouble.
I haven't tried to serialize stuff on my own, but have achieved good results with kryo. It is a lot faster than standard Java serialization.

Categories