Convert an audio stream to PCM - java

I'm trying to get an audio stream from a text-to-speech interface (MaryTTS) and stream it in an SIP RTP session (using Peers).
Peers wants a SoundSource to stream audio, which is an interface defined as
public interface SoundSource {
byte[] readData();
}
and MaryTTS synthesises a String to an AudioInputStream. I tried to simply read the stream and buffering it out to Peers implementing SoundSource, in the lines of
MaryInterface tts = new LocalMaryInterface();
AudioInputStream audio = tts.generateAudio("This is a test.");
SoundSource soundSource = new SoundSource() {
#Override
public byte[] readData() {
try {
byte[] buffer = new byte[1024];
audio.read(buffer);
return buffer;
} catch (IOException e) {
return null;
}
}
};
// issue call with soundSource using Peers
the phone rings, and I hear a slow, low, noisy sound instead of the synthesised speech. I guess it could be something with the audio format the SIP RTP session expects, since Peers documentation states
The sound source must be raw audio with the following format: linear PCM 8kHz, 16 bits signed, mono-channel, little endian.
How can I convert/read the AudioInputStream to satisfy these requirements?

One way I know is this - given the systems that you are using I dont know if it will pass:
ByteArrayOutputStream outputStream=new ByteArrayOutputStream();
try {
byte[] data=new byte[1024];
while(true) {
k=audioInputStream.read(data, 0, data.length);
if(k<0) break;
outputStream.write(data, 0, k);
}
AudioFormat af=new AudioFormat(8000f, 16, 1, true, false);
byte[] audioData=outputStream.toByteArray();
InputStream byteArrayInputStream=new ByteArrayInputStream(audioData);
AudioInputStream audioInputStream2=new AudioInputStream(byteArrayInputStream, af, audioData.length/af.getFrameSize());
outputStream.close();
}
catch(Exception ex) { ex.printStackTrace(); }
}
There is also this
AudioSysytem.getAudioInputStream(AudioFormat targetFormat, AudioInputStream sourceStream)
which you can use with the above parameters.

Related

Java realtime audio FFT

My aim is to take in two channels of audio from a live source, perform some FFT analysis and display realtime graphs of the data.
So far I have researched and have gotten to the point where I can create a targetdataline from my audio interface with two channels of audio at a specified audioformat. I have created a buffer for this stream of bytes, however I would like to treat each audio channel independently. Do I need to split the stream as it writes to the buffer, and have two buffers? Or do I need to split the buffer into different arrays to process?
final AudioFormat format = getFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
TargetDataLine line = (TargetDataLine) m.getLine(info);
line.open(format);
line.start();
System.out.println("Line Started");
Thread captureThread = new Thread(){
int bufferSize = (int) (format.getSampleRate() * format.getFrameRate() * format.getChannels());
byte buffer[] = new byte[bufferSize / 5];
out = new ByteArrayOutputStream();
while(running) {
int numBytesRead = line.read(buffer, 0, buffer.length);
while (numBytesRead > 0) {
arraytoProcess = buffer;
Thread fftThread;
fftThread = new Thread(){
public void fftrun() {
try {
fftCalc();
} catch (ParseException ex) {
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex);
}
};
};
while (numBytesRead == buffer.length){
fftThread.start();
}
}
I am sure I have gone far wrong, however any pointers would help. When I try running this at the moment I am aware that it takes too longer to complete the 'fftThread' than it takes for each pass of the buffer so I get an illegal thread state exception (however it is currently getting all bytes (stereo channel) passed to this thread. I have tried the good old search engines however things aren't overly clear on how to deal with accessing multiple channels of a TargetDataStream.

java playing sounds in order

Hi I have following java programme that play some sounds.I want to play sounds in order for example after ending of sound1 i want to play sound2 and then sound3 the following is my java code and function of playing sound .
private void playsound(String file)
{
try {
crit = AudioSystem.getClip();
AudioInputStream inputStream1 = AudioSystem.getAudioInputStream(this.getClass().getResource(file));
crit.open(inputStream1);
//if(!crit.isOpen())
{
crit.start();
}
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
}
and calling it as following
playsound("/sounds/filesound1.au");
playsound("/sounds/filesound2.au");
playsound("/sounds/filesound3.au");
the programme is plying sound in parallel which I don't want.I want to play in order
Regards
I got the following code from somewhere that I can't remember right now but it plays the music consequently:
public static void play(ArrayList<String> files){
byte[] buffer = new byte[4096];
for (String filePath : files) {
File file = new File(filePath);
try {
AudioInputStream is = AudioSystem.getAudioInputStream(file);
AudioFormat format = is.getFormat();
SourceDataLine line = AudioSystem.getSourceDataLine(format);
line.open(format);
line.start();
while (is.available() > 0) {
int len = is.read(buffer);
line.write(buffer, 0, len);
}
line.drain();
line.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}
The reason this plays the files consequently and not all at the same time is because write blocks until the requested amount of data has been written. This applies even if the requested amount of data to write is greater than the data line's buffer size.
Make sure to include drain() from the code above. drain() waits for the buffer to empty before it close()s.

Java play 2 byte buffers at the same time with SourceDataLine

I'm trying to write 2 different buffers (buffer A and B) multithreaded with SourceDataLine to play the sounds at the same time. But it keeps switching between buffer A and buffer B, do I need to merge the buffers together before writing them to my SourceDataLine or is there a way to play them synchronized?
class PlayThread extends Thread {
byte[] buffer = new byte[2 * 1024];
#Override
public void run() {
try {
while (true) {
DatagramPacket receive = new DatagramPacket(buffer, buffer.length);
mDatagramSocket.receive(receive);
mSourceDataLine.write(receive.getData(), 0, receive.getData().length);
System.out.println("Received!");
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
I have 2 PlayThread instances with a different incoming buffer. Below is the function where the SourceDataLine is initialized.
private void init() {
try {
DataLine.Info sourceDataLineInfo = new DataLine.Info(
SourceDataLine.class, audioFormat);
DataLine.Info targetDataLineInfo = new DataLine.Info(
TargetDataLine.class, audioFormat);
Mixer.Info[] mixerInfo = AudioSystem.getMixerInfo();
Mixer mixer = AudioSystem.getMixer(mixerInfo[3]);
mSourceDataLine = (SourceDataLine) AudioSystem
.getLine(sourceDataLineInfo);
mTargetDataLine = (TargetDataLine) mixer.getLine(targetDataLineInfo);
mSourceDataLine.open(audioFormat, 2 * 1024);
mSourceDataLine.start();
mTargetDataLine.open(audioFormat, 2 * 1024);
mTargetDataLine.start();
} catch (LineUnavailableException ex) {
ex.printStackTrace();
}
}
Thank you.
You absolutely do have to merge them. Imagine writing numbers to a file from two threads:
123456...
123456...
might become
11234235656...
Which is what's happening to you.
Another issue is that you need to buffer your data as it comes in from the network, or you will likely drop it. You need at least two threads -- one for reading and one for playing for this to work. However, in your case, you will probably have better luck with one reader thread for each input packet stream. (See my talk slides: http://blog.bjornroche.com/2011/11/slides-from-fundamentals-of-audio.html I specifically have a slide about streaming from http which is also relevant here)
So, Instead of multiple PlayThreads, make multiple ReaderThreads, which wait for data and then write to a buffer of some sort (PipedInput and PipedOutputStream work well for Java). Then you need another thread to read the data from the buffers and then write the COMBINED data to the stream.
This leaves your original question of how to combine the data. The answer is that there's no single answer, but usually the easiest correct way is to average the data on a sample-by-sample basis. However, exactly how you do so depends on your data format, which your code doesn't include. Assuming it's big-endian 16-bit integer, you need to convert the incoming raw data to shorts, average the shorts, and convert the averaged short back to bytes.
The byte to short conversion is most easily accomplished using DataInputStream and DataOutputStream.

How can I convert a byte array to a bitmap and stream the images through a web server?

The project I am working on is capturing a frame from a security camera, with the help of Arduino UNO and Video Experimenter shield. Then I am sending the frame as byte arrays through a serial port. I would like to ask, how could I, with Java, convert back this byte arrays to an image, and stream this image - or even make this images a video and then stream it - through a web server?
The code I have stacked with is this:
//Handle an event on the serial port. Read the data and save the image.
public synchronized void serialEvent(SerialPortEvent oEvent) {
if (oEvent.getEventType() == SerialPortEvent.DATA_AVAILABLE) {
try {
System.out.println("Got it!");
int available = input.available();
byte[] chunk = new byte[available];
input.read(chunk, 0, available);
InputStream in = new ByteArrayInputStream(chunk);
BufferedImage image = ImageIO.read(in);
ImageIO.write(image, "BMP", new File ("/home/zuss/images/image.BMP"));
} catch (Exception e) {
System.err.println(e.toString());
}
}
}
That returns to my terminal window: java.lang.IllegalArgumentException: image == null!
continuously as long as arduino is sending data to the serial port.
Your code should look something like this:
InputStream in = new ByteArrayInputStream(chunk);
OutputStream out = new FileOutputStream(new File ("/home/zuss/images/image.BMP"));
byte[] buffer = new byte[4096];
int read = in.read(buffer);
while(read >= 0 ) {
out.write(buffer, 0, read);
read = in.read(buffer);
}

Making sound play in Java Program

Even though the sun.audio API says that .wav is a supported file apparently the one that I had must not have been. a .aiff file is now working but not in this way I found a better way thats a little more complicated though.
String strFilename = "C:\\Documents and Settings\\gkehoe\\Network\\GIM\\Explode.aiff";
File soundFile = new File(strFilename);
AudioInputStream audioInputStream = null;
try
{
audioInputStream = AudioSystem.getAudioInputStream(soundFile);
}
catch (Exception e)
{
e.printStackTrace();
}
AudioFormat audioFormat = audioInputStream.getFormat();
SourceDataLine line = null;
DataLine.Info info = new DataLine.Info(SourceDataLine.class,
audioFormat);
try
{
line = (SourceDataLine) AudioSystem.getLine(info);
/*
The line is there, but it is not yet ready to
receive audio data. We have to open the line.
*/
line.open(audioFormat);
}
catch (LineUnavailableException e)
{
e.printStackTrace();
System.exit(1);
}
catch (Exception e)
{
e.printStackTrace();
System.exit(1);
}
line.start();
int nBytesRead = 0;
byte[] abData = new byte[EXTERNAL_BUFFER_SIZE];
while (nBytesRead != -1)
{
try
{
nBytesRead = audioInputStream.read(abData, 0, abData.length);
}
catch (IOException e)
{
e.printStackTrace();
}
if (nBytesRead >= 0)
{
int nBytesWritten = line.write(abData, 0, nBytesRead);
}
}
line.drain();
/*
All data are played. We can close the shop.
*/
line.close();
According to source code it is not recognized as supported file format.
Wav files are supported, but there are many variables, and some of them are not supported.
For example, you might get an unrecognized format exception if the wav is encoded at 48000 instead of 44100, or at 24 or 32 bits instead of 16 bit encoding.
What exact error did you get?
What are the specs (properties) of the wav file?
It is possible to convert from one wav to a compatible wav using a tool such as Audacity. A format that I use for wav files has the following properties:
16-bit encoding
little endian
44100 sample rate
stereo
I didn't really look closely at the code example itself. I like this playback example.

Categories