Even though the sun.audio API says that .wav is a supported file apparently the one that I had must not have been. a .aiff file is now working but not in this way I found a better way thats a little more complicated though.
String strFilename = "C:\\Documents and Settings\\gkehoe\\Network\\GIM\\Explode.aiff";
File soundFile = new File(strFilename);
AudioInputStream audioInputStream = null;
try
{
audioInputStream = AudioSystem.getAudioInputStream(soundFile);
}
catch (Exception e)
{
e.printStackTrace();
}
AudioFormat audioFormat = audioInputStream.getFormat();
SourceDataLine line = null;
DataLine.Info info = new DataLine.Info(SourceDataLine.class,
audioFormat);
try
{
line = (SourceDataLine) AudioSystem.getLine(info);
/*
The line is there, but it is not yet ready to
receive audio data. We have to open the line.
*/
line.open(audioFormat);
}
catch (LineUnavailableException e)
{
e.printStackTrace();
System.exit(1);
}
catch (Exception e)
{
e.printStackTrace();
System.exit(1);
}
line.start();
int nBytesRead = 0;
byte[] abData = new byte[EXTERNAL_BUFFER_SIZE];
while (nBytesRead != -1)
{
try
{
nBytesRead = audioInputStream.read(abData, 0, abData.length);
}
catch (IOException e)
{
e.printStackTrace();
}
if (nBytesRead >= 0)
{
int nBytesWritten = line.write(abData, 0, nBytesRead);
}
}
line.drain();
/*
All data are played. We can close the shop.
*/
line.close();
According to source code it is not recognized as supported file format.
Wav files are supported, but there are many variables, and some of them are not supported.
For example, you might get an unrecognized format exception if the wav is encoded at 48000 instead of 44100, or at 24 or 32 bits instead of 16 bit encoding.
What exact error did you get?
What are the specs (properties) of the wav file?
It is possible to convert from one wav to a compatible wav using a tool such as Audacity. A format that I use for wav files has the following properties:
16-bit encoding
little endian
44100 sample rate
stereo
I didn't really look closely at the code example itself. I like this playback example.
Related
I am trying to get two .wav files, convert them to the same audio format and then concatenate. I need only concatenated files and delete the others. The problem is that I can not delete them because AudioInputStreams are not closed. After debugging I discovered that streams are not closed after execution of this method:
private static void convertFilesToSameAudioFormat(String fileName1, String fileName2) {
try (AudioInputStream clip = AudioSystem.getAudioInputStream(new File(fileName2));
AudioInputStream clip1 = AudioSystem.getAudioInputStream(new File(fileName1));
AudioInputStream clip2 = AudioSystem.getAudioInputStream(clip1.getFormat(), clip);) {
AudioSystem.write(clip2,
AudioFileFormat.Type.WAVE,
new File("temp.wav"));
} catch (Exception e) {
e.printStackTrace();
}
}
So after the execution of this method clip, clip1, clip2 can not be deleted, because they are used by the program.
I'm trying to get an audio stream from a text-to-speech interface (MaryTTS) and stream it in an SIP RTP session (using Peers).
Peers wants a SoundSource to stream audio, which is an interface defined as
public interface SoundSource {
byte[] readData();
}
and MaryTTS synthesises a String to an AudioInputStream. I tried to simply read the stream and buffering it out to Peers implementing SoundSource, in the lines of
MaryInterface tts = new LocalMaryInterface();
AudioInputStream audio = tts.generateAudio("This is a test.");
SoundSource soundSource = new SoundSource() {
#Override
public byte[] readData() {
try {
byte[] buffer = new byte[1024];
audio.read(buffer);
return buffer;
} catch (IOException e) {
return null;
}
}
};
// issue call with soundSource using Peers
the phone rings, and I hear a slow, low, noisy sound instead of the synthesised speech. I guess it could be something with the audio format the SIP RTP session expects, since Peers documentation states
The sound source must be raw audio with the following format: linear PCM 8kHz, 16 bits signed, mono-channel, little endian.
How can I convert/read the AudioInputStream to satisfy these requirements?
One way I know is this - given the systems that you are using I dont know if it will pass:
ByteArrayOutputStream outputStream=new ByteArrayOutputStream();
try {
byte[] data=new byte[1024];
while(true) {
k=audioInputStream.read(data, 0, data.length);
if(k<0) break;
outputStream.write(data, 0, k);
}
AudioFormat af=new AudioFormat(8000f, 16, 1, true, false);
byte[] audioData=outputStream.toByteArray();
InputStream byteArrayInputStream=new ByteArrayInputStream(audioData);
AudioInputStream audioInputStream2=new AudioInputStream(byteArrayInputStream, af, audioData.length/af.getFrameSize());
outputStream.close();
}
catch(Exception ex) { ex.printStackTrace(); }
}
There is also this
AudioSysytem.getAudioInputStream(AudioFormat targetFormat, AudioInputStream sourceStream)
which you can use with the above parameters.
Hi I have following java programme that play some sounds.I want to play sounds in order for example after ending of sound1 i want to play sound2 and then sound3 the following is my java code and function of playing sound .
private void playsound(String file)
{
try {
crit = AudioSystem.getClip();
AudioInputStream inputStream1 = AudioSystem.getAudioInputStream(this.getClass().getResource(file));
crit.open(inputStream1);
//if(!crit.isOpen())
{
crit.start();
}
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
}
and calling it as following
playsound("/sounds/filesound1.au");
playsound("/sounds/filesound2.au");
playsound("/sounds/filesound3.au");
the programme is plying sound in parallel which I don't want.I want to play in order
Regards
I got the following code from somewhere that I can't remember right now but it plays the music consequently:
public static void play(ArrayList<String> files){
byte[] buffer = new byte[4096];
for (String filePath : files) {
File file = new File(filePath);
try {
AudioInputStream is = AudioSystem.getAudioInputStream(file);
AudioFormat format = is.getFormat();
SourceDataLine line = AudioSystem.getSourceDataLine(format);
line.open(format);
line.start();
while (is.available() > 0) {
int len = is.read(buffer);
line.write(buffer, 0, len);
}
line.drain();
line.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}
The reason this plays the files consequently and not all at the same time is because write blocks until the requested amount of data has been written. This applies even if the requested amount of data to write is greater than the data line's buffer size.
Make sure to include drain() from the code above. drain() waits for the buffer to empty before it close()s.
Using Java is it possible to capture the speaker output? This output is not being generated by my program but rather by other running applications. Can this be done with Java or will I need to resort to C/C++?
I had a Java based app. that used Java Sound to tap into the sound flowing through the system to make a trace of it. It worked well on my own (Windows based) machine, but failed completely on some others.
It was determined that in order to get it working on those machines, would take nothing short of an audio loop-back in either software or hardware (e.g. connect a lead from the speaker 'out' jack to the microphone 'in' jack).
Since all I really wanted to do was plot the trace for music, and I figured how to play the target format (MP3) in Java, it became unnecessary to pursue the other option further.
(And I also heard that Java Sound on Mac. was horribly broken, but I never looked closely into it.)
Java is not the best tool when dealing with the OS. If you need/want to use it for this task, probably you will end using Java Native Interface (JNI), linking to libraries compiled in other languages (probably c/c++).
Take an AUX cable, connect to HEADPHONE JACK and other end to MICROPHONE JACK and run this code
https://www.codejava.net/coding/capture-and-record-sound-into-wav-file-with-java-sound-api
import javax.sound.sampled.*;
import java.io.*;
public class JavaSoundRecorder {
// record duration, in milliseconds
static final long RECORD_TIME = 60000; // 1 minute
// path of the wav file
File wavFile = new File("E:/Test/RecordAudio.wav");
// format of audio file
AudioFileFormat.Type fileType = AudioFileFormat.Type.WAVE;
// the line from which audio data is captured
TargetDataLine line;
/**
* Defines an audio format
*/
AudioFormat getAudioFormat() {
float sampleRate = 16000;
int sampleSizeInBits = 8;
int channels = 2;
boolean signed = true;
boolean bigEndian = true;
AudioFormat format = new AudioFormat(sampleRate, sampleSizeInBits,
channels, signed, bigEndian);
return format;
}
/**
* Captures the sound and record into a WAV file
*/
void start() {
try {
AudioFormat format = getAudioFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
// checks if system supports the data line
if (!AudioSystem.isLineSupported(info)) {
System.out.println("Line not supported");
System.exit(0);
}
line = (TargetDataLine) AudioSystem.getLine(info);
line.open(format);
line.start(); // start capturing
System.out.println("Start capturing...");
AudioInputStream ais = new AudioInputStream(line);
System.out.println("Start recording...");
// start recording
AudioSystem.write(ais, fileType, wavFile);
} catch (LineUnavailableException ex) {
ex.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
/**
* Closes the target data line to finish capturing and recording
*/
void finish() {
line.stop();
line.close();
System.out.println("Finished");
}
/**
* Entry to run the program
*/
public static void main(String[] args) {
final JavaSoundRecorder recorder = new JavaSoundRecorder();
// creates a new thread that waits for a specified
// of time before stopping
Thread stopper = new Thread(new Runnable() {
public void run() {
try {
Thread.sleep(RECORD_TIME);
} catch (InterruptedException ex) {
ex.printStackTrace();
}
recorder.finish();
}
});
stopper.start();
// start recording
recorder.start();
}
}
I have a Java application whose UI relies heavily on audio. On Windows and OS X, everything works fine; on Linux, however, the application requires exclusive access to the sound device, a LineUnavailableException is thrown and no sound is heard. I'm using Kubuntu 9.10.
This means that no other application can play audio while the program is running, and can't even be holding an audio device when the program starts. This is naturally unacceptable.
Here is the code I'm using to play audio:
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(file);
Clip clip = AudioSystem.getClip();
clip.open(audioInputStream);
clip.start();
this.wait((clip.getMicrosecondLength() / 1000) + 100);
clip.stop();
Am I doing something wrong? Is using Java to play audio in Linux a lost cause?
I fear that audio in Linux is a lost cause itself. But in this case, it really is a known Java Bug. You should try to figure out what sound architecture you are using. I think the default for Ubuntu is PulseAudio/ALSA. I'm not not sure about Kubuntu though.
There is a known workaround (I never tried it myself though).
It's also possible that some other applications you're running is exclusively using the soundcard, so make sure to test with different applications (i.e. applications that play nicely with others).
I was able to play audio sound on GNU/Linux (Ubuntu 10.10) using the OpenJDK with some tweaks. I believe the the LineUnavailableException was a bug in PulseAudio and was fixed in 10.10.
I needed to specify the Format (something not needed on Windows).
AudioInputStream audioIn = AudioSystem.getAudioInputStream(in);
// needed for working on GNU/Linux (openjdk) {
AudioFormat format = audioIn.getFormat();
DataLine.Info info = new DataLine.Info(Clip.class, format);
Clip clip = (Clip)AudioSystem.getLine(info);
// }
// on windows, {
//Clip clip = AudioSystem.getClip();
// }
Be aware that the call to Clip.getMicrosecondLength() returns milliseconds.
Java Sound is terrible for high-precision or low-latency tasks, and almost totally dysfunctional on Linux. Abandon ship now before you sink more time into it.
After Java Sound I tried OpenAL which wasn't great on Linux either.
Currently I'm using FMOD which is unfortunately closed-source.
The open source way to go would probably be PortAudio. Try talking to the SIP Communicator devs.
I also tried RtAudio but found it had bugs with its ALSA implementation.
Send an mplayer command through a shell. Most easy solution.
i got this code from somewhere in internet, the sound comes up most time, occasionally doesn't come up
import java.util.*;
import java.text.*;
import java.io.*;
import java.net.*;
import javax.sound.sampled.*;
public class Sound2
{
public static
void main (String name[])
{
playSound ( "somesound.wav" );
}
public static
void playSound (String filename)
{
int BUFFER_SIZE = 128000;
//File soundFile = null;
AudioInputStream audioStream = null;
AudioFormat audioFormat = null;
SourceDataLine sourceLine = null;
try
{
audioStream =
AudioSystem.getAudioInputStream
(
new
BufferedInputStream
(
new FileInputStream ( filename )
)
//soundFileStream
);
}
catch (Exception e)
{
e.printStackTrace();
System.exit(1);
}
audioFormat = audioStream.getFormat();
DataLine.Info info = new DataLine.Info
(
SourceDataLine.class,
audioFormat
);
try
{
sourceLine = (SourceDataLine) AudioSystem.getLine(info);
sourceLine.open(audioFormat);
}
catch (LineUnavailableException e)
{
e.printStackTrace();
System.exit(1);
}
catch (Exception e)
{
e.printStackTrace();
System.exit(1);
}
sourceLine.start();
int nBytesRead = 0;
byte[] abData = new byte[BUFFER_SIZE];
while (nBytesRead != -1)
{
try
{
nBytesRead =
audioStream.read(abData, 0, abData.length);
}
catch (IOException e)
{
e.printStackTrace();
}
if (nBytesRead >= 0)
{
#SuppressWarnings("unused")
int nBytesWritten =
sourceLine.write(abData, 0, nBytesRead);
}
}
sourceLine.drain();
sourceLine.close();
}
}