I'm trying to load a .wav file in my EnviromentAudio object, but I received only an UnsupportedAudioFileException and I don't know why. Because the file is a wav and I've tried to encode it as an unsigned 8 bit, as a signed 16 bit, with a 44100 bit rate, as a GSM, as a A-law... long story short I've tried a lot of encoding, as many people suggested, but no one worked. Probably I'm not getting something, so, I want to ask what I'm doing wrong.
EDIT:
As pointed out I should have specified some things: first of all, to set some context, I am using Java 8 to create a little pc game for a project, which must uses the basics components of java. Said that, I'm using the ClassLoader
, because I have a mess in the project folder. It does not follow the convention and I have to keep like that. It's structured like this:
-src
-app
-audio
EnviromentAudio.java // Class that need to load soundtrack.wav
-res
-audio
Soundtrack.wav // Audio to be loaded
And I know that a getResource.. should start always with a /, but if I add that slash, then every attempt to get a resource results in a NPE. Probably that's caused by the folders disposition and, by the way, the resources folder is set as source folder, so I'm not even quite sure about that, cause, also, I've already used the getResource to get other files without problems.
In this case The getResource works fine, that is it retrieves the file, but the AudioSystem generates an error. I've tried to isolate the parties involved, but the only problem seems to be here. I'm adding the AudioManager class, the Audio class inherited by EnviromentAudio, and the whole EnviromentAudio, with the hope that it will be of help for a better understanding. I also provided a main in the AudioManager class, which should be enough to replicate the error.
Audio class:
package application.core.audio;
import java.util.ArrayList;
import javax.sound.sampled.Clip;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.LineUnavailableException;
import javax.swing.JOptionPane;
public abstract class Audio
{
protected static final String AUDIOERROR="Error in loading audio. "
+ "Execution Failed, please, restart the game. "
protected static final String AUDIOERRORTITLE="Audio loading error";
protected ArrayList<Clip> multimedia;
protected Clip currentAudio;
protected FloatControl gainControl;
public Audio() {
multimedia=new ArrayList<Clip>();
currentAudio=null;
}
protected abstract void getResources();
public void playAudio(int index) {
try
{
currentAudio=multimedia.get(index);
gainControl=(FloatControl) currentAudio.getControl(
FloatControl.Type.MASTER_GAIN);
currentAudio.open();
} catch (LineUnavailableException e)
{
e.printStackTrace();
JOptionPane.showMessageDialog(null, AUDIOERROR,
AUDIOERRORTITLE, JOptionPane.ERROR_MESSAGE);
}
currentAudio.start();
}
public void loopAudio(int index) {
currentAudio=multimedia.get(index);
// gainControl=(FloatControl) currentAudio.getControl(
// FloatControl.Type.MASTER_GAIN);
// currentAudio.open();
// currentAudio.start();
currentAudio.loop(Clip.LOOP_CONTINUOUSLY);
}
public void repeatAudio(int index, int times) {
try
{
currentAudio=multimedia.get(index);
gainControl=(FloatControl) currentAudio.getControl(
FloatControl.Type.MASTER_GAIN);
currentAudio.open();
} catch (LineUnavailableException e)
{
e.printStackTrace();
JOptionPane.showMessageDialog(null, AUDIOERROR,
AUDIOERRORTITLE, JOptionPane.ERROR_MESSAGE);
}
currentAudio.loop(times);
}
public void stopAudio(int index) {
multimedia.get(index).stop();
multimedia.get(index).close();
}
public void setVolume(float volume) {
float range=gainControl.getMaximum()-gainControl.getMinimum();
float gain=(range-volume)+gainControl.getMinimum();
gainControl.setValue(gain);
}
public boolean currentAudioIsOpen() {return currentAudio.isOpen();}
public void openCurrentAudio() {
if (!currentAudio.isOpen())
try
{
currentAudio.open();
} catch (LineUnavailableException e)
{
e.printStackTrace();
JOptionPane.showMessageDialog(null, AUDIOERROR,
AUDIOERRORTITLE, JOptionPane.ERROR_MESSAGE);
}
}
public void openAndPlayCurrentAudio() {
if (!currentAudio.isOpen())
openCurrentAudio();
currentAudio.start();
}
public void playCurrentAudio() {currentAudio.start();}
public void loopCurrentAudio() {currentAudio.loop(Clip.LOOP_CONTINUOUSLY);}
public void repeatCurrentAudio(int times) {currentAudio.loop(times);}
public void stopCurrentAudio() {currentAudio.stop();}
public void stopAndCloseCurrentAudio() {
currentAudio.stop();
currentAudio.close();
}
}
This is my EnviromentAudio class that produce the exception:
package application.core.audio;
import java.io.File;
import java.io.IOException;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.UnsupportedAudioFileException;
public class EnviromentAudio extends Audio
{
public static final int SOUNDTRACK=0;
public EnviromentAudio()
{
super();
getResources();
this.gainControl=(FloatControl) currentAudio.getControl(FloatControl.Type.MASTER_GAIN);
}
#Override
protected void getResources()
{
try
{
ClassLoader loader=EnviromentAudio.class.getClassLoader();
multimedia.add(AudioSystem.getClip());
multimedia.get(SOUNDTRACK).open(AudioSystem.getAudioInputStream( // here the exception is thrown (on getAudioInputStream)
loader.getResourceAsStream("resources"+File.separator+"audio"+File.separator+
"soundtrack"+File.separator+"igpeSoundtrack.wav")));
currentAudio=multimedia.get(SOUNDTRACK);
} catch (LineUnavailableException e)
{
e.printStackTrace();
} catch (IOException | UnsupportedAudioFileException e1)
{
e1.printStackTrace();
}
}
}
AudioManager class:
package application.core.audio;
public class AudioManager
{
private static AudioManager instance=null;
private EnviromentAudio soundtrack;
private PlayerAudio playerAudio;
private AudioManager() {
soundtrack=new EnviromentAudio();
// playerAudio=new PlayerAudio();
soundtrack.loopAudio(EnviromentAudio.SOUNDTRACK);
}
public static AudioManager getInstance() {
if (instance==null)
instance=new AudioManager();
return instance;
}
public Audio getSoundtrack() {return soundtrack;}
public Audio getPlayerSounds() {return playerAudio;}
public void setVolume(float volume) {
soundtrack.setVolume(volume);
playerAudio.setVolume(volume);
}
public float getVolume() {return soundtrack.gainControl.getValue();}
public static void main(String[] args)
{
AudioManager a=AudioManager.getInstance();
}
}
And here is the error:
javax.sound.sampled.UnsupportedAudioFileException: Stream of unsupported format
at java.desktop/javax.sound.sampled.AudioSystem.getAudioInputStream(AudioSystem.java:1020)
at application.core.audio.EnviromentAudio.getResources(EnviromentAudio.java:29)
at application.core.audio.EnviromentAudio.<init>(EnviromentAudio.java:18)
at application.core.audio.AudioManager.<init>(AudioManager.java:11)
at application.core.audio.AudioManager.getInstance(AudioManager.java:19)
at application.MainApplication.audioInitialize(MainApplication.java:44)
at application.MainApplication.main(MainApplication.java:25)
This is more to help with troubleshooting than a solution (expanding on Andrew Thompson's suggestion of making an MRE. Are you using a particular framework? Or is it something of your own making? For a second I though it might be Android (due to presence of AudioManager).
Following is a more minimal example for play testing your .wav file. Put the wav file in the same folder as this class. Does your .wav file play when using this?
import java.io.IOException;
import java.net.URL;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Clip;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.UnsupportedAudioFileException;
public class BasicClipExample {
public static void main(String[] args) {
try {
BasicClipExample.run();
} catch (UnsupportedAudioFileException | IOException
| LineUnavailableException | InterruptedException e) {
e.printStackTrace();
}
}
private static void run() throws UnsupportedAudioFileException,
IOException, LineUnavailableException, InterruptedException
{
String filename = "yourSound.wav";
URL url = BasicClipExample.class.getResource(filename);
AudioInputStream ais = AudioSystem.getAudioInputStream(url);
DataLine.Info info = new DataLine.Info(Clip.class, ais.getFormat());
Clip clip = (Clip) AudioSystem.getLine(info);
clip.open(ais);
clip.start();
Thread.sleep(6000); // plays up to 6 seconds of sound before exiting
clip.close();
}
}
If it works, then something is odd about your framing code. From here you can progressively check if things like the file separator logic are working. You can also add some lines to print out the AudioFormat if the file loads.
Another way I sometimes inspect files is to load them into Audacity, which is free. Info about the file format is pretty easy to inspect with that tool. If I had to wager, and the issue IS the .wav format, I'm guessing that the file is recorded at a higher quality level than Java is set to work with, e.g., 48000 (maybe Java supports?) or 96000 fps or 24- or 32-bit encoding rather than 16-bit.
Related
I wrote this audio class that is used by my soundplayer class it works correctly with no errors but causes my game to hiccup or jerk when the sound is used in quick succession (i.e. rapid jumping). I thought that by loading it in once and rewinding it each time(with the mark and reset methods) the sound would stay in RAM but it doesn't seem to be doing that.I am open to any changes or a total rewrite. i just need it to work I've been stuck on audio for months.
package com.bigdirty1985.squaresthatmove.sound;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import sun.audio.AudioPlayer;
import sun.audio.AudioStream;
public class Sound {
#SuppressWarnings("restriction")
private AudioStream audioStream;
private File file;
#SuppressWarnings("restriction")
public Sound(File file) {
this.file = file;
try {
this.audioStream = new AudioStream(new FileInputStream(this.file));
this.audioStream.mark(100000000);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#SuppressWarnings("restriction")
public void play() {
AudioPlayer.player.stop(this.audioStream);
AudioPlayer.player.start(this.audioStream);
try {
this.audioStream.reset();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#SuppressWarnings("restriction")
public void stop() {
AudioPlayer.player.stop(this.audioStream);
}
}
I have the next class where it stores all the sounds i'm gonna use for a game i'm doing, however i'm trying to test the Jar file to see if is working properly and I find the next error from the command prompt
Exception in thread "main" java.lang.NullPointerException
at com.sun.media.sound.StandardMidiFileReader.getSequence(Unknown Source)
at javax.sound.midi.MidiSystem.getSequence(Unknown Source)
at com.sun.media.sound.SoftMidiAudioFileReader.getAudioInputStream(Unknown Source)
at javax.sound.sampled.AudioSystem.getAudioInputStream(Unknown Source)
at Assets.AudioStreaming.loadMusicAssets(AudioStreaming.java:27)
at CrossyMain.MainMenu.<init>(MainMenu.java:64)
at CrossyMain.Principal.main(Principal.java:18)
This is the class where i'm storing all the audio assets
package Assets;
import java.io.IOException;
import javax.sound.sampled.*;
public class AudioStreaming {
private static Clip mainMenu,BGM,coinP,gameOver,trainAlert,birdGO;
private static AudioInputStream in_mainMenu,in_BGM,in_coinP,in_gameOver,in_trainAlert,in_birdGO;
public static void loadMusicAssets(){
try{
/*Background*/
BGM = AudioSystem.getClip();
in_BGM = AudioSystem.getAudioInputStream(AudioStreaming.class.getResource("/sound/background/NeonW.wav"));
BGM.open(in_BGM);
coinP = AudioSystem.getClip();
in_coinP = AudioSystem.getAudioInputStream(AudioStreaming.class.getResource("/sound/entityFx/coinPickup.wav"));
coinP.open(in_coinP);
gameOver = AudioSystem.getClip();
in_gameOver = AudioSystem.getAudioInputStream(AudioStreaming.class.getResource("/sound/entityFx/gameOver.wav"));
gameOver.open(in_gameOver);
trainAlert = AudioSystem.getClip();
in_trainAlert = AudioSystem.getAudioInputStream(AudioStreaming.class.getResource("/sound/entityFx/trainWarning.wav"));
trainAlert.open(in_trainAlert);
birdGO = AudioSystem.getClip();
in_birdGO = AudioSystem.getAudioInputStream(AudioStreaming.class.getResource("/sound/entityFx/eaglePickup.wav"));
birdGO.open(in_birdGO);
mainMenu = AudioSystem.getClip();
in_mainMenu = AudioSystem.getAudioInputStream(AudioStreaming.class.getResource("/sound/background/NeonValley.wav"));
mainMenu.open(in_mainMenu);
}catch(LineUnavailableException | UnsupportedAudioFileException | IOException e){
e.printStackTrace();
}
}
public static void playBGM(){
BGM.setFramePosition(0);
BGM.start();
}
public static void stopBGM(){
BGM.stop();
BGM.close();
}
public static void playMainMenu(){
mainMenu.setFramePosition(0);
mainMenu.start();
}
public static void stopMainMenu(){
mainMenu.stop();
mainMenu.close();
}
public static void playTrainWarning(){
trainAlert.setFramePosition(0);
trainAlert.start();
}
public static void stopTrainWarning(){
trainAlert.stop();
trainAlert.close();
}
public static void playBirdGO(){
birdGO.setFramePosition(0);
birdGO.start();
}
public static void playCoinP(){
coinP.setFramePosition(0);
coinP.start();
}
public static void playGameOver(){
gameOver.setFramePosition(0);
gameOver.start();
}
}
It works perfectly within NetBeans with no errors whatsoever, however it sucessfully compiles the JAR file but i noticed this error with the command prompt cause it didn't want to open by just double clicking it? has anyone experienced this error? if so, how did you manage to load your sound assets to a Jar file? any tips/advices would be highly appreciated thanks for your time!
(I'm attempting to make my previous question more generic in the hopes of a solution.)
I am using the JLayer library and a sample.mp3 file. I would like to play AND decode the file at the same time.
However, I want them to be synchronized - if a part of the song is decoded, it is also played. Nothing is decoded before it is played and vice versa (to a reasonable degree, of course).
Here is how a song is played and decoded, respectfully:
Player p = new Player(InputStream mp3stream);
p.play();
Decoder d = new Decoder();
BitStream bs = new Bitstream(InputStream mp3stream);
SampleBuffer s = (SampleBuffer) d.decodeFrame(bs.readFrame(), bs);
// ... for processing the SampleBuffer but irrelevant for the question
I currently use:
InputStream mp3stream = new FileInputStream("sample.mp3");
but this uses the whole song at once so I am unable to synchronize. Is there a way to break the sample.mp3 into pieces that can be manipulated by both processes? If I had small enough pieces I could run both pieces into the processes, wait until both finished, and then grab the next small piece and repeat until I was out of small pieces.
Note: I have tried using ByteArrayInputStream with no success - but perhaps my methodology is incorrect when using it.
I hope i get this right:
You have a single input file
You want that two different input streams are synchronized in the sense, that "they must make the same progress" in the stream.
This is an interestig question. I came up with the following sketch (compiles, but didn't execute it, so you may do a little testing first).
Create a wrapper object "StreamSynchronizer" that controls access to the underlying input. Only a single byte is read until all derived streams have read this byte.
Derive any number of "SynchronizedStream" instances from this that delegate the "read" back t the StreamSynchronizer.
package de.mit.stackoverflow;
import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
public class StreamSynchronizer {
final private InputStream inputStream;
private List activeStreams = new ArrayList();
private int lastByte;
private Set waitingStreams = new HashSet();
private Object lock = new Object();
public StreamSynchronizer(InputStream is) throws IOException {
super();
this.inputStream = is;
lastByte = getInputStream().read();
}
public void close(SynchronizedStream stream) {
activeStreams.remove(stream);
}
public SynchronizedStream createStream() {
SynchronizedStream stream = new SynchronizedStream(this);
activeStreams.add(stream);
return stream;
}
public InputStream getInputStream() {
return inputStream;
}
public int read(SynchronizedStream stream) throws IOException {
synchronized (lock) {
while (waitingStreams.contains(stream)) {
if (waitingStreams.size() == activeStreams.size()) {
waitingStreams.clear();
lastByte = getInputStream().read();
lock.notifyAll();
} else {
try {
lock.wait();
} catch (InterruptedException e) {
throw new IOException(e);
}
}
}
waitingStreams.add(stream);
return lastByte;
}
}
}
package de.mit.stackoverflow;
import java.io.IOException;
import java.io.InputStream;
public class SynchronizedStream extends InputStream {
final private StreamSynchronizer synchronizer;
protected SynchronizedStream(StreamSynchronizer synchronizer) {
this.synchronizer = synchronizer;
}
#Override
public void close() throws IOException {
getSynchronizer().close(this);
}
public StreamSynchronizer getSynchronizer() {
return synchronizer;
}
#Override
public int read() throws IOException {
return getSynchronizer().read(this);
}
}
I have a problem while recording the audio. I created a servlet and I modified the java sound API demo code to some extent and finally I can record the audio. The problem is that when I play the audio I can see the total time of the audio stored as 645.45 or something like that, but I have been recording the audio only for a couple of mins. One more problem is the audio is getting saved in the Eclipse directory instead of the project directory.
This is the servlet code.
package com;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.Clip;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.TargetDataLine;
public class SoundRecorder extends HttpServlet {
private static final long serialVersionUID = 1L;
static protected boolean running;
static ByteArrayOutputStream out;
double fileName = Math.random();
//strFilename = nowLong.toString();
public SoundRecorder() {
System.out.println("Filename will be..." + fileName + ".wav");
}
public void init() {
}
public void destroy() {
}
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
System.out.println("call received..");
String method = request.getParameter("method");
System.out.println(method);
if("record".equalsIgnoreCase(method)) {
captureAudio(true);
}
else if("stop".equalsIgnoreCase(method)) {
captureAudio(false);
}
else if("play".equalsIgnoreCase(method)) {
System.out.println("yet to write");
playAudio();
}
}
public void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
System.out.println("call received..");
String method = request.getParameter("method");
System.out.println(method);
doGet(request, response);
}
private void captureAudio(boolean capturing) {
File outputFile = new File(fileName + ".wav");
AudioFormat audioFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED,44100.0F, 16, 2, 4, 44100.0F, false);
DataLine.Info info = new DataLine.Info(TargetDataLine.class, audioFormat);
TargetDataLine targetDataLine = null;
try
{
targetDataLine = (TargetDataLine) AudioSystem.getLine(info);
targetDataLine.open(audioFormat);
}
catch (LineUnavailableException e)
{
System.out.println("unable to get a recording line");
e.printStackTrace();
System.exit(1);
}
AudioFileFormat.Type targetType = AudioFileFormat.Type.WAVE;
final Recorder recorder = new Recorder(targetDataLine,targetType,outputFile);
System.out.println("Recording...");
if(capturing){
recorder.start();
}
else {
recorder.stopRecording();
}
}
private void playAudio() {
try {
File file = new File(fileName + ".wav");
AudioInputStream stream = AudioSystem.getAudioInputStream(file);
AudioFormat format = stream.getFormat();
DataLine.Info info = new DataLine.Info(Clip.class, stream.getFormat());
Clip clip = (Clip) AudioSystem.getLine(info);
clip.open(stream);
clip.start();
} catch (Exception e) {
System.err.println("Line unavailable: " + e);
System.exit(-4);
}
}
}
And this is the recorder class
public class Recorder extends Thread {
private TargetDataLine m_line;
private AudioFileFormat.Type m_targetType;
private AudioInputStream m_audioInputStream;
private File m_outputFile;
public Recorder(TargetDataLine line,
AudioFileFormat.Type targetType,
File file)
{
m_line = line;
m_audioInputStream = new AudioInputStream(line);
m_targetType = targetType;
m_outputFile = file;
}
/** Starts the recording.
To accomplish this, (i) the line is started and (ii) the
thread is started.
*/
public void start()
{
m_line.start();
super.start();
}
/** Stops the recording.
*/
public void stopRecording()
{
m_line.stop();
m_line.close();
}
/** Main working method.
*/
public void run()
{
try
{
AudioSystem.write(
m_audioInputStream,
m_targetType,
m_outputFile);
}
catch (IOException e)
{
e.printStackTrace();
}
}
private static void closeProgram()
{
System.out.println("Program closing.....");
System.exit(1);
}
private static void out(String strMessage)
{
System.out.println(strMessage);
}
}
When developing with servlets, you need to realize that there's only one servlet instance throughout the whole webapp's lifetime, from startup until shutdown. So, the HTTP requests from all visitors, all sessions, all browser windows/tabs, etc will all share the same servlet instance. Also, when you make a variable static, it will be shared among all instances of the same class (which is not really relevant here since there's only one servlet instance anyway).
In other words, those variables which you've declared in the servlet are not threadsafe:
static protected boolean running;
static ByteArrayOutputStream out;
double fileName = Math.random();
There's only one of them and they are used by all visitors simultaneously. For the first two variables, which are continuously modified, this will lead to major threadsafety problems and for the third variable this means that all visitors record to the very same file. You need to declare them inside the doGet() block. You'd like to store the recording in the session by an unique request based token as key and then pass that key to the subsequent requests.
As to the problem of the file being saved at the unexpected location; when you use relative paths in java.io.File in a servlet, then it will be relative to the directory from where the webserver is started. If you start it from inside Eclipse, then it's saved in Eclipse directory. You'd like to use absolute path in java.io.File instead. If your intent is to save it in public webcontent (there where your JSP's and the /WEB-INF folder is located), then you need ServletContext#getRealPath() to convert a web path to an absolute disk path.
String relativeWebPath = "filename.ext";
String absoluteDiskPath = getServletContext().getRealPath(relativeWebPath);
File file = new File(absoluteDiskPath);
There's however another problem with this: all files will get erased whenever you redeploy the webapp. If you want a bit more permanent storage, then you'd like to store it outside the web project. E.g. C:/path/to/recordings.
File file = new File("C:/path/to/recordings/filename.ext");
I am trying to run a program using freetts. I am able to compile the program however I am not able to use kevin or mbrola voices I get the follwing output message at the end
System property "mbrola.base" is undefined. Will not use MBROLA voices.
LINE UNAVAILABLE: Format is pcm_signed 16000.0 Hz 16 bits 1 channel big endian
import javax.speech.*;
import javax.speech.synthesis.*;
import java.util.*;
class freetts {
public static void main(String[] args) {
try{
Calendar calendar = new GregorianCalendar();
String sayTime = "It is " + calendar.get(Calendar.HOUR) + " " + calendar.get(Calendar.MINUTE) + " " + (calendar.get(Calendar.AM_PM)==0 ? "AM":"PM");
Synthesizer synth = Central.createSynthesizer(null);
synth.allocate();
synth.resume();
synth.speakPlainText(sayTime, null);
synth.waitEngineState(Synthesizer.QUEUE_EMPTY);
synth.deallocate();
}
catch(Exception e){
e.printStackTrace();
}
}
}
It seems that "To enable FreeTTS support for MBROLA, merely copy mbrola/mbrola.jar to lib/mbrola.jar. Then, whenever you run any FreeTTS application, specify the "mbrola.base" directory as a system property:
java -Dmbrola.base=/home/jim/mbrola -jar bin/FreeTTSHelloWorld.jar mbrola_us1"
I found this at:
http://freetts.sourceforge.net/mbrola/README.html
http://workorhobby.blogspot.com/2011/02/java-audio-freetts-line-unavailable.html
A big thanks to the author.
A program based on FreeTTS, the free text-to-speech engine for Java, was getting occasional errors
"LINE UNAVAILABLE: Format is ..."
Turns out there is no Java Exception or other mechanism to detect this error that occurs inside the FreeTTS library. All you get is the message on System.out, so there is no good way to react programatically.
Workaround: Configure the FreeTTS audio player to attempt accessing the audio device more than once until it succeeds. In this example, a short delay of 0.1 seconds is used to not miss an opportunity to grab the audio device; we keep trying for 30 seconds:
System.setProperty("com.sun.speech.freetts.audio.AudioPlayer.openFailDelayMs", "100");
System.setProperty("com.sun.speech.freetts.audio.AudioPlayer.totalOpenFailDelayMs", "30000");
If the audio device is permanently used by another program, there is of course no way to get access. Under Linux, this command will display the ID of the process that is currently holding the audio device, so you can then try to get rid of the offending program:
/sbin/fuser /dev/dsp
The second phrase has nothing to do with mbrola, but with a horrendous java linux sound bug that is still not fixed.
Check the third post here:
https://forums.oracle.com/forums/thread.jspa?threadID=2206163
That is happening because freetts "trusts" the sourcedataline, instead of doing the workaround on that post. The bug is in the jdk, but can be worked around by finding where in freetts that is happening and inserting the workaround & recompiling.
Here is a testcase
package util.speech;
import java.util.Iterator;
import java.util.Locale;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.SourceDataLine;
import org.junit.After;
import org.junit.AfterClass;
import org.junit.Assume;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;
import static org.junit.Assert.*;
public class VoiceTest {
public VoiceTest() {
}
#BeforeClass
public static void setUpClass() throws Exception {
}
#AfterClass
public static void tearDownClass() throws Exception {
}
#Before
public void setUp() {
}
#After
public void tearDown() {
}
#Test
public void testDataLineAvailableAndBuggyInJDK() throws LineUnavailableException {
boolean failedSimpleGetLine = false;
AudioFormat format = new AudioFormat(44100, 16, 2, true, false);
SourceDataLine line = null;
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
try {
line = (SourceDataLine) AudioSystem.getLine(info);
} catch (LineUnavailableException e) {
//ok, at least it says so
throw e;
}
try {
//if this fails the jdk is very buggy, since it just told us
//the line was available
line.open(format);
} catch (LineUnavailableException e) {
failedSimpleGetLine = true;
} finally {
if (line.isOpen()) {
line.close();
}
}
//now if this is true, test if it's possible to get a valid sourcedataline
//or the only bug is adquiring a sourcedataline doesn't throw a lineunavailable
//exception before open
Assume.assumeTrue(failedSimpleGetLine);
line = getSourceDataLine(format);
if (line == null) {
return;
}
try {
line.open(format);
} catch (LineUnavailableException e) {
//ok then it is consistent, and there is only one bug
fail("Line Unavailable after being adquired");
} finally {
if (line.isOpen()) {
line.close();
}
}
fail("line available after first test not managing to adquire it");
}
private SourceDataLine getSourceDataLine(AudioFormat format) {
try {
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
for (Mixer.Info mi : AudioSystem.getMixerInfo()) {
SourceDataLine dataline = null;
try {
Mixer mixer = AudioSystem.getMixer(mi);
dataline = (SourceDataLine) mixer.getLine(info);
dataline.open(format);
dataline.start();
return dataline;
} catch (Exception e) {
}
if (dataline != null) {
try {
dataline.close();
} catch (Exception e) {
}
}
}
} catch (Exception e) {
}
return null;
}
}
I know i am posting it little late, but this may help someone. I tried with both kevin and mbrola, and it worked for me. Please find the code below.
package com.mani.texttospeech;
import java.beans.PropertyVetoException;
import java.util.Locale;
import javax.speech.AudioException;
import javax.speech.Central;
import javax.speech.EngineException;
import javax.speech.EngineStateError;
import javax.speech.synthesis.Synthesizer;
import javax.speech.synthesis.SynthesizerModeDesc;
import javax.speech.synthesis.Voice;
/**
*
* #author Manindar
*/
public class SpeechUtils {
SynthesizerModeDesc desc;
Synthesizer synthesizer;
Voice voice;
public void init(String voiceName) throws EngineException, AudioException, EngineStateError, PropertyVetoException {
if (desc == null) {
System.setProperty("freetts.voices", "com.sun.speech.freetts.en.us.cmu_us_kal.KevinVoiceDirectory");
desc = new SynthesizerModeDesc(Locale.US);
Central.registerEngineCentral("com.sun.speech.freetts.jsapi.FreeTTSEngineCentral");
synthesizer = Central.createSynthesizer(desc);
synthesizer.allocate();
synthesizer.resume();
SynthesizerModeDesc smd = (SynthesizerModeDesc) synthesizer.getEngineModeDesc();
Voice[] voices = smd.getVoices();
for (Voice voice1 : voices) {
if (voice1.getName().equals(voiceName)) {
voice = voice1;
break;
}
}
synthesizer.getSynthesizerProperties().setVoice(voice);
}
}
public void terminate() throws EngineException, EngineStateError {
synthesizer.deallocate();
}
public void doSpeak(String speakText) throws EngineException, AudioException, IllegalArgumentException, InterruptedException {
synthesizer.speakPlainText(speakText, null);
synthesizer.waitEngineState(Synthesizer.QUEUE_EMPTY);
}
public static void main(String[] args) throws Exception {
SpeechUtils su = new SpeechUtils();
su.init("kevin16");
// su.init("kevin");
// su.init("mbrola_us1");
// su.init("mbrola_us2");
// su.init("mbrola_us3");
// high quality
su.doSpeak("Hi this is Manindar. Welcome to audio world.");
su.terminate();
}
}
And add the below dependencies to your pom.xml file.
<dependencies>
<dependency>
<groupId>net.sf.sociaal</groupId>
<artifactId>freetts</artifactId>
<version>1.2.2</version>
</dependency>
</dependencies>
Hope this will be helpful.