Java: Append zeros (silence) to a sound file - java

I want to implement a metronome app in Java for playing beats in complicated rhythm patterns. There are many kinds of beats, (drums and other percussion instruments) that's why using timers and threads may cause a not precise and optimal performance. I decided firstly to generate the sound by adding silence intervals for each instrument and then to mix them together for better performance (not sure if this is the best solution, but anyway). Now my problem is to add silence intervals to each beat.
public AudioStream create(String file, String rhythm) {
InputStream in = null;
AudioStream audioStream = null;
try {
in = new FileInputStream(path + file);
audioStream = new AudioStream(in);
} catch (FileNotFoundException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
This function should create and return the sample for one instrument. So how can I add silence milliseconds to the file and then return the final sample?

Related

Java audio crackling

When I am playing the audio on my Java desktop application, the sound begins to crackle and fuzz out. I don't know why, any suggestions? I am working on a Pokemon fan game.
static AudioInputStream audio = null;
public static boolean change = false;
static Clip clip = null;
public static void music() {
try {
change = false;
if(!Main.choosegame) {
if(!Main.startup) {
if(Movement.POKEMONBATTLE) {
audio = AudioSystem.getAudioInputStream(new File("Res/music/pokemon battle.wav"));
} else {
audio = AudioSystem.getAudioInputStream(new File("Res/music/route.wav"));
}
} else {
audio = AudioSystem.getAudioInputStream(new File("Res/music/Oak's Speech.wav"));
}
} else {
audio = AudioSystem.getAudioInputStream(new File("Res/music/Title Screen.wav"));
}
clip = AudioSystem.getClip();
clip.open(audio);
clip.start();
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
while(clip.isActive() && Main.Running && !change){
}
clip.stop();
audio.close();
Thread.sleep(100);
} catch(UnsupportedAudioFileException uae) {
System.out.println(uae);
} catch(IOException ioe) {
System.out.println(ioe);
} catch(LineUnavailableException lua) {
System.out.println(lua);
} catch (InterruptedException e) {
e.printStackTrace();
} catch(OutOfMemoryError e12) {
clip.stop();
change = true;
try {
audio.close();
} catch (IOException e1) {
e1.printStackTrace();
}
System.out.println("OUT OF MEMORY IN MUSIC");
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
#Override
public void run() {
while(Main.Running) {
music();
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
Curious stuff. Given that you have found a solution, maybe I shouldn't be adding my two cents. But a few things seem puzzling and not quite matching the audio world as I know it.
Usually crackle and distortion are the result of PCM data points exceeding their bounds. For example, if you have wav files with data that ranges from -32768 to 32767 (16-bit encoding represented via signed shorts), and the values go outside of that range, then distortion of various sorts can occur.
This might occur in your case if more than one wav file is played at a time, and the wavs are already at a very high volume. When their data is summed together for simultaneous playback, the 16-bit range could be exceeded.
If the addition of pauses has the main effect of preventing the wavs from playing at the same time, this could thus also lessen the amount of distortion.
There are some situations where it takes an audio thread a bit of time to finish and respond to a state change. But I can't think of any where crackle or fuzz would be the result. (But that doesn't mean there are no such situations.)
Simply bypassing a number of samples, via skip(), should (theoretically) only help if the same crackle and fuzz are on the original wav files, and you are skipping past the distorted section. However this should result in a click if starting from an already audible volume level.
By the way, you would probably do better to run the files as SourceDataLines than as Clips. Clips are only meant for situations where you are going to replay the sounds many times and can afford to hold the data in memory. As coded, every time you play a sound, you are first loading the entire sound into memory, and then playing it. A Clip does not play until all the data has been loaded into memory. With a SourceDataLine, the playback code reads data as it plays, consuming much less memory.
If you can afford the memory, load the Clip only once into its own variable. After playing a Clip, one can set its cursor back to the start of the Clip and later replay the data without having to reload from the file (as you are continually doing).
Is the crackling always at the beginning? If so, I found some code that skips the first bytes to avoid that:
// Skip some bytes at the beginning to prevent crackling noise.
audio.skip(30);
Source: http://veritas.eecs.berkeley.edu/apcsa-ret/program/projects/lesson13/Sound/SampleRateConverter.java

java playing sounds in order

Hi I have following java programme that play some sounds.I want to play sounds in order for example after ending of sound1 i want to play sound2 and then sound3 the following is my java code and function of playing sound .
private void playsound(String file)
{
try {
crit = AudioSystem.getClip();
AudioInputStream inputStream1 = AudioSystem.getAudioInputStream(this.getClass().getResource(file));
crit.open(inputStream1);
//if(!crit.isOpen())
{
crit.start();
}
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
}
and calling it as following
playsound("/sounds/filesound1.au");
playsound("/sounds/filesound2.au");
playsound("/sounds/filesound3.au");
the programme is plying sound in parallel which I don't want.I want to play in order
Regards
I got the following code from somewhere that I can't remember right now but it plays the music consequently:
public static void play(ArrayList<String> files){
byte[] buffer = new byte[4096];
for (String filePath : files) {
File file = new File(filePath);
try {
AudioInputStream is = AudioSystem.getAudioInputStream(file);
AudioFormat format = is.getFormat();
SourceDataLine line = AudioSystem.getSourceDataLine(format);
line.open(format);
line.start();
while (is.available() > 0) {
int len = is.read(buffer);
line.write(buffer, 0, len);
}
line.drain();
line.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}
The reason this plays the files consequently and not all at the same time is because write blocks until the requested amount of data has been written. This applies even if the requested amount of data to write is greater than the data line's buffer size.
Make sure to include drain() from the code above. drain() waits for the buffer to empty before it close()s.

How to properly close file-to-string input stream? (IOUtils FileUtils)

I have two file-to-string processes in my app (one actually deals with an asset file).
If I repeat either of these processes a few times on the same file, I get OutOfMemoryErrors.
I suspect it might be because I'm not closing the streams properly and therefore maybe causing multiple streams to be created, and this is perhaps causing my app to run out of memory.
Here is the code of the two processes:
My asset-file-to-string process.
As you can see, I have have something in place to close the stream but I don't know if it's formatted properly.
try
{
myVeryLargeString = IOUtils.toString(getAssets().open(myAssetsFilePath), "UTF-8");
IOUtils.closeQuietly(getAssets().open(myAssetsFilePath));
}
catch (IOException e)
{
e.printStackTrace();
}
catch(OutOfMemoryError e)
{
Log.e(TAG, "Ran out of memory 01");
}
My file-to-string process.
I have no idea how to close this stream (if there is even a stream to close at all).
myFile01 = new File(myFilePath);
try
{
myVeryLargeString = FileUtils.readFileToString(myFile01, "UTF-8");
}
catch (IOException e)
{
e.printStackTrace();
}
catch(OutOfMemoryError e)
{
Log.e(TAG, "Ran out of memory 02");
}
It's difficult to say what may cause OOME but closing should be like this
InputStream is = getAssets().open(myAssetsFilePath);
try {
myVeryLargeString = IOUtils.toString(is, "UTF-8");
} finally {
IOUtils.closeQuietly(is);
}

Playing wavs in Java

So, I'm working on a project for class wherein we have to have a game with background music. I'm trying to play a .wav file as background music, but since I can't use clips (too short for a music file) I have to play with the AudioStream.
In my first implementation, the game would hang until the song finished, so I threw it into its own thread to try and alleviate that. Currently, the game plays very slowly while the song plays. I'm not sure what I need to do to make this thread play nice with my animator thread, because we we're never formally taught threads. Below is my background music player class, please someone tell me what I've done wrong that makes it hog all the system resources.
public class BGMusicPlayer implements Runnable {
File file;
AudioInputStream in;
SourceDataLine line;
int frameSize;
byte[] buffer = new byte [32 * 1024];
Thread player;
boolean playing = false;
boolean fileNotOver = true;
public BGMusicPlayer (File inputFile){
try{
file = inputFile;
in = AudioSystem.getAudioInputStream (inputFile);
AudioFormat format = in.getFormat();
frameSize = format.getFrameSize();
DataLine.Info info =new DataLine.Info (SourceDataLine.class, format);
line = (SourceDataLine) AudioSystem.getLine (info);
line.open();
player = new Thread (this);
player.start();
}
catch(Exception e){
System.out.println("That is not a valid file. No music for you.");
}
}
public void run() {
int readPoint = 0;
int bytesRead = 0;
player.setPriority(Thread.MIN_PRIORITY);
while (fileNotOver) {
if (playing) {
try {
bytesRead = in.read (buffer,
readPoint,
buffer.length - readPoint);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
if (bytesRead == -1) {
fileNotOver = false;
break;
}
int leftover = bytesRead % frameSize;
// send to line
line.write (buffer, readPoint, bytesRead-leftover);
// save the leftover bytes
System.arraycopy (buffer, bytesRead,
buffer, 0,
leftover);
readPoint = leftover;
try {
Thread.sleep(20);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
public void start() {
playing = true;
if(!player.isAlive())
player.start();
line.start();
}
public void stop() {
playing = false;
line.stop();
}
}
You are pretty close, but there are a couple of unusual things that maybe are contributing to the performance problem.
First off, if you are just playing back a .wav, there shouldn't really be a need to deal with any "readpoint" but a value of 0, and there shouldn't really be a need for a "leftover" computation. When you do the write, it should simply be the same number of bytes that were read in (return value of the read() method).
I'm also unclear why you are doing the ArrayCopy. Can you lose that?
Setting the Thread to low priority, and putting a Sleep--I guess you were hoping those would slow down the audio processing to allow more of your game to process? I've never seen this done before and it is really unusual if it is truly needed. I really recommend getting rid of these as well.
I'm curious where your audio file is coming from. Your not streaming it over the web, are you?
By the way, the way you get your input from a File and place it into an InputStream very likely won't work with Java7. A lot of folks are reporting a bug with that. It turns out it is more correct and efficient to generate a URL from the File, and then get the AudioInputStream using the URL as the argument rather than the file. The error that can come up is a "Mark/Reset" error. (A search on that will show its come up a number of times here.)

Recording multiplexed Audio/Video to a file using JMF

I have a project that uses JMF, and records for a short time (a few seconds to a couple of minutes) both the web camera, and audio inputs, and then writes the results to a file.
The problem with my project is that this file is never produced properly, and cannot be played back.
While I've found numerous examples of how to do multiplexed transmission of audio and video over RTP, or conversion of an input file from one format to another , I haven't seen a working example yet that captures audio and video, and writes it to a file.
Does anyone have an example of functioning code to do this?
I've found the reason why I was not able to generate a file from two separate capture devices under JMF, and it relates to ordering of the start commands. In particular, things like Processors will take a datasource, or merging datasource, assign and synchronize the time base(s) and start/stop the sources for you, so the extra work I was trying to do starting the datasources manually is utterly redundant, and throws a wrench in the works.
This was a lot of painful trial and error, and I would suggest you read every line of code, understand the sequencing, and understand what has been included, and what has been left out and why before trying to implement this yourself. JMF is quite the bear if you're not careful.
Oh, and remember to catch exceptions. I had to omit that code due to length restrictions.
Here's my final solution:
public void doRecordingDemo() {
// Get the default media capture device for audio and video
DataSource[] sources = new DataSource[2];
sources[0] = Manager.createDataSource(audioDevice.getLocator());
sources[1] = Manager.createDataSource(videoDevice.getLocator());
// Merge the audio and video streams
DataSource source = Manager.createMergingDataSource(sources);
// Create a processor to convert from raw format to a file format
// Notice that we are NOT starting the datasources, but letting the
// processor take care of this for us.
Processor processor = Manager.createProcessor(source);
// Need a configured processor for this next step
processor.configure();
waitForState(processor, Processor.Configured);
// Modify this to suit your needs, but pay attention to what formats can go in what containers
processor.setContentDescriptor(new FileTypeDescriptor(FileTypeDescriptor.QUICKTIME));
// Use the processor to convert the audio and video into reasonable formats and sizes
// There are probably better ways to do this, but you should NOT make any assumptions
// about what formats are supported, and instead use a generic method of checking the
// available formats and sizes. You have been warned!
for (TrackControl control : processor.getTrackControls()) {
if (control.getFormat() instanceof VideoFormat || control.getFormat() instanceof AudioFormat) {
if (control.getFormat() instanceof AudioFormat) {
// In general, this is safe for audio, but do not make assumptions for video.
// Things get a little wonky for video because of how complex the options are.
control.setFormat(new AudioFormat(AudioFormat.GSM));
}
if (control.getFormat() instanceof VideoFormat) {
VideoFormat desiredVideoFormat = null;
Dimension targetDimension = new Dimension(352, 288);
// Search sequentially through this array of formats
VideoFormat[] desiredFormats = new VideoFormat[] {new H263Format(), new JPEGFormat(), new RGBFormat(), new YUVFormat()};
for (VideoFormat checkFormat : desiredFormats) {
// Search the video formats looking for a match.
List<VideoFormat> candidates = new LinkedList<VideoFormat>();
for (Format format : control.getSupportedFormats()) {
if (format.isSameEncoding(checkFormat)) {
candidates.add((VideoFormat) format);
}
}
if (!candidates.isEmpty()) {
// Get the first candidate for now since we have at least a format match
desiredVideoFormat = candidates.get(0);
for (VideoFormat format : candidates) {
if (targetDimension.equals(format.getSize())) {
// Found exactly what we're looking for
desiredVideoFormat = format;
break;
}
}
}
if (desiredVideoFormat != null) {
// If we found a match, stop searching formats
break;
}
}
if (desiredVideoFormat != null) {
// It's entirely possible (but not likely) that we got here without a format
// selected, so this null check is unfortunately necessary.
control.setFormat(desiredVideoFormat);
}
}
control.setEnabled(true);
System.out.println("Enabled track: " + control + " (" + control.getFormat() + ")");
}
}
// To get the output from a processor, we need it to be realized.
processor.realize();
waitForState(processor, Processor.Realized);
// Get the data output so we can output it to a file.
DataSource dataOutput = processor.getDataOutput();
// Create a file to receive the media
File answerFile = new File("recording.mov");
MediaLocator dest = new MediaLocator(answerFile.toURI().toURL());
// Create a data sink to write to the disk
DataSink answerSink = Manager.createDataSink(dataOutput, dest);
// Start the processor spinning
processor.start();
// Open the file
answerSink.open();
// Start writing data
answerSink.start();
// SUCCESS! We are now recording
Thread.sleep(10000); // Wait for 10 seconds so we record 10 seconds of video
try {
// Stop the processor. This will also stop and close the datasources
processor.stop();
processor.close();
try {
// Let the buffer run dry. Event Listeners never seem to get called,
// so this seems to be the most effective way.
Thread.sleep(1000);
} catch (InterruptedException ex) {
Logger.getLogger(getClass().getName()).log(Level.SEVERE, null, ex);
}
try {
// Stop recording to the file.
answerSink.stop();
} catch (IOException ex) {
Logger.getLogger(getClass().getName()).log(Level.SEVERE, null, ex);
}
} finally {
try {
// Whatever else we do, close the file if we can to avoid leaking.
answerSink.close();
} catch (Exception ex) {
Logger.getLogger(getClass().getName()).log(Level.SEVERE, null, ex);
}
try {
// Deallocate the native processor resources.
processor.deallocate();
} catch (Exception ex) {
Logger.getLogger(getClass().getName()).log(Level.SEVERE, null, ex);
}
}
}
// My little utility function to wait for a given state.
private void waitForState(Player player, int state) {
// Fast abort
if (player.getState() == state) {
return;
}
long startTime = new Date().getTime();
long timeout = 10 * 1000;
final Object waitListener = new Object();
ControllerListener cl = new ControllerListener() {
#Override
public void controllerUpdate(ControllerEvent ce) {
synchronized (waitListener) {
waitListener.notifyAll();
}
}
};
try {
player.addControllerListener(cl);
// Make sure we wake up every 500ms to check for timeouts and in case we miss a signal
synchronized (waitListener) {
while (player.getState() != state && new Date().getTime() - startTime < timeout) {
try {
waitListener.wait(500);
} catch (InterruptedException ex) {
Logger.getLogger(getClass().getName()).log(Level.SEVERE, null, ex);
}
}
}
} finally {
// No matter what else happens, we want to remove this
player.removeControllerListener(cl);
}
}

Categories