So I started working on a video game and I want to create a custom sound format.
Path file = Paths.get("C:", "Users", "Mariobro85", "Desktop", "test.wav");
public void playFile() throws InterruptedException{
try{
File f = new File(file.toString());
URL url = f.toURI().toURL();
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(url);
Clip clip = AudioSystem.getClip();
clip.open(audioInputStream);
clip.start();
Thread.sleep(15000);
}
catch(UnsupportedAudioFileException uafE){
System.out.println("ERROR: Audio data of the file " + file + " is not in wave format.");
}
catch(IOException ioE){
System.out.println("ERROR: Audio data of the file " + file + " is corrupted.");
}
catch(LineUnavailableException luE){
System.out.println(luE);
}
}
But this only plays standard .wav files.
I use wav in the custom format too, but the problem is that the audio data in my file starts at 0x60 instead of 0x0 and also there is information stored for loops, volume etc. Because of that it always throws an UnsupportedAudioFileException.
Is there any way to tell the AudioStream to jump to a specific address or is that not possible with standard java libraries?
If you want to edit the audio data before it is parsed by the default audio parser, you can edit it as follows
File file=new File("path/to/file");
byte[] bytes= Files.readAllBytes(file.toPath());
//edit or trim bytes as you wish.
ByteArrayInputStream inputStream=new ByteArrayInputStream(bytes);//you can even give an offset in this constructor
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(inputStream);
Clip clip = AudioSystem.getClip();
clip.open(audioInputStream);
clip.start();
Thread.sleep(xAmount);
If your format is truly custom, I recommend just following one of these standardised approaches and simply adding a header or footer to your byte array. No need to remake what has been done. Or, to make things simpler, you can just add a separate file you also load which has different data you wish to load with the audio.
Related
Ok, this is going to sound odd, but hear me out. I've been using Mendix to deal with an app that manipulates wav files. I love how easy Mendix makes the front end. I have a FileDocument with a WAV file in it. I use the custom java below to read the wav file as an InputStream, then convert to an AudioInput stream. I then concatenate the sound to itself. Basically a drum sound gets duplicated and repeated so it sounds like two drum sounds, one after another.
Then I save it two ways: (1) Back to a save file on my computer. This actual wav file is just fine. I can open it and play the wav file and I hear the back to back drum sounds. When I save it back to the FileDocument in mendix (which requires an inputstream), I feed it the audioinput stream instead. The size of the file appears to be exactly the same, but if I try to play this file I get an error about the file not being formatted properly. Does anyone know if there is a way to convert a file from an audioinputstream BACK to an inputstream? If I load then wav file from the first method into an inputstream and save it to the FileDocument, it works fine, but this is an extra step that eats up resources and forces me to save the file elsewhere, which I don't want to do.
In otherwords... If an AudioInputStream is an InputStream... then I'm not sure why feeing Mendix an Inputstream works, but not an AudioInputStream
InputStream is;
InputStream is2;
InputStream iscopy;
InputStream is2copy;
TestFile myobject = this.Parameter.get(a);
is = Core.getFileDocumentContent(getContext(), myobject.getMendixObject());
is2 = Core.getFileDocumentContent(getContext(), myobject.getMendixObject());
iscopy = Core.getFileDocumentContent(getContext(), myobject.getMendixObject());
is2copy = Core.getFileDocumentContent(getContext(), myobject.getMendixObject());
AudioInputStream clip1 = AudioSystem.getAudioInputStream(new BufferedInputStream(is));
AudioInputStream clip2 = AudioSystem.getAudioInputStream(new BufferedInputStream(is2));
AudioInputStream clip1copy = AudioSystem.getAudioInputStream(new BufferedInputStream(iscopy));
AudioInputStream clip2copy = AudioSystem.getAudioInputStream(new BufferedInputStream(is2copy));
AudioInputStream appendedFiles = new AudioInputStream(new SequenceInputStream(clip1, clip2), clip1.getFormat(), clip1.getFrameLength() + clip2.getFrameLength());
AudioInputStream appendedFiles2 = new AudioInputStream(new SequenceInputStream(clip1copy, clip2copy), clip1copy.getFormat(), clip1copy.getFrameLength() + clip2copy.getFrameLength());
AudioSystem.write(appendedFiles, AudioFileFormat.Type.WAVE, new File("C:\\test\\wavAppended.wav"));
IMendixObject newobject = Core.instantiate(getContext(), "MusicBack.TestFile");
Core.storeFileDocumentContent(getContext(), newobject2, "TEST.WAV", appendedFiles2);
I know this question has been asked a few times now about sound, and believe me, I have looked through to try and work out my own answer. I had the sound in Java working before on a different project without any difficulties using this:
public void playWelcome (){
try{
InputStream inputStream =
getClass().getResourceAsStream("Start.wav");
AudioStream audioStream = new AudioStream(inputStream);
AudioPlayer.player.start(audioStream);
}catch(IOException ioException){
ioException.printStackTrace();
System.out.println("Unable to Find WAV/Start FIle");
}
}
Obviously, this is not working before as explained so I have tried to use:
public void uploadDownload() {
try {
AudioInputStream audioInputStream =
AudioSystem.getAudioInputStream(new
File("/DaWord/src/resources/upload.wav").getAbsoluteFile());
Clip clip = AudioSystem.getClip();
clip.open(audioInputStream);
clip.start();
} catch(Exception ex) {
System.out.println("Error with playing sound.");
ex.printStackTrace();
}
}
NOTE: I have tried the file without the slashes too. e.g. 'my file.wav'
Can someone help me out a bit here?
UPDATE:
So I have managed to get it going.. sort of:
File soundFile = new File("/Users/myname/DaWord/src/testing.wav");
AudioInputStream audioIn =
AudioSystem.getAudioInputStream(soundFile);
// Get a sound clip resource.
Clip clip = AudioSystem.getClip();
// Open audio clip and load samples from the audio input stream.
clip.open(audioIn);
clip.start();
But does anyone know how to do this by not providing the absolute path?? I have tried to use the .absoulutePath(); on the end of File such as : File
file = new File ("sound.wav").absouloutePath();
As explained here this are the ways to allocate a AudioInputStream piped from a file or URL,:
// from a wave File
File soundFile = new File("eatfood.wav");
AudioInputStream audioIn = AudioSystem.getAudioInputStream(soundFile);
// from a URL
URL url = new URL("http://www.zzz.com/eatfood.wav");
AudioInputStream audioIn = AudioSystem.getAudioInputStream(url);
// can read from a disk file and also a file contained inside a JAR (used for distribution)
// recommended
URL url = this.getClass().getClassLoader().getResource("eatfood.wav");
AudioInputStream audioIn = AudioSystem.getAudioInputStream(url);
This worked for the time being... with full path
File soundFile = new File("/Users/mrBerns/NetBeansProjects/DaWord/src/ching.wav");
AudioInputStream audioIn = AudioSystem.getAudioInputStream(soundFile);
// Get a sound clip resource.
Clip clip = AudioSystem.getClip();
// Open audio clip and load samples from the audio input stream.
clip.open(audioIn);
clip.start();
In my game I've have got a very long ogg files(about 8 to 20 mb) and some other machines aren't able to read it direct into memory. So I read that some games uses stream and play method. Is there any lib/code example to load and play ogg files(with LWJGL) in real time?
Thanks for help :)
Do you specifically need to play OGG files? If not, there are plenty of online converters than can convert it to MP3, WAV, etc.
Also, do you specifically need to play it with LWJGL? This is very possible with default java, like so:
static String randomName = "TreasureQuest";
public static Clip clip = null;
public static void playSound(String name) throws Exception{
if (clip != null && clip.isOpen()) clip.close();
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(new File("music/" + name + ".wav").getAbsoluteFile());
clip = AudioSystem.getClip();
clip.open(audioInputStream);
FloatControl gainControl =
(FloatControl) clip.getControl(FloatControl.Type.MASTER_GAIN);
gainControl.setValue(0f);
System.out.println(clip.getFrameLength() + " | " + clip.getFramePosition());
clip.start();
}
Personally, I use this for my LWJGL game, and it works perfectly fine.
If you must play an OGG file, and you specifically must play it with LWJGL, I suggest you use OpenAL. You can find the documentation for playing OGG files here.
#Joehot200
So I have two music engines - java clip and lwjgl, so it doesn't metter which one i will be using :)
I have a very similar code to yours(but it includes ogg decompression) and it time loading is still very long - I want to read my sound file and play what I read at the same time(like YOUTUBE). Here is my piece of code:
public static Clip DecodeOgg(String filename)
{
try
{
File file = new File(filename);
// Get AudioInputStream from given file.
AudioInputStream in= AudioSystem.getAudioInputStream(file);
AudioInputStream din = null;
if (in != null)
{
AudioFormat baseFormat = in.getFormat();
AudioFormat decodedFormat = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(),
16,
baseFormat.getChannels(),
baseFormat.getChannels() * 2,
baseFormat.getSampleRate(),
false);
// Get AudioInputStream that will be decoded by underlying VorbisSPI
din = AudioSystem.getAudioInputStream(decodedFormat, in);
Clip clip = AudioSystem.getClip();
clip.open(din);
return clip;
}
}
catch (Exception e)
{
e.printStackTrace();
}
return null;
}
Just like the title. I'm wondering if it's possible to play a song in the background, and if so, how does the code go for that? I've googled endlessly to no avail. Any help would be appreciated.
Try using the Clip class:
http://docs.oracle.com/javase/7/docs/api/javax/sound/sampled/Clip.html
I've used this code for many projects, and it always works
try {
// Open an audio input stream.
URL url = this.getClass().getClassLoader().getResource("path/fileName");
AudioInputStream audioIn = AudioSystem.getAudioInputStream(url);
// Get a sound clip resource.
clip = AudioSystem.getClip();
// Open audio clip and load samples from the audio input stream.
clip.open(audioIn);
}catch(Exception e){
System.out.println(e);
}
Given an InputStream called in which contains audio data in a compressed format (such as MP3 or OGG), I wish to create a byte array containing a WAV conversion of the input data. Unfortunately, if you try to do this, JavaSound hands you the following error:
java.io.IOException: stream length not specified
I managed to get it to work by writing the wav to a temporary file, then reading it back in, as shown below:
AudioInputStream source = AudioSystem.getAudioInputStream(new BufferedInputStream(in, 1024));
AudioInputStream pcm = AudioSystem.getAudioInputStream(AudioFormat.Encoding.PCM_SIGNED, source);
AudioInputStream ulaw = AudioSystem.getAudioInputStream(AudioFormat.Encoding.ULAW, pcm);
File tempFile = File.createTempFile("wav", "tmp");
AudioSystem.write(ulaw, AudioFileFormat.Type.WAVE, tempFile);
// The fileToByteArray() method reads the file
// into a byte array; omitted for brevity
byte[] bytes = fileToByteArray(tempFile);
tempFile.delete();
return bytes;
This is obviously less desirable. Is there a better way?
The problem is that the most AudioFileWriters need to know the file size in advance if writing to an OutputStream. Because you can't provide this, it always fails. Unfortunatly, the default Java sound API implementation doesn't have any alternatives.
But you can try using the AudioOutputStream architecture from the Tritonus plugins (Tritonus is an open source implementation of the Java sound API): http://tritonus.org/plugins.html
I notice this one was asked very long time ago. In case any new person (using Java 7 and above) found this thread, note there is a better new way doing it via Files.readAllBytes API. See:
How to convert .wav file into byte array?
Too late, I know, but I was needed this, so this is my two cents on the topic.
public void UploadFiles(String fileName, byte[] bFile)
{
String uploadedFileLocation = "c:\\";
AudioInputStream source;
AudioInputStream pcm;
InputStream b_in = new ByteArrayInputStream(bFile);
source = AudioSystem.getAudioInputStream(new BufferedInputStream(b_in));
pcm = AudioSystem.getAudioInputStream(AudioFormat.Encoding.PCM_SIGNED, source);
File newFile = new File(uploadedFileLocation + fileName);
AudioSystem.write(pcm, Type.WAVE, newFile);
source.close();
pcm.close();
}
The issue is easy to solve if you prepare class which will create correct header for you. In my example Example how to read audio input in wav buffer data goes in some buffer, after that I create header and have wav file in the buffer. No need in additional libraries. Just copy the code from my example.
Example how to use class which creates correct header in the buffer array:
public void run() {
try {
writer = new NewWaveWriter(44100);
byte[]buffer = new byte[256];
int res = 0;
while((res = m_audioInputStream.read(buffer)) > 0) {
writer.write(buffer, 0, res);
}
} catch (IOException e) {
System.out.println("Error: " + e.getMessage());
}
}
public byte[]getResult() throws IOException {
return writer.getByteBuffer();
}
And class NewWaveWriter you can find under my link.
This is very simple...
File f = new File(exportFileName+".tmp");
File f2 = new File(exportFileName);
long l = f.length();
FileInputStream fi = new FileInputStream(f);
AudioInputStream ai = new AudioInputStream(fi,mainFormat,l/4);
AudioSystem.write(ai, Type.WAVE, f2);
fi.close();
f.delete();
The .tmp file is a RAW audio file, the result is a WAV file with header.