Java Sounds work on JRE6 but not JRE7 - java

I have been tearing my hair out trying to solve this problem, I have this method in a calculator I've been working on:
public void error_sound() throws UnsupportedAudioFileException, IOException, LineUnavailableException {
AudioInputStream AIS = AudioSystem.getAudioInputStream(calculator.class.getResourceAsStream("/resources/Error.wav"));
AudioFormat format = AIS.getFormat();
SourceDataLine playbackLine = AudioSystem.getSourceDataLine(format);
playbackLine.open(format);
playbackLine.start();
int bytesRead = 0;
byte[] buffer = new byte[128000];
while (bytesRead != -1) {
bytesRead = AIS.read(buffer, 0, buffer.length);
if (bytesRead >= 0)
playbackLine.write(buffer, 0, bytesRead);
}
playbackLine.drain();
playbackLine.close();
}
This code works on JRE6 but not on JRE7. If anyone could suggest a way to make the above work on JRE7 I'd be eternally grateful?
It appears that Sun dropped the "Java Sound Audio Engine" in JRE 1.7 and this is the only thing I can put it down to?

"It appears that Sun dropped the "Java Sound Audio Engine" in JRE 1.7 and this is the only thing I can put it down to?"
Nope, that would have been noticed by a lot of people, including me. Your comments indicate there is a problem seeking in the resource input stream. That can either be caused by different audio systems or different implementations of getAudioStream().
You could just try wrapping the resource stream into a BufferedInputStream:
InputStream raw = calculator.class.getResourceAsStream("/resources/Error.wav");
InputStream bis = new BufferedInputStream(raw, 20000);
AudioInputStream ais = AudioSystem.getAudioInputStream(bis);
(Thats based on the idea thet BufferedInputStream supports mark / reset)
You should really add some proper error handling to the code (check if the resource is there etc. and proper error logging/reporting). It really helps in the long run if problems are reported clearly.
EDIT: Re-reading you problem description, its clear you are running the code from eclipse, and on the other computer you run from a jar-file. The problem is your code doesn't cope with the latter. Wrapping it into a BufferedInputStream should fix that (you may need to increase the buffer size though).
Edit2: Try this for repeating sound:
public void error_sound() throws UnsupportedAudioFileException, IOException, LineUnavailableException {
AudioInputStream AIS = ...
AudioFormat format = ...
SourceDataLine playbackLine = ...
playbackLine.open(format);
playbackLine.start();
int repeats = 5;
while (true) {
// playloop
int bytesRead = 0;
byte[] buffer = new byte[128000];
while (bytesRead != -1) {
bytesRead = AIS.read(buffer, 0, buffer.length);
if (bytesRead >= 0)
playbackLine.write(buffer, 0, bytesRead);
}
--repeats;
if (repeats <= 0) {
// done, stop playing
break;
} else {
// repeat one more time, reset audio stream
AIS = ...
}
}
playbackLine.drain();
playbackLine.close();
}
The only complicated thing is that you need the audio stream to get the format and that you also need to create it anew in every loop iteration to read it from the beginning. Everything else stays the same.

Related

How do I convert data read from a WAV file to an array of signed 16-bit raw audio data in java?

I have no idea how to do this. I have read the answers to several similar questions and some websites that probably had the answer somewhere, but either I could not understand them or they were not what I am trying to do. It is also possible that some did have the answer, but I could not focus well enough to interpret it. I want a method that converts the data from a WAV file signed 16-bit raw audio data and puts this into a short[]. I would prefer short minimalistic easy to understand answers because I would have less difficulty focusing on those.
Edit: Some have said this might be a duplicate of stackoverflow.com/questions/5210147/reading-wav-file-in-java. I do not understand that question or its answers well enough to even say whether it is different or why or how to change my question so it is not confused for that one.
Another edit: I have attempted using Phil Freihofner's answer, but when testing this by attempting to pay back the audio, I just heard a lot of clicks. I am not sure if I implemented it correctly. Here is the method that reads the file:
static void loadAudioDataTest(String filepath){
int totalFramesRead = 0;
File fileIn = new File(filepath);
try {
AudioInputStream audioInputStream =
AudioSystem.getAudioInputStream(fileIn);
int bytesPerFrame =
audioInputStream.getFormat().getFrameSize();
if (bytesPerFrame == AudioSystem.NOT_SPECIFIED) {
bytesPerFrame = 1;
}
int numBytes = 1024 * bytesPerFrame;
byte[] audioBytes = new byte[numBytes];
audioArray=new short[numBytes/2];
try{
int numBytesRead = 0;
int numFramesRead = 0;
while ((numBytesRead =
audioInputStream.read(audioBytes)) != -1) {
numFramesRead = numBytesRead / bytesPerFrame;
totalFramesRead += numFramesRead;
}for(int a=0;a<audioArray.length;a++){
audioArray[acc]=(short)((audioBytes[a*2]&0xff)|(audioBytes[acc*2+1]<<8));
}
} catch (Exception ex) {
// Handle the error...
}
} catch (Exception e) {
// Handle the error...
}
}
This bit plays the sound and is inside an actionPerformed(ActionEvent) void that is repeatedly activated by a timer, in case the issue is there
byte[]buf=new byte[2];
AudioFormat af=new AudioFormat(44100,16,1,true,false);
SourceDataLine sdl;
try{
sdl=AudioSystem.getSourceDataLine(af);
sdl.open();
sdl.start();
buf[1]=(byte) (audioArray[t%audioArray.length]&0xFF);
buf[0]=(byte) (audioArray[t%audioArray.length]>>8);
sdl.write(buf,0,2);
sdl.drain();
sdl.stop();
}catch(LineUnavailableException e1){
e1.printStackTrace();
}t++;
The current core java class commonly used for loading data into a byte array is AudioInputStream (javax.sound.sampled.AudioInputStream). An example of its use, with explanation, can be found in the Oracle tutorial Using Files and Format Converters. The sample code is in the section titled "Reading Sound Files". Note the point in the innermost while loop with the following line: // Here, do something useful with the audio data. At that point, you would load the data into your array.
Taking two bytes and converting them to a short has been answered several times but I don't have the links handy. It's easier to just post some code I have used.
audioArray[i] = ( buffer[bufferIdx] & 0xff )
| ( buffer[bufferIdx + 1] << 8 ) ;
... where audioArray could be a short[]. (In my code I use float[] and do another step to scale the values to range from -1 to 1.)
This is a slightly modified snipped from the library AudioCue on github, quoting from lines 391-393.

Java audio - trim an audio file down to a specified length

I am trying to create a small java program to cut an audio file down to a specified length. Currently I have the following code:-
import java.util.*;
import java.io.*;
import javax.sound.sampled.*;
public class cuttest_3{
public static void main(String[]args)
{
int totalFramesRead = 0;
File fileIn = new File("output1.wav");
// somePathName is a pre-existing string whose value was
// based on a user selection.
try {
AudioInputStream audioInputStream =
AudioSystem.getAudioInputStream(fileIn);
int bytesPerFrame =
audioInputStream.getFormat().getFrameSize();
if (bytesPerFrame == AudioSystem.NOT_SPECIFIED) {
// some audio formats may have unspecified frame size
// in that case we may read any amount of bytes
bytesPerFrame = 1;
}
// Set a buffer size of 5512 frames - semiquavers at 120bpm
int numBytes = 5512 * bytesPerFrame;
byte[] audioBytes = new byte[numBytes];
try {
int numBytesRead = 0;
int numFramesRead = 0;
// Try to read numBytes bytes from the file.
while ((numBytesRead =
audioInputStream.read(audioBytes)) != -1) {
// Calculate the number of frames actually read.
numFramesRead = numBytesRead / bytesPerFrame;
totalFramesRead += numFramesRead;
// Here, - output a trimmed audio file
AudioInputStream cutFile =
new AudioInputStream(audioBytes);
AudioSystem.write(cutFile,
AudioFileFormat.Type.WAVE,
new File("cut_output1.wav"));
}
} catch (Exception ex) {
// Handle the error...
}
} catch (Exception e) {
// Handle the error...
}
}
}
On attempting compilation, the following error is returned:-
cuttest_3.java:50: error: incompatible types: byte[] cannot be converted to TargetDataLine
new AudioInputStream(audioBytes);
I am not very familiar with AudioInputStream handling in Java, so can anyone suggest a way I can conform the data to achieve output? Many thanks
You have to tell the AudioInputStream how to decipher the bytes you pass in as is specified by Matt in the answer here. This documentation indicates what each of the parameters mean.
A stream of bytes does not mean anything until you indicate to the system playing the sound how many channels there are, the bit resolution per sample, samples per second, etc.
Since .wav files are an understood protocol and I think they have data at the front of the file defining various parameters of the audio track, the AudioInputStream can correctly decipher the 1st file you pass in.

openPrefetchingReadChannel is not working in Google Cloud Storage Client API

I am trying to fetch object from bucket using openPrefetchingReadChannel GCSInputChannel. As Google developer tutorial says:
GcsInputChannel readChannel = gcsService.openPrefetchingReadChannel(fileName, 0, 1024 * 1024);
Prefetching provides is a major performance advantage for most applications, because it
allows processing part of the file while more data is being downloaded in the background
in parallel.The third parameter is the size of the prefetch buffer, set in the example
to 1 megabyte.
Well this is not happening for me. Please have a look at my snippet:
GcsInputChannel readChannel = gcsService.openPrefetchingReadChannel(fileName, 0, 1024);
copy(Channels.newInputStream(readChannel), resp.getOutputStream());
private void copy(InputStream input, OutputStream output) throws IOException {
try {
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = input.read(buffer);
while (bytesRead != -1) {
output.write(buffer, 0, bytesRead);
bytesRead = input.read(buffer);
}
} finally {
input.close();
output.close();
}
}
Ref: https://code.google.com/p/appengine-gcs-client/source/browse/trunk/java/example/src/com/google/appengine/demos/GcsExampleServlet.java
Above code should deliver 1KB of data from uploaded object but it is returning the whole data of object i.e. 8.4KB. Please look at the screenshot:
I am not sure what is happening. Need your help guys
The third argument for openPrefetchingReadChannel is not the max size to read (or limit).
Is the the internal buffer size for prefetching. In your case you may want to track how much
you read and keep writing until reached the desired limit

How to speed up download using Java I/O

I'm quite new to using Java I/O as I haven't ever before and have written this to download a .mp4 file from www.kissanime.com.
The download is very, very slow at the moment (approximately 70-100kb/s) and was wondering how I could speed it up. I don't really understand the byte buffering so any help with that would be appreciated. That may be my problem, I'm not sure.
Here's my code:
protected static boolean downloadFile(URL source, File dest) {
try {
URLConnection urlConn = source.openConnection();
urlConn.setConnectTimeout(1000);
urlConn.setReadTimeout(5000);
InputStream in = urlConn.getInputStream();
FileOutputStream out = new FileOutputStream(dest);
BufferedOutputStream bout = new BufferedOutputStream(out);
int fileSize = urlConn.getContentLength();
byte[] b = new byte[65536];
int bytesDownloaded = 0, len;
while ((len = in.read(b)) != -1 && bytesDownloaded < fileSize) {
bout.write(b, 0, len);
bytesDownloaded += len;
// System.out.println((double) bytesDownloaded / 1000000.0 + "mb/" + (double) fileSize / 1000000.0 + "mb");
}
bout.close();
} catch (IOException e) {
e.printStackTrace();
}
return true;
}
Thanks. Any further information will be provided upon request.
I can't find any questions on here related to downloading media files, and I'm sorry if this is deemed to be a duplicate.
Try using IOUtils.toByteArray, It takes an inputstream and returns an array with all bytes, in my opinion it's generally a good idea to check the common utility packages like apache-commons and guava and see if what you're trying to do hasn't already been done
If you want to save the file from InputStream then use this bellow method of apache-commons
FileUtils.copyInputStreamToFile ()
public static void copyInputStreamToFile(InputStream source,
File destination)
throws IOException
Copies bytes from an InputStream source to a file destination. The directories up to destination will be created if they don't already exist. destination will be overwritten if it already exists. The source stream is closed.
Always use file and IO related stuff by using library if available.There are also some other utility methods available & you can explore .
IOUtils
FileUtils
Turns out that it was the vast number of redirects from the link that caused the download speed to be throttled. Thanks everyone who answered.

Using AudioInputStream with a ProgressMonitorInputStream

Revised/Summary:
I'm using a plugin to decode an MP3 audio file. I'd like to provide a ProgressMonitor to provide feedback to the user. The logic of constructing an AudioInputStream that decodes the MP3 format AudioFile is as follows:
readAudioFile(File pAudioFile) throws UnsupportedAuioFileException, IOException {
AudioInputStream nativeFormatStream = AudioSystem.getAudioInputStream(pAudioFile);
AudioInputStream desiredFormatStream = AudioSystem.getAudioInputStream(AUDIO_OUTPUT_FORMAT,nativeFormatStream);
int bytesRead, bufferLength;
byte[] rawAudioBuffer[bufferLength=4096];
bytesRead=desiredFormatStream.read(rawAudioBuffer,0,bufferLength));
...
}
First attempt was to wrap the audio File with a ProgressMontorInputStream, then get the AudioInputStream from that:
readAudioFile(File pAudioFile) throws UnsupportedAuioFileException, IOException {
ProgressMonitorInputStream monitorStream = new ProgressMonitorInputStream(COMP,"Decoding",new FileInputStream(pAudioFile);
AudioInputStream nativeFormatStream = AudioSystem.getAudioInputStream(monitorStream);
AudioInputStream desiredFormatStream = AudioSystem.getAudioInputStream(AUDIO_OUTPUT_FORMAT,nativeFormatStream);
int bytesRead, bufferLength;
byte[] rawAudioBuffer[bufferLength=4096];
bytesRead=desiredFormatStream.read(rawAudioBuffer,0,bufferLength));
...
}
While it builds, upon execution I get the following when constructing the AudioInputStream from the ProgressMonitorInputStream:
java.io.IOException: mark/reset not supported
Comments below confirm that the AudioInputStream requires the InputStream it wraps to support the mark() and reset() methods, which apparently ProgressMonitorInputStream does not.
Another suggestion below is to wrap the ProgressMonitorInputStream with a BufferedInputStream (which does support mark/reset). So then I have:
readAudioFile(File pAudioFile) throws UnsupportedAuioFileException, IOException {
ProgressMonitorInputStream monitorStream = new ProgressMonitorInputStream(COMP,"Decoding",new FileInputStream(pAudioFile);
AudioInputStream nativeFormatStream = AudioSystem.getAudioInputStream(new BufferedInputStream(monitorStream));
AudioInputStream desiredFormatStream = AudioSystem.getAudioInputStream(AUDIO_OUTPUT_FORMAT,nativeFormatStream);
int bytesRead, bufferLength;
byte[] rawAudioBuffer[bufferLength=4096];
bytesRead=desiredFormatStream.read(rawAudioBuffer,0,bufferLength));
...
}
Now this builds and executes without error. However, the ProgressMonitor never appears, despite aggressive settings for setMillisToPopup(10) and setMillisToDecideToPopup(10); My theory is that the time to actually read the undecoded audio into memory is still faster than 10mSec. The time is actually spent decoding that raw audio after reading from disk. So the next step is to wrap the undecoded AudioInputStream with the ProgressMonitorInputStream before constructing the decoding AudioInputStream:
readAudioFile(File pAudioFile) throws UnsupportedAuioFileException, IOException {
AudioInputStream nativeFormatStream = AudioSystem.getAudioInputStream(pAudioFile);
AudioInputStream desiredFormatStream = AudioSystem.getAudioInputStream(AUDIO_OUTPUT_FORMAT,new BufferedInputStream(new ProgressMonitorInputStream(COMP,"Decoding",nativeFormatStream);
int bytesRead, bufferLength;
byte[] rawAudioBuffer[bufferLength=4096];
bytesRead=desiredFormatStream.read(rawAudioBuffer,0,bufferLength));
...
}
I seem to be kicking the can down the road but not making progress. Is there any workaround for this problem? Is there an alternative way to providing a ProgressMonitor for the decoding process? My (unsatisfying) fallback is displaying a busy cursor. Any suggestions for other ways to accomplish the goal - providing visual feedback to the user with at least an estimate of time remaining to complete the decoding?

Categories