We use the following code in our application to try to read the duration of an MP3 file:
final File file = new File(filename);
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(file);
While trying to get an AudioInputStream out of a file an UnsupportedAudioFileException is thrown. I used the same code for a JUnit test in another project where the exception does not occur instead an is returned.
As I debugged into the Method getAudioInputStream(file) I found out that the exception is thrown because of getAudioFileReaders() returned an empty provider list. That is not the case in the other project in which my JUnit test is.
So I got two questions:
1. Why is the provider list empty?
2. Do I need to configure something in order to get at least one provider?
Assuming you are trying to open a .mp3, you need a Java SPI which adds the audio format support.
You can take a look at MP3SPI library from Javazoom which would let you open mp3 files. They have VorbisSPI for ogg format support as well.
I'd like to point out that you cannot play .mp3 only with pure java API
The AudioStream class does not supports mp3 natively.
You can check it on the javadoc for all the available formats
So i wonder how it's possible that on another computer you are able to run it,
if java itself doesn't allow it.
Related
I am trying to read an MP3 file through class javax.sound.sampled.AudioSystem but I am getting an UnsupportedAudioFileException.
My code trying to read the audio file looks like:-
AudioInputStream audioInputStream =
AudioSystem.getAudioInputStream(file);
I am getting the following exception:-
javax.sound.sampled.UnsupportedAudioFileException: could not get audio input stream from input file
Does AudioSystem class not support mp3 format? If not then what formats does it supports? Or Am I doing some mistake here?
No it doesn't support MP3 (hence the UnsupportedAudioFileException). The supported files are quite basic (WAV and that sort of thing), so for any advanced codecs you'll need separate libraries.
I'm fairly new to Java and Android and I'm trying to make an Android plugin for Unity using Eclipse.
The goal is to let the plugin play audio files, effectively bypassing the Unity Engine's audio playback since a mobile build of the unity project introduces lag in calling the audio playback.
I've seen it implemented on youtube: https://www.youtube.com/watch?v=NyceOWbr7T4
I figure you can send an abstract file from Unity to the plugin, the question is, how can you interpret the file into a soundpool object (or resource?) so that it can be played?
*The audio file would be in the Unity project, but outside of the jar file.
What you can do is stick the audio file in the streaming assets folder of your project, and then use the Media Player class to access and play the file.
However, two things I would point out.
a. Usually, using MediaPlayer will require you to run on the UI thread. This means that the game will be paused. You'll need to see if there's a workaround
b. Unity's audio solution is not the best, granted. But there's plenty of plugins available that are pretty amazing on the Asset store.
So, I still wasn't able to forward the actual file (wav) then try to convert it to SoundPool. What I did was something different:
I put the audio files into the StreamingAssets folder so that when the project is packaged, I still have access to these raw files (although in Android, their compressed into jar files). (Just like Venkat at Axiom Studios said, I wish I saw that answer sooner though, spent some time discovering what StreamingAssets were.)
Using WWW, I was able to retrieve the raw files from the jar files. This is from "Application.streamingAssetsPath".
With these files, I wrote them into a new directory that can be easily accessible by the plugin using File.WriteAllBytes. I used the "Application.persistentDataPath" directory.
In the plugin side, SoundPool is able to get the resource from a provided URL, which is the destination directory + file name.
With this process, I was able to ultimately accomplish what I was going for, and the improvement in performance is drastic. And I'm pretty happy with it.
*Note: This however, makes copies of the files from inside the package unto the install folder (/data/data/com.example.app/files for example). I'm sure you can opt to provide a check before writing if the files exists already just so you can leave the files there for future use, or just delete them when the app closes as clean up.
I just made Native Audio asset store plugins which can make a native call to both iOS and Android's fastest native way from one central interface. https://www.assetstore.unity3d.com/#!/content/107067 (It uses AudioTrack instead of SoundPool though, since from my test it is faster)
Both of the platform receives an audio file via StreamingAssets special folder of Unity. At Java side, here's how you can refer to the file.
In Unity : Assets/StreamingAssets/Assist.wav
In Java :
Context context = UnityPlayer.currentActivity;
AssetManager assetManager = context.getAssets();
try {
AssetFileDescriptor descriptor = assetManager.openFd(audioPath);
byte[] music = new byte[(int) descriptor.getLength()];
InputStream is = assetManager.open(audioPath);
is.read(music);
is.close();
//Do whatever you want with "music"
} catch (IOException e) {
Log("Error reading audio file " + audioPath);
e.printStackTrace();
return -1;
}
You now have a byte[] array of an audio. audioPath here would be just "Assist.wav". I guess you could put that to SoundPool to instantiate it.
I am trying to add a feature to some audio processing software I have written.
My software already captures sound from a microphone input, processes it in real time, and sends the result to a speaker output. (This is already a threaded application.) I've been using javax.sound.sampled.* and working with wav data (transforming it to and from numerical samples to do the processing.
I would like to add a feature to save both the raw input and the transformed output of a session with this software to wav files. But the signature for creating a new wav file (e.g., WavFile.newWavFile(...) seems to want to know in advance how many frames of data it is going to receive. Since these are live sessions of indeterminate time, I have no way of knowing this information before hand.
Am I missing something? Is there some way around this, other than a hack like saving files of data or samples, and then post-processing it?
Most audio file writers need to know the full file size before writing to an output stream. There's an open source project called Tritonus which is an implementation of the Java sound API that has an AudioOutputStream plugin you could try.
Right now I'm working on an archive browsing application that lets users navigate through archive contents, extract the archives and preview the files inside the archive. I'm using java.util.zip API. To preview a file I'm extracting it temporarily and opening it as a usual existing file. As you may understand, that's not a good approach since it won't preview files if there's not enough space to make a temporary extraction. Is there a working solution for passing ZipInputStream to an Activity to open it as a file? Is there another workaround for this problem? Thanks in advance.
In principle, you could create a ContentProvider that serves up the ZipInputStream.
In this sample project I demonstrate how to create a ContentProvider supporting openFile() that uses a pipe created by ParcelFileDescriptor.createPipe() to serve up a file. createPipe() returns a pair (two-element array) of ParcelFileDescriptors representing the ends of the pipe. You use the second element out of the array to write to via an OutputStream; openFile() returns the first element out of the array to be passed by Android to the calling process. The caller would use openInputStream() to read in what you transfer via the pipe.
In my case, I am sending an asset on which I get an InputStream via AssetManager. In your case, you would use your ZipInputStream.
Note that my sample project assumes it is being run on a device that has a PDF viewer, since it is serving a PDF out of assets and trying to open it via startActivity().
I am using Java to write a media application.
Given a file, how can I know is it a audio file or video file?
By the way, I use vlcj library.
In Java 7 you will be able to use java.nio.file.probeContentType to do this.
In the meantime, there are a number of other options for doing this kind of thing.