I'm fairly new to Java and Android and I'm trying to make an Android plugin for Unity using Eclipse.
The goal is to let the plugin play audio files, effectively bypassing the Unity Engine's audio playback since a mobile build of the unity project introduces lag in calling the audio playback.
I've seen it implemented on youtube: https://www.youtube.com/watch?v=NyceOWbr7T4
I figure you can send an abstract file from Unity to the plugin, the question is, how can you interpret the file into a soundpool object (or resource?) so that it can be played?
*The audio file would be in the Unity project, but outside of the jar file.
What you can do is stick the audio file in the streaming assets folder of your project, and then use the Media Player class to access and play the file.
However, two things I would point out.
a. Usually, using MediaPlayer will require you to run on the UI thread. This means that the game will be paused. You'll need to see if there's a workaround
b. Unity's audio solution is not the best, granted. But there's plenty of plugins available that are pretty amazing on the Asset store.
So, I still wasn't able to forward the actual file (wav) then try to convert it to SoundPool. What I did was something different:
I put the audio files into the StreamingAssets folder so that when the project is packaged, I still have access to these raw files (although in Android, their compressed into jar files). (Just like Venkat at Axiom Studios said, I wish I saw that answer sooner though, spent some time discovering what StreamingAssets were.)
Using WWW, I was able to retrieve the raw files from the jar files. This is from "Application.streamingAssetsPath".
With these files, I wrote them into a new directory that can be easily accessible by the plugin using File.WriteAllBytes. I used the "Application.persistentDataPath" directory.
In the plugin side, SoundPool is able to get the resource from a provided URL, which is the destination directory + file name.
With this process, I was able to ultimately accomplish what I was going for, and the improvement in performance is drastic. And I'm pretty happy with it.
*Note: This however, makes copies of the files from inside the package unto the install folder (/data/data/com.example.app/files for example). I'm sure you can opt to provide a check before writing if the files exists already just so you can leave the files there for future use, or just delete them when the app closes as clean up.
I just made Native Audio asset store plugins which can make a native call to both iOS and Android's fastest native way from one central interface. https://www.assetstore.unity3d.com/#!/content/107067 (It uses AudioTrack instead of SoundPool though, since from my test it is faster)
Both of the platform receives an audio file via StreamingAssets special folder of Unity. At Java side, here's how you can refer to the file.
In Unity : Assets/StreamingAssets/Assist.wav
In Java :
Context context = UnityPlayer.currentActivity;
AssetManager assetManager = context.getAssets();
try {
AssetFileDescriptor descriptor = assetManager.openFd(audioPath);
byte[] music = new byte[(int) descriptor.getLength()];
InputStream is = assetManager.open(audioPath);
is.read(music);
is.close();
//Do whatever you want with "music"
} catch (IOException e) {
Log("Error reading audio file " + audioPath);
e.printStackTrace();
return -1;
}
You now have a byte[] array of an audio. audioPath here would be just "Assist.wav". I guess you could put that to SoundPool to instantiate it.
Related
My call to the sound looks like this, and I want to be able to change the volume of the sound.
EasySound soundOne = new EasySound("sound.wav");
soundOne.play();
If the library doesn't provide dynamic volume control, you can try using a FloatControl as described in Processing Audio with Controls. Using this method, I've only had success with the MASTER_GAIN type. The problem with this control is that it affects all playing audio at the same time. I haven't found the other types listed to be consistently implemented for different systems.
For this to work, you might run into difficulties figuring out what port the library uses. The tutorial I linked has an earlier chapter, Accessing Audio System Resources which deals with inspecting your audio system.
There is another relatively simple audio library that has implemented dynamic volume: AudioCue. Disclaimer: I am the author. It's Maven-based. If you don't know how to use a resource by linking via a Maven pom file, one option is to directly copy the five classes/interfaces into your program, editing the package line to match the file location where you place them. I'm currently working to figure out how to best set up github "Releases" with corresponding jar files, and haven't provided a jar that can be downloaded as of yet. The jar can be generated if you know Maven basics and have forked and cloned the project.
Most of the solutions tells me use the File Class, but I am planning to use the audio stored in the Java Project. If I make an .exe file, would that work when I'm using File Class?
If you are using JavaFX, there is direct support for MP3. I just discovered this page, and haven't tried using it yet. I've always used javax.sound.sampled.SourceDataLine for audio output, and added libraries as needed to deal with the compression format. There are some useful libraries on github: https://github.com/pdudits/soundlibs. Of these, I've only used the jorbis library for ogg/vorbis encoded wav files. But the mp3 decoders here have been around for a long time, have been used in countless projects, and should work.
As far as packaging audio resources, a key thing to remember is that file systems don't "see into" jar files. So, a File address is basically useless as long as the resource is packed into a jar. But a URL can specify a file that is jarred. The usual practice is to have a "resource" folder for the project that can be specified by a relative address, and to load the resource using its URL. The URL can be obtained using the .getResource method of Class
For example, if you have a class named "AudioHandler" in your project in a package loction "com.dory.mymediaplayer", and a sub folder "/res", and an mp3 file "audiocue01.mp3" in /res, the line to obtain the URL for the mp3 file would be as follows:
URL url = AudioHandler.getClass().getResource("res/audiocue01.mp3");
However, depending on the needs of the library used for decoding the mp3, you might need to use the .getResourceAsStream method. The getResourceAsStream method returns an InputStream instead of a URL.
I want to find a way to save captured picture and recorded sound in custom file-type(s) then open them in my app. I don't want to other apps could open my files (for example gallery app don't open my pictures).
Is there any way to encode and decode my files in my app for example by writing a string at end of files.
Thanks
There is no need for custom file types. If you save your files in Internal Storage, no other application can access them. You can also save files in the External Storage that are private, by calling getExternalFilesDir().
Details are here: http://developer.android.com/guide/topics/data/data-storage.html#filesInternal
Use a file extension no one else uses, then use this answer to open all such files with your app.
Is there any way to load a sound samples from memory using SoundPool.load method?
Unfortunatelly, all methods provided in SoundPool are using arguments for real files.
The problem is that I want to load sounds from zip file on SDcard, and extracting zip (like in this solution) is not an option.
Furthermore, there is a solution for loading from uncomressed zip files, but my files is comressed (and will be with password).
So is there any way to have java.io.FileDescriptor that represents a virtual file, so I can implement some abstract class placing my own streams?
Best regards.
I got the final answer on this question.
This is feature-missing on Android platform. Android media playback framework doesn't support it for a very long time. Google also notices it, so Android6.0(API Level23) introduces android.media.MediaDataSource which could be used as a pure memory byte-array data source. Before API Level23, we still need to copy the memory data to a temporary file in the file system.
The following URL provide some more clues on this issue, its explanation is correct for both audio and video:
how to play video from byte array in media player
I wanted to suggest you to use MemoryFile, but after checking it, I found the MemoryFile has no getFileDescriptor() method, it means we couldn't use it as a parameter in SoundPool.load().
But I found this post:
what is the use of MemoryFile in android
One guy implemented MemoryFileUtil.getFileDescriptor(memFile), he posts his codes there.
If his codes really could work, it means we could load a sound sample from memory using SoundPool.load().
The only problem is writing our memory data into that memory file.
The following site shows how to write memory data(from SQLITE query result) into a memory file:
http://www.programcreek.com/java-api-examples/index.php?api=android.os.MemoryFile
I'll give you updates after my test.
We have a java web application where users can upload all kinds of files including any kind of video files. Now we want to allow them to stream these video files they own. So I need to make sure that they are the owner and then stream video. Also possibly stream a preview.
Do I need to convert these video files before streaming and where should I look to get started?
The best video playback/encoding library I have ever seen is ffmpeg. It plays everything you throw at it. (It is used by MPlayer.) It is written in C but I found some Java wrappers.
FFMPEG-Java: A Java wrapper around ffmpeg using JNA.
jffmpeg: This one integrates to JMF.