Hi Guys I am trying to build a "4-track" recording app on Android.
I'm looking for a library or set of classes I can use record audio and mix 4 channels of audio to a 2 channel "mixdown".
Ideally it would be similar to the javax.sound.sampled library.
Low latency is also important...
I am new to Android development and have only worked in web dev for a year (c#, jquery, sql, vb).
You will likely have to do an implementation of AudioTrack. This will give you the most control.
Ref:
http://developer.android.com/reference/android/media/AudioTrack.html
Android: Mixing multiple AudioTrack instances?
Related
I'm using Java to read and play some real time audio streams such as the voice from radio station.
I have the real time web address like this one and it can be played in a web browser.
How can I play it using Java language?
Thanks.
MP3SPI is a Java Service Provider Interface that adds MP3 (MPEG 1/2/2.5 Layer 1/2/3) audio format support for Java Platform. It supports streaming, ID3v2 frames, Equalizer etc. It is based on JLayer and Tritonus Java libraries.
You can use this library MP3 SPI for Java Sound , and its documentation here.
Library reference
I am trying to stream a video in app using the native player for android. But when trying to stream a link with "rtmpe://" it doesn't work. Can anyone guide me on how to play this? So far I have only seen solutions of iOS. I want to play this without using external Apps.
Check supported media formats for Android here. Not all protocols work on android.
I am trying to run Example One from https://github.com/fyhertz/libstreaming-examples
It uses libstreaming-4.0.
I have forced it to use encodeWithMediaCodecMethod2(). This method uses the createInputSurface() method introduced in Android 4.3. This reduced the latency from 3 seconds to 1 second.
I am creating a video chat application (like Skype) and I need the video latency to be much lower than this.
I don't know where to go from here really.
Could anyone offer suggestions on how to get the latency down? Different libraries? techniques? maybe the NDK? I have done loads of research but I have had very little luck :(
Please help
Thanks
Thre are few open source projects
doubango
ffmpeg (you will need javacv - Java wrapper for C/C++ SDK)
Also thre is IMSDroid (open source 3GPP IMS Client for Android based on doubango) and FFMpeg's streaming guide about latency
Now i am working on RTMP video streaming in android.please give some examples of RTMP client for video publishing in android.
I would suggest javacv with ffmpeg.
I've tried different things and this was the only one worked.
This example will help you a lot, but you' ll need to do some changes and update libs.
If authentication is not required you can use flazr.
try this library http://code.google.com/p/android-rtmp-client/
This is a rtmp client lib ported from red5 and can be used in both android and other java platform. Compared to red5, this lib has a minimal lib dependency.
You should use Yasea or LibRestreaming.
Yasea can publish live video to your rtmp server (needs API 16+)
-Some devices , like MTK chips you can get 8-14 fps with yasea.
I also recommend LibRestreaming to you , if your target api is 18 and upper.
-You can get 20-30 fps with librestreaming
https://github.com/begeekmyfriend/yasea
https://github.com/lakeinchina/librestreaming
I don't suggest javacv to publishing live video , because it takes up 10-15 mb in apk and works with slow fps
Could anyone tell me, Where can i find api for waveTable syntesis on android?
Or maybe i can use one of C++ waveTable synth libraries on android through JNI?
MediaPlayer and JetPlayer does not fit for this task.
I need to play MIDI, but with good soundbanks and be able to change soundbanks on user request.
AFAIR there's no API for this in SDK. If you want to play midi on android use MediaPlayer to do the playback (it does support midi files)