Does anyone know of any video encoding/decoding libraries written entirely in java?
Bonus points if it works on Android.
I'm trying to write a video decoding application for android, where I have access to the frame level decoding functions (which is absent in the android API MediaPlayer class)
jcodec is a pure java implementation of audio and video codecs and containers. Currently, it "only" decode H264 (MPEG-4) and MJPEG. More informations here.
Maybe you want to look at this :
http://www.alphaworks.ibm.com/tech/tk4mpeg4 (looks fairly old though)
But (depending on your platform), I believe you will have a hard time for a real time decoder purely in Java...
Related
I'm writing a video recording program, and it's going quite well. I can record mic as well as video from the screen. However, I would also like to be able to obtain sounds from another Java program and then sync them with the video. Basically, record the audio as it is played by the other program.
Is there a way to accomplish this? I'm pretty new with sound, and have read a bit up on it. I think I need to set up a mixer, but I'm not sure if I can actually obtain sound from another Java program that way.
This is not possible with java sound, not because of any particular problem with java sound, but because not all audio APIs that java builds on support this feature. (Core audio on the mac for example, and ASIO on windows. Not sure about ALSA on linux, but I don't think it supports this either).
If you are on windows and want to write JNI/JNA code you can use PortAudio which supports this on one of the audio APIs (sorry I can't recall which one).
I am doing a project in which I have to transform the audio data (which would be most probably in mp3, wav or wma format) into a waveform and also get the FFT and pitch for it along with the time in milliseconds at which the pitch change.
I am just confused whether which of these APIs is better? What are the limitations of each of these?
JMF is ancient, clunky, and basically unmaintained.
JavaFX may or may not support what you need, but at least it's on Oracle's radar for future development.
You may want to check out FMJ, which is basically an open source replacement for JMF after Sun dropped the ball with maintaining JMF:
http://fmj-sf.net/
I haven't used it, but it does seem to have quite a few users and recently committed code which is a good sign....
My application takes a long time to prepare and buffer an audio stream. I have read this question Why does it take so long for Android's MediaPlayer to prepare some live streams for playback?, however it just says people have experienced this issue, it does not state how to improve the problem.
I am experiencing this in all versions of Android, tested from 2.2 - 4.1.2.
The streams are in a suitable bit-rate for mobile and 3G connection. The same stream takes less than a second to start buffering in the equivalent iOS app.
Is there a way to specify the amount of time that should be buffered? I know that the Tune In radio application offers this feature ( https://play.google.com/store/apps/details?id=tunein.player ).
Thanks.
Edit: I've tested again and found that it only happens on devices running Gingerbread and above (>=2.3). I know that Android changed the underlying framework from OpenCore to StageFright. So how can I optimise the media framework? It just seems wrong that the old HTC Wildfire can prepare, stream and play, literally 10x faster than the brand new HTC One X and Nexus 7.
I have struggled with this question for months. Finally i found the solution.
The real problem is in the implementation of the MediaPlayer class. Particularly with the way MediaPlayer buffers the data. This is why the solution is to create your own buffering, save it to a temp file and feed that to MediaPlayer.
This tutorial and source code explain exactly how. http://androidstreamingtut.blogspot.nl/2012/08/custom-progressive-audio-streaming-with.html
By adapting this code, it is easy to create a much better streaming player.
Google Developers really screwed up here.
EDIT : This answer is rather old. Nowdays i would recommend not using MediaPlayer and use ExoPlayer instead. It is extendable, stable and can play many different types of media. You can find it here: https://github.com/google/ExoPlayer/
There really isn't much you can do since the Android MediaPlayer class doesn't provide access to lower level settings such as buffer size. The only alternative would be to make your own player using AudioTrack and a library like FFmpeg to do the decoding.
The one thing I'd recommend is to play around with encoding. For instance, for MP4s, ensure that the MOOV Atom is located at the beginning of the file (there are enough questions on S/O regarding how to do this with ffmpeg, etc). With MP3s, you can look at different codecs or bitrates for instance.
You can, for instance, try a number of audio files you find online, and if you see one that doesn't take a long time to buffer, try to encode your files in the same way.
I am interested in doing some music analysis on the Android platform. To do this, I want to parse an arbitrary MP3 into PCM data. I've looked around and there doesn't seem to be an easy way to do this. One solution I've tried is using jLayer. This works, but it is incredibly slow, decoding the song in the same time it takes to play it.
I know there exists an MP3 decoder on Android, Google says so itself under supported media types. Does anyone know how to use the Android decoder to decode an MP3 without actually playing it? All I want to do is divert the bits away from the DAC and store them in some buffer instead.
Alternatively, has anyone had any success using the NDK and something like MAD? Are the performance gains that good?
You can compile and use LAME decoder. It works fine on ARM. And since its C/C++, performance would be better.
Hardware decoder on most ARM platforms are geared towards playback. So would not give you perfect PCM. E.g., to avoid jitter it would skip some data if load is high.
Instructions on compiling LAME are:
Lame MP3 Encoder compile for Android
I have an audio file in .3gp format on my Android device which I wish
to upload to YouTube. I know that YouTube is a video upload site and
that I need to convert this sound file to video.
I just want an image to display all the time the audio is playing.
Google tells me there are number of tools that can help me. But I want
to do this via java code from my Android device.
Please help.
Thanks.
There are tools such as FFMPEG available for free that allow you to, essentially, mix and convert heterogenous streams. That is you can add a bitmap to a video, create video from slide shows and then add sound etc. (See a related question I asked here).
These programs can be executed from within java applications by making Runtime.exec(..) calls.
Sun has an example for stitching multiple JPEGs together into a movie, you can find it here. You should be able to take this example, (its fairly robust), and add what you need to it.
I recommend looking into the Java Media Framework (FAQ)
You can find a vast collection of sample applets/code at the Sun Solutions page. You can find the API on this page. I do hope this is compatible on the Android platform, as I haven't had any personal experience developing for it. But it might be a good place to start.