I downloaded Xuggler now and played a little bit around with it. Some examples work, others don't. There is no example of a video live stream with server and client in the demo files. Did anyone already establish a Xuggler Video Live Stream and can tell me how to do it or even post the server and client source code?
You may be better off with something like ffmpeg using javacv like this fellow http://www.walking-productions.com/notslop/2013/01/16/android-live-streaming-courtesy-of-javacv-and-ffmpeg/ .
Xuggler is no longer actively developed , and while it was good at the time it hasn't kept up wih ffmpeg API updates.
Related
Can you suggest to me how to stream a live video (recording from webcam and microphone in syn) over the socket in java? I tried to search but only found ways to stream images over socket not a live video (capturing from webcam) along audio (using mic).
This is a really broad question, so it's difficult to provide a satisfactory answer. I can point you to some resources though.
Firstly, a popular protocol for this kind of application is the following:
https://en.wikipedia.org/wiki/Real_Time_Streaming_Protocol
Here is a link to a promising paper going in-depth on how this protocol can be implemented using Java:
https://www.researchgate.net/publication/228974152_A_Java-based_adaptive_media_streaming_on-demand_platform
Of course, you can also choose to use a pre-existing API (there are plenty of options). Additionally here are some StackOverflow posts that also touch upon this subject:
Live video streaming using Java?
Real-time audio streaming using HTTP - choosing the protocol and Java implementation
Good luck :)
Now i am working on RTMP video streaming in android.please give some examples of RTMP client for video publishing in android.
I would suggest javacv with ffmpeg.
I've tried different things and this was the only one worked.
This example will help you a lot, but you' ll need to do some changes and update libs.
If authentication is not required you can use flazr.
try this library http://code.google.com/p/android-rtmp-client/
This is a rtmp client lib ported from red5 and can be used in both android and other java platform. Compared to red5, this lib has a minimal lib dependency.
You should use Yasea or LibRestreaming.
Yasea can publish live video to your rtmp server (needs API 16+)
-Some devices , like MTK chips you can get 8-14 fps with yasea.
I also recommend LibRestreaming to you , if your target api is 18 and upper.
-You can get 20-30 fps with librestreaming
https://github.com/begeekmyfriend/yasea
https://github.com/lakeinchina/librestreaming
I don't suggest javacv to publishing live video , because it takes up 10-15 mb in apk and works with slow fps
I have the following problem: in my application I am supposed to connect to a VNC server, grab the icoming images and both save them as a video file (this part works) and publish as a RTMP stream to Red5 server.
Most of the video encoding part is based on Xuggler's screen recording tutorial app ( http://wiki.xuggle.com/MediaTool_Introduction#How_To_Take_Snapshots_Of_Your_Desktop ).
However, being a beginner in Java I can't seem to get the live streaming part to work. I know I am supposed to use IContainer, but even after reading this: How to transmit live video from within a Java application? and several other posts I don't really know how to adapt that to my application. Any help would be greatly appreciated, thanks!
I have an audio file in .3gp format on my Android device which I wish
to upload to YouTube. I know that YouTube is a video upload site and
that I need to convert this sound file to video.
I just want an image to display all the time the audio is playing.
Google tells me there are number of tools that can help me. But I want
to do this via java code from my Android device.
Please help.
Thanks.
There are tools such as FFMPEG available for free that allow you to, essentially, mix and convert heterogenous streams. That is you can add a bitmap to a video, create video from slide shows and then add sound etc. (See a related question I asked here).
These programs can be executed from within java applications by making Runtime.exec(..) calls.
Sun has an example for stitching multiple JPEGs together into a movie, you can find it here. You should be able to take this example, (its fairly robust), and add what you need to it.
I recommend looking into the Java Media Framework (FAQ)
You can find a vast collection of sample applets/code at the Sun Solutions page. You can find the API on this page. I do hope this is compatible on the Android platform, as I haven't had any personal experience developing for it. But it might be a good place to start.
Where to get streaming (live) video and audio from camera example for Android?
Suppose I want to create some live video streaming service app so I'll have some cool server at the back end. And I know how to do that part. Suppose I have some stand alone app for PCs now I want to go on to mobile devices. So I want to see some sample app grabing audio and video streams from Phone, Synchronizing them, encoding somehow, and sending LIVE stream to server. I need any Open-Source sample that will do this or something like this. Where can I get such one?
Ole have you been able to find any good examples of video or audio broadcasting yet? The best that I have found so far is the SIPDroid project (www.sipdroid.org). I haven't had a chance to review it in depth, but it looks promising.
Here are some project that you want
Ip Camera
http://code.google.com/p/ipcamera-for-android
SipDroid
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
You can get the codes using SVN or other clients.
Yet to me, the both projects still have issues. If you get the one working well, please tell me.