I find it hard to find some conclusive information on this. I have a dedicated server in a datacenter with Debian 5.0. I have an iPhone/iPad app which uses a JAVA EE (Glassfish 2.1) backend, and I am in the process of implementing video into the App. This includes live streaming and video's are longer than 10 minutes I need HTTP Live Streaming.
What is the best open-source/free solution to implement? This is only a pilot project, so I don't want to subscribe to any paid service. I have currently nothing in place yet for the live streaming, so am flexible to adapt any system (server or client side).
I came across:
Darwin (but am not sure that project is alive, as there is not a lot of info)
Red5 (but cannot find conclusive if this would allow an easy implementation of HTTP live streaming)
FFMPEG
Regarding the video's, I would ideally like to upload a 720p version to the server (for iPad) and then convert automatic (either on the fly when requested or prepared when the file is uploaded) to the required formats for iPhone/iTouch and low bandwidth. For live streaming I would like to be able to provide the content in about 30 seconds from it streaming into the server.
I am not envisaging high demands (e.g. a lot of simultaneous requests, and if so (e.g. live event) on one stream which should be able to be dealt with using HTTP-live streaming, it only needs encoding and segmenting once).
In the )not so near) future android will probably be made part of the App as well.
Any hints/tutorial/suggestions/advice would be really appreciated.
Wowza is pretty good for live streaming to iOS (as well as flash)
It isn't free though.
Latest development version of VLC supports HTTP live streaming.
You'll have to build from source as this has been added to the git repository not so long ago.
http://wiki.videolan.org/Documentation:Streaming_HowTo/Streaming_for_the_iPhone
I am now using the Xuggler framework, which is Java based. Seems to do exactly the job I am looking for, although no build in segmented etc. is available. Instead I try now to write one myself which at the same time integrates exactly with my system
Refer to Apple's http live streaming document and best practices.
https://developer.apple.com/streaming/
This should be a good point to get started.
What is the source of the live video? The iPhone only supports playback of H.264 baseline profile level 3 or mpeg-4 video with aac audio. The iPhone itself encodes video to these specs, but most other encoders don't (including many Android phones). If your video is not encoded to this spec, you'll first have to transcode. FFMpeg (with libx264) will do this nicely. Then you'll need to generate the dynamic .m3u8 playlist file. Wowza will do this for you out of the box, and will accept an rtmp stream from FFmpeg (but is not free). I don't believe that red5 supports Apple http streaming. There are free servers that claim to, but I've never used them. Take a look at http://erlyvideo.org/. Otherwise, you can do it yourself fairly simply. FFmpeg will output an mpeg-ts stream. All that the playlist generator needs to do, then, is cut this into 188-byte-aligned chunks, and return a playlist containing the last n. You can even use an http byte offset module to make the playlist reference a single file. Read Apple's http streaming docs at https://developer.apple.com/streaming/
Related
I must make a video streaming java program as a project in the university, however I don't know how to start!
I must do both, main-server side and subserver side, the client side is going to be VLC.
So I need help to this points:
In the main server I must split the videos to say 10KB parts how to
do that correctly?
How to stream a video from a sub-server to the client properly?
Note: I prefer to use mp4 videos, but I'm allowd to use whatever I want.
Thanks
You need to decide whether you're building a true live stream (typically Apple HLS or MPEG DASH), or just a pseudo-live stream. Some formats, such as MP4, can be streamed when formatted correctly (see how to do that here).
In the main server I must split the videos to say 10KB parts how to do that correctly?
Sounds like you want to convert a mp4 into a mpeg-ts. Have a look at https://github.com/taktik/mpegts-streamer. Other option is to run ffmpeg
How to stream a video from a sub-server to the client properly?
Multi-source synchronization is a non-trivial matter when it comes to live streams. Depending on your implementation:
Pseudo-live stream with MP4: make sure your streaming API supports seeking and restarting. When a client reconnects to another endpoint, it may send HTTP headers to indicate where to continue (not sure if VLC supports this)
True live stream: keep track of chunks that were served to the client. Topic or elasticache sound reasonable for that. When a client connects to sub-server for the first time, analyze subscription or query elasticache to determine the best chunk.
You may look at Ant Media Server project which is open source.
Complete example witch stream-m
https://github.com/vbence/stream-m/blob/master/README.md
contains examples of capture and transmission
I want to implement something like below:
Reading the video stream from IP-Camera by using RTSP (which is
done)
Processing the image by OpenCV (which is done)
Sending the image to the browser to display (which is the problem)
The third part I want to send the images as video stream by using the RTSP protocol.
Note: The language which has used on the server side is Java (OpenCV also is in Java), and the server is TomCat.
And if someone thinks that using RTSP to implement is not better, then what the best way to achieve this functionality, because RTSP is special for video stream so I think this can be the better choice.
The solution you select will most likely depend on the type of system you are building.
If you have a small number of users who will be happy to add a plugin to their browser then an RTSP based approach as you have outlined may work - an example plugin for chrome you could experiment with is
https://github.com/VideoExpertsGroup/VXG.Chrome-RTSP-Player
The more usual solution is to convert the RTSP stream into a HTTP stream using a proxy or streaming server - again the scale and size of your system will probably dictate what you want to do here.
A 'Rolls Royce' solution which will allow you address as many users as possible with as good quality as possible (based on the current landscape - video keeps changing...) might be:
video stream encoded with h.264
transcoded into multiple Bit rate versions to support ABR
packaged into fragmented mp4 and streamed with both HLS and MPEG DASH (for maximum device coverage)
ABR essentially allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
There is an example using GStreamer, an open source streaming server, and hls streaming in this answer here: https://stackoverflow.com/a/35002374/334402
I do realise your question mentioned TomCat and Java, but if you do want this level of service then I would really hesitate to build it from scratch yourself as there is a lot of specialist detail in a streaming server.
I know the topic is not an easy one, but I am looking for a Java class to send an HLS stream from the server to the client.
I have files being generated greater and greater:
out.m3u8
out0.ts
out1.ts
out2.ts
out3.ts
out4.ts
out5.ts
out6.ts
This is generated using ffmpeg from an original source:
ffmpeg -i http://sourceurl.com:9981/stream/channel/1232131 out.m3u8
I can play it using VLC.
Somehow, I need to stream this live to the clients.
At this point, I do not really care about different bit rates, i just want live streaming to work, in mobile browsers and on desktop browsers.
I found this class:
https://github.com/Red5/red5-hls-plugin/blob/master/plugin/src/main/java/org/red5/stream/http/servlet/PlayList.java
Which might be doing something like that.
I have pulled in hls.js into my application in hopes of using it for desktops.
HLS should however work IOS devices without hls.js right now.
How should one serve HLS content from the server? It's very difficult to find any good and simple example to do that.
Anyone knows of the steps needed to do that ?
I've looked into Wowza and Red5 just a little bit, but unsure what they can provide for me at this stage and seems to be overly complicated to setup just to serve some files. But please explain to me why that's not the case.
The H in HLS stands for HTTP. The point of streaming tech such as HLS DASH HDS smooth streaming, etc is that no special server is necessary. Just plain HTTP. you can use something like nginx, or any HTTP server class/library available for Java or any other language.
I would like to know how to upload a file to Amazon S3 with 'Pause and Resume' support? (Via a web browser).
Are there any sample web applications available? Any programming language / framework is fine.
Thanks in advance.
SOLUTION
I implemented the following app. Github Link.
It is based on the sample app and gem from Condominios.
All credit to https://github.com/cotag/ for a great gem and work.
FEATURES:
- Pause / Resume support ~ 5MB chunks
- Large File Upload
- Progress Bar
- No Java Applet / No Flash
- Registration system via devise
That one is a big one. I have been looking for a clean answer for that for a very long time. I even built somethings but it always comes down to using a medium for your application. I think the best solution I have found is using this. It really is a very simple idea and the great part is it only uses a small amount of flash to use.
S3 Heroku Flash Uploader
Github Source Code
However the down side is your not going to be able to upload anything successfully that is over 512 MB there is some sort of cashing fall out after that point. Loose track or something. I think the only other solution that I can think of is to build a Java Application that would handling the uploading to the server. At least then you have a more stable connection and don't have to worry about the problems with the browser.
You should build it using the multipart upload API. Here's the link for Java:
http://docs.amazonwebservices.com/AmazonS3/2006-03-01/dev/mpListPartsJavaAPI.html
The idea would be to initiate a multi-part upload, start uploading parts (whose size would be based on the client's transfer rate) and whenever the user pauses the upload, stop uploading parts. You won't have byte by byte pause granularity, but I suspect the user would not notice that.
I implemented the following app. Github Link:
https://github.com/interpegasus/condo_example
It is based on the sample app and gem from:
http://cotag.github.com/Condominios/
All credit to https://github.com/cotag/ for a great gem and work.
FEATURES:
Pause / Resume support ~ 5MB chunks
Large File Upload
Progress Bar
No Java Applet / No Flash
Registration system via devise
Evaporate js is a pretty good plugin for this specific task.
Have a look here:
https://github.com/TTLabs/EvaporateJS
You'll need a client something like this: https://github.com/23/resumable.js
And a server that:
Writes the chunks somewhere (locally or to S3)
Uploads the fully-assembled file to S3.
You are not going to be able to do it straight from the browser to S3.
Update: S3 supports CORS now. http://aws.amazon.com/about-aws/whats-new/2012/08/31/amazon-s3-announces-cross-origin-resource-sharing-CORS-support/
Have you tried S3Fox firefox plugin ?
It has both front end and backend implmentation with S3
https://medium.com/#selvakumar.ponnusamy/resumable-file-upload-with-s3-ce039cbc8865
This has been discussed before here. Using Java, I have developed my web services on Tomcat for a media library. I want to add a functionality for streaming media while dynamically transcoding them as appropriate to mobile clients. There are few questions I am pondering over :
How exactly to stream the files (both audio and video) ? I am coming across many streaming servers - but I want something to be done on my code from Tomcat itself. Do I need to install one more server, i.e , the streaming server - and then redirect streaming requests to that server from Tomcat ?
Is it really a good idea to dynamically transcode ? Static transcoding means we have to replicate the same file in 'N' formats - something which is space consuming and I dont want. So is there a way out ?
Is it possible to stream the data "as it is transcoded"...that is, I dont want to start streaming when the transcoding has finished (as it introduces latency) - rather I want to stream the transcoded data bytes as they are produced. I apologize if this is an absurd requirement...I have no experience of either transcoding or streaming.
Other alternatives like ffmpeg, Xuggler and other technologies mentioned here - are they a better approach for getting the job done ?
I dont want to use any proprietary / cost based alternative to achieve this goal, and I also want this to work in production environments. Hope to get some help here...
Thanks a lot !
Red5 is another possible solution. Its open source and is essentially Tomcat with some added features. I don't know how far back in time the split from the Tomcat codebase occurred but the basics are all there (and the source - so you can patch what's missing).
Xuggler is a lib 'front end' for ffmpeg and plays nicely with Red5. If you intend to do lots of transcoding you'll probably run into this code along the way.
Between these two projects you can change A/V format and stream various media.
Unless you really need to roll your own I'd reccomend an OSS project with good community support.
For your questions:
1.) This is the standard space vs. performace tradeoff. You see the same thing in generating hash tables and other computationally expensive operations. If space is a larger issue than processor time, then dynamic transcoding is your only way out.
2.) Yes, you can stream during the transcode process. VLC http://www.videolan.org/vlc/ does this.
3.) I'd really look into VLC if I were you.