Java, serve HLS live video streams - java

I know the topic is not an easy one, but I am looking for a Java class to send an HLS stream from the server to the client.
I have files being generated greater and greater:
out.m3u8
out0.ts
out1.ts
out2.ts
out3.ts
out4.ts
out5.ts
out6.ts
This is generated using ffmpeg from an original source:
ffmpeg -i http://sourceurl.com:9981/stream/channel/1232131 out.m3u8
I can play it using VLC.
Somehow, I need to stream this live to the clients.
At this point, I do not really care about different bit rates, i just want live streaming to work, in mobile browsers and on desktop browsers.
I found this class:
https://github.com/Red5/red5-hls-plugin/blob/master/plugin/src/main/java/org/red5/stream/http/servlet/PlayList.java
Which might be doing something like that.
I have pulled in hls.js into my application in hopes of using it for desktops.
HLS should however work IOS devices without hls.js right now.
How should one serve HLS content from the server? It's very difficult to find any good and simple example to do that.
Anyone knows of the steps needed to do that ?
I've looked into Wowza and Red5 just a little bit, but unsure what they can provide for me at this stage and seems to be overly complicated to setup just to serve some files. But please explain to me why that's not the case.

The H in HLS stands for HTTP. The point of streaming tech such as HLS DASH HDS smooth streaming, etc is that no special server is necessary. Just plain HTTP. you can use something like nginx, or any HTTP server class/library available for Java or any other language.

Related

Java video streaming project

I must make a video streaming java program as a project in the university, however I don't know how to start!
I must do both, main-server side and subserver side, the client side is going to be VLC.
So I need help to this points:
In the main server I must split the videos to say 10KB parts how to
do that correctly?
How to stream a video from a sub-server to the client properly?
Note: I prefer to use mp4 videos, but I'm allowd to use whatever I want.
Thanks
You need to decide whether you're building a true live stream (typically Apple HLS or MPEG DASH), or just a pseudo-live stream. Some formats, such as MP4, can be streamed when formatted correctly (see how to do that here).
In the main server I must split the videos to say 10KB parts how to do that correctly?
Sounds like you want to convert a mp4 into a mpeg-ts. Have a look at https://github.com/taktik/mpegts-streamer. Other option is to run ffmpeg
How to stream a video from a sub-server to the client properly?
Multi-source synchronization is a non-trivial matter when it comes to live streams. Depending on your implementation:
Pseudo-live stream with MP4: make sure your streaming API supports seeking and restarting. When a client reconnects to another endpoint, it may send HTTP headers to indicate where to continue (not sure if VLC supports this)
True live stream: keep track of chunks that were served to the client. Topic or elasticache sound reasonable for that. When a client connects to sub-server for the first time, analyze subscription or query elasticache to determine the best chunk.
You may look at Ant Media Server project which is open source.
Complete example witch stream-m
https://github.com/vbence/stream-m/blob/master/README.md
contains examples of capture and transmission

How can I send video stream to client side (Browser) by using RTSP

I want to implement something like below:
Reading the video stream from IP-Camera by using RTSP (which is
done)
Processing the image by OpenCV (which is done)
Sending the image to the browser to display (which is the problem)
The third part I want to send the images as video stream by using the RTSP protocol.
Note: The language which has used on the server side is Java (OpenCV also is in Java), and the server is TomCat.
And if someone thinks that using RTSP to implement is not better, then what the best way to achieve this functionality, because RTSP is special for video stream so I think this can be the better choice.
The solution you select will most likely depend on the type of system you are building.
If you have a small number of users who will be happy to add a plugin to their browser then an RTSP based approach as you have outlined may work - an example plugin for chrome you could experiment with is
https://github.com/VideoExpertsGroup/VXG.Chrome-RTSP-Player
The more usual solution is to convert the RTSP stream into a HTTP stream using a proxy or streaming server - again the scale and size of your system will probably dictate what you want to do here.
A 'Rolls Royce' solution which will allow you address as many users as possible with as good quality as possible (based on the current landscape - video keeps changing...) might be:
video stream encoded with h.264
transcoded into multiple Bit rate versions to support ABR
packaged into fragmented mp4 and streamed with both HLS and MPEG DASH (for maximum device coverage)
ABR essentially allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
There is an example using GStreamer, an open source streaming server, and hls streaming in this answer here: https://stackoverflow.com/a/35002374/334402
I do realise your question mentioned TomCat and Java, but if you do want this level of service then I would really hesitate to build it from scratch yourself as there is a lot of specialist detail in a streaming server.

HTTP-live steaming on Linux server

I find it hard to find some conclusive information on this. I have a dedicated server in a datacenter with Debian 5.0. I have an iPhone/iPad app which uses a JAVA EE (Glassfish 2.1) backend, and I am in the process of implementing video into the App. This includes live streaming and video's are longer than 10 minutes I need HTTP Live Streaming.
What is the best open-source/free solution to implement? This is only a pilot project, so I don't want to subscribe to any paid service. I have currently nothing in place yet for the live streaming, so am flexible to adapt any system (server or client side).
I came across:
Darwin (but am not sure that project is alive, as there is not a lot of info)
Red5 (but cannot find conclusive if this would allow an easy implementation of HTTP live streaming)
FFMPEG
Regarding the video's, I would ideally like to upload a 720p version to the server (for iPad) and then convert automatic (either on the fly when requested or prepared when the file is uploaded) to the required formats for iPhone/iTouch and low bandwidth. For live streaming I would like to be able to provide the content in about 30 seconds from it streaming into the server.
I am not envisaging high demands (e.g. a lot of simultaneous requests, and if so (e.g. live event) on one stream which should be able to be dealt with using HTTP-live streaming, it only needs encoding and segmenting once).
In the )not so near) future android will probably be made part of the App as well.
Any hints/tutorial/suggestions/advice would be really appreciated.
Wowza is pretty good for live streaming to iOS (as well as flash)
It isn't free though.
Latest development version of VLC supports HTTP live streaming.
You'll have to build from source as this has been added to the git repository not so long ago.
http://wiki.videolan.org/Documentation:Streaming_HowTo/Streaming_for_the_iPhone
I am now using the Xuggler framework, which is Java based. Seems to do exactly the job I am looking for, although no build in segmented etc. is available. Instead I try now to write one myself which at the same time integrates exactly with my system
Refer to Apple's http live streaming document and best practices.
https://developer.apple.com/streaming/
This should be a good point to get started.
What is the source of the live video? The iPhone only supports playback of H.264 baseline profile level 3 or mpeg-4 video with aac audio. The iPhone itself encodes video to these specs, but most other encoders don't (including many Android phones). If your video is not encoded to this spec, you'll first have to transcode. FFMpeg (with libx264) will do this nicely. Then you'll need to generate the dynamic .m3u8 playlist file. Wowza will do this for you out of the box, and will accept an rtmp stream from FFmpeg (but is not free). I don't believe that red5 supports Apple http streaming. There are free servers that claim to, but I've never used them. Take a look at http://erlyvideo.org/. Otherwise, you can do it yourself fairly simply. FFmpeg will output an mpeg-ts stream. All that the playlist generator needs to do, then, is cut this into 188-byte-aligned chunks, and return a playlist containing the last n. You can even use an http byte offset module to make the playlist reference a single file. Read Apple's http streaming docs at https://developer.apple.com/streaming/

How to stream and transcode media files using java (on Tomcat)?

This has been discussed before here. Using Java, I have developed my web services on Tomcat for a media library. I want to add a functionality for streaming media while dynamically transcoding them as appropriate to mobile clients. There are few questions I am pondering over :
How exactly to stream the files (both audio and video) ? I am coming across many streaming servers - but I want something to be done on my code from Tomcat itself. Do I need to install one more server, i.e , the streaming server - and then redirect streaming requests to that server from Tomcat ?
Is it really a good idea to dynamically transcode ? Static transcoding means we have to replicate the same file in 'N' formats - something which is space consuming and I dont want. So is there a way out ?
Is it possible to stream the data "as it is transcoded"...that is, I dont want to start streaming when the transcoding has finished (as it introduces latency) - rather I want to stream the transcoded data bytes as they are produced. I apologize if this is an absurd requirement...I have no experience of either transcoding or streaming.
Other alternatives like ffmpeg, Xuggler and other technologies mentioned here - are they a better approach for getting the job done ?
I dont want to use any proprietary / cost based alternative to achieve this goal, and I also want this to work in production environments. Hope to get some help here...
Thanks a lot !
Red5 is another possible solution. Its open source and is essentially Tomcat with some added features. I don't know how far back in time the split from the Tomcat codebase occurred but the basics are all there (and the source - so you can patch what's missing).
Xuggler is a lib 'front end' for ffmpeg and plays nicely with Red5. If you intend to do lots of transcoding you'll probably run into this code along the way.
Between these two projects you can change A/V format and stream various media.
Unless you really need to roll your own I'd reccomend an OSS project with good community support.
For your questions:
1.) This is the standard space vs. performace tradeoff. You see the same thing in generating hash tables and other computationally expensive operations. If space is a larger issue than processor time, then dynamic transcoding is your only way out.
2.) Yes, you can stream during the transcode process. VLC http://www.videolan.org/vlc/ does this.
3.) I'd really look into VLC if I were you.

What is the best way to develop a screensharing/presenting app in java or c++ using RTP?

I am trying to build a server sided HTML rendering based browser much like skyfire. I am evaluating the technologies that I would need to build this. When a user types in a www.yahoo.com on the client box, there is a session started on the server, then i grab the screenshots and send them in a RTP stream back to the client. To do the RTP bit, i started using JMF http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/ScreenGrabber.html I found out that when i got the RTP stream back on the client, it was extremely slow and sometimes i would get a lot of artifacts in the video stream. Is there a better way to do this than to use JMF?
Well if I correctly understand your problem, you need an efficient way to do the RTP streaming. A really good and efficient library for streaming in C++ is live555. To encode your images you can use ffmpeg and eventually its C++/java/JMF binding FOBS. This way you can have an efficient streaming server.
It's isn't clear which feature of skyfire you want to implement.
If you're streaming a succession of screenshots, and finding it slow, then compress the data which you're sending. Presumably one screenshot is only very slightly different than the previous one: to minimize bandwidth, you should only transmit the delta between each screenshot.
Look at VNC/VNCViewer. There is even an viewer applet, and IIRC there was a question here on SO whether it can be done in a Java Desktop Program (as against in a applet in browser)

Categories