I create an HTTP streaming server, but the clients can not play all video formats, so my question is, if there is a way, using xuggle in the server to transcode the video in a specific format and streaming it directly, on the fly.
I mean, not have to wait to finish the transcoding and then start the http streaming. I mean that I have a loop for example and get everytime a number of transcoded bytes and writes them to the socket.
Yes, but...
I'd not recommend taking this approach. Encoding videos is generally very CPU intensive. The generally accepted aproach to solving this problem is to transcode the video file off-line and store on the streaming server. Yes, that means a couple of different media files with the same video, but it scales muuuuuuuuch better. Most (all?) successful streaming servers do it this way.
Related
I must make a video streaming java program as a project in the university, however I don't know how to start!
I must do both, main-server side and subserver side, the client side is going to be VLC.
So I need help to this points:
In the main server I must split the videos to say 10KB parts how to
do that correctly?
How to stream a video from a sub-server to the client properly?
Note: I prefer to use mp4 videos, but I'm allowd to use whatever I want.
Thanks
You need to decide whether you're building a true live stream (typically Apple HLS or MPEG DASH), or just a pseudo-live stream. Some formats, such as MP4, can be streamed when formatted correctly (see how to do that here).
In the main server I must split the videos to say 10KB parts how to do that correctly?
Sounds like you want to convert a mp4 into a mpeg-ts. Have a look at https://github.com/taktik/mpegts-streamer. Other option is to run ffmpeg
How to stream a video from a sub-server to the client properly?
Multi-source synchronization is a non-trivial matter when it comes to live streams. Depending on your implementation:
Pseudo-live stream with MP4: make sure your streaming API supports seeking and restarting. When a client reconnects to another endpoint, it may send HTTP headers to indicate where to continue (not sure if VLC supports this)
True live stream: keep track of chunks that were served to the client. Topic or elasticache sound reasonable for that. When a client connects to sub-server for the first time, analyze subscription or query elasticache to determine the best chunk.
You may look at Ant Media Server project which is open source.
Complete example witch stream-m
https://github.com/vbence/stream-m/blob/master/README.md
contains examples of capture and transmission
I want to implement something like below:
Reading the video stream from IP-Camera by using RTSP (which is
done)
Processing the image by OpenCV (which is done)
Sending the image to the browser to display (which is the problem)
The third part I want to send the images as video stream by using the RTSP protocol.
Note: The language which has used on the server side is Java (OpenCV also is in Java), and the server is TomCat.
And if someone thinks that using RTSP to implement is not better, then what the best way to achieve this functionality, because RTSP is special for video stream so I think this can be the better choice.
The solution you select will most likely depend on the type of system you are building.
If you have a small number of users who will be happy to add a plugin to their browser then an RTSP based approach as you have outlined may work - an example plugin for chrome you could experiment with is
https://github.com/VideoExpertsGroup/VXG.Chrome-RTSP-Player
The more usual solution is to convert the RTSP stream into a HTTP stream using a proxy or streaming server - again the scale and size of your system will probably dictate what you want to do here.
A 'Rolls Royce' solution which will allow you address as many users as possible with as good quality as possible (based on the current landscape - video keeps changing...) might be:
video stream encoded with h.264
transcoded into multiple Bit rate versions to support ABR
packaged into fragmented mp4 and streamed with both HLS and MPEG DASH (for maximum device coverage)
ABR essentially allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
There is an example using GStreamer, an open source streaming server, and hls streaming in this answer here: https://stackoverflow.com/a/35002374/334402
I do realise your question mentioned TomCat and Java, but if you do want this level of service then I would really hesitate to build it from scratch yourself as there is a lot of specialist detail in a streaming server.
We're staring a new project where we should be able to modify on the fly video - add text, insert image.
Project is planing to be written in Java.
I'm a newbie in video processing so I want to understand.
What is the best and fast solution to modify video and return to client (web browser).
I can easy do it using ffmpeg but mp4 is not streamable. And I should write it to file first and then return to client - it's not fast.
Maybe my question is very abstract but I should to start from somewhere :)
Thanks
I'm not sure why you mention java, because it sounds to me as if you need to mix a stream of live video (from a camera) with alpha channel graphics etc. and then distribute it in realtime (like a TV station does?). Correct me if I misunderstand.
I suppose it all depends on the quality of the video you need to distribute, using a digital source like a webcam is not very exiting - although most modern products can do that too. It would be better to use a more professional camera with a decent optic that can send a quality stream to a software controlled video mixer & encoder (input card). Blackmagic do cheap mixers for this kind of thing (ATEM is cheap compared to other products in its class). Wirecast from Telestream is a very cheap alternative solution for doing the whole thing without having to buy additional encoders.
You need to send a stream to a distribution server. Depending on what you choose to buy (eg. you can use rather expensive H.264 or even JPEG2000 encoders and decoders with ATEM) If you do a cheaper rtmp stream with wirecast or such, you can handle the distribution with ffmpeg server - and there are other services and CDNs you can hook into if you use rtmp and want somebody to help you.
I find it hard to find some conclusive information on this. I have a dedicated server in a datacenter with Debian 5.0. I have an iPhone/iPad app which uses a JAVA EE (Glassfish 2.1) backend, and I am in the process of implementing video into the App. This includes live streaming and video's are longer than 10 minutes I need HTTP Live Streaming.
What is the best open-source/free solution to implement? This is only a pilot project, so I don't want to subscribe to any paid service. I have currently nothing in place yet for the live streaming, so am flexible to adapt any system (server or client side).
I came across:
Darwin (but am not sure that project is alive, as there is not a lot of info)
Red5 (but cannot find conclusive if this would allow an easy implementation of HTTP live streaming)
FFMPEG
Regarding the video's, I would ideally like to upload a 720p version to the server (for iPad) and then convert automatic (either on the fly when requested or prepared when the file is uploaded) to the required formats for iPhone/iTouch and low bandwidth. For live streaming I would like to be able to provide the content in about 30 seconds from it streaming into the server.
I am not envisaging high demands (e.g. a lot of simultaneous requests, and if so (e.g. live event) on one stream which should be able to be dealt with using HTTP-live streaming, it only needs encoding and segmenting once).
In the )not so near) future android will probably be made part of the App as well.
Any hints/tutorial/suggestions/advice would be really appreciated.
Wowza is pretty good for live streaming to iOS (as well as flash)
It isn't free though.
Latest development version of VLC supports HTTP live streaming.
You'll have to build from source as this has been added to the git repository not so long ago.
http://wiki.videolan.org/Documentation:Streaming_HowTo/Streaming_for_the_iPhone
I am now using the Xuggler framework, which is Java based. Seems to do exactly the job I am looking for, although no build in segmented etc. is available. Instead I try now to write one myself which at the same time integrates exactly with my system
Refer to Apple's http live streaming document and best practices.
https://developer.apple.com/streaming/
This should be a good point to get started.
What is the source of the live video? The iPhone only supports playback of H.264 baseline profile level 3 or mpeg-4 video with aac audio. The iPhone itself encodes video to these specs, but most other encoders don't (including many Android phones). If your video is not encoded to this spec, you'll first have to transcode. FFMpeg (with libx264) will do this nicely. Then you'll need to generate the dynamic .m3u8 playlist file. Wowza will do this for you out of the box, and will accept an rtmp stream from FFmpeg (but is not free). I don't believe that red5 supports Apple http streaming. There are free servers that claim to, but I've never used them. Take a look at http://erlyvideo.org/. Otherwise, you can do it yourself fairly simply. FFmpeg will output an mpeg-ts stream. All that the playlist generator needs to do, then, is cut this into 188-byte-aligned chunks, and return a playlist containing the last n. You can even use an http byte offset module to make the playlist reference a single file. Read Apple's http streaming docs at https://developer.apple.com/streaming/
I am trying to build a server sided HTML rendering based browser much like skyfire. I am evaluating the technologies that I would need to build this. When a user types in a www.yahoo.com on the client box, there is a session started on the server, then i grab the screenshots and send them in a RTP stream back to the client. To do the RTP bit, i started using JMF http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/ScreenGrabber.html I found out that when i got the RTP stream back on the client, it was extremely slow and sometimes i would get a lot of artifacts in the video stream. Is there a better way to do this than to use JMF?
Well if I correctly understand your problem, you need an efficient way to do the RTP streaming. A really good and efficient library for streaming in C++ is live555. To encode your images you can use ffmpeg and eventually its C++/java/JMF binding FOBS. This way you can have an efficient streaming server.
It's isn't clear which feature of skyfire you want to implement.
If you're streaming a succession of screenshots, and finding it slow, then compress the data which you're sending. Presumably one screenshot is only very slightly different than the previous one: to minimize bandwidth, you should only transmit the delta between each screenshot.
Look at VNC/VNCViewer. There is even an viewer applet, and IIRC there was a question here on SO whether it can be done in a Java Desktop Program (as against in a applet in browser)