Real-time video render - java

We're staring a new project where we should be able to modify on the fly video - add text, insert image.
Project is planing to be written in Java.
I'm a newbie in video processing so I want to understand.
What is the best and fast solution to modify video and return to client (web browser).
I can easy do it using ffmpeg but mp4 is not streamable. And I should write it to file first and then return to client - it's not fast.
Maybe my question is very abstract but I should to start from somewhere :)
Thanks

I'm not sure why you mention java, because it sounds to me as if you need to mix a stream of live video (from a camera) with alpha channel graphics etc. and then distribute it in realtime (like a TV station does?). Correct me if I misunderstand.
I suppose it all depends on the quality of the video you need to distribute, using a digital source like a webcam is not very exiting - although most modern products can do that too. It would be better to use a more professional camera with a decent optic that can send a quality stream to a software controlled video mixer & encoder (input card). Blackmagic do cheap mixers for this kind of thing (ATEM is cheap compared to other products in its class). Wirecast from Telestream is a very cheap alternative solution for doing the whole thing without having to buy additional encoders.
You need to send a stream to a distribution server. Depending on what you choose to buy (eg. you can use rather expensive H.264 or even JPEG2000 encoders and decoders with ATEM) If you do a cheaper rtmp stream with wirecast or such, you can handle the distribution with ffmpeg server - and there are other services and CDNs you can hook into if you use rtmp and want somebody to help you.

Related

How can I send video stream to client side (Browser) by using RTSP

I want to implement something like below:
Reading the video stream from IP-Camera by using RTSP (which is
done)
Processing the image by OpenCV (which is done)
Sending the image to the browser to display (which is the problem)
The third part I want to send the images as video stream by using the RTSP protocol.
Note: The language which has used on the server side is Java (OpenCV also is in Java), and the server is TomCat.
And if someone thinks that using RTSP to implement is not better, then what the best way to achieve this functionality, because RTSP is special for video stream so I think this can be the better choice.
The solution you select will most likely depend on the type of system you are building.
If you have a small number of users who will be happy to add a plugin to their browser then an RTSP based approach as you have outlined may work - an example plugin for chrome you could experiment with is
https://github.com/VideoExpertsGroup/VXG.Chrome-RTSP-Player
The more usual solution is to convert the RTSP stream into a HTTP stream using a proxy or streaming server - again the scale and size of your system will probably dictate what you want to do here.
A 'Rolls Royce' solution which will allow you address as many users as possible with as good quality as possible (based on the current landscape - video keeps changing...) might be:
video stream encoded with h.264
transcoded into multiple Bit rate versions to support ABR
packaged into fragmented mp4 and streamed with both HLS and MPEG DASH (for maximum device coverage)
ABR essentially allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
There is an example using GStreamer, an open source streaming server, and hls streaming in this answer here: https://stackoverflow.com/a/35002374/334402
I do realise your question mentioned TomCat and Java, but if you do want this level of service then I would really hesitate to build it from scratch yourself as there is a lot of specialist detail in a streaming server.

HTTP-live steaming on Linux server

I find it hard to find some conclusive information on this. I have a dedicated server in a datacenter with Debian 5.0. I have an iPhone/iPad app which uses a JAVA EE (Glassfish 2.1) backend, and I am in the process of implementing video into the App. This includes live streaming and video's are longer than 10 minutes I need HTTP Live Streaming.
What is the best open-source/free solution to implement? This is only a pilot project, so I don't want to subscribe to any paid service. I have currently nothing in place yet for the live streaming, so am flexible to adapt any system (server or client side).
I came across:
Darwin (but am not sure that project is alive, as there is not a lot of info)
Red5 (but cannot find conclusive if this would allow an easy implementation of HTTP live streaming)
FFMPEG
Regarding the video's, I would ideally like to upload a 720p version to the server (for iPad) and then convert automatic (either on the fly when requested or prepared when the file is uploaded) to the required formats for iPhone/iTouch and low bandwidth. For live streaming I would like to be able to provide the content in about 30 seconds from it streaming into the server.
I am not envisaging high demands (e.g. a lot of simultaneous requests, and if so (e.g. live event) on one stream which should be able to be dealt with using HTTP-live streaming, it only needs encoding and segmenting once).
In the )not so near) future android will probably be made part of the App as well.
Any hints/tutorial/suggestions/advice would be really appreciated.
Wowza is pretty good for live streaming to iOS (as well as flash)
It isn't free though.
Latest development version of VLC supports HTTP live streaming.
You'll have to build from source as this has been added to the git repository not so long ago.
http://wiki.videolan.org/Documentation:Streaming_HowTo/Streaming_for_the_iPhone
I am now using the Xuggler framework, which is Java based. Seems to do exactly the job I am looking for, although no build in segmented etc. is available. Instead I try now to write one myself which at the same time integrates exactly with my system
Refer to Apple's http live streaming document and best practices.
https://developer.apple.com/streaming/
This should be a good point to get started.
What is the source of the live video? The iPhone only supports playback of H.264 baseline profile level 3 or mpeg-4 video with aac audio. The iPhone itself encodes video to these specs, but most other encoders don't (including many Android phones). If your video is not encoded to this spec, you'll first have to transcode. FFMpeg (with libx264) will do this nicely. Then you'll need to generate the dynamic .m3u8 playlist file. Wowza will do this for you out of the box, and will accept an rtmp stream from FFmpeg (but is not free). I don't believe that red5 supports Apple http streaming. There are free servers that claim to, but I've never used them. Take a look at http://erlyvideo.org/. Otherwise, you can do it yourself fairly simply. FFmpeg will output an mpeg-ts stream. All that the playlist generator needs to do, then, is cut this into 188-byte-aligned chunks, and return a playlist containing the last n. You can even use an http byte offset module to make the playlist reference a single file. Read Apple's http streaming docs at https://developer.apple.com/streaming/

How to stream and transcode media files using java (on Tomcat)?

This has been discussed before here. Using Java, I have developed my web services on Tomcat for a media library. I want to add a functionality for streaming media while dynamically transcoding them as appropriate to mobile clients. There are few questions I am pondering over :
How exactly to stream the files (both audio and video) ? I am coming across many streaming servers - but I want something to be done on my code from Tomcat itself. Do I need to install one more server, i.e , the streaming server - and then redirect streaming requests to that server from Tomcat ?
Is it really a good idea to dynamically transcode ? Static transcoding means we have to replicate the same file in 'N' formats - something which is space consuming and I dont want. So is there a way out ?
Is it possible to stream the data "as it is transcoded"...that is, I dont want to start streaming when the transcoding has finished (as it introduces latency) - rather I want to stream the transcoded data bytes as they are produced. I apologize if this is an absurd requirement...I have no experience of either transcoding or streaming.
Other alternatives like ffmpeg, Xuggler and other technologies mentioned here - are they a better approach for getting the job done ?
I dont want to use any proprietary / cost based alternative to achieve this goal, and I also want this to work in production environments. Hope to get some help here...
Thanks a lot !
Red5 is another possible solution. Its open source and is essentially Tomcat with some added features. I don't know how far back in time the split from the Tomcat codebase occurred but the basics are all there (and the source - so you can patch what's missing).
Xuggler is a lib 'front end' for ffmpeg and plays nicely with Red5. If you intend to do lots of transcoding you'll probably run into this code along the way.
Between these two projects you can change A/V format and stream various media.
Unless you really need to roll your own I'd reccomend an OSS project with good community support.
For your questions:
1.) This is the standard space vs. performace tradeoff. You see the same thing in generating hash tables and other computationally expensive operations. If space is a larger issue than processor time, then dynamic transcoding is your only way out.
2.) Yes, you can stream during the transcode process. VLC http://www.videolan.org/vlc/ does this.
3.) I'd really look into VLC if I were you.

Take input from IP_webcam, manipulate & pump it out

I want to take the input from a standard, off the shelf, IP based webcam - which, has yet to be decided, so the API is not yet clear - manipulate it a little and then pump it back out so that others can view my manipulated image.
Given that this is a little vague, which technologies can you recommend?
I am thinking to use an Adroid slate, to save costs, so it's probably Java coding. So, how best to get an image stream (plus void, modify the stream and send the modified video plus unmodified audio?
I might also add file transfer & IM chat into the mix ...
FOSS solutions highly welcomed
Most IP cameras produce RTP/RTSP with jmpeg, mpeg4 or h.264 encoded stream.
You would need to write a RTP/RTSP client and then a decoder for the particular stream, then manipulate images, reencode stream and serve it over some standard protocol (again probably RTP/RTSP).
Not something Android devices are powerful enough to do. Also there are no pure Java libs that can do this.
What you should use is Xuggler. If you need to serve streams to Flash and/or iPhone you should add Wowza or Red5.

What is the best way to develop a screensharing/presenting app in java or c++ using RTP?

I am trying to build a server sided HTML rendering based browser much like skyfire. I am evaluating the technologies that I would need to build this. When a user types in a www.yahoo.com on the client box, there is a session started on the server, then i grab the screenshots and send them in a RTP stream back to the client. To do the RTP bit, i started using JMF http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/ScreenGrabber.html I found out that when i got the RTP stream back on the client, it was extremely slow and sometimes i would get a lot of artifacts in the video stream. Is there a better way to do this than to use JMF?
Well if I correctly understand your problem, you need an efficient way to do the RTP streaming. A really good and efficient library for streaming in C++ is live555. To encode your images you can use ffmpeg and eventually its C++/java/JMF binding FOBS. This way you can have an efficient streaming server.
It's isn't clear which feature of skyfire you want to implement.
If you're streaming a succession of screenshots, and finding it slow, then compress the data which you're sending. Presumably one screenshot is only very slightly different than the previous one: to minimize bandwidth, you should only transmit the delta between each screenshot.
Look at VNC/VNCViewer. There is even an viewer applet, and IIRC there was a question here on SO whether it can be done in a Java Desktop Program (as against in a applet in browser)

Categories