Take input from IP_webcam, manipulate & pump it out - java

I want to take the input from a standard, off the shelf, IP based webcam - which, has yet to be decided, so the API is not yet clear - manipulate it a little and then pump it back out so that others can view my manipulated image.
Given that this is a little vague, which technologies can you recommend?
I am thinking to use an Adroid slate, to save costs, so it's probably Java coding. So, how best to get an image stream (plus void, modify the stream and send the modified video plus unmodified audio?
I might also add file transfer & IM chat into the mix ...
FOSS solutions highly welcomed

Most IP cameras produce RTP/RTSP with jmpeg, mpeg4 or h.264 encoded stream.
You would need to write a RTP/RTSP client and then a decoder for the particular stream, then manipulate images, reencode stream and serve it over some standard protocol (again probably RTP/RTSP).
Not something Android devices are powerful enough to do. Also there are no pure Java libs that can do this.
What you should use is Xuggler. If you need to serve streams to Flash and/or iPhone you should add Wowza or Red5.

Related

Java video streaming project

I must make a video streaming java program as a project in the university, however I don't know how to start!
I must do both, main-server side and subserver side, the client side is going to be VLC.
So I need help to this points:
In the main server I must split the videos to say 10KB parts how to
do that correctly?
How to stream a video from a sub-server to the client properly?
Note: I prefer to use mp4 videos, but I'm allowd to use whatever I want.
Thanks
You need to decide whether you're building a true live stream (typically Apple HLS or MPEG DASH), or just a pseudo-live stream. Some formats, such as MP4, can be streamed when formatted correctly (see how to do that here).
In the main server I must split the videos to say 10KB parts how to do that correctly?
Sounds like you want to convert a mp4 into a mpeg-ts. Have a look at https://github.com/taktik/mpegts-streamer. Other option is to run ffmpeg
How to stream a video from a sub-server to the client properly?
Multi-source synchronization is a non-trivial matter when it comes to live streams. Depending on your implementation:
Pseudo-live stream with MP4: make sure your streaming API supports seeking and restarting. When a client reconnects to another endpoint, it may send HTTP headers to indicate where to continue (not sure if VLC supports this)
True live stream: keep track of chunks that were served to the client. Topic or elasticache sound reasonable for that. When a client connects to sub-server for the first time, analyze subscription or query elasticache to determine the best chunk.
You may look at Ant Media Server project which is open source.
Complete example witch stream-m
https://github.com/vbence/stream-m/blob/master/README.md
contains examples of capture and transmission

How can I send video stream to client side (Browser) by using RTSP

I want to implement something like below:
Reading the video stream from IP-Camera by using RTSP (which is
done)
Processing the image by OpenCV (which is done)
Sending the image to the browser to display (which is the problem)
The third part I want to send the images as video stream by using the RTSP protocol.
Note: The language which has used on the server side is Java (OpenCV also is in Java), and the server is TomCat.
And if someone thinks that using RTSP to implement is not better, then what the best way to achieve this functionality, because RTSP is special for video stream so I think this can be the better choice.
The solution you select will most likely depend on the type of system you are building.
If you have a small number of users who will be happy to add a plugin to their browser then an RTSP based approach as you have outlined may work - an example plugin for chrome you could experiment with is
https://github.com/VideoExpertsGroup/VXG.Chrome-RTSP-Player
The more usual solution is to convert the RTSP stream into a HTTP stream using a proxy or streaming server - again the scale and size of your system will probably dictate what you want to do here.
A 'Rolls Royce' solution which will allow you address as many users as possible with as good quality as possible (based on the current landscape - video keeps changing...) might be:
video stream encoded with h.264
transcoded into multiple Bit rate versions to support ABR
packaged into fragmented mp4 and streamed with both HLS and MPEG DASH (for maximum device coverage)
ABR essentially allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
There is an example using GStreamer, an open source streaming server, and hls streaming in this answer here: https://stackoverflow.com/a/35002374/334402
I do realise your question mentioned TomCat and Java, but if you do want this level of service then I would really hesitate to build it from scratch yourself as there is a lot of specialist detail in a streaming server.

Java, serve HLS live video streams

I know the topic is not an easy one, but I am looking for a Java class to send an HLS stream from the server to the client.
I have files being generated greater and greater:
out.m3u8
out0.ts
out1.ts
out2.ts
out3.ts
out4.ts
out5.ts
out6.ts
This is generated using ffmpeg from an original source:
ffmpeg -i http://sourceurl.com:9981/stream/channel/1232131 out.m3u8
I can play it using VLC.
Somehow, I need to stream this live to the clients.
At this point, I do not really care about different bit rates, i just want live streaming to work, in mobile browsers and on desktop browsers.
I found this class:
https://github.com/Red5/red5-hls-plugin/blob/master/plugin/src/main/java/org/red5/stream/http/servlet/PlayList.java
Which might be doing something like that.
I have pulled in hls.js into my application in hopes of using it for desktops.
HLS should however work IOS devices without hls.js right now.
How should one serve HLS content from the server? It's very difficult to find any good and simple example to do that.
Anyone knows of the steps needed to do that ?
I've looked into Wowza and Red5 just a little bit, but unsure what they can provide for me at this stage and seems to be overly complicated to setup just to serve some files. But please explain to me why that's not the case.
The H in HLS stands for HTTP. The point of streaming tech such as HLS DASH HDS smooth streaming, etc is that no special server is necessary. Just plain HTTP. you can use something like nginx, or any HTTP server class/library available for Java or any other language.

Real-time video render

We're staring a new project where we should be able to modify on the fly video - add text, insert image.
Project is planing to be written in Java.
I'm a newbie in video processing so I want to understand.
What is the best and fast solution to modify video and return to client (web browser).
I can easy do it using ffmpeg but mp4 is not streamable. And I should write it to file first and then return to client - it's not fast.
Maybe my question is very abstract but I should to start from somewhere :)
Thanks
I'm not sure why you mention java, because it sounds to me as if you need to mix a stream of live video (from a camera) with alpha channel graphics etc. and then distribute it in realtime (like a TV station does?). Correct me if I misunderstand.
I suppose it all depends on the quality of the video you need to distribute, using a digital source like a webcam is not very exiting - although most modern products can do that too. It would be better to use a more professional camera with a decent optic that can send a quality stream to a software controlled video mixer & encoder (input card). Blackmagic do cheap mixers for this kind of thing (ATEM is cheap compared to other products in its class). Wirecast from Telestream is a very cheap alternative solution for doing the whole thing without having to buy additional encoders.
You need to send a stream to a distribution server. Depending on what you choose to buy (eg. you can use rather expensive H.264 or even JPEG2000 encoders and decoders with ATEM) If you do a cheaper rtmp stream with wirecast or such, you can handle the distribution with ffmpeg server - and there are other services and CDNs you can hook into if you use rtmp and want somebody to help you.

What is the best way to develop a screensharing/presenting app in java or c++ using RTP?

I am trying to build a server sided HTML rendering based browser much like skyfire. I am evaluating the technologies that I would need to build this. When a user types in a www.yahoo.com on the client box, there is a session started on the server, then i grab the screenshots and send them in a RTP stream back to the client. To do the RTP bit, i started using JMF http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/ScreenGrabber.html I found out that when i got the RTP stream back on the client, it was extremely slow and sometimes i would get a lot of artifacts in the video stream. Is there a better way to do this than to use JMF?
Well if I correctly understand your problem, you need an efficient way to do the RTP streaming. A really good and efficient library for streaming in C++ is live555. To encode your images you can use ffmpeg and eventually its C++/java/JMF binding FOBS. This way you can have an efficient streaming server.
It's isn't clear which feature of skyfire you want to implement.
If you're streaming a succession of screenshots, and finding it slow, then compress the data which you're sending. Presumably one screenshot is only very slightly different than the previous one: to minimize bandwidth, you should only transmit the delta between each screenshot.
Look at VNC/VNCViewer. There is even an viewer applet, and IIRC there was a question here on SO whether it can be done in a Java Desktop Program (as against in a applet in browser)

Categories