I have to stream some videos from a database. I planned to use Java Media Framework, because I get the Blob objects from the result of a query, but apparently JMF cannot play videos from streams, but only from URLs. Does Anyone know how to solve this problem? If JMF cannot solve it, someone can recommend me a different framework?
You could register a protocol handler, and then use this special URL within JMF. Registering protocol handlers is a bit complicated in my view, but there are a many examples on the internet.
Related
I am using a simple video view which loads an online video from URL
few videos work but not all videos . does android supports certain types of videos if so is there a way to support all formats ?
I think that, as a developer, you can really only rely on support for formats in Google's stock codec and file type list, which is here:
https://developer.android.com/guide/topics/media/media-formats.html
It's not all that extensive and, in practice, device vendors usually add additional capabilities. If you're writing something to run on multiple devices, you'd probably have to assume the worse, for maximum compatibility.
In my Java Application, I need to create an RTMP (or RTSP) livestream server and feed raw RGB image data to it.
I'd like to use the Red5 Server project, as it is available via Maven and the Apache 2.0 license fits my needs.
However, I haven't found any introduction tutorials on how to start and feed a livestream server from within a standalone Java Application. I have already added the dependency to my project, and can access all of Red5's classes.
Can you point me to any resources that help me getting started? The task I'm trying to accomplish seems pretty basic to me.
With Red5 you have the control to handle the "input" in any means you want, from a servlet that accepts octet-streams, to pretty much anything else you can think up. So if your input is RGB data, implement something that accepts your byte arrays (like a servlet) and then convert it into one of the Flash supported video codecs, such as h.264; you can use ffmpeg or jcodec to do this. Lastly, you'll want to package the now encoded data into FLV format, that's more difficult, but there are non-red5 examples of how to do this on the net, google for it. Once you have it in the FLV format, create a broadcast stream and dispatch the VideoData to it. Sounds simple right? Its not, but if you are a proficient in Java and / or C/C++, you should be fine.
I need to implement Video conferencing in intranet using JMF API.But i dint know where to start with ? I have knowledge of JMF but how to implement this thing that i need to know.
I need to know how RTP sessions are created and used while the video streaming is done?
Use Jitsi, they have about the best library I've found for RTP streaming. Note that they use FMJ in-place of JMF.
Their site: https://jitsi.org/
Their code: https://github.com/jitsi
Check out libjitsi if you just want the libraries.
I have some video stored on Google Cloud Storage in mp4, i need the extract a thumbnail from them.
I looked to a number of solutions, and looks like they don't work with App Engine.
for example: http://www.xuggle.com/xuggler/
Do you have library that i could use to do this on Google App Engine in Java?
I finally went with Google Compute Engine and ffmpeg, implementation is in progress...
This question looks related to Is there a Java API for mp4 files?
Xuggler says their project is a mix of java and native code. Native code will not run on java app engine.
Search for java mp4 video decoder and look for a decoders that only use java code. Perhaps this one will work: http://jcodec.org/guide/movstitch.html.
OK, after some searching, I've found following: Video website on google application engine
Sounds like similar problem to yours. The only difference is that the other thread checks for python solution. However answers seem to be quite programming language independent, so I think it will be worth a read.
For the documentation/inspirational purposes, it might be worth to describe your solution in this thread once you go for it.
I'm planing to build a Webdav client for Android and I'm not sure what's the best library to use. Basically I would like to allow the user to pause and resume an upload request to a Webdav server. Is there any way to do that?
I' ve looked around and apparently I've got the choice between these libraries:
Sardine
JackRabbit
Jakarta slide
Sardine looks like the most stable solution for now. I've seen nothing about pausing and resuming a request though.
I'm the original author of Sardine. I created it because Slide and Jackrabbit were too difficult to use and not well supported. The intention of Sardine was always to be run on servers, not mobile devices. Also, the design has always been around making a request and getting a response. Nothing about pausing. I have also heard that there may be issues getting it working on Android (dependencies).
That said, I hate to say it, but I think you are pretty much on your own. If you'd like to join the Sardine project to contribute patches to it to make it compatible with Android and support features like pausing, I'd love to have you (assuming you write good code. hehe).
As far as i remember sardine is not fully compatible with android (some missing dependencies)
About your question, i don't think is pause/resume is a part of parsing lib (which sardine i basically is). This behavior is related more with HttpClient.
However i think this is even too low abstraction level for it.
The best solution would be to implement such mechanism by hand. It's not pretty, and is connected with AIO sockets