Android playing Video data from a custom network stream? - java

Does Android MediaPlayer can only work with file sources? I would like play media (video) from a network stream, but the stream comes in a non-standard protocol, so I have to somehow feed Android MediaPlayer with the data only.
Is there anyway to do that? I found a few web pages suggesting using a temporary file for the buffered media data etc. but I would like to minimize the I/O usage as much as I can, so I'm looking for a API only solution if there is any? how about JNI? but looks like the permissions going to be an issue with that also.

Does Android MediaPlayer can only work
with file sources?
No, it handles HTTP and RTSP streams as well.
I would like play media (video) from a
network stream, but the stream comes
in a non-standard protocol, so I have
to somehow feed Android MediaPlayer
with the data only.
That will be difficult. If this were audio, you could use AudioTrack, but there is no video equivalent for this.
One answer is to create a server-side proxy that converts your non-HTTP, non-RTSP stream into an HTTP or RTSP stream, so the existing Android streaming support works.

Basically Android supports HTTP and RTSP video playback for network videos
This link may help you Click Here

Related

Broadcast from phone camera to RTMP Nigix Server

I had managed to start broadcasting to RTMP from OBS software.
For RTMP with Ngix, I used this tutorial. For PlayBack in android I used https://github.com/josemmo/libvlc-android
To stream to RTMP I used OBS software https://obsproject.com
Everything works well so far.
My question is can I broadcast from my android phone without using OBS software aka I want to use my phone camera to broadcast directly to RTMP server.
https://play.google.com/store/apps/details?id=com.dev47apps.droidcam&hl=en&gl=US
Similar function like Droidcam
Prefer ionic-react, but if no choice, i could try native android with java.
Update: I found this Android Library https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
And did manage to record from camera but still missing on how to add key.
I did manage with this library
https://github.com/begeekmyfriend/yasea
It is quite easy.

How to Stream a live video along audio with syn via java over Socket?

Can you suggest to me how to stream a live video (recording from webcam and microphone in syn) over the socket in java? I tried to search but only found ways to stream images over socket not a live video (capturing from webcam) along audio (using mic).
This is a really broad question, so it's difficult to provide a satisfactory answer. I can point you to some resources though.
Firstly, a popular protocol for this kind of application is the following:
https://en.wikipedia.org/wiki/Real_Time_Streaming_Protocol
Here is a link to a promising paper going in-depth on how this protocol can be implemented using Java:
https://www.researchgate.net/publication/228974152_A_Java-based_adaptive_media_streaming_on-demand_platform
Of course, you can also choose to use a pre-existing API (there are plenty of options). Additionally here are some StackOverflow posts that also touch upon this subject:
Live video streaming using Java?
Real-time audio streaming using HTTP - choosing the protocol and Java implementation
Good luck :)

Codename One: How to stream live video to YouTube Live

I'm developing an app which I need to stream video captured from the Smartphone's video camera (on iPhones and Android phones) directly to YouTube Live.
I looked into Codename One's Capture.captureVideo(ActionListener response) method which must wait for the video to be stopped, the file to be saved, and then the ActionListener is called. Obviously, this can't work work because the video has to be streamed to an output stream (to an URL given by YouTube Live API) on a continuous basis. Is there any other way to accomplish this? (What about any unofficial API, like method to override, to get the input stream from the camera?) If not, would Codename One consider providing this feature for a version upgrade as the market trend seems to be moving on live video streaming apps?
If it cannot be done with Codename One's API, then the only way is to write native code for Android and iOS. I've read the article integrating native API and using Freshdesk API as an example, so any pointers on how to integrate YouTube API for the purpose of streaming live video?
https://developers.google.com/youtube/v3/live/getting-started
https://developers.google.com/youtube/v3/live/libraries
https://developers.google.com/api-client-library/java/apis/youtube/v3
https://developers.google.com/youtube/v3/live/code_samples/
I don't see a REST API within the list of API's although there is a JavaScript API which you might use to implement this. Alternatively you can use something like was done with the freshdesk API. You will need to embed the native view from the live broadcast, you can look at the implementation of Google Maps to see how we embedded that native widget.

What should i use to display a streaming url on my android app?

I want to display a live-streaming on my android application (from a IP camera in this case) using the URL of the streaming.
The streaming is being shown on a webpage, so basically what I want to do is to display it from to the webpage, straight to my application.
Should I use Picasso, WebView or ImageView in this case?, which one is more efficient and why? or should I use another tool?
Thanks very much for reading, I´m kinda lost with this.
Picasso and ImageView can't deal with video streaming. You should use VideoView, but it is very limited for various media formats, at first try it, if it don't work, try
Vitamio, easy to use, but has large delay for rtmp and rtsp streams
ffmpeg, this is a good choice for rtmp and rtsp streams, but you need to find ffmpeg port for android, for example this AndroidFFmpeg, but you can find more, personally I used this sodaplayer, for rtmp streams, and it has very little delay, comparing to vitamio or web plugin.
GStreamer I used a lot, great library, a little hard to use, but when you experience with it, it can be very helpful
I used this 3 libraries, and it depends what kind of stream is it, all have their good and bad sides, depends on stream format.

Where to get streaming (live) video and audio from camera example app for Android?

Where to get streaming (live) video and audio from camera example for Android?
Suppose I want to create some live video streaming service app so I'll have some cool server at the back end. And I know how to do that part. Suppose I have some stand alone app for PCs now I want to go on to mobile devices. So I want to see some sample app grabing audio and video streams from Phone, Synchronizing them, encoding somehow, and sending LIVE stream to server. I need any Open-Source sample that will do this or something like this. Where can I get such one?
Ole have you been able to find any good examples of video or audio broadcasting yet? The best that I have found so far is the SIPDroid project (www.sipdroid.org). I haven't had a chance to review it in depth, but it looks promising.
Here are some project that you want
Ip Camera
http://code.google.com/p/ipcamera-for-android
SipDroid
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
You can get the codes using SVN or other clients.
Yet to me, the both projects still have issues. If you get the one working well, please tell me.

Categories