I am working in a Angular and Java project, where I need to show my camera live (camera added using IP) in my web page (developed in angular). I heard G streaming can help me on this but I did not found any helpful link for this.
I am wondering if G-streaming is only the way to show the camera live streaming or is there any thing else I can do.
Any help would a great help.
As this will be a demo so I have very little knowledge on this, please ask if any doubt.
UPDATE:
I need to ready a demo where I can put my camera's IP and it will show live in my webpage developed in angular.
just few infos until your question is closed ..
I found this
And this
I think that you can stream the camera with Gstreamer rtsp server
implementation and open up that web browser if I understand that properly..
HTH
Related
Could someone explain me how can I include rtsp stream(from my IP camera, which uses user and passwd auth) in my java based application?
Are there any free libraries that could do the trick?
At this moment I've made a lot of research and the idea I've got is to use favaFx webView to open web page which would be generated by FFmpeg running on Ubuntu server. The only problem is that I don't really know how to tell the FFmpeg that this is my rtsp link and just give me some kind of HLS link to put in webView of javaFx.
If you know any simple solution how to make a java frame with rtsp stream please share it with me.
I am trying to develop an AR app to help visual impaired people to make better their access conditions to a computer.
I am investigating on how AR can help HCI for visual disabilities, so, the application is using WebRTC to get computer Desktop to be magnified at AR environment using Sceneform.
I have successfully used the Sceneform example https://github.com/google-ar/sceneform-android-sdk/tree/master/samples/chromakeyvideo, but, I have no idea of how to render the WebRTC stream directly to a ExternalTexture. -> https://github.com/judicapo/MUITSS-ARBE/tree/master/SampleApps/ARCK
I already tried some Stackoverflow answers, but, have not found the clue.
Thank you all for your replies, hope some one has any idea.
Honestly I have not worked with this. but I could think of a way on how to approach this.
Instead of chromaKey rendering on a texture, why don't you try a 'ViewRenderable'? using this, you can place any android View to a node. You just need to place a VideoView and do your webRTC magic. let me know if this works
https://developers.google.com/ar/reference/java/sceneform/reference/com/google/ar/sceneform/rendering/ViewRenderable
https://developers.google.com/ar/develop/java/sceneform/create-renderables
AugmentedImage example - https://proandroiddev.com/arcore-sceneform-simple-video-playback-3fe2f909bfbc
ViewRenderable example - https://github.com/Hariofspades/ARExperiments/blob/master/AugmentedImages/app/src/main/java/com/hariofspades/augmentedimages/common/AugmentedImageNode.java
I wish to show live video in a java-web application. For this I found that xuggler(java library) is capable of doing that. But I didn't get code to how to implement it.
I have done some tutorials with xuggler like capturing frames from a video. But not much familiar with xuggler. Please help.
Any help is appreciated.
My local church hosts a weekly show and upload the videos on Youtube. And I've been asked to develop an app for them, so far all is well, however, I have to implement a feature that will allow me to stream the show's videos unto the app.
I honestly don't have much of a clue how to go about it (still new to android programming)! I've looked around here and seen that a few people have already attempted to do this, but unlike what most people have done or tried to do, the videos are released on that day (not live streaming). For example the 10/02/13 video will be released on 10/02/13.
This link gives me an idea on how to stream a video on Android, but that shows that I have to explicitly put in the link for the video.
Is there a way to do the same thing, but instead of adding the link myself, the app should retrieve the videos from a server??
Thank you in advance for your help!
Is there a way to get the link but instead of adding the link myself, the app should retrieve the videos from a server??
To achieve this you need to make the webservice which help to communicate mobile and web server. When you send the request from mobile to server then make the web service which give you response which have link of video then you can stream the video.
To stream the video.
Read here.
Inside the API DEMOS you have on example.
API DEMOS >> Media >> Mediaplayer >> Play Streaming Video
You need to downloaded using the SDK Manager, and then inside eclipse you go throw create new project form existing sample, then you read the code and copy it inside your app.
Where to get streaming (live) video and audio from camera example for Android?
Suppose I want to create some live video streaming service app so I'll have some cool server at the back end. And I know how to do that part. Suppose I have some stand alone app for PCs now I want to go on to mobile devices. So I want to see some sample app grabing audio and video streams from Phone, Synchronizing them, encoding somehow, and sending LIVE stream to server. I need any Open-Source sample that will do this or something like this. Where can I get such one?
Ole have you been able to find any good examples of video or audio broadcasting yet? The best that I have found so far is the SIPDroid project (www.sipdroid.org). I haven't had a chance to review it in depth, but it looks promising.
Here are some project that you want
Ip Camera
http://code.google.com/p/ipcamera-for-android
SipDroid
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
You can get the codes using SVN or other clients.
Yet to me, the both projects still have issues. If you get the one working well, please tell me.