In all tutorials i have yet encountered the code looks like someVideoTrack.addRenderer(new VideoRenderer(someSurfaceViewRenderer));.
Yet, in the latest version of webRTC for android, the VideoRenderer-constructor looks like that:
public VideoRenderer(VideoRenderer.Callbacks callbacks);.
Thus, no SurfaceViewRenderer-parameter in sight.
Can anyone explain how to connect a SurfaceViewRenderer to a VideoTrack?
Thanks in regards.
I'm not knowledgeable enough to inform you as to why this works, but I managed to get my application working by replacing
someVideoTrack.addRenderer(new VideoRenderer(someSurfaceViewRenderer));
with
someVideoTrack.addSink(someSurfaceViewRenderer);
I managed to find my answer in this thread:
Local Video Renderer in Android WebRTC
Related
i wanted to create face recognition mobile app and made interface from flutterflow, and found code with face recognition on java, but i dont know how to connect them. i never code on java before and never used android studio. hope you'll help me. thanks!
i tried to find answer from everywhere, but i didnt found what i needed
I am trying to develop an AR app to help visual impaired people to make better their access conditions to a computer.
I am investigating on how AR can help HCI for visual disabilities, so, the application is using WebRTC to get computer Desktop to be magnified at AR environment using Sceneform.
I have successfully used the Sceneform example https://github.com/google-ar/sceneform-android-sdk/tree/master/samples/chromakeyvideo, but, I have no idea of how to render the WebRTC stream directly to a ExternalTexture. -> https://github.com/judicapo/MUITSS-ARBE/tree/master/SampleApps/ARCK
I already tried some Stackoverflow answers, but, have not found the clue.
Thank you all for your replies, hope some one has any idea.
Honestly I have not worked with this. but I could think of a way on how to approach this.
Instead of chromaKey rendering on a texture, why don't you try a 'ViewRenderable'? using this, you can place any android View to a node. You just need to place a VideoView and do your webRTC magic. let me know if this works
https://developers.google.com/ar/reference/java/sceneform/reference/com/google/ar/sceneform/rendering/ViewRenderable
https://developers.google.com/ar/develop/java/sceneform/create-renderables
AugmentedImage example - https://proandroiddev.com/arcore-sceneform-simple-video-playback-3fe2f909bfbc
ViewRenderable example - https://github.com/Hariofspades/ARExperiments/blob/master/AugmentedImages/app/src/main/java/com/hariofspades/augmentedimages/common/AugmentedImageNode.java
I need some help with connecting hte Clarifai API into my android app. It's for a uni project where we tests different image recognition softwares. All it needs to do is take a picture on the phone and then run recognition on the pituces (which will be buildings).
I have no idea how to do this which is why i'm throwing this hail mary in the hope that someone would help me. I've put what i have in a drive here: https://drive.google.com/drive/folders/1LH79C0JtpBBpAMdKqNjKfYfj3KTvLgAh?usp=sharing
How can I solve this?
To use the Clarifai API on Android, you should use the Clarifai API Java client. Alternatively, if you wish not to install the library, you can do REST and execute HTTP requests directly using JSON. See the developer's guide for examples.
Note: There's also an Android SDK in works, which is beta at the time of this writing. See here on how to join the beta program.
I am having problems creating a Cordova plugin to establish a communication between a native application and a web view.
Facebook Messenger SDK was released for Android and iOS, which I can use to share content and so on. The hardest part is probably going to be triggering these methods from JavaScript.
Has anyone done this or can help me get this done?
I would be really grateful.
Is there an easier way to communicate without using Cordova at all? For example, listening to URLSchemes or something?
I'm trying to find the right approach for creating an Android OpenGL live wallpaper i.e. a way to convert an app written to use GLSurfaceView into a live wallpaper. There appears to be nothing in the official Android documentation about this surprisingly and it's not obvious what to do.
I've found a few discussions about this elsewhere where most end up linking to the following code written an Android developer:
http://www.rbgrn.net/content/354-glsurfaceview-adapted-3d-live-wallpapers
However, the comments on the page suggest there are problems with the code (memory leaks, lock ups). Does anyone know of any alternatives? If I upload a wallpaper to the market, I'd obviously like to avoid complaints caused by buggy code.
I found an open source example that may help you.
http://code.google.com/p/android-deep-wallpaper/
also, it looks like to use open GL, the GL ES code needs to be called from a different thread.
hope this helps :D
There is a library for building OpenGL Live Wallpapers for Android called GLWallpaperService. You can find GLWallpaperService on GitHub. It includes the code you linked to on rbgrn.net but with a few bug fixes included. There are some alternate implementations available as well. Good luck.