I had managed to start broadcasting to RTMP from OBS software.
For RTMP with Ngix, I used this tutorial. For PlayBack in android I used https://github.com/josemmo/libvlc-android
To stream to RTMP I used OBS software https://obsproject.com
Everything works well so far.
My question is can I broadcast from my android phone without using OBS software aka I want to use my phone camera to broadcast directly to RTMP server.
https://play.google.com/store/apps/details?id=com.dev47apps.droidcam&hl=en&gl=US
Similar function like Droidcam
Prefer ionic-react, but if no choice, i could try native android with java.
Update: I found this Android Library https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
And did manage to record from camera but still missing on how to add key.
I did manage with this library
https://github.com/begeekmyfriend/yasea
It is quite easy.
Related
I need to send live video from a DJI camera to an RTMP server.
I try to find information on how to do this in the DJI sdk documentation, but so far I can not find the answer how to do it.
I would appreciate if you tell me how to do it.
Sorry for my english.
For now, no. But it is said that in the next version of DJI Mobile SDK (After v4.8.1), will provide APIs/Samples for developers to implement video live streaming features, includes stream video to RTMP server.
As all users of Windows Phone 10 know, the Bluetooth connectivity on this system is cutted to the simplest for end-user, which ends for programmers as a horror to create anything. When connecting to a device, system automatically looks for a "functions" that device has, for ex. audio. The questions are as following:1. How to define a function like that on external Android device? The main goal is, to control all types of music playback from it (ex. Groove music playing)2. Where can i find a list of things like this, for future app functions?
And to get ahead of answers, I already tried the easier ways, which came to nothing due to UWP limits of controlling other apps playback.
Main target of project is to make a xamarin c# (or java, I can "translate" from one to another with ease) android app for a smartwatch and in c# uwp for windows phone to do most common tasks, like - as mentioned before - music playback control and notifications reciever.
Based on your description, you want to develop an Android app to send Bluetooth command to control the music player of a windows phone device, then it depends on your Bluetooth of your android device.
But control music player by Bluetooth need the profile Avrcp I have check the google Bluetooth API I did not find the profile of Avrcp, I think it is may be hard to send the Avrcp command at the android application level.
And Android is source opened, many device factories prefer to customize their own Android system, it is possible that the device uses a Bluetooth module as a serial device, then the official Bluetooth APIs are not suitable for this scenario, the develop protocol of this Bluetooth module is in need.
I recently sarted working with bluetooth devices in Java with the target to build a programm, which can stream music from a smartphone app (e.g. Spotify, Google music player, VLC for Android, ...) to a PC/Laptop.
For communication with bluetooth devices I use bluecove.
The pairing works fine, but I can't find a possibility to create a A2DP or HSP client. According to a thread I found on google code this is because bluecove (or JSR-82 implementations in general) doesn't support "audio streaming".
Am I out of luck with java for this project? Or is there another possibility to stream music from a smartphone to a PC/Laptop via bluetooth?
Basically, I am asking how to reverse the Android Beam program, which sends a message from an Android Phone that has NFC capabilities, to eclipse on my PC via an NFC reader. I have no idea where to start. I CAN FIND NO DOCUMENTATION ON WHERE TO GET STARTED ONLINE, OTHERWISE I WOULD USE THAT INSTEAD OF POSTING MY QUESTION HERE.
Android Beam uses the SNEP protocol over the LLCP communication link between 2 Android NFC devices. So you need an LLCP implementation and a SNEP layer on top of that. A project that provides this (and more) is nfcpy.
Does Android MediaPlayer can only work with file sources? I would like play media (video) from a network stream, but the stream comes in a non-standard protocol, so I have to somehow feed Android MediaPlayer with the data only.
Is there anyway to do that? I found a few web pages suggesting using a temporary file for the buffered media data etc. but I would like to minimize the I/O usage as much as I can, so I'm looking for a API only solution if there is any? how about JNI? but looks like the permissions going to be an issue with that also.
Does Android MediaPlayer can only work
with file sources?
No, it handles HTTP and RTSP streams as well.
I would like play media (video) from a
network stream, but the stream comes
in a non-standard protocol, so I have
to somehow feed Android MediaPlayer
with the data only.
That will be difficult. If this were audio, you could use AudioTrack, but there is no video equivalent for this.
One answer is to create a server-side proxy that converts your non-HTTP, non-RTSP stream into an HTTP or RTSP stream, so the existing Android streaming support works.
Basically Android supports HTTP and RTSP video playback for network videos
This link may help you Click Here