The scenario:
My friends and I are developing a product with a bluetooth reciever that connects to a speaker system. Currently we can pair with this unit and stream music over a bluetooth connection. I'd like to develop an app that I can use to control different settings on our unit while the music is streaming. My application needs to be able to do this no matter which app is streaming the music over bluetooth (native media player, spotify, pandora etc.). I'm thinking this potentially a permissions issue as opposed to a bandwidth issue...
I need to know:
Is it possible to send control data and an audio stream concurrently to the same bluetooth reciever? If so, could someone please point toward a good strategy that will help me accomplish this (eg what protocol to use for the control data)? If it is not possible, can someone recommend a better way to control the target device?
I wasn't able to find what I was looking for in the Android Developer docs.
Based on my knowledge from working with bluetooth, the best method would be to send the control packets in the same packet stream as the audio packets, appending a unique ID to allow the speaker the ability to recognize and act on the control packets.
Ideally, you would want a Connection Stream on the speaker built just for control. so that way your app just connects to that and sends the connection data, which the speaker then interprets.
Pseudo code for the speaker would look something like
connectionURL = "btspp://localhost: " + uuid + ";name=control";
bluetoothConnection.open(url);
while (true) {
// waits on this line until something tries to connect to it.
bluetoothConnection.acceptAndOpen();
/* Process control information */
}
As for pausing music from a third party app, the easiest way would be to have the App Request Audio Focus to pause and release it to resume. Alternatively, you could have a control packet sent that would cancel the bluetooth stream of audio, but that could become quite complicated to resume.
Related
I am working on a little project with a very short amount of time and I’m totally new to developing with android. I’m experienced in Java but don’t know anything about android!
The project essentially consists of using an android phone as a “data collection probe”. My plan is to have an arduino connected via usb sending sensor data at regular intervals and ship that data off to a server in packets as fast as possible when network connection is regained. I also want to capture video with the camera and have it sent on a similar basis. Video is streamed into a file and when network connection is regained it empties out the file and sends it off. Not sure how to implement this video part, any help would be much appreciated.
The situation:
I'm developing an android app for some hardware that has a BlueGiga WT12 bluetooth modem. The hardware device sends 56 byte packets at around 240hz. I'm testing on a Samsung S5 and S8. A fully functional app has already been created for IOS and PC so we know the hardware device works.
I use a separate thread to read in the data and then dispatch it to the main thread.
Issue:
The issue I'm having is when I send the command to the device telling it to start streaming, it starts to stream but very shortly after I start receiving packets at a very slow rate 10-60hz.
After some examination I realized that the device was experiencing a bufferoverflow.
After talking to our hardware guy he said the only real thing that could cause that is something on the phone side not reading fast enough, resulting in the hardware device not wanting to send more packets because it thinks the phone can't receive anymore and then the buffer overflows on the hardware device.
The WT12 has flow control enabled so maybe this is an issue with Android not giving a clear-to-send signal to the WT12. But to my knowledge we don't have access to all of the flow control stuff.
What I've tried:
My first line of attack was to simply remove any code that I thought was slowing down the reads but that seemed to have no effect.
I also tried basically every bluetooth serial terminal app I could get my hands on, all with the same result.
So then I questioned if it was some weird problem relating to the hardware device but after using pc based (bluetooth) serial terminals I had no issues at all on the pc.
The hardware device can also operate over USB as opposed to bluetooth so I tried reading the data from it exactly the same as I had with the bluetooth connection but over USB. Using the USB serial connection I had no issues at all.
My thoughts:
So this leads me to believe the problem must be with the bluetooth modem on the phone side. I was thinking maybe it was a flow control issue as we have flow control enabled on the WT12. Maybe android isn't sending a clear to receive signal?
The problem is to my knowledge the flow control is implemented in the bluetooth stack and we as developers have no control over it if I'm not mistaken?
Other then flow control I don't really have much idea what could be causing the hardware device to not want to send me data.
I've just now been experiencing the same issues with flow control. Then I see that in case of RN4678 BT module, that it's best to disable the flow in the MCU FW and pull the cts pin low on the module. That worked for us.
Hello, I'm trying to build client-server audio streaming app. Server has to be written in C and communicate with client using sockets (TCP). Client will be written in javafx (actually tornadofx, but that doesn't relate to the problem)
My doubts concern the streaming part. Server has several audio files. I want to be able to start, stop, rewind, skip to another song in the client app. Java has AFAIK two classes to play audio - Clip and SourceDataLine. I don't think sending the whole song before playing is the way to go (Clip, large sound files) so I guess I should stick with SourceDataLine. My simple guess is that I should send fairly small portion of audio every time the client comes close to listening the last portion he received. (e.g I send 30 seconds and when client is on 25th second he sends request for more) Is that right? Is that doable in previously mentioned technologies?
Maybe something like ffmpeg could be useful here
https://github.com/avTranscoder/avTranscoder
But again, I have to communicate over tcp sockets only.
i've read some articles about DTMF in Android. I guess that it's not possible to make an automatic phone call streaming a .wav file (or other format) and detect the input of the other person (receiver of the call).
I'm want to make an app that calls me/or someone else and is waiting for input to do some action.
Is this scenario possible with VoIP in Android? Any ideas?
Thx
You could use a cloud based telephony service like Tropo and Prophecy to achieve this requirement. These services allow you to make outbound calls and listen for either DTMF or voice input. They also stream .wav file for the person on the phone to listen to. You can trigger these calls from any application by just making an HTTP request to the services API, so it could be an Android app that initiates the call. This solution is independent of the phone device (any mobile, VoIP phone, or land line) and it works on VoIP as well as PSTN.
I am looking to make a remote control car with a camera which can be controlled via an Android Phone. This requires a lot of things, but the only thing i am confused about is how to communicate the camera's image with the phone. I can handle the code and stuff and i know networking well, and i've done server-client many many times, but what i really need is to know how i can hook up a camera to some sort of WiFi enabled motherboard on the RC Car. The RC Car should be able to communicate over WiFi and i also want a motherboard which would let the WiFi data coming in from the client (Android phone) control the motor speeds and stuff. Where do i start? I'm really confused on how data being sent over a socket to my RC Car will control the motors, and also how camera data from the car will be sent to the phone.
Basically, I need a way to have control over motors and also have a way for the car to send me video. How do i do this?
You have to stream the cameras input over the network. In order to achieve this you could use the Gstremer in both car-(streaming server) android phone ( client ). With a gstremer you will set the camera as your input stream and rtsp sink as a renderer so you client ( iphone app ) will be able to connect to the live streaming server on your car.