VideoView delays when starting to play a Multicast stream (just sometimes) - java

I have a simple Android application which contains a WebView to load my Web_Based application and a VideoView to play Video (using HTTP protocol) and Multicast Stream (Live TV using UDP Protocol).
The Web_Based application, Video Server and Live TV Server are in a local Server (Ubuntu 12.04) and so my Android Application is connected to them locally with Ethernet Cable (Not via Internet).
Also my Android Device is a STB with Android Version 6.
The Android application is developed to play Video or Live TV when it is starts just by clicking on a button.But the issue is that JUST SOMETIMES, when I turn the STB on (and instantly starting the Android application), there is a delay for about 30 seconds to start Live TV. This issue is not happening for Video at all.
A sample of Live TV stream is udp://239.0.0.1:1234
A sample of Video url is http://192.168.200.235/test.mp4
// This is a pert of my main function to play UDP Stream
PlayerActivity.videoView.stopPlayback();
final Uri video = Uri.parse("udp://"+url.replaceAll("\\s+",""));
PlayerActivity.getInstance().runOnUiThread(new Runnable() {
#Override
public void run() {
PlayerActivity.videoView.setVisibility(View.GONE);
PlayerActivity.videoView.setVisibility(View.VISIBLE);
PlayerActivity.videoView.setVideoURI(video);
PlayerActivity.videoView.start();
PlayerActivity.videoView.setOnErrorListener(new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.e("ERROR LOG FOR UDP STREAM",":( I don't get any error here !");
return false;
}
});
}
});

Although Android can play UDP stream, but it's not built for this. So, it's better to use HSL or RTMP protocol for live stream playback and also for video file You can use HLS.

Related

BLE Scan not working in Background with Scanfilters in android pie?

I am using blescan with scanfilters to detect beacons it's working very fine in foreground and background up to oreo version but when it comes to android pie it's not able to send pending broadcast in background.
ScanSettings settings = (new ScanSettings.Builder().setScanMode(ScanSettings.SCAN_MODE_LOW_POWER)).build();
final List<ScanFilter> scanFilters = new ArrayList<>();
scanFilters.add(getScanFilter());
BluetoothAdapter bluetoothAdapter;
final BluetoothManager bluetoothManager =
(BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE);
bluetoothAdapter = bluetoothManager.getAdapter();
Intent intent = new Intent(this.getApplicationContext(), MyBroadcastReceiver.class);
intent.putExtra("o-scan", true);
PendingIntent pendingIntent = PendingIntent.getBroadcast(this.getApplicationContext(), 0, intent, PendingIntent.FLAG_UPDATE_CURRENT);
bluetoothAdapter.getBluetoothLeScanner().startScan(scanFilters, settings, pendingIntent);
public class MyBroadcastReceiver extends BroadcastReceiver {
#Override
public void onReceive(Context context, Intent intent) {
int bleCallbackType = intent.getIntExtra(BluetoothLeScanner.EXTRA_CALLBACK_TYPE, -1);
if (bleCallbackType != -1) {
Log.d(TAG, "Passive background scan callback type: "+bleCallbackType);
ArrayList<ScanResult> scanResults = intent.getParcelableArrayListExtra(
BluetoothLeScanner.EXTRA_LIST_SCAN_RESULT);
// Do something with your ScanResult list here.
// These contain the data of your matching BLE advertising packets
}
}
}
Android 9 introduces several behavior changes, such as limiting background apps' access to device sensors and Wi-Fi scans.
These changes affect all apps running on Android 9, regardless of target SDK version.
Sensors that use the continuous reporting mode, such as accelerometers and gyroscopes, don't receive events.
Android 9 Limited access to sensors in background:
Android 9 limits the ability for background apps to access user input and sensor data. If your app is running in the background on a device running Android 9, the system applies the following restrictions to your app:
Sensors that use the continuous reporting mode, such as accelerometers and gyroscopes, don't receive events.
Sensors that use the on-change or one-shot reporting modes don't receive events.
Solution:
If your app needs to detect sensor events on devices running Android 9 while the app is in the background, use a foreground service.
I an example test Android app using Oreo (API 26) and the the code above (slightly modified) to detect beacons. I am using the Pixel 3 XL (with Pie).
I think that the hard part about this is to know for sure if the code in onRecieve() in MyBroadcastReceiver is actually being run upon detection of a beacon when the device is running on battery only (disconnected from Android-studio and Logcat (USB)).
Using Volley (com.android.volley) to submit a HTTP request to a local http server, I was able to demonstrate that it works as documented - ie. I am able to receive the http request when beacon(s) are detected. However, Volley only sends these these requests when Android is awake or when it periodically wakes up and connects to the network - which in my simple tests was about every 15 minutes (plus some variation), but I did get all the beacon ScanResults on my HTTP server, just in delayed up to 15 minutes. I was even able to remove the app from the list of running apps (you know; swiping up to remove the app) and still see that the onRecieve() in MyBroadcastReceiver was receiving BLE ScanResults.
How do you know that the onRecieve() in MyBroadcastReceiver is being killed? I am very interested to know how you know this.

This code work only for video, not for streaming

My problem is reading videos and streaming on the net. If I know the location of a video file and I want to read it I just do:
String Url = "https://www.w3schools.com/html/mov_bbb.mp4";
VideoView videoView;
protected void onCreate (Bundle savedInstanceState) {
...
videoView = (VideoView) findViewById (R.id.loadVideo);
videoView.setVideoURI (Uri.parse (this.Url));
videoView.start ();
But if the video is a stream, for example sent from my computer, I know the address (example: 127.0.0.1:8080 , yeah is random example i know this is localhost and can't work in localhost, but i can't use the real address of video sorry) and I can play from windows via Vlc quietly, the video is not played, the code ends up "catch", then the same syntax can not be used for video streaming via http address.
I looked everywhere on the internet, but I did not find any working solution and many plugins seen are now obsolete and / or unusable with android studio. I'm not a professional Android programmer, but I've been asked to do it, do you know how I can do it?
I forgot: the intention is to create a live stream

HTML5WebView fullscreen youtube video consumes back button click

In HTML5WebView.java (pastebin link)(source) I have:
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
System.out.println("TAG - BACK PRESSED IN WEB VIEW");
return super.onKeyDown(keyCode, event);
}
And in the activity that starts the web view I have:
#Override
public void onBackPressed() {
System.out.println("TAG - BACK PRESSED IN WEB PLAYER ACTIVITY");
super.onBackPressed();
}
Now when I'm playing an embedded youtube video normally (not fullscreen), both methods are called when I press the back button. When I put the video into fullscreen mode (using the youtube player fullscreen button), none of the methods are called. My only guess is, the back button is being consumed by the web view to undo the fullscreen action (but even that doesn't work).
I am trying to get the back button to immediatly kill the web view, even if there is a video in fullscreen mode.
See the problem in action here, had to post it externally because the gif > 2MB
If I was you I would delete the embed play of youtube video's or ask google that won't be seen as abuse. Because the embed play of youtube video's can be seen as a abuse of this:
Your app violates our Device and Network Abuse policy by downloading,
monetizing, or otherwise accessing YouTube videos in violation of the
YouTube Terms of Service or YouTube API Terms of Service.
More Info: https://play.google.com/about/privacy-security/device-network-abuse/ https://www.youtube.com/static?template=terms
For example, your app contains: YouTube background play functionality
This is a violation of the YouTube Terms of Service.
That is what I got in my experience because I also made a app where people could watch some YT embed video's but after some updates the app wouldn't be update becuase I got this error that it is abuse so I am warning you that's all.

Android Speech Recognition freezes when another app is in the background

I have an app that uses google voice speech recognition. It works perfectly, but the voice capture popup freezes sometimes (see snapshot) I have narrowed down the problem to the point where as long as a commercial app called X is active in the background, the google voice popup freezes. As soon as I close app X by swiping it away, my app's speech recognition works perfectly again.
This is the code I use to launch the speech recognition popup:
#Override
protected void onResume() {
speechIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE,app.getDemoLanguageCode());
speechIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, app.getDemoLanguageCode());
speechIntent.putExtra(RecognizerIntent.EXTRA_ONLY_RETURN_LANGUAGE_PREFERENCE, app.getDemoLanguageCode());
speechIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
speechIntent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 1);
}
static void openGoogleASR() {
thisActivity.startActivityForResult(speechIntent, SPEECHRECON_CODE);
}
This is how the speech recognition popup looks like when it freezes while app X is active in the background:
Do you know how to properly initialize my speech recognition to make it robust against other misbehaving apps?

Video streaming in android by parcelFileDescriptor

I am succeed to record video through Mediarecorder on SD card
but i want to send this video to a server without writing to SD card.
I search it and i found the parcelFileDescriptor is the way to send
video to TCP socket
but i don't know how to receive it on server side please explain it.
here is my client side code
socket = new Socket("hostname", portnumber);
ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(socket);
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mPreview = new Preview(VideoRecorder.this,recorder);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(mPreview);
I want to receive it on server side and play it to create areal time video transer.
knowing that
"The MediaRecorder records either in 3GPP or in MP4 format. This file format consists of atoms, where each atom starts with its size. There are different kinds of atoms in a file, mdat atoms store the actual raw frames of the encoded video and audio. In the Cupcake version Android starts writing out an mdat atom with the encoded frames, but it has to leave the size of the atom empty for obvious reasons. When writing to a seekable file descriptor, it can simply fill in the blanks after the recording, but of course socket file descriptors are not seekable. So the received stream will have to be fixed up after the recording is finished, or the raw video / audio frames have to be processed by the server".
I want a server(may be Android handset or PC) side code.
if there is another way please help me......
Thanks
In order to stream from android or pc you need to implement protocol over which the stream is carried over and server. There are several of them like HSL, RTPS etc (more http://en.wikipedia.org/wiki/Streaming_media). It is not a trivial problem, and there are only very few successful streaming service from android.
You can check how to implement and steaming service on android here: https://github.com/fyhertz/libstreaming
The library is broken for Android 5, but works for 4.4.x

Categories