I'm working on this project for a client and we're supposed to stream audio from a server onto a device running Android Gingerbread. To retrieve the stream, the Android client needs to make a request for a variant playlist, and then make a request for the playlist itself (the one with URIs that point to the TS file chunks themselves). Once that's done, the client app decrypts the chunks and sends them to the platform for playback.
The problem I'm having revolves around the security part. Our client(the company in question) uses a proprietary encryption scheme that serves the keys for decrypting the TS file chunks ahead of time through an HTTP request instead of following the HLS spec and serving the key files through URI(s) listed in the index files themselves. As far as I can tell, the Android Mediaplayer framework has the ability to find these key files and generate/find the appropriate IVs for decryption IF the key file URIs are in the index files.
Unfortunately, what this all means is that I can't decrypt the file chunks and play back the stream without gaps between each segment -- I accomplish this by making HTTP GET requests for each segment, downloading them to internal storage, applying decrypting, and then playing them back using the following code:
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
File dir = new File(TS_FILE_DIR);
String [] files = dir.list();
mTsFiles = new ArrayList<File>();
for (String file : files) {
String path = TS_FILE_DIR + file;
mTsFiles.add(new File(path));
}
mMediaController = new MediaController(this);
mVideoView = (VideoView)findViewById(R.id.video_view_1);
mVideoView.setVideoPath(mTsFiles.get(0).getAbsolutePath());
mVideoView.setMediaController(mMediaController);
mVideoView.setOnPreparedListener(new OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp.start();
}
});
mVideoView.setOnCompletionListener(new OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
mp.pause();
mp.reset();
if (mIndex < mTsFiles.size()) {
mIndex++;
try {
mp.setDataSource(mTsFiles.get(mIndex).getAbsolutePath());
mp.prepareAsync();
}
catch (Exception e) {
e.printStackTrace();
}
}
}
});
}
I've tried:
1) Using 2 mediaplayers and switching between the 2, but it doesn't work at all
2) Staring at the source code for ICS to get an idea of how this all works, but its very complex and I'm not well versed in c++
Is there anything I missed?
Related
Forgive me if this question was already asked, I couldn't find an answer for my case.
So, I have an Android app with Voice & Video call feature. I used webRTC for this.
I was able to make both Voice and Video call working perfectly inside an Activity, but now I want to keep the call running while the user exit the CallActivity and go back to the ChatActivity (to send a file/link/photo for example).
I managed to make the Voice call run perfectly inside a Background Service, but video call won't work as expected.
The remote video won't be displayed even though the audio from the video track is playing.
here is my Background Service code :
#Override
public void onAddStream(MediaStream mediaStream) {
if (mediaStream.videoTracks.size() > Constants.ONE || mediaStream.audioTracks.size() > Constants.ONE) {
return;
}
//check for video track, means this is a video call
if (!isAudioCall && mediaStream.videoTracks.size() > Constants.ZERO) {
remoteVideoTrack = mediaStream.videoTracks.get(Constants.ZERO);
CallActivityNew.remoteVideoTrack = remoteVideoTrack;
try {
localAudioTrack.setEnabled(true);
//Now ask the UI to display the video track
sendOrderToActivity(Constants.START_REMOTE_VIDEO, null);
} catch (Exception ignored) {}
} else if (mediaStream.audioTracks.size() > Constants.ZERO) {
//Means this is a Voice call, only audio tracks available
remoteAudioTrack = mediaStream.audioTracks.get(Constants.ZERO);
try {
localAudioTrack.setEnabled(true);
remoteAudioTrack.setEnabled(true);
} catch (Exception ignored) {}
}
}
and below my CallActivity code :
case Constants.START_REMOTE_VIDEO: {
if (remoteVideoView == null) {
remoteVideoView = findViewById(R.id.remote_gl_surface_view);
}
remoteVideoView.init(eglBaseContext, null);
remoteVideoView.setEnableHardwareScaler(true);
remoteVideoView.setMirror(true);
remoteVideoView.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FIT);
remoteVideoView.setZOrderMediaOverlay(true);
//Apply video track to the Surface View in order to display it
remoteVideoTrack.addSink(remoteVideoView);
//now enable local video track
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
//now enable local video track
remoteVideoTrack.setEnabled(true);
}
}, Constants.TIME_THREE_HUNDRED_MILLIS);
setSpeakerphoneOn(false);
break;
}
I am sending orders from Service to Activity, the "case Constants.START_REMOTE_VIDEO" work after receiving the order from Service.
I don't see where the problem, why am I only hearing sound but the remote video won't start display !!
Thank you in advance for helping.
After testing for long hours, I found that my code works just fine, I just forget to change the view visibility from "GONE" to "VISIBLE".
Yeah that was the solution, i swear xD
I'm trying to use VLC as a RTSP client. The RTSP server is based on the libstreaming library. I'm using the code provided by the 1rst example:
// Sets the port of the RTSP server to 1234
Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
editor.putString(RtspServer.KEY_PORT, String.valueOf(1234));
editor.commit();
// Configures the SessionBuilder
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setVideoEncoder(SessionBuilder.VIDEO_H264);
// Starts the RTSP server
this.startService(new Intent(this,RtspServer.class));
The android app starts; I try to access the stream using VLC (open a stream) and this URL:
rtsp://192.168.43.250:1234
The device is connected to the same network (I can ping it), but nothing happens in the android App and VLC displays a "connection failed" window.
Any idea where the problem is? Maybe a bad URL, but I can't found any detailled example of this situation.
It throws null pointer, check the logcat.
you have to provide the url as rtsp://ip:1234?h264=200-20-320-240
200 - buf
20 - fps
320 - resolution w
240 - resolution h
I had similar problems. Here is my solution.
Make sure you have imported the library as an imported module through android studio
Give your manifest the permission to use the resources needed
Use this code for you MainActivity:
public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback, RtspServer.CallbackListener, Session.Callback{
private final static String TAG = "MainActivity";
private SurfaceView mSurfaceView;
private Session mSession;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
setContentView(R.layout.activity_main);
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
// Sets the port of the RTSP server to 1234
SharedPreferences.Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
editor.putString( RtspServer.KEY_PORT, String.valueOf(1234));
editor.commit();
// Configures the SessionBuilder
mSession = SessionBuilder.getInstance()
.setCallback(this)
.setSurfaceView((net.majorkernelpanic.streaming.gl.SurfaceView) mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
.setAudioQuality(new AudioQuality(8000, 16000))
.setVideoEncoder(SessionBuilder.VIDEO_H264)
//.setVideoQuality(new VideoQuality(320,240,20,500000))
.build();
mSurfaceView.getHolder().addCallback(this);
((net.majorkernelpanic.streaming.gl.SurfaceView) mSurfaceView).setAspectRatioMode(net.majorkernelpanic.streaming.gl.SurfaceView.ASPECT_RATIO_PREVIEW);
String ip, port, path;
// Starts the RTSP server
this.startService(new Intent(this,RtspServer.class));
Log.d("test", "1");
mSession.startPreview(); //camera preview on phone surface
mSession.start();
}
#Override
public void onResume()
{
super.onResume();
mSession.stopPreview();
}
#Override
public void onDestroy()
{
super.onDestroy();
mSession.release();
mSurfaceView.getHolder().removeCallback(this);
}
//region ----------------------------------implement methods required
#Override
public void onError(RtspServer server, Exception e, int error) {
Log.e("Server", e.toString());
}
#Override
public void onMessage(RtspServer server, int message) {
Log.e("Server", "unkown message");
}
#Override
public void onBitrateUpdate(long bitrate) {
}
#Override
public void onSessionError(int reason, int streamType, Exception e) {
}
#Override
public void onPreviewStarted() {
}
#Override
public void onSessionConfigured() {
}
#Override
public void onSessionStarted() {
}
#Override
public void onSessionStopped() {
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
//endregion
}
using VLC player open the open network stream and type
rtsp://the ip of android device:1234 <--this port is hard coded so don't change
When I worked with libstreaming & VLC I spent a lot of time with the same problem. The solution for me was to use other VLC version. For example, it worked for me on vlc ver. 1.0.5, but many other versions didn't play the stream.
))) the first example is a server (for the author of the library, the server is someone who accepts the stream and does not give)
you need to use 2 or 3 example ...
2 example is good because you only need a VLK player ...
find out your Ip (cmd-> ipconfig) (it's important to understand that the device must have an external ip address or is in the same network)
Specify the received ip address of the PC in mEditText (mSession.setDestination ();)
After launching the application, press the start button. The studio in the logs will return the contents to create the sdp format file (TAG, mSession.getSessionDescription ());
create a falix for example 1.sdp edit it by specifying the contents of getSessionDescription (remove extra spaces)
huge minutes then that we need to specify the ip of the one we want to send the stream to ...
now the main thing! in any of the examples it does not turn out to simply enter rtsp: //192.168.43.250: 1234 and get the video !!! - 0)))))))
In the Session class, you'll find mOrigin = "127.0.0.1"; ok SessionBuilder .... .setOrigin ("192.xxx.xxx.xx")
Further your logic may suggest that you only need to find the port and maybe you will find SessionBuilder Session build () video.setDestinationPorts (5006);
but this is not the port)))
in this library there is no rtsp server implementation (I'm writing a server since it means that we want to raise the server for ip cam on the device)
you can find the hint in the RtspClient class (pay attention to the author of the library, this is the one who gives it) and there the author writes to us
* RFC 2326.
* A basic and asynchronous RTSP client.
RTSP client compatible with Wowza.
* It implements Digest Access Authentication according to RFC 2069.
I have some files in my Google Drive that others have shared with me and I would like to be able to query them. Unfortunately the MetadataBuffer' result count is 0
This is the code:
private GoogleApiClient mGoogleApiClient;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main_activity);
mGoogleApiClient = new GoogleApiClient.Builder(this).addApi(com.google.android.gms.drive.Drive.API)
.addScope(com.google.android.gms.drive.Drive.SCOPE_FILE).addConnectionCallbacks(this)
.addOnConnectionFailedListener(this).build();
}
#Override
protected void onStart() {
super.onStart();
mGoogleApiClient.connect();
}
#Override
public void onConnected(Bundle arg0) {
Query query = new Query.Builder().addFilter(Filters.sharedWithMe()).build();
Drive.DriveApi.query(mGoogleApiClient, query).setResultCallback(metadataCallback);
}
private ResultCallback<DriveApi.MetadataBufferResult> metadataCallback =
new ResultCallback<DriveApi.MetadataBufferResult>() {
#Override
public void onResult(DriveApi.MetadataBufferResult result) {
if (!result.getStatus().isSuccess()) {
Log.d(TAG, "Problem while retrieving results");
return;
}
MetadataBuffer buffer = result.getMetadataBuffer();
if(buffer != null) {
int cnt = buffer.getCount();
Log.d(TAG, "BUFFER COUNT: " + cnt);
for(int i = 0; i < cnt; i++) {
Metadata meta = buffer.get(i);
Log.d(TAG, meta.getTitle());
}
buffer.close();
}
}
};
....................
I am using the new Google Drive and I also tried moving some shared files from the Incoming Folder into the My Drive root. Still nothing.
I would appreciate any help and suggestions.
As said also on unable to query the files "shared with me" from google drive using Google drive api android , at the moment drive apis let you access only data in drive made by your app.
you can verify by yourself by changing
Query query = new Query.Builder().addFilter(Filters.sharedWithMe()).build();
in
Query query = new Query.Builder().build();
and you see only one folder, named "App Data"
to access other files you need to use drive java api, as told in https://developers.google.com/drive/android/auth#connecting_and_authorizing_the_google_drive_android_api
Note: The Google Drive Android API currently only supports drive.file and drive.appfolder
authorization scopes. If your application requires additional permissions or features not
yet available in the Drive Android API, you must use the Google APIs Java Client.
I'm uploading files(Images,Videos and Audios) from Android side to PHP server. It's working good for small files. But for large files it is giving unexpected results, like file uploads but on android side it gives timeout exception.
So please help me, how i tackle all the scenarios for file uploading. How i can use HttpMultipart entity and how i can set relative timeout against different size files.
My code is:
File myFile = new File(filePath);
RequestParams params = new RequestParams();
try {
params.put("uploaded_file", myFile);
} catch (FileNotFoundException e) {
}
URL=getResources().getString(com.newing.R.string.server_adderess)+URL+fileType+"&lessonID="+LessonID;
client.post(URL, params, new AsyncHttpResponseHandler() {
#Override
public void onSuccess(String response) {
Log.w("async", "success!!!!");
UtilityFunctions.showNotification(getApplicationContext(),
"File uploading completed.");
stopSelf();
}
#Override
public void onFailure(Throwable error) {
Log.w("async", "failure!!!!");
UtilityFunctions.showNotification(getApplicationContext(),
"File couldn't upload. Please try later.");
stopSelf();
}
});
you can try use Ion is a great library for Asynchronous Networking and Image Loading https://github.com/koush/ion
a simple example uploading a file
Ion.with(getContext(), "https://koush.clockworkmod.com/test/echo")
.uploadProgressBar(uploadProgressBar)
.setMultipartFile("filename.zip", new File("/sdcard/filename.zip"))
.asJsonObject()
.setCallback(...)
check the project site for more examples and wiki
I have an app in which the user may need to download up to 760 files, totaling around 350MB. It is not possible to zip these files, they must be downloaded as loose files!
I'm currently using Android Asynchronous Http Client to download individual files, and AsyncTask to run the entire process.
Here's an example of a DownloadThread object which handles downloading hundreds of files in the background:
public class DownloadThread extends AsyncTask<String,String,String> {
ArrayList<String> list;
AsyncHttpClient client;
String[] allowedContentTypes = new String[] { "audio/mpeg" };
BufferedOutputStream bos;
FileOutputStream fos;
#Override
protected String doInBackground(String... params) {
DownloadTask task;
for (String file : list) {
//the "list" variable has already been populated with hundreds of strings
task = new DownloadTask(file);
task.execute("");
while (!task.isdone)
try {
Thread.sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
return null;
}
class DownloadTask extends AsyncTask<String, String, String> {
String character, filename;
boolean isdone = false;
public DownloadTask(String file) {
//file = something like "Whale/sadwhale.mp3"
character = file.split("/")[0];
filename = file.split("/")[1];
}
#Override
protected void onPreExecute() {
}
#Override
protected void onPostExecute(String result) {
if (!result.equals("Error")) {
//Do something on success
}
isdone = true;
}
#Override
protected String doInBackground(String... str) {
client = new AsyncHttpClient();
client.get("http://some-site.com/sounds/" + character + "/"
+ filename, new BinaryHttpResponseHandler(
allowedContentTypes) {
#Override
public void onSuccess(byte[] fileData) {
try {
// Make file/folder and create stream
File folder = new File(Environment
.getExternalStorageDirectory()
+ CharSelect.directory + character);
folder.mkdirs();
File dest = new File(folder, filename);
fos = new FileOutputStream(dest);
bos = new BufferedOutputStream(fos);
// Transfer data to file
bos.write(fileData);
bos.flush();
bos.close();
} catch (Exception e) {
e.printStackTrace();
}
}
});
return "Success";
}
}
}
DownloadThread runs in the background, and also calls hundreds of it's own AsyncTasks. It waits until the task is done downloading, then continues the for loop for each download.
This works, kinda. Some downloads appear to not finish properly or not start at all. Out of a list of 760 downloads, an average of 100 downloads complete properly, and I have to restart the process to download another additional 100 downloads until that one fails as well. I have a feeling this is due to timing issues, as the Thread.sleep(10) line seems a little "hackish".
Surely, calling hundreds of AsyncTasks from another AsyncTask is not the most efficient way to do this. How can I alter this code or implement a 3rd party solution to fit this task?
Try out DownloadManager API. This should be what you need.
Here is the thing you need to keep in mind:
Computers have limited resources; network bandwidth, CPU, memory, disk, etc
The time it takes to download 1 file at a time vs. 760 files simultaneous can never logically take any longer than simultaneous download.
However, by spawning a whole lot of background tasks/threads you are incurring a lot of thread thrashing/overhead as each one needs to be context switched in and out. CPU bandwidth will be consumed in the switching instead of actually moving data in and off of the network interface. In addition, each thread will consume it's own memory and potentially need creating if not part of a pool.
Basically the reason your app isn't working reliably/at all is almost certainly because it's running out of CPU/DISK-IO/memory resources well before it finishes the downloads or fully utilizes the network.
Solution: find a library to do this or make use of the Executor suite of classes and use a limited pool of threads (then only download a few at a time).
Here is some good evidence in the wild that what you're trying to do is not advised:
Google play updates are all serialized
Amazon MP3 file downloader is totally serialized
default scp client in Linux is serialized file transfer
Windows update downloads serially
Getting the picture? Spewing all those threads is a recipe for problems in return for perceived speed improvement.