Can we reset all the values that hold in mediaRecorder while recording video?
I've tried just using mediaRecorder.reset() while recording video. But it won't work. I don't know is it possible or not. If it is possible please any references will appreciate.
I've read this and also google developers, mediaRecorder in developers. But any of references didn't mention my issue.
EDIT :
What I want is while recording a video set mediaRecorder.reset() and mediaRecorder.start(). The problem occurs when I'm doing this. I need to chunk of video clips while recording the same video. Need those process in parallel. While I'm trying to stop and restart the camera capturing methods it will miss many frames. Bcoz handling camera is somewhat cost for the processor. I tried this and it occurs some errors that telling session configuration failed. Now I'm stuck in here. Need help!
Thank you for your valuable time.
Edit in response to clarifications:
Ok, so you want to split the video file into multiple separate files.
You'll need to use the lower-level APIs (MediaCodec, MediaMuxer) to implement this yourself; the higher-level MediaRecorder does not support this without losing frames.
Original:
So you're trying to pause the video recording temporarily.
Unfortunately, there's no support for this before API level 24, which added MediaRecorder.pause(). You can't call MediaRecorder.reset() mid-video and have it work.
All you can really do is to record the full video and then post-process it to crop sections you don't want.
Related
I am using the CameraX api approach to record the videos and saving them as explained in the documents: https://developer.android.com/training/camerax/video-capture
I tried with ffmpeg-android-java but clearly it needs to process the saved video not adding to the saved frames in real time.
Now, I want to add a watermark to all saved videos in easy and not costly way and without the need to reprocess the video?
Also, if there is no way, what the best and fastest approach to process the video to only add a watermark?
My suggestion is that you need to write a player-like code that uses opengles to draw videos so you can add your own arbitrary watermark. Or use the cross-compiled ffmepg dynamic library (or static library) to add your own filters.
I shared an example of cross-compiled ffmpeg on github. Is to implement a player that can play video. Check it out here.
I hope you got some help.
I've read somewhere that it isn't still possible to record Audio while using the Camera function on Android phones. But this source was kind of outdated.
I've also read, that this is possible on Iphone.
But I need this function for Android to create an App.
Can anybody say more to that?
Is there a possibility on Android to archive that in an Application?
I don't see why not. They don't share the same hardware. Even if not, you could easily fake it by recording video (which also records sound) and just taking the first still image of the video as your photo.
I'm using Android Youtube API and I was wondering if there is a way to control video buffering. I'm specifically interested in the possibility of pausing the buffering and resuming it later on with the idea to download video gradually by chunks.
Any kind of help or workarounds highly appreciated.
You can look at the following methods available with a YouTubePlayer.
pause()
play()
seekToMillis (int milliSeconds)
seekRelativeMillis (int milliSeconds)
The YouTube API has a method stopVideo() that you may be able to use. Unlike the pause() method, it will stop downloading the video. I'm not sure if you can resume the video afterwards, but at least it stops buffering.
For all it's worth, I didn't find a way to achieve this using YouTube API and I don't think there is one (correct me if I'm wrong).
I ended up abandoning YouTube API completely and solving the problem in an adhoc way - the app determines the actual location of the video file and downloads it progressively to a temporary file using standard means, which gives it full control over the connection. The data is read and fed to the media player simultaneously as it's being downloaded, performing second-level local buffering.
My application takes a long time to prepare and buffer an audio stream. I have read this question Why does it take so long for Android's MediaPlayer to prepare some live streams for playback?, however it just says people have experienced this issue, it does not state how to improve the problem.
I am experiencing this in all versions of Android, tested from 2.2 - 4.1.2.
The streams are in a suitable bit-rate for mobile and 3G connection. The same stream takes less than a second to start buffering in the equivalent iOS app.
Is there a way to specify the amount of time that should be buffered? I know that the Tune In radio application offers this feature ( https://play.google.com/store/apps/details?id=tunein.player ).
Thanks.
Edit: I've tested again and found that it only happens on devices running Gingerbread and above (>=2.3). I know that Android changed the underlying framework from OpenCore to StageFright. So how can I optimise the media framework? It just seems wrong that the old HTC Wildfire can prepare, stream and play, literally 10x faster than the brand new HTC One X and Nexus 7.
I have struggled with this question for months. Finally i found the solution.
The real problem is in the implementation of the MediaPlayer class. Particularly with the way MediaPlayer buffers the data. This is why the solution is to create your own buffering, save it to a temp file and feed that to MediaPlayer.
This tutorial and source code explain exactly how. http://androidstreamingtut.blogspot.nl/2012/08/custom-progressive-audio-streaming-with.html
By adapting this code, it is easy to create a much better streaming player.
Google Developers really screwed up here.
EDIT : This answer is rather old. Nowdays i would recommend not using MediaPlayer and use ExoPlayer instead. It is extendable, stable and can play many different types of media. You can find it here: https://github.com/google/ExoPlayer/
There really isn't much you can do since the Android MediaPlayer class doesn't provide access to lower level settings such as buffer size. The only alternative would be to make your own player using AudioTrack and a library like FFmpeg to do the decoding.
The one thing I'd recommend is to play around with encoding. For instance, for MP4s, ensure that the MOOV Atom is located at the beginning of the file (there are enough questions on S/O regarding how to do this with ffmpeg, etc). With MP3s, you can look at different codecs or bitrates for instance.
You can, for instance, try a number of audio files you find online, and if you see one that doesn't take a long time to buffer, try to encode your files in the same way.
Where to get streaming (live) video and audio from camera example for Android?
Suppose I want to create some live video streaming service app so I'll have some cool server at the back end. And I know how to do that part. Suppose I have some stand alone app for PCs now I want to go on to mobile devices. So I want to see some sample app grabing audio and video streams from Phone, Synchronizing them, encoding somehow, and sending LIVE stream to server. I need any Open-Source sample that will do this or something like this. Where can I get such one?
Ole have you been able to find any good examples of video or audio broadcasting yet? The best that I have found so far is the SIPDroid project (www.sipdroid.org). I haven't had a chance to review it in depth, but it looks promising.
Here are some project that you want
Ip Camera
http://code.google.com/p/ipcamera-for-android
SipDroid
http://code.google.com/p/sipdroid/source/browse/trunk/src/org/sipdroid/sipua/ui/VideoCamera.java
You can get the codes using SVN or other clients.
Yet to me, the both projects still have issues. If you get the one working well, please tell me.