How to lock focus on Camera2API app on a service without preview - java

I am building an app which uses Camera2API to take pictures. The thing is I need the Camera to take a picture without needing a preview. So far, I managed to do it by dumping (and adapting) the code from an activity into a service and it works like a charm, except for the fact that it is not focusing. On previous versions I had a state machine in charge of that focusing on the preview by means of a separate CaptureRequest.Builder, but I can't make it work without creating a new CaptureRequest.Builder on the service.
I followed this topic on the following stackoverflow discussion How to lock focus in camera2 api, android? but I did not manage to make it work.
My code does the following:
First I create a camera session once the camera has been opened.
public void createCameraSession() {
try {
// Here, we create a CameraCaptureSession for camera preview.
cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
camera2TakePicture();
}
#Override
public void onConfigureFailed(
#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Then on that camera session I call my method "camera2TakePicture()":
protected void camera2TakePicture() {
if (null == cameraDevice) {
return;
}
try {
Surface readerSurface = imageReader.getSurface();
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(readerSurface);
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(readerSurface);
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
//MeteringRectangle meteringRectangle = getAFRegion();
//captureBuilder.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[] {meteringRectangle});
/**** TO BE USED ONCE SAMSUNG TABLETS HAVE BEEN REPLACED ****/
boolean samsungReplaced = false;
if(Boolean.parseBoolean(getPreferenceValue(this, "manualCamSettings"))) {
int exposureCompensation = Integer.parseInt(getPreferenceValue(this, "exposureCompensation"));
captureBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, exposureCompensation);
if(samsungReplaced) {
//Exposure
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_OFF);
Float shutterSpeed = 1 / Float.parseFloat(getPreferenceValue(this, "camSSpeed"));
Long exposureTimeInNanoSec = new Long(Math.round(shutterSpeed * Math.pow(10, 9)));
captureBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, exposureTimeInNanoSec);
captureBuilder.set(CaptureRequest.SENSOR_FRAME_DURATION, 10 * exposureTimeInNanoSec);
//ISO
int ISO = Integer.parseInt(getPreferenceValue(this, "camISO"));
captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, ISO);
//Aperture
Float aperture = Float.parseFloat(getPreferenceValue(this, "camAperture"));
captureBuilder.set(CaptureRequest.LENS_APERTURE, aperture);
}
}
// Orientation
WindowManager window = (WindowManager) getSystemService(Context.WINDOW_SERVICE);
Display display = window.getDefaultDisplay();
int rotation = display.getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
while(result.get(CaptureResult.CONTROL_AF_STATE) != CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED){
System.out.println("Not focused");
}
System.out.println("Focused");
}
};
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_IDLE);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
As you can see, I set the CONTROL_AF_MODE to AUTO then start the AF_TRIGGER and launch the capture. I add a check on onCaptureCompleted() but the AF_STATE never seems to be on FOCUSED_LOCKED. It stays on ACTIVE_SCAN.
What am I doing wrong?

In your code snippet, you've stopped the repeating request, and issue one capture request for the still image, but just one.
Do you then go on to restart the repeating request? If you don't, there are no frames flowing through the camera, and AF cannot make progress.
So if you want to lock AF before you take a picture, you want to
Set AF_TRIGGER to START for a single capture only
Run preview until you get AE_STATE out of ACTIVE_SCAN
Issue single capture for still image.
Being in the background or foreground doesn't really change any of this.

Related

Media Player loading media again and again

I am making a chat application and I have implemented the feature for sending audio messages.But here I find one thing which I don't want it to happen.It is that whenever my adapter gets updated,The media player starts loading again. In this way there will be an issue for if someone is listening to an audio and the user at other end sends a message ,the media player stops and it loads again.Here is the code of my adapter.
final MediaPlayer mediaPlayer;
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
handler = new Handler();
try {
mediaPlayer.setOnCompletionListener(mediaPlayer1 -> {
mediaPlayer1.stop();
binding.audioSeekbar.setProgress(0);
});
if (mediaPlayer.isPlaying()){
mediaPlayer.stop();
mediaPlayer.release();
}
mediaPlayer.setDataSource(finalUrlToLoad[1]);
mediaPlayer.setVolume(1f, 1f);
mediaPlayer.prepareAsync();
mediaPlayer.setOnPreparedListener(mediaPlayer1 -> {
int totalDuration = mediaPlayer1.getDuration();
binding.totalDurationAudio.setText(createTimeLabel(totalDuration));
binding.loadingAudio.setVisibility(GONE);
binding.playPauseAudio.setVisibility(VISIBLE);
});
} catch (IOException e) {e.printStackTrace();}
binding.playPauseAudio.setOnClickListener(view -> {
if (mediaPlayer.isPlaying()){
handler.removeCallbacks(runnable);
mediaPlayer.pause();
binding.playPauseAudio.setImageResource(R.drawable.pause_to_play);
Drawable drawable = binding.playPauseAudio.getDrawable();
if( drawable instanceof AnimatedVectorDrawable) {
AnimatedVectorDrawable animation = (AnimatedVectorDrawable) drawable;
animation.start();
}
}else {
mediaPlayer.seekTo(binding.audioSeekbar.getProgress());
mediaPlayer.start();
handler.post(runnable);
binding.playPauseAudio.setImageResource(R.drawable.play_to_pause);
Drawable drawable = binding.playPauseAudio.getDrawable();
if( drawable instanceof AnimatedVectorDrawable) {
AnimatedVectorDrawable animation = (AnimatedVectorDrawable) drawable;
animation.start();
}
}
});
runnable = () -> {
int totalTime = mediaPlayer.getDuration();
binding.audioSeekbar.setMax(totalTime);
int currentPosition = mediaPlayer.getCurrentPosition();
binding.audioSeekbar.setProgress(currentPosition);
binding.totalDurationAudio.setText(createTimeLabel(totalTime));
Log.d("time", String.valueOf(currentPosition));
handler.postDelayed(runnable,1000);
};
binding.audioSeekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
#Override
public void onProgressChanged(SeekBar seekBar, int i, boolean b) {
if (b){
mediaPlayer.seekTo(i);
seekBar.setProgress(i);
}
}
#Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
#Override
public void onStopTrackingTouch(SeekBar seekBar) {
}
});
mediaPlayer.setOnBufferingUpdateListener((mediaPlayer1, i) -> binding.audioSeekbar.setSecondaryProgress(i));
Here finalurltoload[1] is the url for the audio.
Now what do I need to do in order to prevent loading it again and again.
I will be really grateful to who answer this question.
Thanks😊.
It's hard to tell from this code but I assume this is all set in your onBind event? If so, then this means every time RecyclerView creates a new holder and binds it, the associated media will be prepped and loaded, and whichever is the 'last holder to have been called with onBind, "wins" (and is what MediaPlayer will be loaded with). Since by default RecyclerView typically creates multiple holders up front, you are seeing your MediaPlayer being "loaded" multiple times.
You probably just don't want to do the initialization of each audio message in the onBind. Instead, just use the onBind event to initialize state variables (duration, progress, etc.) to some default value, hide them and bind the specific audio Uri. Then when the user takes some action like tapping on the holder, you unhide an indeterminate progress bar while the initialization takes place, and in the onPrepared() event unhide the state information (duration, progress, seekbar, etc.), and finally hide the indeterminate progress bar and start the audio.
I assume you are also sending over the sound file as part of your messaging app (i.e. not storing it on the web somewhere in a central location?), and this file gets stored in an app-specific storage location? If so, you don't need to worry about persisting the permission to that URI, but if that isn't the case you will.
First extract the media player code into singleton class like AudioManager.
Add few method like setMediaUpdateListener that set a callback for seek duration. and togglePlayPause to play or pause the audio.
Passed the message id or any unique identifier to the audio manager while playing the video.
In Adapter class onBind Method.
First Compare the id and playing Id is same like AudioManager.getInstance().isPlaying(messageId);
If yes then set the seekUpdatelistner to the audio manager class.
also update the play/pause icon based on AudioManager.isPlaying() method.
3.if user play other message by clicking play button. call AudioManager.play(message) method.In which we release the previous message and play the new one.
If current message is not playing then reset the view on non-playing state.
If Auto play is enabled then you need to check if audioManager is free then only you can play the last message otherwise ignored.
Like a class who are managing the audio for you and store all the state.
class AudioManager {
public static AudioManager instance;
final MediaPlayer mediaPlayer;
private AudioListener audioListener;
private Uri currentPlaying;
public AudioManager getInstance() {
if (instance == null) {
instance = new AudioManager();
}
}
public void play(Uri dataUri) {
if (mediaPlayer != null && currentPlaying == null || currentPlaying.equals(dataUri)) {
if (!mediaPlayer.isPlaying) {
mediaPlayer.play();
}
return;
} else if (mediaPlayer != null) {
mediaPlayer.stop();
mediaPlayer.release();
}
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
handler = new Handler();
try {
mediaPlayer.setOnCompletionListener(mediaPlayer1 -> {
mediaPlayer1.stop();
sendProgress(0);
});
if (mediaPlayer.isPlaying()) {
mediaPlayer.stop();
mediaPlayer.release();
}
mediaPlayer.setDataSource(dataUri;
mediaPlayer.setVolume(1f, 1f);
mediaPlayer.prepareAsync();
mediaPlayer.setOnPreparedListener(mediaPlayer1 -> {
int totalDuration = mediaPlayer1.getDuration();
sendTotalDuration(totalDuration);
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void pause() {
// update the pause code.
}
public void sendProgress(int progress) {
if (audioListener != null) {
audioListener.onProgress(progress);
}
}
public void sendTotalDuration(int duration) {
if (audioListener != null) {
audioListener.onTotalDuraration(duration);
}
}
public void AudioListener(AudioListener audioListener) {
this.audioListener = audioListener;
}
public interface AudioListener {
void onProgress(int progress);
void onTotalDuraration(int duration);
void onAudioPlayed();
void onAudioPaused():
}
}

Camera2 ImageReader hangs after a while with "Failed to release buffer" message

I'm having a problem with android's camera2 API.
My end goal here is to have a byte array which I can edit using opencv, whilst displaying the preview to the user (e.g. an OCR with a preview).
I've create a capture request and added an ImageReader as a target. Then on the OnImageAvailableListener, i'm getting the image, transforming it to a bitmap and then display it on an ImageView (and rotating it).
My problem is that after a few seconds, the preview stalls (after gradually slowing down) and in the log om getting the following error: E/BufferItemConsumer: [ImageReader-1225x1057f100m2-18869-0] Failed to release buffer: Unknown error -1 (1)
As you can see in my code, I have already tried closing the img after getting my byte[] from it.
I've also tried clearing the buffer.
I've tried closing the ImageReader but that of course stopped me from getting any further images (throws an exception).
Can anyone please help me understand what im doing wrong? I've been scouring google to no avail.
This is my OnImageAvailableListener, do let me know if you need more of my code to assist:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image img = reader.acquireLatestImage();
final ImageView iv = findViewById(R.id.camPrev);
try{
if (img==null) throw new NullPointerException("null img");
ByteBuffer buffer = img.getPlanes()[0].getBuffer();
byte[] data = new byte[buffer.remaining()];
buffer.get(data);
final Bitmap b = BitmapFactory.decodeByteArray(data, 0, data.length);
runOnUiThread(new Runnable() {
#Override
public void run() {
iv.setImageBitmap(b);
iv.setRotation(90);
}
});
} catch (NullPointerException ex){
showToast("img is null");
}finally {
if(img!=null)
img.close();
}
}
};
Edit - adding cameraStateCallback
private CameraDevice.StateCallback mCameraDeviceStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice cameraDevice) {
mCameraDevice = cameraDevice;
showToast("Connected to camera!");
createCameraPreviewSession();
}
#Override
public void onDisconnected(CameraDevice cameraDevice) {
closeCamera();
}
#Override
public void onError(CameraDevice cameraDevice, int i) {
closeCamera();
}
};
private void closeCamera() {
if (mCameraDevice != null) {
mCameraDevice.close();
mCameraDevice = null;
}
}
You seem to have used setRepeatingRequest() for Jpeg format. This may not be fully supported on your device, also depends on the image resolution that you choose. Normally, we use createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW) in these cases, and get YUV or raw format from ImageReader.
I would try to choose low resolution for Jpeg: maybe this will be enough to keep the ImageReader running.

Camera2 Api autoflash not work

I needed to help with android camera 2 api automatic flash.
This solution works on one phone but not on the other.
I've spent a few hours searching for solutions, but I'm unsuccessful.
My takePhoto code:
pictureTaken = false;
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
int width = 1024;
int height = 768;
cv.setBackground(getResources().getDrawable(R.drawable.fotak_zeleny));
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
if (flashMode == FLASH_AUTO) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
} else if (flashMode == FLASH_ON) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
} else if (flashMode == FLASH_OFF) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
}
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(fileName);
if (file.exists()) {
file.delete();
}
etc....
My create camera preview code:
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, false);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException ex) {
ex.printStackTrace();
}
}
This solution work fine on my LG phone, but on alcatel not work.
I tried a lot of ideas that are written here, unsuccessfully.
Can help me please?
Big thanks
(Sorry for my English)
You should set your selected AE_MODE for the preview request as well, and update it whenever the user switches flash modes. In addition, you need to run the precapture sequence on any devices that are higher than LEGACY level.
Changing flash mode for just the single still capture request won't work correctly, since the phone won't have the opportunity to fire a preflash to properly calculate flash power.
Take a look at camera2basic for running the precapture sequence. It always sets AE mode to AE_MODE_AUTO_FLASH if possible, but the same code will work fine for the other flash modes (though you can skip the precapture sequence if flash is set to OFF, generally, as long as focus quality is OK).
If you Command click on CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH, you will see this:
The flash may be fired during a precapture sequence
(triggered by {#link CaptureRequest#CONTROL_AE_PRECAPTURE_TRIGGER android.control.aePrecaptureTrigger}) and
may be fired for captures for which the
{#link CaptureRequest#CONTROL_CAPTURE_INTENT android.control.captureIntent} field is set to STILL_CAPTURE
That means you have to trigger the the precapture sequence first before capturing the picture.
Can look at the Google's sample app here for the detailed implementation, https://github.com/google/cameraview
I use default camera application because this camera api is different on different phones.

Xamarin Android - Splash Screen Doesn't Work On Resume

I followed this article and pieced together information with other articles to create a splash screen:
https://learn.microsoft.com/en-us/xamarin/android/user-interface/splash-screen
The splash screen works well when I start the app up by tapping on the app's icon. However, if the app is already running and I switch to it, the screen goes white for a few seconds while the app resumes. Why?
Here is my code:
[Activity(Label = "Hardfolio", Icon = "#drawable/icon", Theme = "#style/MyTheme.Splash", MainLauncher = true, NoHistory = true, ConfigurationChanges = ConfigChanges.ScreenSize | ConfigChanges.Orientation)]
public class SplashActivity : FormsAppCompatActivity
{
static readonly string TAG = "Hardfolio: " + typeof(SplashActivity).Name;
public override void OnCreate(Bundle savedInstanceState, PersistableBundle persistentState)
{
base.OnCreate(savedInstanceState, persistentState);
}
// Launches the startup task
protected override void OnResume()
{
base.OnResume();
var startupWork = new Task(AppStartup);
startupWork.Start();
}
// Simulates background work that happens behind the splash screen
async void AppStartup()
{
StartActivity(new Intent(Application.Context, typeof(MainActivity)));
}
}
[Activity(Label = "Hardfolio", Icon = "#drawable/icon", Theme = "#style/MainTheme", MainLauncher = false, ConfigurationChanges = ConfigChanges.ScreenSize | ConfigChanges.Orientation)]
[IntentFilter(new[] { UsbManager.ActionUsbDeviceAttached })]
[MetaData(UsbManager.ActionUsbDeviceAttached, Resource = "#xml/device_filter")]
public class MainActivity : Xamarin.Forms.Platform.Android.FormsAppCompatActivity
{
#region Fields
private AndroidHidDevice _TrezorHidDevice;
private UsbDeviceAttachedReceiver _UsbDeviceAttachedReceiver;
private UsbDeviceDetachedReceiver _UsbDeviceDetachedReceiver;
private object _ReceiverLock = new object();
#endregion
#region Overrides
protected override void OnCreate(Bundle bundle)
{
try
{
_TrezorHidDevice = new AndroidHidDevice(GetSystemService(UsbService) as UsbManager, ApplicationContext, 3000, 64, TrezorManager.TrezorVendorId, TrezorManager.TrezorProductId);
ServicePointManager.ServerCertificateValidationCallback += (o, certificate, chain, errors) => true;
TabLayoutResource = Resource.Layout.Tabbar;
ToolbarResource = Resource.Layout.Toolbar;
base.OnCreate(bundle);
Xamarin.Forms.Forms.Init(this, bundle);
RegisterReceiver();
var application = new App(new CrossPlatformUtilities(new IsolatedStoragePersister(), new AndroidRESTClientFactory()), _TrezorHidDevice, GetPin);
LoadApplication(application);
}
catch (Exception ex)
{
Logger.Log("Android crash", ex, nameof(Wallet.Droid));
Toast.MakeText(ApplicationContext, ex.ToString(), ToastLength.Long).Show();
}
}
private async Task<string> GetPin()
{
var taskCompletionSource = new TaskCompletionSource<string>();
RunOnUiThread(async () =>
{
var pin = await TrezorPinPad.GetPin();
taskCompletionSource.SetResult(pin);
});
return await taskCompletionSource.Task;
}
protected override void OnResume()
{
base.OnResume();
Logger.Log($"Resuming... Setting up Trezor listeners. _TrezorHidDevice is {(_TrezorHidDevice == null ? "null" : "not null")}", null, nameof(Wallet.Droid));
RegisterReceiver();
}
private void RegisterReceiver()
{
try
{
lock (_ReceiverLock)
{
if (_UsbDeviceAttachedReceiver != null)
{
UnregisterReceiver(_UsbDeviceAttachedReceiver);
_UsbDeviceAttachedReceiver.Dispose();
}
_UsbDeviceAttachedReceiver = new UsbDeviceAttachedReceiver(_TrezorHidDevice);
RegisterReceiver(_UsbDeviceAttachedReceiver, new IntentFilter(UsbManager.ActionUsbDeviceAttached));
if (_UsbDeviceDetachedReceiver != null)
{
UnregisterReceiver(_UsbDeviceDetachedReceiver);
_UsbDeviceDetachedReceiver.Dispose();
}
_UsbDeviceDetachedReceiver = new UsbDeviceDetachedReceiver(_TrezorHidDevice);
RegisterReceiver(_UsbDeviceDetachedReceiver, new IntentFilter(UsbManager.ActionUsbDeviceDetached));
}
}
catch (Exception ex)
{
Logger.Log($"Error registering Hid receivers", ex, nameof(Wallet.Droid));
}
}
#endregion
}
If you start your application, from within another application/or using the ClearTask flag/or if your app is performing a cold start (has been closed in the background), and perhaps i other ways as well, you will see a "Preview" screen, which is the background of your current theme (kind of what you are already doing for your SplashScreen, which shows the theme background)...
But if your "#style/MainTheme" has a simple white background, this will be what you might see when reentering your app.
Therefore you can consider using the "SetTheme" method in OnCreate. There is more about this in this link:
https://developer.android.com/topic/performance/vitals/launch-time
Hope it helps.

How to transform each YUV_420_888 frame to RGB efficiently (e.g., using renderscript yuv2rgb.rs) in Android camera2?

I am building a image processing program using Android camera2. Since the image format of each captured frame is YUV_420_888, I need to transform it to RGB efficiently for image processing. I googled and read a lot (especially the following two links), and finally found that renderscript may be the solution. However, I don't know how to use the yuv2rgb script in my code.
http://werner-dittmann.blogspot.jp/2016/03/using-android-renderscript-to-convert.html
Convert android.media.Image (YUV_420_888) to Bitmap
Currently, I use the TextureView surface to show the preview, and use ImageReader to capture each YUV_420_888 frame in onImageAvailable function.
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
Surface mImageSurface = mImageReader.getSurface();
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface)
List surfaces = new ArrayList<>();
surfaces.add(surface);
surfaces.add(mImageSurface);
captureRequestBuilder.addTarget(mImageSurface);
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
cameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image img = null;
img = reader.acquireNextImage(); // we got YUV_420_888 frame here
// transform to RGB format here?
// image processing
}
};
How to update my codes to achieve the goal (e.g., using the yuv2rgb.rs)? Thanks.
The camera2 sample application HdrViewfinder, which uses RenderScript to do some image processing, may be helpful for how to connect up the camera and RenderScript: https://github.com/googlesamples/android-HdrViewfinder
It doesn't do YUV->RGB conversion, IIRC, and I think yuv2rgb.rs may be intended for a different YUV colorspace than what the camera produces (due to backwards-compatibility concerns - it existed before camera2). But it gets you to the point where you can write your own RS script to apply to camera data.

Categories