I needed to help with android camera 2 api automatic flash.
This solution works on one phone but not on the other.
I've spent a few hours searching for solutions, but I'm unsuccessful.
My takePhoto code:
pictureTaken = false;
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
int width = 1024;
int height = 768;
cv.setBackground(getResources().getDrawable(R.drawable.fotak_zeleny));
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
if (flashMode == FLASH_AUTO) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
} else if (flashMode == FLASH_ON) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
} else if (flashMode == FLASH_OFF) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
}
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(fileName);
if (file.exists()) {
file.delete();
}
etc....
My create camera preview code:
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, false);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException ex) {
ex.printStackTrace();
}
}
This solution work fine on my LG phone, but on alcatel not work.
I tried a lot of ideas that are written here, unsuccessfully.
Can help me please?
Big thanks
(Sorry for my English)
You should set your selected AE_MODE for the preview request as well, and update it whenever the user switches flash modes. In addition, you need to run the precapture sequence on any devices that are higher than LEGACY level.
Changing flash mode for just the single still capture request won't work correctly, since the phone won't have the opportunity to fire a preflash to properly calculate flash power.
Take a look at camera2basic for running the precapture sequence. It always sets AE mode to AE_MODE_AUTO_FLASH if possible, but the same code will work fine for the other flash modes (though you can skip the precapture sequence if flash is set to OFF, generally, as long as focus quality is OK).
If you Command click on CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH, you will see this:
The flash may be fired during a precapture sequence
(triggered by {#link CaptureRequest#CONTROL_AE_PRECAPTURE_TRIGGER android.control.aePrecaptureTrigger}) and
may be fired for captures for which the
{#link CaptureRequest#CONTROL_CAPTURE_INTENT android.control.captureIntent} field is set to STILL_CAPTURE
That means you have to trigger the the precapture sequence first before capturing the picture.
Can look at the Google's sample app here for the detailed implementation, https://github.com/google/cameraview
I use default camera application because this camera api is different on different phones.
Related
I am building an app which uses Camera2API to take pictures. The thing is I need the Camera to take a picture without needing a preview. So far, I managed to do it by dumping (and adapting) the code from an activity into a service and it works like a charm, except for the fact that it is not focusing. On previous versions I had a state machine in charge of that focusing on the preview by means of a separate CaptureRequest.Builder, but I can't make it work without creating a new CaptureRequest.Builder on the service.
I followed this topic on the following stackoverflow discussion How to lock focus in camera2 api, android? but I did not manage to make it work.
My code does the following:
First I create a camera session once the camera has been opened.
public void createCameraSession() {
try {
// Here, we create a CameraCaptureSession for camera preview.
cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
camera2TakePicture();
}
#Override
public void onConfigureFailed(
#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Then on that camera session I call my method "camera2TakePicture()":
protected void camera2TakePicture() {
if (null == cameraDevice) {
return;
}
try {
Surface readerSurface = imageReader.getSurface();
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(readerSurface);
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(readerSurface);
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
//MeteringRectangle meteringRectangle = getAFRegion();
//captureBuilder.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[] {meteringRectangle});
/**** TO BE USED ONCE SAMSUNG TABLETS HAVE BEEN REPLACED ****/
boolean samsungReplaced = false;
if(Boolean.parseBoolean(getPreferenceValue(this, "manualCamSettings"))) {
int exposureCompensation = Integer.parseInt(getPreferenceValue(this, "exposureCompensation"));
captureBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, exposureCompensation);
if(samsungReplaced) {
//Exposure
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_OFF);
Float shutterSpeed = 1 / Float.parseFloat(getPreferenceValue(this, "camSSpeed"));
Long exposureTimeInNanoSec = new Long(Math.round(shutterSpeed * Math.pow(10, 9)));
captureBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, exposureTimeInNanoSec);
captureBuilder.set(CaptureRequest.SENSOR_FRAME_DURATION, 10 * exposureTimeInNanoSec);
//ISO
int ISO = Integer.parseInt(getPreferenceValue(this, "camISO"));
captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, ISO);
//Aperture
Float aperture = Float.parseFloat(getPreferenceValue(this, "camAperture"));
captureBuilder.set(CaptureRequest.LENS_APERTURE, aperture);
}
}
// Orientation
WindowManager window = (WindowManager) getSystemService(Context.WINDOW_SERVICE);
Display display = window.getDefaultDisplay();
int rotation = display.getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
while(result.get(CaptureResult.CONTROL_AF_STATE) != CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED){
System.out.println("Not focused");
}
System.out.println("Focused");
}
};
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_IDLE);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
As you can see, I set the CONTROL_AF_MODE to AUTO then start the AF_TRIGGER and launch the capture. I add a check on onCaptureCompleted() but the AF_STATE never seems to be on FOCUSED_LOCKED. It stays on ACTIVE_SCAN.
What am I doing wrong?
In your code snippet, you've stopped the repeating request, and issue one capture request for the still image, but just one.
Do you then go on to restart the repeating request? If you don't, there are no frames flowing through the camera, and AF cannot make progress.
So if you want to lock AF before you take a picture, you want to
Set AF_TRIGGER to START for a single capture only
Run preview until you get AE_STATE out of ACTIVE_SCAN
Issue single capture for still image.
Being in the background or foreground doesn't really change any of this.
I'm new in android.
I found previous questions, but are quite old, actually I'm using API 23 or higher.
I'm interested in a way to obtaining a picture from a camera, without displaying the preview and without any touch or interaction of the user.
I used an intent to access to a camera app but don't let me to take a picture automatically in the way I need.
This only let me to use camera app.
Intent intentTakePic = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if(intentTakePic.resolveActivity(getPackageManager()) != null){
startActivityForResult(intentTakePic, GET_THE_PICTURE);
}
In future I probably need also to record the audio in the same way (without interaction).
Does anyone has suggestion for me ?
You need to use the CameraAPI to take pictures without opening another camera app. https://developer.android.com/guide/topics/media/camera
You'll basically make a camera app.
// in the activity onCreate, but doesn't have to be there
// needs explicit permission
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(new String[] {Manifest.permission.CAMERA}, 1);
}
}
final Camera camera = Camera.open();
CameraPreview cameraPreview = new CameraPreview(this, camera);
// preview is required. But you can just cover it up in the layout.
FrameLayout previewFL = findViewById(R.id.preview_layout);
previewFL.addView(cameraPreview);
camera.startPreview();
// take picture button
findViewById(R.id.take_picture_button).setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
// path of where you want to save it
File pictureFile = new File(getFilesDir() + "/images/pic0");
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
});
CameraPreview class
import android.content.Context;
import android.hardware.Camera;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import java.io.IOException;
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// empty. Take care of releasing the Camera preview in your activity.
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e){
e.printStackTrace();
}
}
}
I found these useful guide:
https://zatackcoder.com/android-camera-2-api-example-without-preview/
https://inducesmile.com/android/android-camera2-api-example-tutorial/
https://github.com/googlesamples/android-Camera2Basic
I'm having a problem with android's camera2 API.
My end goal here is to have a byte array which I can edit using opencv, whilst displaying the preview to the user (e.g. an OCR with a preview).
I've create a capture request and added an ImageReader as a target. Then on the OnImageAvailableListener, i'm getting the image, transforming it to a bitmap and then display it on an ImageView (and rotating it).
My problem is that after a few seconds, the preview stalls (after gradually slowing down) and in the log om getting the following error: E/BufferItemConsumer: [ImageReader-1225x1057f100m2-18869-0] Failed to release buffer: Unknown error -1 (1)
As you can see in my code, I have already tried closing the img after getting my byte[] from it.
I've also tried clearing the buffer.
I've tried closing the ImageReader but that of course stopped me from getting any further images (throws an exception).
Can anyone please help me understand what im doing wrong? I've been scouring google to no avail.
This is my OnImageAvailableListener, do let me know if you need more of my code to assist:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image img = reader.acquireLatestImage();
final ImageView iv = findViewById(R.id.camPrev);
try{
if (img==null) throw new NullPointerException("null img");
ByteBuffer buffer = img.getPlanes()[0].getBuffer();
byte[] data = new byte[buffer.remaining()];
buffer.get(data);
final Bitmap b = BitmapFactory.decodeByteArray(data, 0, data.length);
runOnUiThread(new Runnable() {
#Override
public void run() {
iv.setImageBitmap(b);
iv.setRotation(90);
}
});
} catch (NullPointerException ex){
showToast("img is null");
}finally {
if(img!=null)
img.close();
}
}
};
Edit - adding cameraStateCallback
private CameraDevice.StateCallback mCameraDeviceStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice cameraDevice) {
mCameraDevice = cameraDevice;
showToast("Connected to camera!");
createCameraPreviewSession();
}
#Override
public void onDisconnected(CameraDevice cameraDevice) {
closeCamera();
}
#Override
public void onError(CameraDevice cameraDevice, int i) {
closeCamera();
}
};
private void closeCamera() {
if (mCameraDevice != null) {
mCameraDevice.close();
mCameraDevice = null;
}
}
You seem to have used setRepeatingRequest() for Jpeg format. This may not be fully supported on your device, also depends on the image resolution that you choose. Normally, we use createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW) in these cases, and get YUV or raw format from ImageReader.
I would try to choose low resolution for Jpeg: maybe this will be enough to keep the ImageReader running.
I am building a image processing program using Android camera2. Since the image format of each captured frame is YUV_420_888, I need to transform it to RGB efficiently for image processing. I googled and read a lot (especially the following two links), and finally found that renderscript may be the solution. However, I don't know how to use the yuv2rgb script in my code.
http://werner-dittmann.blogspot.jp/2016/03/using-android-renderscript-to-convert.html
Convert android.media.Image (YUV_420_888) to Bitmap
Currently, I use the TextureView surface to show the preview, and use ImageReader to capture each YUV_420_888 frame in onImageAvailable function.
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
Surface mImageSurface = mImageReader.getSurface();
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface)
List surfaces = new ArrayList<>();
surfaces.add(surface);
surfaces.add(mImageSurface);
captureRequestBuilder.addTarget(mImageSurface);
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
cameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image img = null;
img = reader.acquireNextImage(); // we got YUV_420_888 frame here
// transform to RGB format here?
// image processing
}
};
How to update my codes to achieve the goal (e.g., using the yuv2rgb.rs)? Thanks.
The camera2 sample application HdrViewfinder, which uses RenderScript to do some image processing, may be helpful for how to connect up the camera and RenderScript: https://github.com/googlesamples/android-HdrViewfinder
It doesn't do YUV->RGB conversion, IIRC, and I think yuv2rgb.rs may be intended for a different YUV colorspace than what the camera produces (due to backwards-compatibility concerns - it existed before camera2). But it gets you to the point where you can write your own RS script to apply to camera data.
I took the Google example for using ImageReader from here.
The code uses Camera2 API and ImageReader to such that querying image runs in different thread than previewing it.
As I want to target Android KitKat (API 20), I need to modify the code to use older Camera API with keeping the ImageReader part as is.
Here is the part of original code that sets onImageAvailableListener:
/**
* THIS IS CALLED WHEN OPENING CAMERA
* Sets up member variables related to camera.
*
* #param width The width of available size for camera preview
* #param height The height of available size for camera preview
*/
private void setUpCameraOutputs(int width, int height) {
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// We don't use a front facing camera in this sample.
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// For still image captures, we use the largest available size.
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/2);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
.
.
.
.
}
Now I was able to use older Camera API. But I am lost in connecting it with ImageReader. So I don't know how should I set onImageListener so that I can access it once the image is delivered.
Here is my modification :
#Override
public void onActivityCreated(Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
mTextureView = (AutoFitTextureView) v.findViewById(R.id.texture);
mTextureView.setSurfaceTextureListener(new SurfaceTextureListener() {
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface,
int width, int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface,
int width, int height) {
mCamera = Camera.open();
try {
Camera.Parameters parameters = mCamera.getParameters();
if (getActivity().getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
// parameters.set("orientation", "portrait"); // For
// Android Version 2.2 and above
mCamera.setDisplayOrientation(90);
// For Android Version 2.0 and above
parameters.setRotation(90);
}
mCamera.setParameters(parameters);
mCamera.setPreviewTexture(surface);
} catch (IOException exception) {
mCamera.release();
}
mCamera.startPreview();
setUpCameraOutputs(width, height);
tfPreviewListener.initialize(getActivity().getAssets(), scoreView);
}
});
}
My question is how should I add ImageReader in the code above to make it work properly?
Thanks in advance.