I'm using libgdx to make an application and I need to use the camera so I followed this tutorial and all my camera feed is rotated 90 degrees but they are being drawn as if they weren't. Unfortunately that means the preview is totally distorted and is very hard to use.
I won't post my code here unless snippets are asked for because I copy-pasted the code from the tutorial into my game.. The only change I recall making was as follows.
I changed the original surfaceCreated() method in CameraSurace.java
public void surfaceCreated( SurfaceHolder holder ) {
// Once the surface is created, simply open a handle to the camera hardware.
camera = Camera.open();
}
to open the front facing camera (Im using a Nexus 7 that only has a front camera...)
public void surfaceCreated( SurfaceHolder holder ) {
// Once the surface is created, simply open a handle to the camera hardware.
camera = openFrontFacingCamera();
}
#SuppressLint("NewApi")
private Camera openFrontFacingCamera()
{
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for ( int camIdx = 0; camIdx < cameraCount; camIdx++ ) {
Camera.getCameraInfo( camIdx, cameraInfo );
if ( cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT ) {
try {
cam = Camera.open( camIdx );
} catch (RuntimeException e) {
System.out.println("Falied to open.");
}
}
}
return cam;
}
Other than that change the rest of the code is almost the exact same (excluding minor variable changes and such.)
You can use the ExifInterface class to determine the ORIENTATION_TAG associated with your image and rotate the image accordingly.
The code would look like this:
ei = new ExifInterface(imagePath);
orientation = ei.getAttributeInt(ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
switch (orientation) {
case ExifInterface.ORIENTATION_ROTATE_90:
imageView.setRotation(90);
break;
...
default:
break;
}
Upon diving into the camera API, I found that all I have to do is use a nice little method called setDisplayOrientation(90) and it works perfectly now.
revised code:
#SuppressLint("NewApi")
public void surfaceCreated( SurfaceHolder holder ) {
// Once the surface is created, simply open a handle to the camera hardware.
camera = openFrontFacingCamera();
camera.setDisplayOrientation(90);
}
#SuppressLint("NewApi")
private Camera openFrontFacingCamera()
{
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for ( int camIdx = 0; camIdx < cameraCount; camIdx++ ) {
Camera.getCameraInfo( camIdx, cameraInfo );
if ( cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT ) {
try {
cam = Camera.open( camIdx );
} catch (RuntimeException e) {
System.out.println("Falied to open.");
}
}
}
return cam;
}
P.S only reason I'm ignoring the NewApi is because I know the exact device this app will be running on, and it is specific to that device... Would not recommend unless you know that the device's API is high enough... (it only requires API 8)
Related
I needed to help with android camera 2 api automatic flash.
This solution works on one phone but not on the other.
I've spent a few hours searching for solutions, but I'm unsuccessful.
My takePhoto code:
pictureTaken = false;
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
int width = 1024;
int height = 768;
cv.setBackground(getResources().getDrawable(R.drawable.fotak_zeleny));
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
if (flashMode == FLASH_AUTO) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
} else if (flashMode == FLASH_ON) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
} else if (flashMode == FLASH_OFF) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
}
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(fileName);
if (file.exists()) {
file.delete();
}
etc....
My create camera preview code:
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, false);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException ex) {
ex.printStackTrace();
}
}
This solution work fine on my LG phone, but on alcatel not work.
I tried a lot of ideas that are written here, unsuccessfully.
Can help me please?
Big thanks
(Sorry for my English)
You should set your selected AE_MODE for the preview request as well, and update it whenever the user switches flash modes. In addition, you need to run the precapture sequence on any devices that are higher than LEGACY level.
Changing flash mode for just the single still capture request won't work correctly, since the phone won't have the opportunity to fire a preflash to properly calculate flash power.
Take a look at camera2basic for running the precapture sequence. It always sets AE mode to AE_MODE_AUTO_FLASH if possible, but the same code will work fine for the other flash modes (though you can skip the precapture sequence if flash is set to OFF, generally, as long as focus quality is OK).
If you Command click on CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH, you will see this:
The flash may be fired during a precapture sequence
(triggered by {#link CaptureRequest#CONTROL_AE_PRECAPTURE_TRIGGER android.control.aePrecaptureTrigger}) and
may be fired for captures for which the
{#link CaptureRequest#CONTROL_CAPTURE_INTENT android.control.captureIntent} field is set to STILL_CAPTURE
That means you have to trigger the the precapture sequence first before capturing the picture.
Can look at the Google's sample app here for the detailed implementation, https://github.com/google/cameraview
I use default camera application because this camera api is different on different phones.
I took the Google example for using ImageReader from here.
The code uses Camera2 API and ImageReader to such that querying image runs in different thread than previewing it.
As I want to target Android KitKat (API 20), I need to modify the code to use older Camera API with keeping the ImageReader part as is.
Here is the part of original code that sets onImageAvailableListener:
/**
* THIS IS CALLED WHEN OPENING CAMERA
* Sets up member variables related to camera.
*
* #param width The width of available size for camera preview
* #param height The height of available size for camera preview
*/
private void setUpCameraOutputs(int width, int height) {
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// We don't use a front facing camera in this sample.
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// For still image captures, we use the largest available size.
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/2);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
.
.
.
.
}
Now I was able to use older Camera API. But I am lost in connecting it with ImageReader. So I don't know how should I set onImageListener so that I can access it once the image is delivered.
Here is my modification :
#Override
public void onActivityCreated(Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
mTextureView = (AutoFitTextureView) v.findViewById(R.id.texture);
mTextureView.setSurfaceTextureListener(new SurfaceTextureListener() {
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface,
int width, int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface,
int width, int height) {
mCamera = Camera.open();
try {
Camera.Parameters parameters = mCamera.getParameters();
if (getActivity().getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
// parameters.set("orientation", "portrait"); // For
// Android Version 2.2 and above
mCamera.setDisplayOrientation(90);
// For Android Version 2.0 and above
parameters.setRotation(90);
}
mCamera.setParameters(parameters);
mCamera.setPreviewTexture(surface);
} catch (IOException exception) {
mCamera.release();
}
mCamera.startPreview();
setUpCameraOutputs(width, height);
tfPreviewListener.initialize(getActivity().getAssets(), scoreView);
}
});
}
My question is how should I add ImageReader in the code above to make it work properly?
Thanks in advance.
In my camera layout there is 2 buttons for adjusting zoom parameter, one for increasing, and one for decreasing. On each button there is a OnClickListener. Before increasing/decreasing the zoom value I use Camera.Parameters.isZoomSupported () function to check, is the device support zooming. On my Sony Z1 it works perfectly, but on my other device (Samsung Galaxy S), the fuction returns true, but the device can't zoom.
My code piece:
public void onClick(View v) {
if (isZoomSupported()) {
Camera cam = application.getCamera();
if (cam != null) {
cam.stopPreview();
Parameters par = cam.getParameters();
int maxZoom = par.getMaxZoom();
int zoomValue = par.getZoom();
zoomErtek += 1;
if (zoomValue > maxZoom) {
zoomValue = maxZoom;
}
par.setZoom(zoomValue);
cam.setParameters(par);
cam.startPreview();
}
} else {
toastShortWithCancel(getString(R.string.zoom_not_supported));
}
}
And my little isZoomSupported() function:
private boolean isZoomSupported() {
Camera cam = application.getCamera();
if (cam != null) {
Parameters par = cam.getParameters();
return par.isZoomSupported();
}
return false;
}
So, what is the problem with my zoom control? Is there any mistakes? My Samsung device runs Android 2.3.5, so I use API 8 for programming
I have the following java method:
private Camera openFrontFacingCameraGingerbread() {
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Toast.makeText(debug, "failed", Toast.LENGTH_LONG).show();
}
}
}
and what I'm looking forward to do is mirror the camera in the screen. This is, as soon as I open the app I would like the user to see himself but I'm not being able to accomplish this.
Here is the onCreate method of the same class:
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//setContentView(R.layout.activity_main);
openFrontFacingCameraGingerbread();
}
Can someone help me on this one? Thank you very much
You need to include a SurfaceView in your layout, and use setPreviewDisplay() and startPreview(). This will draw the camera's input in your Activity.
You should check the developer guides, and also take a look at Android - Camera Preview
New to Android programming here.
I have had a look around and have found this to be a common issue, but I don't really see an easy fix... I am trying to run the following code on a Nexus 7 (have tried AVD & physical device) with no luck whatsoever. It seems to be the:
camera.setPreviewDisplay(SurfaceHolder);
But I could be wrong. Here is the current code:
public class MainActivity extends Activity implements SurfaceHolder.Callback{
Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
boolean previewing = false;;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.cameralayout);
getWindow().setFormat(PixelFormat.UNKNOWN);
surfaceView = (SurfaceView)findViewById(R.id.surfaceview);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
}
public void onClick() {
// TODO Auto-generated method stub
if(!previewing){
camera = Camera.open();
if (camera != null){
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
previewing = true;
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
Any ideas folks? Thank you for your help!
From android documentation about Camera.open()
Creates a new Camera object to access the first back-facing camera on
the device. If the device does not have a back-facing camera, this
returns null.
It gives you only an access to back-facing Camera.
I am trying to run the following code on a Nexus 7
Camera.open() returns null because Nexus 7 doesn't have a back camera, only a front camera.
You could try this method
public Camera getCamera()
{
for(int i = 0; i < Camera.getNumberOfCameras(); i++)
return Camera.open(i);
return null;
}
To apply,
camera = getCamera();