Zoom control for separate devices error - java

In my camera layout there is 2 buttons for adjusting zoom parameter, one for increasing, and one for decreasing. On each button there is a OnClickListener. Before increasing/decreasing the zoom value I use Camera.Parameters.isZoomSupported () function to check, is the device support zooming. On my Sony Z1 it works perfectly, but on my other device (Samsung Galaxy S), the fuction returns true, but the device can't zoom.
My code piece:
public void onClick(View v) {
if (isZoomSupported()) {
Camera cam = application.getCamera();
if (cam != null) {
cam.stopPreview();
Parameters par = cam.getParameters();
int maxZoom = par.getMaxZoom();
int zoomValue = par.getZoom();
zoomErtek += 1;
if (zoomValue > maxZoom) {
zoomValue = maxZoom;
}
par.setZoom(zoomValue);
cam.setParameters(par);
cam.startPreview();
}
} else {
toastShortWithCancel(getString(R.string.zoom_not_supported));
}
}
And my little isZoomSupported() function:
private boolean isZoomSupported() {
Camera cam = application.getCamera();
if (cam != null) {
Parameters par = cam.getParameters();
return par.isZoomSupported();
}
return false;
}
So, what is the problem with my zoom control? Is there any mistakes? My Samsung device runs Android 2.3.5, so I use API 8 for programming

Related

My app is lagging after adding images to the resources

Im trying to add some images to my android app. First of all, i want to add my background image, but when i do it app starts working much slower, animations not smoothly and app just lagging. First I added it this way:
android:background="#mipmap/background.png"
What i tryed after:
1) Make bg image in different resolutions, for different screen size and put it to corresponding folder in resourses. My bg image resolutins:
MDPI: 320x467 px
HDPI: 480x800 px
XDPI: 640x960 px
XXDPI: 960x1400 px
XXXDPI: 1280x1920 px
That didnt worked.
2) Remove all images from Android Studio to my backend and get them by request with AsyncTask:
First, i choosing url for resolution by dpi:
URL url = null;
try {
float density = getResources().getDisplayMetrics().density;
url = new URL(PicturesApi.getUrlByDPI(density));
} catch (MalformedURLException e) {
e.printStackTrace();
}
new SetImageBackground().execute(url);
getUrlByDPI method:
public static String getUrlByDPI(float density){
if (density == 0.75f)
{
return "http://back_url/static/ldpi/background.png";
}
else if (density >= 1.0f && density < 1.5f)
{
return "http://back_url/static/mdpi/background.png";
}
else if (density == 1.5f)
{
return "http://back_url/static/hdpi/background.png";
}
else if (density > 1.5f && density <= 2.0f)
{
return "http://back_url/static/xhdpi/background.png";
}
else if (density > 2.0f && density <= 3.0f)
{
return "http://back_url/static/xxhdpi/background.png";
}
else
{
return "http://back_url/static/xxxhdpi/background.png";
}
}
SetImageBackground class:
public class SetImageBackground extends AsyncTask<URL, Void, BitmapDrawable> {
#Override
protected BitmapDrawable doInBackground(URL... urls) {
Bitmap bmp = null;
try {
bmp = BitmapFactory.decodeStream(urls[0].openConnection().getInputStream());
} catch (IOException e) {
e.printStackTrace();
}
BitmapDrawable bitdraw = new BitmapDrawable(getResources(), bmp);
return bitdraw;
}
#Override
protected void onPostExecute(BitmapDrawable bitdraw){
background = (CoordinatorLayout) findViewById(R.id.app_bar);
background.setBackground(bitdraw);
}
}
It works, but the problem with lags stays.
Why it could happends and what should I pay attention to in image (resolution, file extension) when i adding image to app, or how it do in correct way? May be i do it wrong?

Camera2 Api autoflash not work

I needed to help with android camera 2 api automatic flash.
This solution works on one phone but not on the other.
I've spent a few hours searching for solutions, but I'm unsuccessful.
My takePhoto code:
pictureTaken = false;
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
int width = 1024;
int height = 768;
cv.setBackground(getResources().getDrawable(R.drawable.fotak_zeleny));
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
if (flashMode == FLASH_AUTO) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
} else if (flashMode == FLASH_ON) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
} else if (flashMode == FLASH_OFF) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
}
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(fileName);
if (file.exists()) {
file.delete();
}
etc....
My create camera preview code:
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, false);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException ex) {
ex.printStackTrace();
}
}
This solution work fine on my LG phone, but on alcatel not work.
I tried a lot of ideas that are written here, unsuccessfully.
Can help me please?
Big thanks
(Sorry for my English)
You should set your selected AE_MODE for the preview request as well, and update it whenever the user switches flash modes. In addition, you need to run the precapture sequence on any devices that are higher than LEGACY level.
Changing flash mode for just the single still capture request won't work correctly, since the phone won't have the opportunity to fire a preflash to properly calculate flash power.
Take a look at camera2basic for running the precapture sequence. It always sets AE mode to AE_MODE_AUTO_FLASH if possible, but the same code will work fine for the other flash modes (though you can skip the precapture sequence if flash is set to OFF, generally, as long as focus quality is OK).
If you Command click on CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH, you will see this:
The flash may be fired during a precapture sequence
(triggered by {#link CaptureRequest#CONTROL_AE_PRECAPTURE_TRIGGER android.control.aePrecaptureTrigger}) and
may be fired for captures for which the
{#link CaptureRequest#CONTROL_CAPTURE_INTENT android.control.captureIntent} field is set to STILL_CAPTURE
That means you have to trigger the the precapture sequence first before capturing the picture.
Can look at the Google's sample app here for the detailed implementation, https://github.com/google/cameraview
I use default camera application because this camera api is different on different phones.

Using ImageReader with older Camera API (To be supported by API <20)

I took the Google example for using ImageReader from here.
The code uses Camera2 API and ImageReader to such that querying image runs in different thread than previewing it.
As I want to target Android KitKat (API 20), I need to modify the code to use older Camera API with keeping the ImageReader part as is.
Here is the part of original code that sets onImageAvailableListener:
/**
* THIS IS CALLED WHEN OPENING CAMERA
* Sets up member variables related to camera.
*
* #param width The width of available size for camera preview
* #param height The height of available size for camera preview
*/
private void setUpCameraOutputs(int width, int height) {
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// We don't use a front facing camera in this sample.
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// For still image captures, we use the largest available size.
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/2);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
.
.
.
.
}
Now I was able to use older Camera API. But I am lost in connecting it with ImageReader. So I don't know how should I set onImageListener so that I can access it once the image is delivered.
Here is my modification :
#Override
public void onActivityCreated(Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
mTextureView = (AutoFitTextureView) v.findViewById(R.id.texture);
mTextureView.setSurfaceTextureListener(new SurfaceTextureListener() {
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface,
int width, int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface,
int width, int height) {
mCamera = Camera.open();
try {
Camera.Parameters parameters = mCamera.getParameters();
if (getActivity().getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
// parameters.set("orientation", "portrait"); // For
// Android Version 2.2 and above
mCamera.setDisplayOrientation(90);
// For Android Version 2.0 and above
parameters.setRotation(90);
}
mCamera.setParameters(parameters);
mCamera.setPreviewTexture(surface);
} catch (IOException exception) {
mCamera.release();
}
mCamera.startPreview();
setUpCameraOutputs(width, height);
tfPreviewListener.initialize(getActivity().getAssets(), scoreView);
}
});
}
My question is how should I add ImageReader in the code above to make it work properly?
Thanks in advance.

How to rotate android device camera preview(libgdx)

I'm using libgdx to make an application and I need to use the camera so I followed this tutorial and all my camera feed is rotated 90 degrees but they are being drawn as if they weren't. Unfortunately that means the preview is totally distorted and is very hard to use.
I won't post my code here unless snippets are asked for because I copy-pasted the code from the tutorial into my game.. The only change I recall making was as follows.
I changed the original surfaceCreated() method in CameraSurace.java
public void surfaceCreated( SurfaceHolder holder ) {
// Once the surface is created, simply open a handle to the camera hardware.
camera = Camera.open();
}
to open the front facing camera (Im using a Nexus 7 that only has a front camera...)
public void surfaceCreated( SurfaceHolder holder ) {
// Once the surface is created, simply open a handle to the camera hardware.
camera = openFrontFacingCamera();
}
#SuppressLint("NewApi")
private Camera openFrontFacingCamera()
{
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for ( int camIdx = 0; camIdx < cameraCount; camIdx++ ) {
Camera.getCameraInfo( camIdx, cameraInfo );
if ( cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT ) {
try {
cam = Camera.open( camIdx );
} catch (RuntimeException e) {
System.out.println("Falied to open.");
}
}
}
return cam;
}
Other than that change the rest of the code is almost the exact same (excluding minor variable changes and such.)
You can use the ExifInterface class to determine the ORIENTATION_TAG associated with your image and rotate the image accordingly.
The code would look like this:
ei = new ExifInterface(imagePath);
orientation = ei.getAttributeInt(ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
switch (orientation) {
case ExifInterface.ORIENTATION_ROTATE_90:
imageView.setRotation(90);
break;
...
default:
break;
}
Upon diving into the camera API, I found that all I have to do is use a nice little method called setDisplayOrientation(90) and it works perfectly now.
revised code:
#SuppressLint("NewApi")
public void surfaceCreated( SurfaceHolder holder ) {
// Once the surface is created, simply open a handle to the camera hardware.
camera = openFrontFacingCamera();
camera.setDisplayOrientation(90);
}
#SuppressLint("NewApi")
private Camera openFrontFacingCamera()
{
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for ( int camIdx = 0; camIdx < cameraCount; camIdx++ ) {
Camera.getCameraInfo( camIdx, cameraInfo );
if ( cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT ) {
try {
cam = Camera.open( camIdx );
} catch (RuntimeException e) {
System.out.println("Falied to open.");
}
}
}
return cam;
}
P.S only reason I'm ignoring the NewApi is because I know the exact device this app will be running on, and it is specific to that device... Would not recommend unless you know that the device's API is high enough... (it only requires API 8)

Andengine LWP screen orientation issue

I'm really stuck on the screen orientation logic.
Here is my code:
#Override
public EngineOptions onCreateEngineOptions() {
this.cameraWidth = getResources().getDisplayMetrics().widthPixels;
this.cameraHeight = getResources().getDisplayMetrics().heightPixels;
this.camera = CameraFactory.createPixelPerfectCamera(this, this.cameraWidth / 2.0F, this.cameraHeight / 2.0F);
this.camera.setResizeOnSurfaceSizeChanged(true);
this.dpi = getResources().getDisplayMetrics().densityDpi;
Display display = ((WindowManager) getSystemService(WINDOW_SERVICE)).getDefaultDisplay();
int rotation = display.getRotation();
if (rotation == Surface.ROTATION_90 || rotation == Surface.ROTATION_270) {
screenOrientation = ScreenOrientation.LANDSCAPE_SENSOR;
} else {
screenOrientation = ScreenOrientation.PORTRAIT_SENSOR;
}
EngineOptions engineOptions = new EngineOptions(true,screenOrientation, new FillResolutionPolicy(), this.camera);
engineOptions.getAudioOptions().setNeedsSound(true);
return engineOptions;
}
#Override
public void onSurfaceChanged(final GLState pGLState, final int pWidth, final int pHeight) {
super.onSurfaceChanged(pGLState, pWidth, pHeight);
Log.i(TAG, "onSurfaceChanged " + "w: " + this.camera.getSurfaceWidth() + " h: " + this.camera.getSurfaceHeight());
this.cameraWidth = this.camera.getSurfaceWidth();
this.cameraHeight = this.camera.getSurfaceHeight();
this.camera.setCenter(this.cameraWidth / 2.0F, this.cameraHeight / 2.0F);
}
When I try my LWP on AVD 3.7 FWVGA slider 480x854 everything works fine, but only in the LWP preview mode. When, for example - from the Landscape LWP preview mode I press button "Set wallpaper" I'm getting half black screen with my shifted LWP to the other half of desktop.
Also, I have noticed that method onCreateEngineOptions is not called when we returning from the Previos mode to the desktop.
Also, everytime I correctly receive onSurfaceChanged event in my LWP. Also, I have configured and can handle screen orientation change event... But how to apply it to my logic ?
public BroadcastReceiver mBroadcastReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent myIntent) {
if (myIntent.getAction().equals(BROADCAST_CONFIGURATION_CHANGED)) {
Log.d(TAG, "received->" + BROADCAST_CONFIGURATION_CHANGED);
if (getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
Log.i(TAG, "LANDSCAPE_SENSOR");
} else {
Log.i(TAG, "PORTRAIT_SENSOR");
}
}
}
}
How to correctly setup LWP to handle both of modes - Portrait and Landscape ?
Thanks in advance!
I have similar problem with a game, I fix the problem with this line in each activity in the manifest file:
<activity
....
android:configChanges="orientation"
... />
and use the methods:
#Override
public void onResumeGame() {
super.onResumeGame();
}
#Override
public void onPauseGame() {
super.onPauseGame();
}
hopefully solve your problem, best regards.

Categories