I am using Translate Animation for moving an ImageView. I am using this code:
TranslateAnimation set1 = new TranslateAnimation(-4, 10, -110, 0);
set1.setDuration(3000);
TranslateAnimation set2 = new TranslateAnimation(10, -3, 0, 115);
set2.setDuration(3000);
set2.setStartOffset(2200);
TranslateAnimation set3 = new TranslateAnimation(-3, -20, 0, -100);
set3.setDuration(3000);
set3.setStartOffset(4500);
TranslateAnimation set4 = new TranslateAnimation(0, 13, 0, -120);
set4.setDuration(3000);
set4.setStartOffset(6500);
animSet.addAnimation(set1);
animSet.addAnimation(set2);
animSet.addAnimation(set3);
animSet.addAnimation(set4);
animSet.setFillAfter(true);
After creating a set of animations, I apply them on the ImageView like this:
image = (ImageView)findViewById(R.id.img);
image.startAnimation(animSet);
Everything is working fine, but I cannot pause the animation and resume on button click.
How can I do that?
I tried everything, but didn't succeed. Any idea how to do this?
Please help!
After searching for a time i found this link and check is this working for Translate Animation or not and after some modification this is working for your animation too.!
See modified code below:
public class TranslateAnim extends TranslateAnimation{
public TranslateAnim(float fromXDelta, float toXDelta, float fromYDelta,
float toYDelta) {
super(fromXDelta, toXDelta, fromYDelta, toYDelta);
// TODO Auto-generated constructor stub
}
private long mElapsedAtPause=0;
private boolean mPaused=false;
#Override
public boolean getTransformation(long currentTime, Transformation outTransformation) {
if(mPaused && mElapsedAtPause==0) {
mElapsedAtPause=currentTime-getStartTime();
}
if(mPaused)
setStartTime(currentTime-mElapsedAtPause);
return super.getTransformation(currentTime, outTransformation);
}
public void pause() {
mElapsedAtPause=0;
mPaused=true;
}
public void resume() {
mPaused=false;
}
}
I'll only change class name, extends class name and constructor of this class.
you can use it like:
TranslateAnim set1, set2, set3, set4; // objects of TranslateAnim Class
set1 = new TranslateAnim(-4, 10, -110, 0); // initialize all objects like this way
animSet.addAnimation(set1); // add all animation objests in your animation set as you do before
animSet.setFillAfter(true);
and after start your animation you have only call pause and resume methods.
Thanks to Johan for share his code with us.
Hope this solve your problem. :)
You can also do like this: а можно еще так:
public class MyTranslateAnimation extends TranslateAnimation {
private long mTimePause, mTimeTotal;
private boolean mPause;
public MyTranslateAnimation(Context context, AttributeSet attrs) {
super(context, attrs);
}
#Override
public boolean getTransformation(long currentTime, Transformation outTransformation) {
updateTime(currentTime);
return super.getTransformation(mTimeTotal - mTimePause, outTransformation);
}
private void updateTime(long currentTime) {
long dt = currentTime - mTimeTotal;
mTimeTotal += dt;
if (mPause) {
mTimePause += dt;
}
}
public void pause() {
mPause = true;
}
public void resume() {
mPause = false;
}
}
To create an animation from an XML, you can create your own AnimationUtils subclass, like this: для создания анимации из XML можно сделать свой AnimationUtils:
public class MyAnimationUtils {
public static Animation loadAnimation(Context context, int id) throws Resources.NotFoundException {
XmlResourceParser parser = null;
try {
parser = context.getResources().getAnimation(id);
return createAnimationFromXml(context, parser);
} catch (XmlPullParserException ex) {
Resources.NotFoundException rnf = new Resources.NotFoundException("Can't load animation resource ID #0x" + Integer.toHexString(id));
rnf.initCause(ex);
throw rnf;
} catch (IOException ex) {
Resources.NotFoundException rnf = new Resources.NotFoundException("Can't load animation resource ID #0x" + Integer.toHexString(id));
rnf.initCause(ex);
throw rnf;
} finally {
if (parser != null) parser.close();
}
}
private static Animation createAnimationFromXml(Context c, XmlPullParser parser) throws XmlPullParserException, IOException {
return createAnimationFromXml(c, parser, null, Xml.asAttributeSet(parser));
}
private static Animation createAnimationFromXml(Context c, XmlPullParser parser, AnimationSet parent, AttributeSet attrs) throws XmlPullParserException, IOException {
Animation anim = null;
// Make sure we are on a start tag.
int type;
int depth = parser.getDepth();
while (((type=parser.next()) != XmlPullParser.END_TAG || parser.getDepth() > depth)
&& type != XmlPullParser.END_DOCUMENT) {
if (type != XmlPullParser.START_TAG) {
continue;
}
String name = parser.getName();
if (name.equals("set")) {
anim = new AnimationSet(c, attrs);
createAnimationFromXml(c, parser, (AnimationSet)anim, attrs);
} else if (name.equals("alpha")) {
anim = new AlphaAnimation(c, attrs);
} else if (name.equals("scale")) {
anim = new ScaleAnimation(c, attrs);
} else if (name.equals("rotate")) {
anim = new RotateAnimation(c, attrs);
} else if (name.equals("translate")) {
//anim = new TranslateAnimation(c, attrs);
anim = new MyTranslateAnimation(c, attrs); // отредактировали только эту строчку, остальное взяли как было
} else {
throw new RuntimeException("Unknown animation name: " + parser.getName());
}
if (parent != null) {
parent.addAnimation(anim);
}
}
return anim;
}
}
And then you build the animation like this: и вот так создаем анимацию:
MyTranslateAnimation cloud1 = (MyTranslateAnimation) MyAnimationUtils.loadAnimation(this, R.anim.main_cloud1);
Hope this helps. Пользуйтесь на здоровье!
Related
2020-04-22 16:14:49.759 1809-1809/? E/servicemanager: Could not find android.hardware.power.IPower/default in the VINTF manifest.
This keeps popping up. Am coding a camera, that when a button is clicked, an image is cropped.
Here is a custom view I am adding to a fragment.
public class DrawView extends View {
Point[] points = new Point[4];
/**
* point1 and point 3 are of same group and same as point 2 and point4
*/
int groupId = -1;
private ArrayList<ColorBall> colorballs = new ArrayList<>();
private int mStrokeColor = Color.parseColor("#AADB1255");
private int mFillColor = Color.parseColor("#55DB1255");
private Rect mCropRect = new Rect();
// array that holds the balls
private int balID = 0;
// variable to know what ball is being dragged
Paint paint;
public DrawView(Context context) {
this(context, null);
}
public DrawView(Context context, AttributeSet attrs) {
this(context, attrs, -1);
}
public DrawView(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs, defStyle);
init();
}
private void init() {
paint = new Paint();
setFocusable(true); // necessary for getting the touch events
}
private void initRectangle(int X, int Y) {
//initialize rectangle.
points[0] = new Point();
points[0].x = X - 200;
points[0].y = Y - 100;
points[1] = new Point();
points[1].x = X;
points[1].y = Y + 30;
points[2] = new Point();
points[2].x = X + 30;
points[2].y = Y + 30;
points[3] = new Point();
points[3].x = X + 30;
points[3].y = Y;
balID = 2;
groupId = 1;
// declare each ball with the ColorBall class
for (int i = 0; i < points.length; i++) {
colorballs.add(new ColorBall(getContext(), R.drawable.gray_circle, points[i], i));
}
}
// the method that draws the balls
#Override
protected void onDraw(Canvas canvas) {
if(points[3]==null) {
//point4 null when view first create
initRectangle(getWidth() / 2, getHeight() / 2);
}
int left, top, right, bottom;
left = points[0].x;
top = points[0].y;
right = points[0].x;
bottom = points[0].y;
for (int i = 1; i < points.length; i++) {
left = left > points[i].x ? points[i].x : left;
top = top > points[i].y ? points[i].y : top;
right = right < points[i].x ? points[i].x : right;
bottom = bottom < points[i].y ? points[i].y : bottom;
}
paint.setAntiAlias(true);
paint.setDither(true);
paint.setStrokeJoin(Paint.Join.ROUND);
paint.setStrokeWidth(5);
//draw stroke
paint.setStyle(Paint.Style.STROKE);
paint.setColor(mStrokeColor);
paint.setStrokeWidth(2);
mCropRect.left = left + colorballs.get(0).getWidthOfBall() / 2;
mCropRect.top = top + colorballs.get(0).getWidthOfBall() / 2;
mCropRect.right = right + colorballs.get(2).getWidthOfBall() / 2;
mCropRect.bottom = bottom + colorballs.get(3).getWidthOfBall() / 2;
canvas.drawRect(mCropRect, paint);
//fill the rectangle
paint.setStyle(Paint.Style.FILL);
paint.setColor(mFillColor);
paint.setStrokeWidth(0);
canvas.drawRect(mCropRect, paint);
// draw the balls on the canvas
paint.setColor(Color.RED);
paint.setTextSize(18);
paint.setStrokeWidth(0);
for (int i =0; i < colorballs.size(); i ++) {
ColorBall ball = colorballs.get(i);
canvas.drawBitmap(ball.getBitmap(), ball.getX(), ball.getY(),
paint);
canvas.drawText("" + (i+1), ball.getX(), ball.getY(), paint);
}
}
// events when touching the screen
public boolean onTouchEvent(MotionEvent event) {
int eventAction = event.getAction();
int X = (int) event.getX();
int Y = (int) event.getY();
switch (eventAction) {
case MotionEvent.ACTION_DOWN: // touch down so check if the finger is on
// a ball
if (points[0] == null) {
initRectangle(X, Y);
} else {
//resize rectangle
balID = -1;
groupId = -1;
for (int i = colorballs.size()-1; i>=0; i--) {
ColorBall ball = colorballs.get(i);
// check if inside the bounds of the ball (circle)
// get the center for the ball
int centerX = ball.getX() + ball.getWidthOfBall();
int centerY = ball.getY() + ball.getHeightOfBall();
paint.setColor(Color.CYAN);
// calculate the radius from the touch to the center of the
// ball
double radCircle = Math
.sqrt((double) (((centerX - X) * (centerX - X)) + (centerY - Y)
* (centerY - Y)));
if (radCircle < ball.getWidthOfBall()) {
balID = ball.getID();
if (balID == 1 || balID == 3) {
groupId = 2;
} else {
groupId = 1;
}
invalidate();
break;
}
invalidate();
}
}
break;
case MotionEvent.ACTION_MOVE: // touch drag with the ball
if (balID > -1) {
// move the balls the same as the finger
colorballs.get(balID).setX(X);
colorballs.get(balID).setY(Y);
paint.setColor(Color.CYAN);
if (groupId == 1) {
colorballs.get(1).setX(colorballs.get(0).getX());
colorballs.get(1).setY(colorballs.get(2).getY());
colorballs.get(3).setX(colorballs.get(2).getX());
colorballs.get(3).setY(colorballs.get(0).getY());
} else {
colorballs.get(0).setX(colorballs.get(1).getX());
colorballs.get(0).setY(colorballs.get(3).getY());
colorballs.get(2).setX(colorballs.get(3).getX());
colorballs.get(2).setY(colorballs.get(1).getY());
}
invalidate();
}
break;
case MotionEvent.ACTION_UP:
// touch drop - just do things here after dropping
break;
}
// redraw the canvas
invalidate();
return true;
}
public Drawable doTheCrop(Bitmap sourceBitmap) throws IOException {
//Bitmap sourceBitmap = null;
//Drawable backgroundDrawable = getBackground();
/*
if (backgroundDrawable instanceof BitmapDrawable) {
BitmapDrawable bitmapDrawable = (BitmapDrawable) backgroundDrawable;
if(bitmapDrawable.getBitmap() != null) {
sourceBitmap = bitmapDrawable.getBitmap();
}
}*/
//source bitmap was scaled, you should calculate the rate
float widthRate = ((float) sourceBitmap.getWidth()) / getWidth();
float heightRate = ((float) sourceBitmap.getHeight()) / getHeight();
//crop the source bitmap with rate value
int left = (int) (mCropRect.left * widthRate);
int top = (int) (mCropRect.top * heightRate);
int right = (int) (mCropRect.right * widthRate);
int bottom = (int) (mCropRect.bottom * heightRate);
Bitmap croppedBitmap = Bitmap.createBitmap(sourceBitmap, left, top, right - left, bottom - top);
Drawable drawable = new BitmapDrawable(getResources(), croppedBitmap);
return drawable;
/*
setContentView(R.layout.fragment_dashboard);
Button btn = (Button)findViewById(R.id.capture);
if (btn == null){
System.out.println("NULL");
}
try{
btn.setText("HI");
}
catch (Exception e){
}
//setBackground(drawable);*/
//savebitmap(croppedBitmap);
}
private File savebitmap(Bitmap bmp) throws IOException {
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.JPEG, 60, bytes);
File f = new File(Environment.getExternalStorageDirectory()
+ "/" + "testimage.jpg");
Toast.makeText(getContext(), "YUP", Toast.LENGTH_LONG).show();
f.createNewFile();
FileOutputStream fo = new FileOutputStream(f);
fo.write(bytes.toByteArray());
fo.close();
return f;
}
public static class ColorBall {
Bitmap bitmap;
Context mContext;
Point point;
int id;
public ColorBall(Context context, int resourceId, Point point, int id) {
this.id = id;
bitmap = BitmapFactory.decodeResource(context.getResources(),
resourceId);
mContext = context;
this.point = point;
}
public int getWidthOfBall() {
return bitmap.getWidth();
}
public int getHeightOfBall() {
return bitmap.getHeight();
}
public Bitmap getBitmap() {
return bitmap;
}
public int getX() {
return point.x;
}
public int getY() {
return point.y;
}
public int getID() {
return id;
}
public void setX(int x) {
point.x = x;
}
public void setY(int y) {
point.y = y;
}
}
}
Here is the fragment that I have added a camera do, basically the main part of the application that I am working on.
public class DashboardFragment extends Fragment {
private DashboardViewModel dashboardViewModel;
//All my constants
private DrawView mDrawView;
private Drawable imgDraw;
private TextureView txtView;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static{
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 180);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
private String cameraID;
private String pathway;
CameraDevice cameraDevice;
CameraCaptureSession cameraCaptureSession;
CaptureRequest captureRequest;
CaptureRequest.Builder captureRequestBuilder;
private Size imageDimensions;
private ImageReader imageReader;
private File file;
Handler mBackgroundHandler;
HandlerThread mBackgroundThread;
public View onCreateView(#NonNull LayoutInflater inflater,
ViewGroup container, Bundle savedInstanceState) {
dashboardViewModel =
ViewModelProviders.of(this).get(DashboardViewModel.class);
View root = inflater.inflate(R.layout.fragment_dashboard, container, false);
try{
txtView = (TextureView)root.findViewById(R.id.textureView);
txtView.setSurfaceTextureListener(textureListener);
mDrawView = root.findViewById(draw_view);
Button cap = (Button)root.findViewById(R.id.capture);
cap.setClickable(true);
cap.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
try {
Log.i("HOLA","HOLA");
takePicture();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
});
}
catch (Exception e){
Log.i("HI",e.toString());
}
/*
txtView = (TextureView)root.findViewById(R.id.textureView);
txtView.setSurfaceTextureListener(textureListener);
mDrawView = root.findViewById(R.id.draw_view);
Button cap = (Button)root.findViewById(R.id.capture);
cap.setClickable(true);
cap.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
try {
Log.i("HOLA","HOLA");
takePicture();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
});*/
return root;
}
#Override
public void onRequestPermissionsResult(int requestCode, #NonNull String[] permissions, #NonNull int[] grantResults){
if (requestCode == 101){
if (grantResults[0] == PackageManager.PERMISSION_DENIED){
Toast.makeText(getActivity().getApplicationContext(), "Permission is required",Toast.LENGTH_LONG);
}
}
}
TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int i, int i1) {
try {
openCamera();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i1) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
}
};
private final CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice camera) {
cameraDevice = camera;
try {
createCameraPreview();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onDisconnected(#NonNull CameraDevice cameraDevice) {
cameraDevice.close();
}
#Override
public void onError(#NonNull CameraDevice cameraDevice, int i) {
cameraDevice.close();
cameraDevice = null;
}
};
private void createCameraPreview() throws CameraAccessException {
SurfaceTexture texture = txtView.getSurfaceTexture(); //?
texture.setDefaultBufferSize(imageDimensions.getWidth(), imageDimensions.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
if (cameraDevice == null){
return;
}
cameraCaptureSession = session;
try {
updatePreview();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(getActivity().getApplicationContext(), "CONFIGURATION", Toast.LENGTH_LONG);
}
}, null);
}
private void updatePreview() throws CameraAccessException {
if (cameraDevice == null){
return;
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
cameraCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
}
private void openCamera() throws CameraAccessException {
CameraManager manager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
cameraID = manager.getCameraIdList()[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraID);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
imageDimensions = map.getOutputSizes(SurfaceTexture.class)[0];
if (ActivityCompat.checkSelfPermission(getActivity().getApplicationContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(getActivity().getApplicationContext(), Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED){
ActivityCompat.requestPermissions(getActivity(), new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, 101);
return;
}
manager.openCamera(cameraID, stateCallback, null);
}
private void takePicture() throws CameraAccessException {
if (cameraDevice == null) {
Log.i("NOt working", "hi");
return;
}
CameraManager manager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
int width = 640;
int height = 480;
if (jpegSizes != null && jpegSizes.length > 0) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
final ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(txtView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
int rotation = getActivity().getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
Long tsLong = System.currentTimeMillis() / 1000;
String ts = tsLong.toString();
file = new File(Environment.getExternalStorageDirectory() + "/" + ts + ".jpg");
pathway = Environment.getExternalStorageDirectory() + "/" + ts + ".jpg";
//cameraDevice.close();
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader imageReader) {
Image image = null;
//image = reader.acquireLatestImage();
image = reader.acquireNextImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes , 0, bytes.length);
try {
Drawable back = mDrawView.doTheCrop(bitmap);
Button btn = (Button)getView().findViewById(R.id.capture);
btn.setBackground(back);
} catch (IOException e) {
e.printStackTrace();
}
/*
try {
save(bytes);
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null){
image.close();
}
}*/
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback(){
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result){
super.onCaptureCompleted(session, request, result);
try {
createCameraPreview();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
}, mBackgroundHandler);
}
private void save (byte[] bytes) throws IOException {
OutputStream outputStream = null;
outputStream = new FileOutputStream(file);
outputStream.write(bytes);
Toast.makeText(getActivity().getApplicationContext(),pathway,Toast.LENGTH_LONG).show();
outputStream.close();
imgDraw = Drawable.createFromPath(pathway);
//mDrawView.doTheCrop(imgDraw);
}
#Override
public void onResume(){
super.onResume();
startBackgroundThread();
if (txtView.isAvailable()){
try {
openCamera();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
else{
txtView.setSurfaceTextureListener(textureListener);
}
}
private void startBackgroundThread() {
mBackgroundThread = new HandlerThread("Camera Background");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
protected void stopBackgroundThread() throws InterruptedException{
mBackgroundThread.quitSafely();
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
}
#Override
public void onPause(){
try {
stopBackgroundThread();
} catch (InterruptedException e) {
e.printStackTrace();
}
super.onPause();
}
}
Here is the xml file for that fragment.
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".ui.dashboard.DashboardFragment">
<TextureView
android:id = "#+id/textureView"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<com.PeavlerDevelopment.OpinionMinion.ui.dashboard.DrawView
android:id="#+id/draw_view"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<Button
android:id="#+id/capture"
android:layout_width="100dp"
android:layout_height="200dp"
android:clickable="true"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"></Button>
</androidx.constraintlayout.widget.ConstraintLayout>
The problem seems to lie somewhere in the doCrop method of the DrawView class.
If there is anything else that would help make the problem more clear, let me know! I will gladly share the github repo with you.
Thank you.
As you can see in Android Design Documenation the VINTF stands for Vendor Interface and its a Manifest structure to aggregate data form the device. That specific log means that your manifest is missing something like this:
<hal>
<name>android.hardware.power</name>
<transport>hwbinder</transport>
<version>1.1</version>
<interface>
<name>IPower</name>
<instance>default</instance>
</interface>
</hal>
which basically is hardware power information.
I think it's not related to what you are trying to do, but I need more info than that log.
I am using below custom class for make text outline in my quotes application. Its working fine till device with android 9. In android 10 device, its not giving me any error as well not working.
My custom class for draw outline is like below
public class OutlineTextView extends androidx.appcompat.widget.AppCompatTextView {
private Field colorField;
private int textColor;
private int outlineColor;
public OutlineTextView(Context context) {
this(context, null);
}
public OutlineTextView(Context context, AttributeSet attrs) {
this(context, attrs, android.R.attr.textViewStyle);
}
public OutlineTextView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
try {
colorField = TextView.class.getDeclaredField("mCurTextColor");
colorField.setAccessible(true);
textColor = getTextColors().getDefaultColor();
TypedArray a = context.obtainStyledAttributes(attrs, R.styleable.OutlineTextView);
outlineColor = a.getColor(R.styleable.OutlineTextView_outlineColor, Color.TRANSPARENT);
setOutlineStrokeWidth(a.getDimensionPixelSize(R.styleable.OutlineTextView_outlineWidth, 0));
a.recycle();
}
catch (NoSuchFieldException e) {
e.printStackTrace();
colorField = null;
}
}
#Override
public void setTextColor(int color) {
textColor = color;
super.setTextColor(color);
}
public void setOutlineColor(int color) {
invalidate();
outlineColor = color;
}
public void setOutlineWidth(float width) {
invalidate();
setOutlineStrokeWidth(width);
}
private void setOutlineStrokeWidth(float width) {
invalidate();
getPaint().setStrokeWidth(2 * width + 1);
}
#Override
protected void onDraw(Canvas canvas) {
if (colorField != null) {
// Outline
setColorField(outlineColor);
getPaint().setStyle(Paint.Style.STROKE);
super.onDraw(canvas);
// Reset for text
setColorField(textColor);
getPaint().setStyle(Paint.Style.FILL);
}
super.onDraw(canvas);
}
private void setColorField(int color) {
// We did the null check in onDraw()
try {
colorField.setInt(this, color);
}
catch (IllegalAccessException | IllegalArgumentException e) {
// Optionally catch Exception and remove print after testing
e.printStackTrace();
}
}
}
I am calling it from my RecyclerView like this
CardMainActivity.text_quotes.setOutlineColor(Color.parseColor(CardConstant.getcolorcodearray().get(getAdapterPosition())));
My gradle build information is like below
compileSdkVersion 29
buildToolsVersion "29.0.2"
Let me know if someone can help me. Thanks!
I want to make app which take photos only with faces but I get whole image instead of face when I save into my storage file. So how to crop face and save into storage with the help google vision face detection API.
So how to use frame in my code to get face list as well as how can I convert into bitmap and save into my storage.
main.xml
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent">
<LinearLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:id="#+id/topLayout"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true">
<com.google.android.gms.samples.vision.face.facetracker.ui.camera.CameraSourcePreview
android:id="#+id/preview"
android:layout_width="match_parent"
android:layout_height="match_parent">
<com.google.android.gms.samples.vision.face.facetracker.ui.camera.GraphicOverlay
android:id="#+id/faceOverlay"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</com.google.android.gms.samples.vision.face.facetracker.ui.camera.CameraSourcePreview>
</LinearLayout>
<android.support.v7.widget.AppCompatButton
android:id="#+id/take_pic_btn"
android:layout_gravity="bottom"
android:gravity="center"
android:background="#color/green"
android:layout_margin="10dp"
android:text="Take Image"
android:textStyle="bold"
android:textSize="20sp"
android:textAllCaps="false"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
</FrameLayout>
FaceTrackerActivity .java
public final class FaceTrackerActivity extends AppCompatActivity {
private static final String TAG = "FaceTracker";
private CameraSource mCameraSource = null;
private CameraSourcePreview mPreview;
private GraphicOverlay mGraphicOverlay;
private static final int RC_HANDLE_GMS = 9001;
// permission request codes need to be < 256
private static final int RC_HANDLE_CAMERA_PERM = 2;
private Button takePicButton;
FaceDetector detector;
#Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.main);
mPreview = (CameraSourcePreview) findViewById(R.id.preview);
mGraphicOverlay = (GraphicOverlay) findViewById(R.id.faceOverlay);
takePicButton=(Button)findViewById(R.id.take_pic_btn);
int rc = ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA);
int gc = ActivityCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE);
if (rc == PackageManager.PERMISSION_GRANTED && gc == PackageManager.PERMISSION_GRANTED) {
createCameraSource();
} else {
requestCameraPermission();
}
takePicButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Toast.makeText(getApplicationContext(),"dilip",Toast.LENGTH_LONG).show();
captureImage();
}
});
}
private void requestCameraPermission() {
Log.w(TAG, "Camera permission is not granted. Requesting permission");
final String[] permissions = new String[]{Manifest.permission.CAMERA,Manifest.permission.WRITE_EXTERNAL_STORAGE};
if (!ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.CAMERA) && !ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE) ) {
ActivityCompat.requestPermissions(this, permissions, RC_HANDLE_CAMERA_PERM);
return;
}
final Activity thisActivity = this;
View.OnClickListener listener = new View.OnClickListener() {
#Override
public void onClick(View view) {
ActivityCompat.requestPermissions(thisActivity, permissions,
RC_HANDLE_CAMERA_PERM);
}
};
Snackbar.make(mGraphicOverlay, R.string.permission_camera_rationale,
Snackbar.LENGTH_INDEFINITE)
.setAction(R.string.ok, listener)
.show();
}
private void createCameraSource() {
Context context = getApplicationContext();
/* FaceDetector detector = new FaceDetector.Builder(context)
.setClassificationType(FaceDetector.ALL_CLASSIFICATIONS)
.build();
detector.setProcessor(
new MultiProcessor.Builder<>(new GraphicFaceTrackerFactory())
.build());*/
detector= new FaceDetector.Builder(context)
.setClassificationType(FaceDetector.ALL_CLASSIFICATIONS)
.build();
MyFaceDetector myFaceDetector = new MyFaceDetector(detector);
detector.setProcessor(
new MultiProcessor.Builder<>(new GraphicFaceTrackerFactory())
.build());
mCameraSource = new CameraSource.Builder(context, myFaceDetector)
.build();
if (!detector.isOperational()) {
}
mCameraSource = new CameraSource.Builder(context, detector)
.setRequestedPreviewSize(640, 480)
.setFacing(CameraSource.CAMERA_FACING_FRONT)
.setRequestedFps(10.0f)
.build();
}
/**
* Restarts the camera.
*/
#Override
protected void onResume() {
super.onResume();
startCameraSource();
}
/**
* Stops the camera.
*/
#Override
protected void onPause() {
super.onPause();
mPreview.stop();
}
#Override
protected void onDestroy() {
super.onDestroy();
if (mCameraSource != null) {
mCameraSource.release();
}
}
/**
* Callback for the result from requesting permissions. This method
* is invoked for every call on {#link #requestPermissions(String[], int)}.
* <p>
* <strong>Note:</strong> It is possible that the permissions request interaction
* with the user is interrupted. In this case you will receive empty permissions
* and results arrays which should be treated as a cancellation.
* </p>
*
* #param requestCode The request code passed in {#link #requestPermissions(String[], int)}.
* #param permissions The requested permissions. Never null.
* #param grantResults The grant results for the corresponding permissions
* which is either {#link PackageManager#PERMISSION_GRANTED}
* or {#link PackageManager#PERMISSION_DENIED}. Never null.
* #see #requestPermissions(String[], int)
*/
#Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
if (requestCode != RC_HANDLE_CAMERA_PERM) {
Log.d(TAG, "Got unexpected permission result: " + requestCode);
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
return;
}
if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED &&grantResults[1] == PackageManager.PERMISSION_GRANTED) {
Log.d(TAG, "Camera permission granted - initialize the camera source");
// we have permission, so create the camerasource
createCameraSource();
return;
}
Log.e(TAG, "Permission not granted: results len = " + grantResults.length +
" Result code = " + (grantResults.length > 0 ? grantResults[0] : "(empty)"));
DialogInterface.OnClickListener listener = new DialogInterface.OnClickListener() {
public void onClick(DialogInterface dialog, int id) {
finish();
}
};
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Face Tracker sample")
.setMessage(R.string.no_camera_permission)
.setPositiveButton(R.string.ok, listener)
.show();
}
//==============================================================================================
// Camera Source Preview
//==============================================================================================
/**
* Starts or restarts the camera source, if it exists. If the camera source doesn't exist yet
* (e.g., because onResume was called before the camera source was created), this will be called
* again when the camera source is created.
*/
private void startCameraSource() {
// check that the device has play services available.
int code = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(
getApplicationContext());
if (code != ConnectionResult.SUCCESS) {
Dialog dlg =
GoogleApiAvailability.getInstance().getErrorDialog(this, code, RC_HANDLE_GMS);
dlg.show();
}
if (mCameraSource != null) {
try {
mPreview.start(mCameraSource, mGraphicOverlay);
} catch (IOException e) {
Log.e(TAG, "Unable to start camera source.", e);
mCameraSource.release();
mCameraSource = null;
}
}
}
//==============================================================================================
// Graphic Face Tracker
//==============================================================================================
/**
* Factory for creating a face tracker to be associated with a new face. The multiprocessor
* uses this factory to create face trackers as needed -- one for each individual.
*/
private class GraphicFaceTrackerFactory implements MultiProcessor.Factory<Face> {
#Override
public Tracker<Face> create(Face face) {
return new GraphicFaceTracker(mGraphicOverlay);
}
}
/**
* Face tracker for each detected individual. This maintains a face graphic within the app's
* associated face overlay.
*/
private class GraphicFaceTracker extends Tracker<Face> {
private GraphicOverlay mOverlay;
private FaceGraphic mFaceGraphic;
GraphicFaceTracker(GraphicOverlay overlay) {
mOverlay = overlay;
mFaceGraphic = new FaceGraphic(overlay);
}
/**
* Start tracking the detected face instance within the face overlay.
*/
#Override
public void onNewItem(int faceId, Face item) {
mFaceGraphic.setId(faceId);
}
**`strong text`**
#Override
public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
mOverlay.add(mFaceGraphic);
mFaceGraphic.updateFace(face);
}
/**
* Hide the graphic when the corresponding face was not detected. This can happen for
* intermediate frames temporarily (e.g., if the face was momentarily blocked from
* view).
*/
#Override
public void onMissing(FaceDetector.Detections<Face> detectionResults) {
mOverlay.remove(mFaceGraphic);
}
/**
* Called when the face is assumed to be gone for good. Remove the graphic annotation from
* the overlay.
*/
#Override
public void onDone() {
mOverlay.remove(mFaceGraphic);
}
}
private void captureImage() {
mPreview.setDrawingCacheEnabled(true);
final Bitmap drawingCache = mPreview.getDrawingCache();
mCameraSource.takePicture(null, new CameraSource.PictureCallback() {
#Override
public void onPictureTaken(byte[] bytes) {
int orientation = Exif.getOrientation(bytes);
Bitmap temp = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
Bitmap picture = rotateImage(temp,orientation);
Bitmap overlay = Bitmap.createBitmap(mGraphicOverlay.getWidth(),mGraphicOverlay.getHeight(),picture.getConfig());
Canvas canvas = new Canvas(overlay);
Matrix matrix = new Matrix();
matrix.setScale((float)overlay.getWidth()/(float)picture.getWidth(),(float)overlay.getHeight()/(float)picture.getHeight());
// mirror by inverting scale and translating
matrix.preScale(-1, 1);
matrix.postTranslate(canvas.getWidth(), 0);
Paint paint = new Paint();
canvas.drawBitmap(picture,matrix,paint);
canvas.drawBitmap(drawingCache,0,0,paint);
try {
String mainpath = getExternalStorageDirectory() + separator + "MaskIt" + separator + "images" + separator;
File basePath = new File(mainpath);
if (!basePath.exists())
Log.d("CAPTURE_BASE_PATH", basePath.mkdirs() ? "Success": "Failed");
String path = mainpath + "photo_" + getPhotoTime() + ".jpg";
File captureFile = new File(path);
captureFile.createNewFile();
if (!captureFile.exists())
Log.d("CAPTURE_FILE_PATH", captureFile.createNewFile() ? "Success": "Failed");
FileOutputStream stream = new FileOutputStream(captureFile);
overlay.compress(Bitmap.CompressFormat.PNG, 100, stream);
stream.flush();
stream.close();
picture.recycle();
drawingCache.recycle();
mPreview.setDrawingCacheEnabled(false);
} catch (IOException e) {
e.printStackTrace();
}
}
private String getPhotoTime() {
DateFormat dateFormatter = new SimpleDateFormat("yyyyMMdd hhmmss");
dateFormatter.setLenient(false);
Date today = new Date();
String s = dateFormatter.format(today);
return s;
}
});
}
private Bitmap rotateImage(Bitmap bm, int i) {
Matrix matrix = new Matrix();
switch (i) {
case ExifInterface.ORIENTATION_NORMAL:
return bm;
case ExifInterface.ORIENTATION_FLIP_HORIZONTAL:
matrix.setScale(-1, 1);
break;
case ExifInterface.ORIENTATION_ROTATE_180:
matrix.setRotate(180);
break;
case ExifInterface.ORIENTATION_FLIP_VERTICAL:
matrix.setRotate(180);
matrix.postScale(-1, 1);
break;
case ExifInterface.ORIENTATION_TRANSPOSE:
matrix.setRotate(90);
matrix.postScale(-1, 1);
break;
case ExifInterface.ORIENTATION_ROTATE_90:
matrix.setRotate(90);
break;
case ExifInterface.ORIENTATION_TRANSVERSE:
matrix.setRotate(-90);
matrix.postScale(-1, 1);
break;
case ExifInterface.ORIENTATION_ROTATE_270:
matrix.setRotate(-90);
break;
default:
return bm;
}
try {
Bitmap bmRotated = Bitmap.createBitmap(bm, 0, 0, bm.getWidth(), bm.getHeight(), matrix, true);
bm.recycle();
return bmRotated;
} catch (OutOfMemoryError e) {
e.printStackTrace();
return null;
}
}
}
CameraSourcePreview.java
package com.google.android.gms.samples.vision.face.facetracker.ui.camera;
import android.content.Context;
import android.content.res.Configuration;
import android.util.AttributeSet;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.ViewGroup;
import com.google.android.gms.common.images.Size;
import com.google.android.gms.vision.CameraSource;
import java.io.IOException;
public class CameraSourcePreview extends ViewGroup {
private static final String TAG = "CameraSourcePreview";
private Context mContext;
private SurfaceView mSurfaceView;
private boolean mStartRequested;
private boolean mSurfaceAvailable;
private CameraSource mCameraSource;
private GraphicOverlay mOverlay;
public CameraSourcePreview(Context context, AttributeSet attrs) {
super(context, attrs);
mContext = context;
mStartRequested = false;
mSurfaceAvailable = false;
mSurfaceView = new SurfaceView(context);
mSurfaceView.getHolder().addCallback(new SurfaceCallback());
addView(mSurfaceView);
}
public void start(CameraSource cameraSource) throws IOException {
if (cameraSource == null) {
stop();
}
mCameraSource = cameraSource;
if (mCameraSource != null) {
mStartRequested = true;
startIfReady();
}
}
public void start(CameraSource cameraSource, GraphicOverlay overlay) throws IOException {
mOverlay = overlay;
start(cameraSource);
}
public void stop() {
if (mCameraSource != null) {
mCameraSource.stop();
}
}
public void release() {
if (mCameraSource != null) {
mCameraSource.release();
mCameraSource = null;
}
}
private void startIfReady() throws IOException {
if (mStartRequested && mSurfaceAvailable) {
mCameraSource.start(mSurfaceView.getHolder());
if (mOverlay != null) {
Size size = mCameraSource.getPreviewSize();
int min = Math.min(size.getWidth(), size.getHeight());
int max = Math.max(size.getWidth(), size.getHeight());
if (isPortraitMode()) {
// Swap width and height sizes when in portrait, since it will be rotated by
// 90 degrees
mOverlay.setCameraInfo(min, max, mCameraSource.getCameraFacing());
} else {
mOverlay.setCameraInfo(max, min, mCameraSource.getCameraFacing());
}
mOverlay.clear();
}
mStartRequested = false;
}
}
private class SurfaceCallback implements SurfaceHolder.Callback {
#Override
public void surfaceCreated(SurfaceHolder surface) {
mSurfaceAvailable = true;
try {
startIfReady();
} catch (IOException e) {
Log.e(TAG, "Could not start camera source.", e);
}
}
#Override
public void surfaceDestroyed(SurfaceHolder surface) {
mSurfaceAvailable = false;
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
}
#Override
protected void onLayout(boolean changed, int left, int top, int right, int bottom) {
int previewWidth = 320;
int previewHeight = 240;
if (mCameraSource != null) {
Size size = mCameraSource.getPreviewSize();
if (size != null) {
previewWidth = size.getWidth();
previewHeight = size.getHeight();
}
}
// Swap width and height sizes when in portrait, since it will be rotated 90 degrees
if (isPortraitMode()) {
int tmp = previewWidth;
previewWidth = previewHeight;
previewHeight = tmp;
}
final int viewWidth = right - left;
final int viewHeight = bottom - top;
int childWidth;
int childHeight;
int childXOffset = 0;
int childYOffset = 0;
float widthRatio = (float) viewWidth / (float) previewWidth;
float heightRatio = (float) viewHeight / (float) previewHeight;
// To fill the view with the camera preview, while also preserving the correct aspect ratio,
// it is usually necessary to slightly oversize the child and to crop off portions along one
// of the dimensions. We scale up based on the dimension requiring the most correction, and
// compute a crop offset for the other dimension.
if (widthRatio > heightRatio) {
childWidth = viewWidth;
childHeight = (int) ((float) previewHeight * widthRatio);
childYOffset = (childHeight - viewHeight) / 2;
} else {
childWidth = (int) ((float) previewWidth * heightRatio);
childHeight = viewHeight;
childXOffset = (childWidth - viewWidth) / 2;
}
for (int i = 0; i < getChildCount(); ++i) {
// One dimension will be cropped. We shift child over or up by this offset and adjust
// the size to maintain the proper aspect ratio.
getChildAt(i).layout(
-1 * childXOffset, -1 * childYOffset,
childWidth - childXOffset, childHeight - childYOffset);
}
try {
startIfReady();
} catch (IOException e) {
Log.e(TAG, "Could not start camera source.", e);
}
}
private boolean isPortraitMode() {
int orientation = mContext.getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
return false;
}
if (orientation == Configuration.ORIENTATION_PORTRAIT) {
return true;
}
Log.d(TAG, "isPortraitMode returning false by default");
return false;
}
}
GraphicOverlay.java
package com.google.android.gms.samples.vision.face.facetracker.ui.camera;
import android.content.Context;
import android.graphics.Canvas;
import android.util.AttributeSet;
import android.view.View;
import com.google.android.gms.vision.CameraSource;
import java.util.HashSet;
import java.util.Set;
public class GraphicOverlay extends View {
private final Object mLock = new Object();
private int mPreviewWidth;
private float mWidthScaleFactor = 1.0f;
private int mPreviewHeight;
private float mHeightScaleFactor = 1.0f;
private int mFacing = CameraSource.CAMERA_FACING_BACK;
private Set<Graphic> mGraphics = new HashSet<>();
public static abstract class Graphic {
private GraphicOverlay mOverlay;
public Graphic(GraphicOverlay overlay) {
mOverlay = overlay;
}
public abstract void draw(Canvas canvas);
public float scaleX(float horizontal) {
return horizontal * mOverlay.mWidthScaleFactor;
}
public float scaleY(float vertical) {
return vertical * mOverlay.mHeightScaleFactor;
}
public float translateX(float x) {
if (mOverlay.mFacing == CameraSource.CAMERA_FACING_FRONT) {
return mOverlay.getWidth() - scaleX(x);
} else {
return scaleX(x);
}
}
public float translateY(float y) {
return scaleY(y);
}
public void postInvalidate() {
mOverlay.postInvalidate();
}
}
public GraphicOverlay(Context context, AttributeSet attrs) {
super(context, attrs);
}
public void clear() {
synchronized (mLock) {
mGraphics.clear();
}
postInvalidate();
}
public void add(Graphic graphic) {
synchronized (mLock) {
mGraphics.add(graphic);
}
postInvalidate();
}
public void remove(Graphic graphic) {
synchronized (mLock) {
mGraphics.remove(graphic);
}
postInvalidate();
}
public void setCameraInfo(int previewWidth, int previewHeight, int facing) {
synchronized (mLock) {
mPreviewWidth = previewWidth;
mPreviewHeight = previewHeight;
mFacing = facing;
}
postInvalidate();
}
/**
* Draws the overlay with its associated graphic objects.
*/
#Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
synchronized (mLock) {
if ((mPreviewWidth != 0) && (mPreviewHeight != 0)) {
mWidthScaleFactor = (float) canvas.getWidth() / (float)mPreviewWidth;
mHeightScaleFactor = (float) canvas.getHeight() / (float) mPreviewHeight;
}
for (Graphic graphic : mGraphics) {
graphic.draw(canvas);
}
}
}
}
You should pass the captured image to FaceDetector API and crop it thereafter.
fun getFace(context: Context, data: ByteArray): Bitmap? {
try {
val imageStrem = ByteArrayInputStream(data)
var bitmap = BitmapFactory.decodeStream(imageStrem)
if (bitmap.width > bitmap.height) {
val matrix = Matrix()
matrix.postRotate(270f)
if (bitmap.width > 1500) matrix.postScale(0.5f, 0.5f)
bitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.width, bitmap.height, matrix, true)
}
val faceDetector = FaceDetector.Builder(context).setProminentFaceOnly(true).setTrackingEnabled(false).build()
val frame = Frame.Builder().setBitmap(bitmap).build()
val faces = faceDetector.detect(frame)
var results: Bitmap? = null
for (i in 0 until faces.size()) {
val thisFace = faces.valueAt(i)
val x = thisFace.position.x
val y = thisFace.position.y
val x2 = x / 4 + thisFace.width
val y2 = y / 4 + thisFace.height
results = Bitmap.createBitmap(bitmap, x.toInt(), y.toInt(), x2.toInt(), y2.toInt())
}
return results
} catch (e: Exception) {
Log.e("GET_FACE", e.message)
}
return null
}
Source
I'm trying to change attribute to custom LinearLayout class, I set the option to class with:
MyBuilder option = new MyBuilder.Builder()
.image(...)
.setCardRadius(...)
.build());
Than i call in MainActivity
MyObject obj = (MyObject) findViewById(R.id.myObject);
obj.init(context, option);
But if I call multiple times obj.init(...) with different option the builder has old value setted so the view cannot change attribute correctly.
So my question is: can I reset Builder o LinearLayout before initializate new object?
This is my LinearLayout.java:
public class MyObject extends LinearLayout{
CardView card;
ImageView image;
float cardRadiusAttr;
View rootView;
AttributeSet attributeSet;
public void init(final Context context, final MyBuilder option){
if(option != null)
{
/*
Get attribute from XML
*/
TypedArray ta = context.obtainStyledAttributes(attributeSet, R.styleable.Card, 0, 0);
try {
cardRadiusAttr = ta.getDimension(R.styleable.Card_c_cardRadius, option.getCardRadius());
} finally {
ta.recycle();
}
/*
Set up xml object.
*/
card = (CardView) findViewById(R.id.card);
image = (ImageView) findViewById(R.id.image);
card.setRadius(cardRadiusAttr);
/**
* Check if Option is set
*/
if (option.isImage() != null) {
//Set Image
}
}else{
Log.e("Initialization", "Option View not initialize!");
}
}
public MyObject(Context context, AttributeSet attrs) {
super(context, attrs);
/*
Inflater custom layout to view.
*/
LayoutInflater inflater = (LayoutInflater) context
.getSystemService(Context.LAYOUT_INFLATER_SERVICE);
rootView = inflater.inflate(R.layout.Card, this, true);
attributeSet = attrs;
}
#Override
protected void onFinishInflate() {
super.onFinishInflate();
}
}
This is MyBuilder.java
public class MyBuilder {
private int mImage;
private float mCardRadius = 4f;
private MyBuilder(Builder builder)
{
mImage = builder.mImage;
mCardRadius = builder.mCardRadius;
}
public static class Builder{
private int mImage;
private float mCardRadius = 4f;
public Builder setCardRadius(float radius)
{
if(radius <= 0)
{
Log.e("CardRadius", "Impossible to set Card Radius lower than 0! Please Check it");
}
else {
mCardRadius = radius;
}
return this;
}
public Builder image(int image) {
if(image == 0)
{
Log.e("Image", "Impossible to set Image to 0! Please Check it");
}
else {
mImage = image;
}
return this;
}
public MyBuilder build() {
return new MyBuilder(this);
}
}
public int getImage() {
return mImage;
}
public float getCardRadius() {
return mCardRadius;
}
}
I finally found the issue.
In the init method of the MyObject you have to clean up the View after the previous use.
In this particular case, first, you pass one set of options. Based on them, View is adjusting Visibility of its controls (making button1, button2, etc. Visible). But when you pass another set of options - you have to erase all changes have been made before. (i.e. hide button1, button2, etc. and let the View to adjust Visibility of its controls once again)
I am having nearly 500+ images in Imageview inside Horizontalscrollview. If i am selecting an image then I am marking it as selected. If I am selecting any other images in the view, it should be un-select and newly clicked image should have to be selected. How could I can achieve it?
for (int i = 0; i < Home.arr_category_item_list.size(); i++) {
ImageView circleImageView = new ImageView(getActivity());
imageLoader.get(Home.arr_category_item_list.get(i).get(Variables.EST_CATEGORY_ITEM_IMAGE), ImageLoader.getImageListener(circleImageView, R.drawable.defaultimage, R.drawable.defaultimage));
circleImageView.setTag(Integer.parseInt(Home.arr_category_item_list.get(i).get(Variables.EST_CATEGORY_ITEM_ID)));
circleImageView.setLayoutParams(params);
lnr_category_item.addView(circleImageView);
}
Please check the image attached. At the bottom of the screen there is an Image view. So user will have option to select only one image at a time.
Now this is some really old code, hope it still works.
Note: there may be a thing or two missing, but you can get the idea from this implementation
import android.content.Context;
import android.util.AttributeSet;
import android.view.View;
import android.widget.ImageView;
import yourpackage.R;
//Creataed by Bojan Kseneman on 14.8.2013
public class CustomCheckBox extends ImageView {
private boolean isChecked;
private boolean isImageShown;
private boolean useCustomClickListener;
//private String android_xmlns = "http://schemas.android.com/apk/res/android";
private String app_xmlns;
private int checkboxOnResID;
private int checkboxOffResID;
private int checkboxDisabledOnResID;
private int checkboxDisabledOffResID;
//private int imageHeight;
//private int imageWidth;
public CustomCheckBox(Context context, AttributeSet attrs) {
super(context, attrs);
app_xmlns = new StringBuilder("http://schemas.android.com/apk/res/" + context.getPackageName()).toString();
init(attrs);
}
private void init(AttributeSet attrs) {
checkboxOnResID = attrs.getAttributeResourceValue(app_xmlns, "resourceChecked", R.drawable.round_checkbox_on);
checkboxOffResID = attrs.getAttributeResourceValue(app_xmlns, "resourceNotChecked", R.drawable.round_checkbox_off);
checkboxDisabledOnResID = attrs.getAttributeResourceValue(app_xmlns, "resourceDisabledOn", R.drawable.round_checkbox_off);
checkboxDisabledOffResID = attrs.getAttributeResourceValue(app_xmlns, "resourceDisabledOff", R.drawable.round_checkbox_off);
useCustomClickListener = attrs.getAttributeBooleanValue(app_xmlns, "customClickEvent", false);
if (useCustomClickListener)
this.setOnTouchListener(new CboxTouchListener());
else {
this.setOnTouchListener(new NormalClickListener());
}
if (!hasOnClickListener()) {
/**
* assign a new onClick listener so we get desired onClick sound
* effect (because we call it) this is opposite to how android
* behaves, where you don't hear the sound if there is not
* onClickListener assigned
*/
this.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
}
});
}
}
private boolean hasOnClickListener() {
try {
if (android.os.Build.VERSION.SDK_INT >= 14) {
//the information is inside ListenerInfo
java.lang.reflect.Field listenerInfoField = null;
listenerInfoField = Class.forName("android.view.View").getDeclaredField("mListenerInfo");
if (listenerInfoField != null)
listenerInfoField.setAccessible(true);
Object mOnClickListener = null;
mOnClickListener = listenerInfoField.get(this); //get from view object, in this case this is this
// get the field mOnClickListener, that holds the listener and cast it to a listener
java.lang.reflect.Field listenerField = null;
listenerField = Class.forName("android.view.View$ListenerInfo").getDeclaredField("mOnClickListener");
//View.OnClickListener myListener = (View.OnClickListener) listenerField.get(myLiObject);
return (listenerField.get(mOnClickListener) != null);
}
else {
//directly in View
java.lang.reflect.Field f = Class.forName("android.view.View").getDeclaredField("mOnClickListener");
return (f.get(this) != null);
}
}
catch (Exception e) {
return false;
}
}
// private void setScaledDownImage(int resID) {
// this.setImageBitmap(CommonMethods.decodeSampledBitmapFromResource(getContext(), resID, imageWidth, imageHeight));
// }
public boolean isChecked() {
return (isChecked && this.isEnabled());
}
public void setChecked(boolean isChecked) {
if (this.isEnabled())
setCheckedIgnoreEnabled(isChecked);
else
this.setEnabled(false);
}
public void setCheckedIgnoreEnabled(boolean isChecked) {
if ((this.isChecked != isChecked) || !isImageShown) {
this.isChecked = isChecked;
if (isChecked)
setImageResource(checkboxOnResID);
else
setImageResource(checkboxOffResID);
}
}
#Override
public void setEnabled(boolean enabled) {
super.setEnabled(enabled);
if (enabled)
setChecked(isChecked);
else {
int resID = isChecked ? checkboxDisabledOnResID : checkboxDisabledOffResID;
setImageResource(resID);
}
}
public void setCheckedAndEnabled(boolean isChecked, boolean isEnabled) {
setCheckedIgnoreEnabled(isChecked);
setEnabled(isEnabled);
}
public void toggle() {
setChecked(!isChecked);
}
public void toggleWithClick() {
toggle();
this.performClick();
}
public void toggleWithSilentClick() {
//v.playSoundEffect(android.view.SoundEffectConstants.CLICK);
boolean currentState = this.isSoundEffectsEnabled();
this.setSoundEffectsEnabled(false);
toggleWithClick();
this.setSoundEffectsEnabled(currentState);
}
private class CboxTouchListener extends ImageBoundClickListener {
#Override
public void doSomething() {
super.doSomething();
setChecked(!isChecked);
CustomCheckBox.this.performClick();
}
}
private class NormalClickListener extends MyOnClickListener {
#Override
public void doSomething() {
super.doSomething();
setChecked(!isChecked);
CustomCheckBox.this.performClick();
}
}
#Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec);
//this.imageWidth = MeasureSpec.getSize(widthMeasureSpec);
//this.imageHeight = MeasureSpec.getSize(heightMeasureSpec);
if (!isImageShown) {
setChecked(isChecked);
isImageShown = !isImageShown;
}
}
}
I used this click listener, since I was using some round images and I didn't want to trigger the click events when the user clicked on the transparent part of the image.
public abstract class ImageBoundClickListener implements android.view.View.OnTouchListener {
private String TAG = "ImageBoundsTouchListener";
private boolean shouldTriggerAction = false;
Rect allowedArea;
#Override
public boolean onTouch(View v, MotionEvent event) {
//let's us detect only clicks on the part where actual image is and not where the ImageView is
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
float[] eventXY = new float[] { event.getX(), event.getY() };
allowedArea = new Rect(v.getLeft(), v.getTop(), v.getRight(), v.getBottom());
android.graphics.Matrix invertMatrix = new android.graphics.Matrix();
ImageView iv = (ImageView) v;
iv.getImageMatrix().invert(invertMatrix);
invertMatrix.mapPoints(eventXY);
int x = (int) eventXY[0];
int y = (int) eventXY[1];
Drawable imgDrawable = iv.getDrawable();
Bitmap bitmap = ((BitmapDrawable) imgDrawable).getBitmap();
//Limit x,y within the bitmap
if (x < 0)
x = 0;
else if (x > (bitmap.getWidth() - 1))
x = bitmap.getWidth() - 1;
if (y < 0)
y = 0;
else if (y > bitmap.getHeight() - 1)
y = bitmap.getHeight() - 1;
int touchedRGB = bitmap.getPixel(x, y);
//is transparent?
shouldTriggerAction = (touchedRGB == 0) ? false : true;
return true;
case MotionEvent.ACTION_MOVE:
if (!allowedArea.contains(v.getLeft() + (int) event.getX(), v.getTop() + (int) event.getY()))
//the user went out of the user area
shouldTriggerAction = false;
return true;
case MotionEvent.ACTION_UP:
//finger is no longer on screen
if (shouldTriggerAction)
doSomething();
return true;
default:
return false;
}
}
public void doSomething() {
}
}