I'm making an android application that serves as house guard. For now it works like this: after pressing a button it takes picture without intent, compare it with the picture taken last time, if there's a significant difference it saves it, sends sms and uploads picture using ftp. I'm new to Java and Android...
#Nullable
#Override
public View onCreateView(#NonNull LayoutInflater inflater,
#Nullable ViewGroup container,
#Nullable Bundle savedInstanceState) {
[...]
//Take a picture
view.findViewById(R.id.capture_btn).setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
//Take picture using the camera without preview.
//...wanted to put here for loop but didn't work
takePicture();
}
}
return view;
}
// ----------------------------------------------------
protected void takePicture() {
if (mCameraPreview != null) {
if (mCameraPreview.isSafeToTakePictureInternal()) {
mCameraPreview.takePictureInternal();
}
} else {
throw new RuntimeException("Background camera not initialized. Call startCamera() to initialize the camera.");
}
}
// ----------------------------------------------------
#Override
public void onImageCapture(#NonNull File imageFile) {
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap bitmap1 = BitmapFactory.decodeFile(imageFile.getAbsolutePath(), options);
// here it compares last and old picture
[...]
// if comparison failed alert
new FtpTask().execute(path);
sendSMS("767555444", "ALARM ALARM");
}
// ----------------------------------------------------
void takePictureInternal() {
safeToTakePicture = false;
if (mCamera != null) {
mCamera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(final byte[] bytes, Camera camera) {
new Thread(new Runnable() {
#Override
public void run() {
//Convert byte array to bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
//Rotate the bitmap
Bitmap rotatedBitmap;
if (mCameraConfig.getImageRotation() != CameraRotation.ROTATION_0) {
rotatedBitmap = HiddenCameraUtils.rotateBitmap(bitmap, mCameraConfig.getImageRotation());
//noinspection UnusedAssignment
bitmap = null;
} else {
rotatedBitmap = bitmap;
}
//Save image to the file.
if (HiddenCameraUtils.saveImageFromFile(rotatedBitmap,
mCameraConfig.getImageFile(),
mCameraConfig.getImageFormat())) {
//Post image file to the main thread
new android.os.Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
mCameraCallbacks.onImageCapture(mCameraConfig.getImageFile());
}
});
} else {
//Post error to the main thread
new android.os.Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
mCameraCallbacks.onCameraError(CameraError.ERROR_IMAGE_WRITE_FAILED);
}
});
}
mCamera.startPreview();
safeToTakePicture = true;
}
}).start();
}
});
} else {
mCameraCallbacks.onCameraError(CameraError.ERROR_CAMERA_OPEN_FAILED);
safeToTakePicture = true;
}
}
What I want to achieve is that after button press it starts the loop in which pictures are taken and compared again and again until I stop by clicking the button. I don't want new loop to start before old one is still being processed (image comparison pixel by pixel takes time). Being more precise, I want to ask where exactly should I add a loop to make it works?
Sorry if there is too much or too little code. If I forgot to add something please tell me and I'll upload.
it sounds like you will be better off with the camera capture session.
See this article - it should help you understand what you can do: https://medium.com/androiddevelopers/understanding-android-camera-capture-sessions-and-requests-4e54d9150295
Related
When the ImagePager load first time, some times, Picasso call onError, showing the .error drawable. If I press on back button and go back to the Activity that has the ImagePager, Picasso load the picture correctly. If the ImagePager has two or more pictures and I swipe between the pictures, those are loaded correctly some times without exit and reenter to the ImagePager.
It correctly downloads other images from web. The issue arises when I tried to download from hosted company server.
I am using Picasso 'com.squareup.picasso:picasso:2.5.0'.
I also referred to below question but it doesn't help.
First time error loading picture with Picasso
Below is my MainActivity.java
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
configureToolbar(R.string.select_task);
init();
}
#Override
protected void init() {
//TODO: For QA Testing Purpose, Remove after Testing
mWidthField =findViewById(R.id.edt_txt_1);
mHeightField =findViewById(R.id.edt_txt_2);
mImage=findViewById(R.id.image_view_2);
mImageLoadButton=findViewById(R.id.image_load_button);
item=new Item();
item.setPrimaryImageURL("https://cdn.cnn.com/cnnnext/dam/assets/190119161516-01-trump-government-shutdown-0119-exlarge-169.jpg");
item.setUpc("0001111086751");
Log.d("ImageManager","Main Activity");
new ImageManager(getApplicationContext()).downloadImage(item.getPrimaryImageURL(),item.getUpc()+".jpeg",imageDownloadedCallBack);
mImageLoadButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
loadResizedImage();
}
});
}
//TODO: For QA Testing Purpose, Remove after Testing
ImageManager.ImageDownloadedCallBack imageDownloadedCallBack=new ImageManager.ImageDownloadedCallBack() {
#Override
public void imageDownloadComplete(final Bitmap bitmap, boolean status) {
runOnUiThread(new Runnable() {
#Override
public void run() {
Log.d("ImageManager","Image Download CallBack in Main Activity");
mImage.setImageBitmap(bitmap);
}
});
}
};
Below is my ImageManager.java
public class ImageManager {
private final Context mContext;
private int mWidth;
private int mHeight;
public ImageManager(Context mContext){
this.mContext=mContext;
}
public interface ImageDownloadedCallBack {
void imageDownloadComplete(Bitmap bitmap,boolean status);
}
private Target picassoImageTarget(Context context, final String imageDir, final String imageName,final ImageDownloadedCallBack imageDownloadedCallBack) {
ContextWrapper cw = new ContextWrapper(context);
final File directory = cw.getDir(imageDir, Context.MODE_PRIVATE); // path to /data/data/yourapp/app_imageDir
return new Target() {
#Override
public void onBitmapLoaded(final Bitmap bitmap, Picasso.LoadedFrom from) {
Thread thread=new Thread(new Runnable() {
#Override
public void run() {
final File myImageFile = new File(directory, imageName);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(myImageFile);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);
Log.d("ImageManager","Image DownLoad CallBack");
imageDownloadedCallBack.imageDownloadComplete(bitmap,true);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
});
thread.start();
}
#Override
public void onBitmapFailed(Drawable errorDrawable) {
Log.d("ImageManager","Bitmap Failure");
}
#Override
public void onPrepareLoad(Drawable placeHolderDrawable) {
if (placeHolderDrawable != null) {}
}
};
}
public void downloadImage(String url, String id,ImageDownloadedCallBack imageDownloadedCallBack){
// this.imageDownloadedCallBack=imageDownloadedCallBack;
Log.d("ImageManager","Download Image function");
Picasso.with(mContext).load(url).into(picassoImageTarget(mContext,"imageDir", id ,imageDownloadedCallBack));
}
}
Any help would be appreciated.
I found the answer from old post and I implemented this code in MainActivity. And now it is working good.
final Target target = new Target{...};
imageView.setTag(target);
Refer wrb-answer below for the above code:
wrb-answer
I understand how to capture an image using intents and launching the camera app using intents, however I would like to know how to do so in the following steps:
display surfaceview showing preview of camera to user
when user presses capture, display captured image on screen to user and hide surfaceview with camera preview (exact same behaviour as snapchat).
NP - I do not want the camera app to be launched at anytime during this process, I want it all to be done within my own app. The problem with my current code is that it launches the camera app when the capture button is pressed. Also, it does not display the taken photo properly, a white screen is shown instead. I currently have this code created:
ANDROID ACTIVITY:
public class CameraScreen extends Activity {
private Camera mCamera = null;
private SessionManager session;
private String rand_img;
private ImageView preview_pic;
private CameraPreview mCameraView = null;
static final int CAM_REQUEST = 1;
private RandomString randomString = new RandomString(10);
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera_screen);
session = new SessionManager(getApplicationContext());
try{
mCamera = Camera.open();//you can use open(int) to use different cameras
} catch (Exception e){
Log.d("ERROR", "Failed to get camera: " + e.getMessage());
}
if(mCamera != null) {
mCameraView = new CameraPreview(this, mCamera);//create a SurfaceView to show camera data
FrameLayout camera_view = (FrameLayout)findViewById(R.id.camera_view);
camera_view.addView(mCameraView);//add the SurfaceView to the layout
}
//btn to close the application
Button imgClose = (Button)findViewById(R.id.imgClose);
imgClose.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
System.exit(0);
}
});
//btn to logout
Button logout = (Button)findViewById(R.id.imgOpen);
logout.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
session.logOut();
Intent a = new Intent(CameraScreen.this, MainActivity.class);
startActivity(a);
finish();
}
});
//CAPTURE BUTTON
Button snap = (Button) findViewById(R.id.snap);
snap.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
Intent capture = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
capture.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(getFile()));
startActivityForResult(capture,CAM_REQUEST);
}
});
}
#Override
protected void onPause() {
super.onPause();
if (mCamera != null) {
mCamera.setPreviewCallback(null);
mCameraView.getHolder().removeCallback(mCameraView);
mCamera.release();
}
}
#Override
public void onResume() {
super.onResume();
// Get the Camera instance as the activity achieves full user focus
if (mCamera == null) {
initializeCamera(); // Local method to handle camera initialization
}
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
String path = "sdcard/city_life_pic/" + rand_img;
preview_pic = (ImageView) findViewById(R.id.picturedisplay);
FrameLayout camera_view = (FrameLayout)findViewById(R.id.camera_view);
camera_view.setVisibility(View.GONE);
preview_pic.setVisibility(View.VISIBLE);
preview_pic.setImageDrawable(Drawable.createFromPath(path));
}
protected void initializeCamera(){
// Get an instance of Camera Object
try{
mCamera = Camera.open();//you can use open(int) to use different cameras
} catch (Exception e){
Log.d("ERROR", "Failed to get camera: " + e.getMessage());
}
if(mCamera != null) {
mCameraView = new CameraPreview(this, mCamera);//create a SurfaceView to show camera data
FrameLayout camera_view = (FrameLayout)findViewById(R.id.camera_view);
camera_view.addView(mCameraView);//add the SurfaceView to the layout
}
}
private File getFile() {
File folder = new File("sdcard/city_life_pic");
if (!folder.exists()) {
folder.mkdir();
}
rand_img = randomString.nextString() + ".jpg";
File image = new File(folder,rand_img);
return image;
}
}
CAMERA CLASS:
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback{
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraPreview(Context context, Camera camera){
super(context);
mCamera = camera;
mCamera.setDisplayOrientation(90);
//get the holder and set this class as the callback, so we can get camera data here
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);;
}
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
try{
//when the surface is created, we can set the camera to draw images in this surfaceholder
mCamera.setPreviewDisplay(surfaceHolder);
mCamera.startPreview();
} catch (IOException e) {
Log.d("ERROR", "Camera error on surfaceCreated " + e.getMessage());
}
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i2, int i3) {
//before changing the application orientation, you need to stop the preview, rotate and then start it again
if(mHolder.getSurface() == null)//check if the surface is ready to receive camera data
return;
try{
mCamera.stopPreview();
} catch (Exception e){
//this will happen when you are trying the camera if it's not running
}
//now, recreate the camera preview
try{
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (IOException e) {
Log.d("ERROR", "Camera error on surfaceChanged " + e.getMessage());
}
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
//our app has only one screen, so we'll destroy the camera in the surface
//if you are unsing with more screens, please move this code your activity
mCamera.stopPreview();
mCamera.release();
}
}
You are opening the device's camera app by using this code
Intent a = new Intent(CameraScreen.this, MainActivity.class);
startActivity(a);
finish();
Instead, to take a picture using your custom camera, use Camera#takePicture method instead
That would make your code
snap.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
mCamera.takePicture(....) //set parameters based on what you need
}
});
Alright so I am working on an app that has to take pictures and send them to a server for processing. I need this for some image recognition that will eventually help control a robot. Basically I need to use the android device as a webcam that sends pictures. I figured out the Sockets part but now after fiddling with some code for a few days I ended up with this:
public class MainActivity extends AppCompatActivity {
public static final String dTag = "DBG";
#SuppressWarnings("deprecation")
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Button trg = (Button)findViewById(R.id.trigger_btn);
trg.setOnClickListener(new Button.OnClickListener(){
#Override
public void onClick(View v)
{
SurfaceHolder mSurfaceHolder = new SurfaceHolder() {
#Override
public void addCallback(Callback callback) {
}
#Override
public void removeCallback(Callback callback) {
}
#Override
public boolean isCreating() {
return false;
}
#Override
public void setType(int type) {
}
#Override
public void setFixedSize(int width, int height) {
}
#Override
public void setSizeFromLayout() {
}
#Override
public void setFormat(int format) {
}
#Override
public void setKeepScreenOn(boolean screenOn) {
}
#Override
public Canvas lockCanvas() {
return null;
}
#Override
public Canvas lockCanvas(Rect dirty) {
return null;
}
#Override
public void unlockCanvasAndPost(Canvas canvas) {
}
#Override
public Rect getSurfaceFrame() {
return null;
}
#Override
public Surface getSurface() {
return null;
}
};
Camera.PictureCallback mCall = new Camera.PictureCallback()
{
#Override
public void onPictureTaken(byte[] data, Camera camera)
{
ByteArrayInputStream inputStream = new ByteArrayInputStream(data);
Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
ImageView imW = (ImageView)findViewById(R.id.imView);
imW.setImageBitmap(bitmap);
Log.d(dTag, "" + data.length);
}
};
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
Camera mCamera;
mCamera = Camera.open();
try {
mCamera.setPreviewDisplay();
mCamera.startPreview();
}
catch(IOException e){
Log.d(dTag, "Cam is null!");
}
mCamera.takePicture(null, null, mCall);
mCamera.stopPreview();
mCamera.release();
}
});
}
Now whenever I press the button I see this in the debug log "D/Camera: app passed NULL surface", I assume this is because of mSurfaceHolder which isn't properly declared. If anyone could point to me what the problem is and how to solve it I would be grateful since I don't have a very good understanding of java and can't seem to find anything that works on the internet.
several problems -
You have no surface view and surface holder must be obtained from the surface view. You just can not create a "new" for this purpose.
You did not pass any surfaceolder in mCamera.setPreviewDisplay(); so system can not decide where to display.
Your method local anonymous inner class is just simply wrong.
Tutorial link: http://examples.javacodegeeks.com/android/core/ui/surfaceview/android-surfaceview-example/
Hello i have the whole code, but i want to save the snaps automatically and release camera to back preview. I don't know how to do that :/ It's taking the snapshot but don't save neither release camera.
Thanks for the hlep guys!!
package com.velcisribeiro.xcamera;
+imports
public class MainActivity extends Activity {
private Camera cameraObject;
private ShowCamera showCamera;
private ImageView pic;
public static Camera isCameraAvailiable(){
Camera object = null;
try {
object = Camera.open();
}
catch (Exception e){
}
return object;
}
private PictureCallback capturedIt = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap bitmap = BitmapFactory.decodeByteArray(data , 0, data .length);
if(bitmap==null){
Toast.makeText(getApplicationContext(), "not taken", Toast.LENGTH_SHORT).show();
}
else
{
Toast.makeText(getApplicationContext(), "taken", Toast.LENGTH_SHORT).show();
}
cameraObject.release();
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
cameraObject = isCameraAvailiable();
showCamera = new ShowCamera(this, cameraObject);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(showCamera);
}
public void snapIt(View view){
cameraObject.takePicture(null, null, capturedIt);
}
}
And the other one is:
public class ShowCamera extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder holdMe;
private Camera theCamera;
public ShowCamera(Context context,Camera camera) {
super(context);
theCamera = camera;
holdMe = getHolder();
holdMe.addCallback(this);
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
theCamera.setPreviewDisplay(holder);
theCamera.setDisplayOrientation(90);
theCamera.startPreview();
} catch (IOException e) {
}
}
#Override
public void surfaceDestroyed(SurfaceHolder arg0) {
}
}
When I was building my own camera implementation, I just used the code provided by the Zxing library. It works really well and you can easily modify it to do what you'd like:
https://github.com/zxing/zxing
You need to add following two lines in surfaceDestroyed callback for releasing camera.
theCamera.stopPreview();
theCamera.release();
And for saving image change onPictureTaken callback.
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap bitmap = BitmapFactory.decodeByteArray(data , 0, data .length);
if(bitmap==null){
Toast.makeText(getApplicationContext(), "not taken", Toast.LENGTH_SHORT).show();
}
else
{
Toast.makeText(getApplicationContext(), "taken", Toast.LENGTH_SHORT).show();
}
//Add code to save image
cameraObject.release();
}
Also have a look on following URL for better understanding.
http://androidtrainningcenter.blogspot.in/2012/01/how-to-use-android-camera-to-take.html.
I've seen a lot of old questions about this, maybe now there are some solutions.
I want to take a screenshot of the current frame of my videoview. Videoview shows a real-time video using a rtsp-stream.
I try to take bitmap, but it's always black
public static Bitmap loadBitmapFromView(View v) {
Bitmap b = Bitmap.createBitmap(v.getLayoutParams().width , v.getLayoutParams().height, Bitmap.Config.ARGB_8888);
Canvas c = new Canvas(b);
v.layout(0, 0, v.getLayoutParams().width, v.getLayoutParams().height);
v.draw(c);
return b;
}
EDIT :
MediaMetadataRetriever does not work with stream url, maybe works with video-file.
Using lib at this link (it's a wrapper of MediaMetadataRetriever that enable rtsp protocol input) I can save a frame of video, but there is a delay of 10 secs respect real-time videoview because it must create a new connection with streaming server.
I not test ThumbnailUtils, but in Api I read that input is only file-path
Use TextureView instead VideoView. TextureView has getBitmap() method. Here is the usage of TextureView as videoView
public class TextureVideoActivity extends Activity implements TextureView.SurfaceTextureListener {
private static final String FILE_NAME = "myVideo.mp4";
private static final String TAG = TextureVideoActivity.class.getName();
private MediaPlayer mMediaPlayer;
private TextureView mPreview;
#Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.activity_texture_video);
mPreview = (TextureView) findViewById(R.id.textureView);
mPreview.setSurfaceTextureListener(this);
}
public Bitmap getBitmap(){
return mPreview.getBitmap();
}
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
Surface surface = new Surface(surfaceTexture);
try {
mMediaPlayer = new MediaPlayer();
mMediaPlayer
.setDataSource(this, Uri.parse(FILE_NAME));
mMediaPlayer.setSurface(surface);
mMediaPlayer.setLooping(true);
// don't forget to call MediaPlayer.prepareAsync() method when you use constructor for
// creating MediaPlayer
mMediaPlayer.prepareAsync();
// Play video when the media source is ready for playback.
mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.start();
}
});
} catch (IllegalArgumentException e) {
Log.d(TAG, e.getMessage());
} catch (SecurityException e) {
Log.d(TAG, e.getMessage());
} catch (IllegalStateException e) {
Log.d(TAG, e.getMessage());
} catch (IOException e) {
Log.d(TAG, e.getMessage());
}
}
#Override
protected void onDestroy() {
super.onDestroy();
if (mMediaPlayer != null) {
// Make sure we stop video and release resources when activity is destroyed.
mMediaPlayer.stop();
mMediaPlayer.release();
mMediaPlayer = null;
}
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int i, int i2) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
}
}
Playing video on TextureView