I am new to android app and I am trying Camera using SurfaceTexture. The call back for OnFrameAvailable() is not being called... Please suggest me a solution. The code is below.
What is missing in this? I am not sure if I have made the correct call to setOnFrameListener().
package com.example.cameratest;
import com.example.test.R;
import android.app.Activity;
import android.content.Intent;
import android.os.Bundle;
import android.view.Menu;
import android.view.View;
import android.graphics.SurfaceTexture;
import android.graphics.SurfaceTexture.OnFrameAvailableListener;
import android.hardware.Camera;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.opengl.*;
import android.util.Log;
import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.util.concurrent.locks.ReentrantLock;
public class MainActivity extends Activity implements OnFrameAvailableListener {
private static final String TAG = "CameraToMpegTest";
private static final boolean VERBOSE = true; // lots of logging
// where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
private static final long DURATION_SEC = 8;
// camera state
private Camera mCamera;
private static SurfaceTexture mSurfaceTexture;
private int[] mGlTextures = null;
private Object mFrameSyncObject = new Object();
private boolean mFrameAvailable = false;
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
}
public void startCamera(View v) {
try {
this.initCamera(0);
this.StartCamera();
} catch (Throwable throwable) {
throwable.printStackTrace();
}
}
private void StartCamera() {
try {
mCamera.startPreview();
long startWhen = System.nanoTime();
long desiredEnd = startWhen + DURATION_SEC * 1000000000L;
int frameCount = 0;
while (System.nanoTime() < desiredEnd) {
// Feed any pending encoder output into the muxer.
awaitNewImage();
}
} finally {
// release everything we grabbed
releaseCamera();
}
}
/**
* Stops camera preview, and releases the camera to the system.
*/
private void releaseCamera() {
if (VERBOSE) Log.d(TAG, "releasing camera");
if (mCamera != null) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
}
private void initCamera(int cameraId) {
mCamera = Camera.open(cameraId);
if (mCamera == null) {
Log.d(TAG, "No front-facing camera found; opening default");
mCamera = Camera.open(); // opens first back-facing camera
}
if (mCamera == null) {
throw new RuntimeException("Unable to open camera");
}
Camera.Parameters parms = mCamera.getParameters();
parms.setPreviewSize(640, 480);
mGlTextures = new int[1];
GLES20.glGenTextures(1, mGlTextures, 0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mGlTextures[0]);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
mSurfaceTexture = new SurfaceTexture(mGlTextures[0]);
try {
mCamera.setPreviewTexture(mSurfaceTexture);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mSurfaceTexture.setOnFrameAvailableListener(MainActivity.this);
}
public void awaitNewImage() {
final int TIMEOUT_MS = 4500;
synchronized (mFrameSyncObject) {
while (!mFrameAvailable) {
try {
// Wait for onFrameAvailable() to signal us. Use a timeout to avoid
// stalling the test if it doesn't arrive.
if (VERBOSE) Log.i(TAG, "Waiting for Frame in Thread");
mFrameSyncObject.wait(TIMEOUT_MS);
if (!mFrameAvailable) {
// TODO: if "spurious wakeup", continue while loop
throw new RuntimeException("Camera frame wait timed out");
}
} catch (InterruptedException ie) {
// shouldn't happen
throw new RuntimeException(ie);
}
}
mFrameAvailable = false;
}
}
#Override
public void onFrameAvailable(SurfaceTexture st) {
if (VERBOSE) Log.d(TAG, "new frame available");
synchronized (mFrameSyncObject) {
if (mFrameAvailable) {
throw new RuntimeException("mFrameAvailable already set, frame could be dropped");
}
mFrameAvailable = true;
mFrameSyncObject.notifyAll();
}
}
}
I think you have to call SurfaceTeture.updateTextImage() after your OnFrameAvailable() Callback to tell the camera "I've used your last frame, give me another one".
(Sorry but my English cannot provide a better explanation)
#Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
...
surfaceTexture.updateTexImage();
}
had the same problem, seems like I forgot to call for the updateTexImage()
use method setOnFrameAvailableListener(#Nullable final OnFrameAvailableListener listener, #Nullable Handler handler) replace setOnFrameAvailableListener(#Nullable OnFrameAvailableListener listener).
in your case, you can modify the code as:
frameUpdateThread = new HandlerThread("frameUpdateThread");
frameUpdateThread.start();
mSurfaceTexture.setOnFrameAvailableListener(MainActivity.this, Handler(frameUpdateThread.getLooper()));
In my understanding onFrameAvailable should be used with thread. With that i am not facing the issue and also make sure updatetextImage is called after receiving the frames
Related
I am new in android and now I am trying to make a custom camera app. I am able to make a custom camera and and take some videos. But failed to change on camera raw data. I want to add some live filter on my camera like Black & White.I used this method to change my data
final Camera.PreviewCallback callback = new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
//Do your processing here... Use the byte[] called "data"
int pixelCount = width * height;
int[] out = new int[pixelCount];
for (int i = 0; i < pixelCount; ++i) {
int luminance = data[i] & 0xFF;
out[i] = Color.argb(0xFF, luminance, luminance, luminance);
data[i] = (byte) out[i];
}
}
};
mCamera.setPreviewCallback(callback);
But it didn't work. I am also confused about when I should call mCamera.setPreviewCallback(callback) method.
For better understanding I am giving my entire code here.
Mainactivity.java
package com.example.msi.magiccamera;
import android.content.Context;
import android.graphics.Color;
import android.hardware.Camera;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.net.Uri;
import android.os.Environment;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import android.widget.FrameLayout;
import android.widget.Toast;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import static android.content.ContentValues.TAG;
public class MainActivity extends AppCompatActivity {
private Camera mCamera;
private CameraPreview mPreview;
private MediaRecorder mediaRecorder;
public static final int MEDIA_TYPE_IMAGE = 1;
public static final int MEDIA_TYPE_VIDEO = 2;
private boolean isRecording = false;
private SurfaceHolder mholder;
private Button captureButton;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// Create an instance of Camera
mCamera = getCameraInstance();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
DisplayMetrics displayMetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
final int height = displayMetrics.heightPixels;
final int width = displayMetrics.widthPixels;
preview.addView(mPreview);
mholder = mPreview.getHolder();
mPreview.surfaceCreated(mholder);
final Camera.PreviewCallback callback = new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
//Do your processing here... Use the byte[] called "data"
int pixelCount = width * height;
int[] out = new int[pixelCount];
for (int i = 0; i < pixelCount; ++i) {
int luminance = data[i] & 0xFF;
out[i] = Color.argb(0xFF, luminance, luminance, luminance);
data[i] = (byte) out[i];
}
}
};
//mCamera.setPreviewCallback(callback);
captureButton = findViewById(R.id.captureButton);
captureButton.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isRecording) {
// stop recording and release camera
mediaRecorder.stop(); // stop the recording
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
// inform the user that recording has stopped
//setCaptureButtonText("Capture");
captureButton.setText("Capture");
isRecording = false;
} else {
// initialize video camera
if (prepareVideoRecorder()) {
// Camera is available and unlocked, MediaRecorder is prepared,
// now you can start recording
mediaRecorder.start() ;
//mCamera.setPreviewCallback(callback);
// inform the user that recording has started
//setCaptureButtonText("Stop");
captureButton.setText("Stop");
isRecording = true;
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
// inform user
}
}
}
private void setCaptureButtonText(String capture) {
}
}
);
}
/** A safe way to get an instance of the Camera object. */
public static android.hardware.Camera getCameraInstance(){
Camera c = null;
try {
c = Camera.open(); // attempt to get a Camera instance
}
catch (Exception e){
// Camera is not available (in use or does not exist)
}
return c; // returns null if camera is unavailable
}
private boolean prepareVideoRecorder(){
mCamera = getCameraInstance();
mediaRecorder = new MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
// Step 4: Set output file
mediaRecorder.setOutputFile(getOutputMediaFile(MEDIA_TYPE_VIDEO).toString());
// Step 5: Set the preview output
mediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
try {
mediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
/** Create a file Uri for saving an image or video */
private static Uri getOutputMediaFileUri(int type){
return Uri.fromFile(getOutputMediaFile(type));
}
/** Create a File for saving an image or video */
private static File getOutputMediaFile(int type){
// To be safe, you should check that the SDCard is mounted
// using Environment.getExternalStorageState() before doing this.
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES), "MyCameraApp");
// This location works best if you want the created images to be shared
// between applications and persist after your app has been uninstalled.
// Create the storage directory if it does not exist
if (! mediaStorageDir.exists()){
if (! mediaStorageDir.mkdirs()){
Log.d("MyCameraApp", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_IMAGE){
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"IMG_"+ timeStamp + ".jpg");
} else if(type == MEDIA_TYPE_VIDEO) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"VID_"+ timeStamp + ".mp4");
} else {
return null;
}
return mediaFile;
}
#Override
protected void onPause() {
super.onPause();
releaseMediaRecorder(); // if you are using MediaRecorder, release it first
releaseCamera(); // release the camera immediately on pause event
}
private void releaseMediaRecorder(){
if (mediaRecorder != null) {
mediaRecorder.reset(); // clear recorder configuration
mediaRecorder.release(); // release the recorder object
mediaRecorder = null;
mCamera.lock(); // lock camera for later use
}
}
private void releaseCamera(){
if (mCamera != null){
mCamera.release(); // release the camera for other applications
mCamera = null;
}
}
}
Camerapreview.java
package com.example.msi.magiccamera;
import android.content.Context;
import android.graphics.Color;
import android.hardware.Camera;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.WindowManager;
import java.io.IOException;
import static android.content.ContentValues.TAG;
/** A basic Camera preview class */
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder mHolder;
private Camera mCamera;
//New codes
#Override
public SurfaceHolder getHolder() {
return super.getHolder();
}
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
Log.d(TAG, "Error setting camera preview: " + e.getMessage());
} catch (Exception e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// empty. Take care of releasing the Camera preview in your activity.
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e){
Log.d(TAG, "Error starting camera preview: " + e.getMessage());
}
}
}
i'll be simply generate video recorder application using Camera. but it's crash and give me a bellow error :
Attempt to invoke virtual method
'android.view.SurfaceHolderandroid.view.SurfaceView.getHolder()' on a
null object reference`
Capture Video Activity.class
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import android.app.Activity;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.camera2.CameraCharacteristics;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ToggleButton;
import static android.provider.MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE;
import static android.provider.MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO;
public class CaptureVideoActivity extends Activity implements SurfaceHolder.Callback {
private MediaRecorder mMediaRecorder;
private Camera mCamera;
private SurfaceView mSurfaceView;
private SurfaceHolder mHolder;
private View mToggleButton;
private boolean mInitSuccesful;
CameraCharacteristics characteristics;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_capture_video);
// we shall take the video in landscape orientation
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
mSurfaceView = (SurfaceView) findViewById(R.id.surfaceView);
mHolder = mSurfaceView.getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mToggleButton = (ToggleButton) findViewById(R.id.toggleRecordingButton);
mToggleButton.setOnClickListener(new View.OnClickListener() {
#Override
// toggle video recording
public void onClick(View v) {
if (((ToggleButton) v).isChecked()) {
mMediaRecorder.start();
try {
Thread.sleep(7 * 1000); // This will recode for 10 seconds, if you don't want then just remove it.
} catch (Exception e) {
e.printStackTrace();
}
//finish();
} else {
mMediaRecorder.stop();
mMediaRecorder.reset();
try {
initRecorder(mHolder.getSurface());
} catch (IOException e) {
e.printStackTrace();
}
}
}
});
}
/* Init the MediaRecorder, the order the methods are called is vital to
* its correct functioning */
private void initRecorder(Surface surface) throws IOException {
// It is very important to unlock the camera before doing setCamera
// or it will results in a black preview
if (mCamera == null) {
mCamera = Camera.open();
mCamera.unlock();
}
if (mMediaRecorder == null) mMediaRecorder = new MediaRecorder();
mMediaRecorder.setPreviewDisplay(surface);
mMediaRecorder.setCamera(mCamera);
File file = getOutputMediaFile(2);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
// mMediaRecorder.setOutputFormat(8);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setOutputFile(file.getAbsolutePath());
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_720P));
/*mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setVideoEncodingBitRate(512 * 1000);
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoSize(640, 480);*/
int deviceorientation = getResources().getConfiguration().orientation;
mMediaRecorder.setOrientationHint(getJpegOrientation(characteristics, deviceorientation));
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
// This is thrown if the previous calls are not called with the
// proper order
e.printStackTrace();
}
mInitSuccesful = true;
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
if (!mInitSuccesful)
initRecorder(mHolder.getSurface());
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
shutdown();
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
private void shutdown() {
// Release MediaRecorder and especially the Camera as it's a shared
// object that can be used by other applications
mMediaRecorder.reset();
mMediaRecorder.release();
mCamera.release();
// once the objects have been released they can't be reused
mMediaRecorder = null;
mCamera = null;
}
private static File getOutputMediaFile(int type){
// To be safe, you should check that the SDCard is mounted
// using Environment.getExternalStorageState() before doing this.
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES), "Test Capture");
// This location works best if you want the created images to be shared
// between applications and persist after your app has been uninstalled.
// Create the storage directory if it does not exist
if (! mediaStorageDir.exists()){
if (! mediaStorageDir.mkdirs()){
Log.d("MyCameraApp", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_IMAGE){
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"IMG_"+ timeStamp + ".jpg");
} else if(type == MEDIA_TYPE_VIDEO) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"VID_"+ timeStamp + ".mp4");
} else {
return null;
}
return mediaFile;
}
private int getJpegOrientation(CameraCharacteristics c, int deviceOrientation) {
if (deviceOrientation == android.view.OrientationEventListener.ORIENTATION_UNKNOWN)
return 0;
int sensorOrientation = c.get(CameraCharacteristics.SENSOR_ORIENTATION);
// Round device orientation to a multiple of 90
deviceOrientation = (deviceOrientation + 45) / 90 * 90;
// Reverse device orientation for front-facing cameras
boolean facingFront = c.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT;
if (facingFront) deviceOrientation = -deviceOrientation;
// Calculate desired JPEG orientation relative to camera orientation to make
// the image upright relative to the device orientation
return (sensorOrientation + deviceOrientation + 360) % 360;
}
}
activity_capture_video.xml
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="fill_parent">
<ToggleButton
android:id="#+id/toggleRecordingButton"
android:layout_width="fill_parent"
android:textOff="Start Recording"
android:textOn="Stop Recording"
android:layout_height="wrap_content"
/>
<FrameLayout
android:layout_width="fill_parent"
android:layout_height="fill_parent">
<SurfaceView android:id="#+id/surfaceView"
android:layout_width="fill_parent"
android:layout_height="fill_parent"></SurfaceView>
</FrameLayout>
</LinearLayout>
i have already read all the link details :
How to solve "android.view.SurfaceView.getHolder()' on a null object reference"?
'android.view.SurfaceHolder android.view.SurfaceView.getHolder()' on a null object reference in SurfaceView
surfaceView.getHolder not returning SurfaceHolder
but not useful for me.
I guess you were following some tutorial video or website. If not please do it now.
Check the following things.
Check the import statements for all mainly surfaceview, surfaceholder and camera.
Check the elements in xml.
Check whether the xml is linked to java by ctrl+click on R.id.surfaceView
I'm trying to build an app which will periodically call the Camera to take a picture and save it.
My problem is that unless I put the 'takePictuce' call in the 'onCreate' the response to 'takePicture' (onPictureTaken) is never called.
I've broken down the app as simply as possible to illustrate.
if a class to handle the camera is partially defined as:
public class CameraHandler {
private Camera mCamera;
public CameraHandler(){
mCamera = Camera.open();
mCamera.setPreviewTexture(new SurfaceTexture(10));
mCamera.startPreview();
mCamera.takePicture(null, null, mPicture);
}
Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null) {
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
} catch (FileNotFoundException e) {
} catch (IOException e) { }
}
};
Then when I put the following code into MainActivity.java, 'onPictureTaken' IS called when CameraHandler is instantiated.
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
CameraHandler cameraHandler = new CameraHandler();
}
However, putting the call to instantiate CameraHandler in a click event in the MainActivity.java will NOT call 'onPictureTaken' in response to the takePicture call.
(This snippet is located in MainACtivity.java)
public void onClickListener(View view){
CameraHandler cameraHandler = new CameraHandler();
}
So why is this occurring, and how can I get the call to take a picture in the class where it belongs and not in the 'main' of the program?
All help welcome
Finally figured out how to set the phone to take timed pics without any screen display.
There were 2 main issues I struggled with. First, I wanted to take the pics without displaying to screen. Along those lines, I found an example where they used :
mCamera.setPreviewTexture(new SurfaceTexture(10));
and nothing showed on the screen when using this preview method. It appears that some sort of preview is required. Another method was to set the preview to 1 pixel. This method had examples online which appeared to work as well, but I did not try it.
The second and bigger problem had to do with 'onPictureTaken' method. This method processes your picture after the 'takePicture' call.
It seemed that no matter what looping method I used, or where in the code the call to 'takePicture' was located, all of the 'onPictureTaken' methods were queued up and called one after another once the parent of the 'takePicture' caller ended.
Although the picture data processed by onPictureTaken were in a proper time sequence, I could see that having several hundred pics stored and waiting to process could cause problems, and a method needed to be found where on pic was processed and stored before the next pic was taken.
Along those lines, I stumbled upon the AlarmManager and coupled that with the BroadcastReceiver and Future classes to solve the problem.
What I've done is set the alarmManger to go off at a set time or time frequency. The BroadcaseReceiver captures this call & in turn calls a method which creates
a thread where a 'Future' object makes the call to take a picture.
'Future' object is nice, because it will wait for the physical camera to take the picture (takePicture) and then process it (onPictureTaken). This all occurs in one thread, then terminates. So no queuing of pics to process and each picture sequence is handled separately.
Code is contained below. Note that some of the default 'Overrides' have been left out to save space. Also, the visible screen was basically a button which captured the click event...very basic.
MainActivity.java:
package myTest.com.test;
import android.app.Activity;
import android.app.AlarmManager;
import android.app.PendingIntent;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.location.LocationManager;
import android.os.Bundle;
import android.os.SystemClock;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.TextView;
import android.widget.Toast;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.Future;
public class MainActivity extends Activity {
CameraHandler cameraHandler;
public BroadcastReceiver br;
public PendingIntent pi;
public AlarmManager am;
final static private long LOOPTIME = 20000;
private static final ExecutorService threadpool = Executors.newFixedThreadPool(3);
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setup();
}
private void setup() {
try{
cameraHandler = new CameraHandler();
br = new BroadcastReceiver() {
#Override
public void onReceive(Context c, Intent i) {
//Toast.makeText(c, "Taking a pic!", Toast.LENGTH_LONG).show();
TryAThread();
}
};
registerReceiver(br, new IntentFilter("com.timedActivity.activity") );
pi = PendingIntent.getBroadcast(this, 0, new Intent("com.timedActivity.activity"), 0);
am = (AlarmManager)(this.getSystemService( Context.ALARM_SERVICE ));
}
catch (Exception e){ }
}
private void TryAThread() {
try{
CameraCaller cameraCaller = new CameraCaller(cameraHandler);
Future future = threadpool.submit(cameraCaller);
while (!future.isDone()) {
try {
Thread.sleep(5000);
} catch (Exception ex) { }
}
}
catch (Exception e){ }
}
#Override
protected void onDestroy() {
am.cancel(pi);
unregisterReceiver(br);
super.onDestroy();
}
public void onClickListener(View view){
try{
am.setRepeating(am.ELAPSED_REALTIME,SystemClock.elapsedRealtime(), LOOPTIME, pi);
}
catch (Exception e){ }
}
}
CameraCaller.java:.
package myTest.com.test;
import java.util.concurrent.Callable;
public class CameraCaller implements Callable {
private CameraHandler cameraHandler;
public CameraCaller(CameraHandler ch){
cameraHandler = ch;
}
#Override
public Object call() throws Exception {
cameraHandler.takeAPic();
return true;
}
}
CameraHandler.java:.
package myTest.com.test;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.os.Environment;
import android.util.Log;
import junit.runner.Version;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
public class CameraHandler implements Camera.PictureCallback{
private Camera mCamera;
public CameraHandler(){
}
public Boolean takeAPic(){
try{
if (mCamera == null){
mCamera = Camera.open();
mCamera.enableShutterSound(false);
try {
mCamera.setPreviewTexture(new SurfaceTexture(10));
}
catch (IOException e1) {Log.e(Version.id(), e1.getMessage());
}
}
mCamera.startPreview();
mCamera.takePicture(null, null, this);
}
catch (Exception ex){ }
return true;
}
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null) {
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
} catch (FileNotFoundException e) {
} catch (IOException e) { }
try {
Thread.sleep(2000);
}catch (Exception ex){}
}
public static File getOutputMediaFile() {
File file = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
File mediaStorageDir = new File(file, "MyCameraApp");
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("MyCameraApp", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
File mediaFile;
mediaFile = new File(mediaStorageDir.getPath() + File.separator + "IMG_" + timeStamp + ".jpg");
return mediaFile;
}
}
I want to ask a question that I want to improve my frame rate. I am sending from android Device which is using Preview call with Buffer. I am than receiving it on the Server side. Ia m getting everything fine. The frame rate is terribly slow. Can you please guide me. I am doing the YUV to JPEG conversion on Android too, working fine but its making it terribly slow
Thanks
Regards
import java.io.ByteArrayOutputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.io.OutputStream;
import java.net.Socket;
import java.net.UnknownHostException;
import android.app.Activity;
import android.app.ActivityManager;
import android.app.ActivityManager.RunningServiceInfo;
import android.content.Context;
import android.content.Intent;
import android.content.pm.ActivityInfo;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.hardware.Camera;
import android.hardware.Camera.ErrorCallback;
import android.hardware.Camera.Size;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.hardware.Camera.PreviewCallback;
import android.location.Location;
import android.location.LocationListener;
import android.location.LocationManager;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceHolder.Callback;
import android.view.SurfaceView;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.FrameLayout;
import android.widget.Toast;
import android.app.Service;
import android.graphics.ImageFormat;
import android.graphics.PixelFormat;
import android.graphics.Rect;
import android.graphics.YuvImage;
import android.net.Uri;
import android.os.IBinder;
import android.provider.MediaStore;
import android.view.LayoutInflater;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.net.InetAddress;
import java.sql.Date;
import java.text.SimpleDateFormat;
import java.util.List;
import android.widget.TextView;
public class SdfJuliaActivity extends Activity implements SensorEventListener
{
private static String TAG = "SDFJulia";
/*
* WiFi Methods
* ------------
* Please do not delete any method, just adjust them if necessary.
*
* Problems:
* - if server is not available / running and app tries to connect,
* it crashes the emulator. Works fine on smartphone, though.
*
*/
protected static DataOutputStream os;
protected static Socket socket;
private static String address = "192.168.2.102:4444"; //also default IP address
private static String ipAddress;
private static int port;
//Connect to given ip address on given port
protected static boolean connect(String serverName, int port)
{
Log.d(TAG, "Connecting to " + serverName + ":" + port);
try
{
socket = new Socket(serverName,port);
os = new DataOutputStream(socket.getOutputStream());
}
catch (UnknownHostException e)
{
Log.d(TAG, "Unknown Host Exception while connecting: " + e);
return false;
}
catch (IOException e)
{
Log.d(TAG, "IO Exception while connecting: " + e);
return false;
}
if (!socket.isConnected()) { return false; }
return true;
}
//Close the current connection
protected static void disconnect()
{
if (!socket.isClosed())
{
try { socket.close(); }
catch (IOException e) {}
}
}
//Check if we are connected
protected static boolean connected()
{
//Check if socket is open
if (socket.isClosed())
{ return false; }
//Check if socket is actually connected to a remote host
if (!socket.isConnected())
{ return false; }
//We are connected!
return true;
}
//Send a single frame to the server
//YUV-Image
protected static void sendFrame(byte[] data)
{
//Check if a frame is actually available
if (data != null)
{
try
{
os.writeInt(data.length);
os.write(data, 0, data.length);
os.flush();
}
catch (IOException e) {};
}
}
//Send all sensor data to the server
protected static void sendData()
{
try
{
os.writeFloat(orientation_x);
os.writeFloat(orientation_y);
os.writeFloat(orientation_z);
os.writeDouble(gps_longitude);
os.writeDouble(gps_latitude);
os.writeDouble(gps_altitude);
}
catch (IOException e) {};
}
public static String getAddress()
{ return address; };
/*
* Sensor Stuff
* ------------
*/
SensorManager sensorManager = null;
private static float accelerometer_x;
private static float accelerometer_y;
private static float accelerometer_z;
private static float orientation_x;
private static float orientation_y;
private static float orientation_z;
//New sensor data available
public void onSensorChanged(SensorEvent event)
{
synchronized (this)
{
switch (event.sensor.getType())
{
case Sensor.TYPE_ACCELEROMETER:
accelerometer_x = event.values[0];
accelerometer_y = event.values[1];
accelerometer_z = event.values[2];
break;
case Sensor.TYPE_ORIENTATION:
orientation_x = event.values[0];
orientation_y = event.values[1];
orientation_z = event.values[2];
break;
}
}
}
//Sensor accuracy has changed (unused)
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
private static double gps_latitude;
private static double gps_longitude;
private static double gps_altitude;
private static long gps_lastfix;
/*
* Camera Methods
* --------------
*
*/
private Camera mCamera;
SurfaceView mPreview;
private void startVideo()
{
SurfaceHolder videoCaptureViewHolder = null;
try { mCamera = Camera.open(); }
catch (Exception e) { Log.d(TAG,"Can not open camera. In use?"); };
mCamera.setErrorCallback
(
new ErrorCallback()
{ public void onError(int error, Camera camera) {} }
);
Camera.Parameters parameters = mCamera.getParameters();
final int previewWidth = parameters.getPreviewSize().width;
final int previewHeight = parameters.getPreviewSize().height;
parameters.setPreviewFrameRate(30);
mCamera.setParameters(parameters);
parameters = mCamera.getParameters();
Log.d(TAG,"Format: " + parameters.getPreviewFormat());
Log.d(TAG,"FPS: " + parameters.getPreviewFrameRate());
if (null != mPreview)
{ videoCaptureViewHolder = mPreview.getHolder(); }
try { mCamera.setPreviewDisplay(videoCaptureViewHolder); }
catch (Throwable t) {}
//Get Preview Size to set the data buffers to it
Size previewSize=mCamera.getParameters().getPreviewSize();
//Set the Buffer Size according to Preview Size
int dataBufferSize=(int)(previewSize.height*previewSize.width*
(ImageFormat.getBitsPerPixel(mCamera.getParameters().getPreviewFormat())/8.0));
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
//This method is called for every preview frame
//we use it to transmit both the current preview frame as well as the sensor data
mCamera.setPreviewCallbackWithBuffer(new Camera.PreviewCallback()
{
int fpsCount = 0;
boolean test = true;
public void onPreviewFrame(byte[] data, Camera camera)
{
//Check if connection is present; if not, try to reconnect
if (!connected())
{ connect(ipAddress, port); }
else
{
//Prediction step XXX
//Log.d(TAG, "Transmitting data.");
//Convert image to YUV
/*YuvImage image = new YuvImage(data,ImageFormat.NV21,previewWidth,previewHeight,null);
Rect rect = new Rect(0,0,previewWidth,previewHeight);
ByteArrayOutputStream oas = new ByteArrayOutputStream();
image.compressToJpeg(rect,100,oas);
byte[] imageJPG = oas.toByteArray();*/
//if (test)
//{ Log.d(TAG,"Length: " + imageJPG.length); test = false; };
//Send the current frame to the server
sendFrame(data);
//Send the corresponding sensor data to the server
sendData();
camera.addCallbackBuffer(data);
}
}
});
try { mCamera.startPreview(); }
catch (Throwable e)
{
mCamera.release();
mCamera = null;
return;
}
}
private void stopVideo()
{
if (mCamera == null)
{ return; }
try
{
mCamera.stopPreview();
mCamera.setPreviewDisplay(null);
mCamera.setPreviewCallback(null);
mCamera.release();
}
catch (IOException e)
{
e.printStackTrace();
return;
}
mCamera = null;
}
/* Input Validation
* ----------------
*/
//Check if user specified address is valid
private static boolean checkAddress(String userinput)
{
String [] part = userinput.split("\\:");
if (part.length != 2)
{ return false; }
if (!validIP(part[0]))
{ return false; }
int i = Integer.parseInt(part[1]);
if (i < 1 || i > 10000)
{ return false; }
return true;
}
//Check for valid IP address
private static boolean validIP (String userinput)
{
String [] part = userinput.split("\\.");
if (part.length != 4)
{
Log.d(TAG,"Invalid ip address length.");
return false;
}
for (String s : part)
{
int i = Integer.parseInt(s);
if (i < 0 || i > 255)
{
Log.d(TAG,"Invalid ip address values.");
return false;
}
}
return true;
}
/*
* Activity Creation
* -----------------
*/
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
//Always use portrait view
//Otherwise activity gets destroyed when orientation is changed --> network & video stream is lost
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
setContentView(R.layout.main);
socket = new Socket();
/*
* Sensors
* -------
* This registers our app to receive the latest sensor data.
*/
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), sensorManager.SENSOR_DELAY_FASTEST);
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION), sensorManager.SENSOR_DELAY_FASTEST);
/*
* GPS
* ---
* This registers our app to receive the latest GPS data.
*/
LocationManager locationManager = (LocationManager) this.getSystemService(Context.LOCATION_SERVICE);
LocationListener locationListener = new LocationListener()
{
public void onLocationChanged(Location location)
{
gps_latitude = location.getLatitude();
gps_longitude = location.getLongitude();
gps_altitude = location.getAltitude();
gps_lastfix = System.currentTimeMillis();
}
public void onStatusChanged(String provider, int status, Bundle extras) {}
public void onProviderEnabled(String provider) {}
public void onProviderDisabled(String provider) {}
};
locationManager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 0, 0, locationListener);
final Button serviceButton = (Button) findViewById(R.id.btn_serviceButton);
final EditText ipText = (EditText) findViewById(R.id.text_ipAddress);
ipText.setText(address);
//CameraStuff here first
mPreview = (SurfaceView) findViewById(R.id.cameraView);
SurfaceHolder videoCaptureViewHolder = mPreview.getHolder();
videoCaptureViewHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
videoCaptureViewHolder.addCallback(new Callback()
{
public void surfaceDestroyed(SurfaceHolder holder) { stopVideo(); }
public void surfaceCreated(SurfaceHolder holder) {}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}
});
//Click Listener for Button
//starts & stops the service
serviceButton.setOnClickListener(new View.OnClickListener()
{
public void onClick(View v)
{
if (connected())
{
Toast.makeText(getApplicationContext(), "Connection terminated.", Toast.LENGTH_SHORT).show();
serviceButton.setText("Connect");
//Stop the app
stopVideo(); //Stop video & sending data
disconnect(); //Stop network services
}
else
{
address = ipText.getText().toString();
String [] part = ipText.getText().toString().split("\\:");
//Validate input
ipAddress = part[0];
port = Integer.parseInt(part[1]);
//int port = 4444;
if (!checkAddress(address))
{
//Invalid ip address specified
Toast.makeText(getApplicationContext(), "Invalid address specified.", Toast.LENGTH_SHORT).show();
Log.d(TAG,"Invalid address specified.");
}
else
{
if (!connect(ipAddress, port))
{
//Connection failed
Toast.makeText(getApplicationContext(), "Could not connect.", Toast.LENGTH_SHORT).show();
Log.d(TAG,"Could not connect.");
}
else
{
//Connection successful
Log.d(TAG,"Connected.");
Toast.makeText(getApplicationContext(), "Connection successful.", Toast.LENGTH_SHORT).show();
serviceButton.setText("Disconnect");
//Start video & sending data
startVideo();
}
}
}
}
});
}
//Don't do anything if configuration is changed (e.g. phone has been rotated)
#Override
public void onConfigurationChanged(Configuration newConfig) {}
}
You must call getSupportedPreviewFpsRange() of com.android.hardware.Camera, and you will get a list of FPS range supported
Select the proper one and call setSupportedPreviewFpsRange()
all that must be called before startPreview() call.
you can refer to SDK guide.
I am trying to build a small photo app with a burst mode for camera. The main idea is to shoot a picture every 0,3sec and to store the pictures in an Array until the last picture is taken. The number of pictures to shoot should specified.
I can only shoot a picture every 2sec. as shortest interval. Then, when the pictures should be written to storage, the applications chrashes.
Does someone know how to implement this? Here is my code:
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.graphics.PixelFormat;
import android.hardware.Camera;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;
import android.view.View.OnClickListener;
public class CamaeaView extends Activity implements SurfaceHolder.Callback, OnClickListener {
static final int FOTO_MODE = 1;
private static final String TAG = "Test";
Camera mCamera;
boolean mPreviewRunning = false;
private Context mContext = this;
ArrayList<Object> listImage = null;
public static ArrayList<Object> List1[];
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
Log.e(TAG, "onCreate");
#SuppressWarnings("unused")
Bundle extras = getIntent().getExtras();
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
mSurfaceView = (SurfaceView) findViewById(R.id.surface_camera);
mSurfaceView.setOnClickListener(this);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
}
Camera.PictureCallback mPictureCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] imageData, Camera c) {
Log.d("Start: ","Imagelist");
Intent mIntent = new Intent();
listImage.add(StoreByteImage(imageData));
SaveImage(listImage);
mCamera.startPreview();
setResult(FOTO_MODE, mIntent);
finish();
}
};
protected void onResume() {
Log.d(TAG, "onResume");
super.onResume();
}
protected void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(outState);
}
protected void onStop() {
Log.e(TAG, "onStop");
super.onStop();
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated");
mCamera = Camera.open();
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
Log.d(TAG, "surfaceChanged");
// XXX stopPreview() will crash if preview is not running
if (mPreviewRunning) {
mCamera.stopPreview();
}
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(w, h);
p.setPictureSize(1024, 768);
p.getSupportedPictureFormats();
p.setPictureFormat(PixelFormat.JPEG);
mCamera.setParameters(p);
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
mCamera.startPreview();
mPreviewRunning = true;
}
public void surfaceDestroyed(SurfaceHolder holder) {
Log.e(TAG, "surfaceDestroyed");
mCamera.stopPreview();
mPreviewRunning = false;
mCamera.release();
}
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
public void onClick(View arg0) {
int i = 0;
while (i < 4) {
mCamera.takePicture(null, mPictureCallback, mPictureCallback);
i = i + 1;
try {
Thread.sleep(3000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public static Object StoreByteImage(byte[] imageData) {
Object obj = null;
try {
ByteArrayInputStream bis = new ByteArrayInputStream(imageData);
ObjectInputStream objImageData = new ObjectInputStream(bis);
obj = objImageData.readObject();
} catch (IOException ex) {
// TODO: Handle the exception
} catch (ClassNotFoundException ex) {
// TODO: Handle the exception
}
return obj;
}
public static Object createImagesArray(byte[] imageData) {
Object obj = null;
try {
ByteArrayInputStream bis = new ByteArrayInputStream(imageData);
ObjectInputStream objImageData = new ObjectInputStream(bis);
obj = objImageData.readObject();
} catch (IOException ex) {
// TODO: Handle the exception
} catch (ClassNotFoundException ex) {
// TODO: Handle the exception
}
return obj;
}
public static boolean SaveImage(List<Object> listImage) {
FileOutputStream outStream = null;
Log.d("ListSize: ", Integer.toString(listImage.size()));
Iterator<Object> itList = listImage.iterator();
byte[] imageBytes = null;
while (itList.hasNext()) {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
ObjectOutputStream oos = new ObjectOutputStream(bos);
oos.writeObject(listImage);
oos.flush();
oos.close();
bos.close();
imageBytes = bos.toByteArray();
} catch (IOException ex) {
// TODO: Handle the exception
}
try {
// Write to SD Card
outStream = new FileOutputStream(String.format(
"/sdcard/DCIM/%d.jpg", System.currentTimeMillis())); // <9>
outStream.write(imageBytes);
outStream.close();
} catch (FileNotFoundException e) { // <10>
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
Log.d(TAG, "onPictureTaken - jpeg");
return true;
}
}
Error from the comments:
ERROR/MemoryHeapBase(1382): error opening /dev/pmem_camera: No such file or directory
ERROR/QualcommCameraHardware(1382): failed to construct master heap for pmem pool /dev/pmem_camera
ERROR/QualcommCameraHardware(1382): initRaw X failed with pmem_camera, trying with pmem_adsp
AFAIK, you cannot take another picture until the first one is complete. Eliminate your for loop and Thread.sleep(). Take the next picture in mPictureCallback.
The main idea is to shoot a picture every 0,3sec and to store the pictures in an Array until the last picture is taken.
The "0,3sec" objective is faster than most devices can process an image, and you may not have enough heap space for your array of images. I suspect that you will have to write each image out to disk as it comes in via an AsyncTask, so you can free up the heap space while also not tying up the main application thread.
You can try capturing subsequent preview frames in preallocated byte arrays.
See the following methods:
http://developer.android.com/reference/android/hardware/Camera.html#addCallbackBuffer(byte[])
http://developer.android.com/reference/android/hardware/Camera.html#setPreviewCallbackWithBuffer(android.hardware.Camera.PreviewCallback)
...or http://developer.android.com/reference/android/hardware/Camera.html#setOneShotPreviewCallback(android.hardware.Camera.PreviewCallback) for better control over timing.