I know i can set a boolean flag while opening front Camera. And if flag is true it means front camera is on.
But is there a way using Android API to know which Camera is Open right now? Front or Back.
public int getFrontCameraId() {
CameraInfo ci = new CameraInfo();
for (int i = 0 ; i < Camera.getNumberOfCameras(); i++) {
Camera.getCameraInfo(i, ci);
if (ci.facing == CameraInfo.CAMERA_FACING_FRONT) return i;
}
return -1; // No front-facing camera found
}
Camera Preview is inverting(upside donw) when i open Front Camera. So i have to add a check which Camera is open if FrontCamera is opened then matrix = 270. otherwise matrix =90.
onPreviewFrame(byte abyte0[] , Camera camera)
int[] rgbData = YuvUtils.decodeGreyscale(abyte0, mWidth,mHeight);
editedBitmap.setPixels(rgbData, 0, widthPreview, 0, 0, widthPreview, heightPreview);
finalBitmap = Bitmap.createBitmap(editedBitmap, 0, 0, widthPreview, heightPreview, matrix, true);
private boolean safeCameraOpen(int id) {
boolean qOpened = false;
try {
releaseCameraAndPreview();
mCamera = Camera.open(id);
qOpened = (mCamera != null);
} catch (Exception e) {
Log.e(getString(R.string.app_name), "failed to open Camera");
e.printStackTrace();
}
return qOpened;
}
private void releaseCameraAndPreview() {
mPreview.setCamera(null);
if (mCamera != null) {
mCamera.release();
mCamera = null;
}
}
Since API level 9, the camera framework supports multiple cameras. If you use the legacy API and call open() without an argument, you get the first rear-facing camera.Android devices can have multiple cameras, for example a back-facing camera for photography and a front-facing camera for video calls. Android 2.3 (API Level 9) and later allows you to check the number of cameras available on a device using the Camera.getNumberOfCameras() method.
To access the primary camera, use the Camera.open() method and be sure to catch any exceptions, as shown in the code below:
/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance(){
Camera c = null;
try {
c = Camera.open(); // attempt to get a Camera instance
}
catch (Exception e){
// Camera is not available (in use or does not exist)
}
return c; // returns null if camera is unavailable
}
On devices running Android 2.3 (API Level 9) or higher, you can access specific cameras using Camera.open(int). The example code above will access the first, back-facing camera on a device with more than one camera.
In new android.hardware.camera2 package, you can enquire from CameraCharacteristics.LENS_FACING property and each CameraDevice publishes its id with CameraDevice.getId() it's easy to get to the characteristics.
In the older camera API, I think the only way is to keep track of the index you opened it with.
private int cameraId;
public void openFrontCamera(){
cameraId = getFrontCameraId();
if (cameraId != -1)
camera = Camera.open(cameraId); //try catch omitted for brevity
}
Then use cameraId later, this little snippet might be a better way of achieving what you are trying to:
public void onOrientationChanged(int orientation) {
if (orientation == ORIENTATION_UNKNOWN) return;
android.hardware.Camera.CameraInfo info =
new android.hardware.Camera.CameraInfo();
android.hardware.Camera.getCameraInfo(cameraId, info);
orientation = (orientation + 45) / 90 * 90;
int rotation = 0;
if (info.facing == CameraInfo.CAMERA_FACING_FRONT) {
rotation = (info.orientation - orientation + 360) % 360;
} else { // back-facing camera
rotation = (info.orientation + orientation) % 360;
}
mParameters.setRotation(rotation);
}
if you have a custom Camera Activity you can try this method.
boolean inPreview;
in your on surfaceChanged method from surfaceView set
inPreview = True;
in your Camera.CallbackListener variable set
inPreview = false;
Related
I am creating an app, in which I am setting a profile image and cover image ,so on setting profile image I want to open the front camera by default using intent.
I am using
pictureIntent.putExtra("android.intent.extras.CAMERA_FACING",1);
Its working on Sony but when I tested on a Samsung Galaxy J4 ,it's opening the back camera. I searched and found somewhere that for Samsung one should use value 2.
However it's not working.
I want to do this using intent only.
Does anyone have an idea about it?
You can do like below,
private Camera openFrontFacingCamera() {
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for ( int camIdx = 0; camIdx < cameraCount; camIdx++ ) {
Camera.getCameraInfo( camIdx, cameraInfo );
if ( cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT ) {
try {
cam = Camera.open( camIdx );
} catch (RuntimeException e) {
Log.e(TAG, "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
And then use it in your app as follows:
public static Camera getCameraInstance() {
Camera c = null;
try {
c = openFrontFacingCamera();
}
catch (Exception e){
}
return c;
}
I am using a webrtc sample Code to stream from my Android device to a Webpage.
The sample Code does not have the function to Switch the camera. I tried to solve it but I failed. The sample uses a VideoCapturerAndroid class all of the Suggestions I found switching the camera used a different type.
The main part of the sample Looks like this:
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_video_chat);
ButterKnife.bind(this);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Bundle extras = getIntent().getExtras();
if (extras == null || !extras.containsKey(Constants.USER_NAME)) {
Intent intent = new Intent(this, MainActivity.class);
startActivity(intent);
Toast.makeText(this, "Need to pass username to VideoChatActivity in intent extras (Constants.USER_NAME).", Toast.LENGTH_SHORT).show();
finish();
return;
}
this.username = extras.getString(Constants.USER_NAME, "");
this.mCallStatus = (TextView) findViewById(R.id.call_status);
// First, we initiate the PeerConnectionFactory with our application context and some options.
PeerConnectionFactory.initializeAndroidGlobals(
this, // Context
true, // Audio Enabled
true, // Video Enabled
true, // Hardware Acceleration Enabled
null); // Render EGL Context
pcFactory = new PeerConnectionFactory();
this.pnRTCClient = new PnRTCClient(Constants.PUB_KEY, Constants.SUB_KEY, this.username);
List<PeerConnection.IceServer> servers = getXirSysIceServers();
if (!servers.isEmpty()) {
this.pnRTCClient.setSignalParams(new de.kevingleason.pnwebrtc.PnSignalingParams());
}
backFacingCam = VideoCapturerAndroid.getNameOfBackFacingDevice();
frontFacingCam = VideoCapturerAndroid.getNameOfFrontFacingDevice();
// Creates a VideoCapturerAndroid instance for the device name
//VideoCapturer capturer = VideoCapturerAndroid.create(frontFacingCam);
capturer = VideoCapturerAndroid.create(facingCam);
// First create a Video Source, then we can make a Video Track
localVideoSource = pcFactory.createVideoSource(capturer, this.pnRTCClient.videoConstraints());
localVideoTrack = pcFactory.createVideoTrack(VIDEO_TRACK_ID, localVideoSource);
// First we create an AudioSource then we can create our AudioTrack
AudioSource audioSource = pcFactory.createAudioSource(this.pnRTCClient.audioConstraints());
AudioTrack localAudioTrack = pcFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
// To create our VideoRenderer, we can use the included VideoRendererGui for simplicity
// First we need to set the GLSurfaceView that it should render to
this.videoView = (GLSurfaceView) findViewById(R.id.gl_surface);
// Then we set that view, and pass a Runnable to run once the surface is ready
VideoRendererGui.setView(videoView, null);
// Now that VideoRendererGui is ready, we can get our VideoRenderer.
// IN THIS ORDER. Effects which is on top or bottom
remoteRender = VideoRendererGui.create(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, false);
localRender = VideoRendererGui.create(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
// We start out with an empty MediaStream object, created with help from our PeerConnectionFactory
// Note that LOCAL_MEDIA_STREAM_ID can be any string
MediaStream mediaStream = pcFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
// Now we can add our tracks.
mediaStream.addTrack(localVideoTrack);
mediaStream.addTrack(localAudioTrack);
// First attach the RTC Listener so that callback events will be triggered
this.pnRTCClient.attachRTCListener(new DemoRTCListener());
// Then attach your local media stream to the PnRTCClient.
// This will trigger the onLocalStream callback.
this.pnRTCClient.attachLocalMediaStream(mediaStream);
this.pnRTCClient.listenOn(username);
this.pnRTCClient.setMaxConnections(1);
....
}
Currently I am hardcoding which camera shall be used:
backFacingCam = VideoCapturerAndroid.getNameOfBackFacingDevice();
frontFacingCam = VideoCapturerAndroid.getNameOfFrontFacingDevice();
This is my button which shall Switch the camera:
#OnClick(R.id.switchCameraBtn)
public void switchCameraBtn(View view) {
Log.e("Test", "switch camera button clicked");
this.mCallStatus = (TextView) findViewById(R.id.call_status);
}
I also tried to restart the activity and give a Parameter which tells one that the other camera shall be used, but I would like to Keep the stream fluent and not to restart the activity.
You are using a very old implementation of WebRTC in android. VideoRendererGui is removed from new WebRTC library. I strongly suggest you to always use the newest version possible of google WebRTC from here which is at the time of writing is 1.0.22512
compile 'org.webrtc:google-webrtc:1.0.22512'
You can check the Android implementation of the newest library from official WebRTC Chromium project site here. Check out the other classes too.
With the new library, you should create a VideoCapturer in the following way.
private void createVideoCapturer() {
VideoCapturer videoCapturer;
if (Camera2Enumerator.isSupported(this)) {
videoCapturer = createCameraCapturer(new Camera2Enumerator(this));
} else {
videoCapturer = createCameraCapturer(new Camera1Enumerator(false));
}
}
createCameraCapturer() Method :
private VideoCapturer createCameraCapturer(CameraEnumerator enumerator) {
final String[] deviceNames = enumerator.getDeviceNames();
// First, try to find front facing camera
for (String deviceName : deviceNames) {
if (enumerator.isFrontFacing(deviceName)) {
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return videoCapturer;
}
}
}
// Front facing camera not found, try something else
for (String deviceName : deviceNames) {
if (!enumerator.isFrontFacing(deviceName)) {
CameraVideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return videoCapturer;
}
}
}
return null;
}
and call switchCamera() method from your class whenever you want to switch between front and back camera.
private void switchCamera() {
if (videoCapturer != null) {
if (videoCapturer instanceof CameraVideoCapturer) {
CameraVideoCapturer cameraVideoCapturer = (CameraVideoCapturer) videoCapturer;
cameraVideoCapturer.switchCamera(null);
} else {
// Will not switch camera, video capturer is not a camera
}
}
}
I use this code to prepare my MediaRecorder for recording video. After this I call the start() method, which doesn't crash, however when I call the stop() method a crash occurs and the RuntimeException stop failed is raised. I also notice that the video file which is saved in the device is broken and is only 32B. I'm assuming I have an error somewhere when setting up the device in the below method. Notice that I am trying to record from the surfaceView live preview which is displayed on screen ( like snapchat) not from the native camera app.
private void initRecorder(Surface surface) throws IOException {
// It is very important to unlock the camera before doing setCamera
// or it will results in a black preview
if(mCamera == null) {
mCamera = Camera.open();
mCamera.unlock();
}
if(mMediaRecorder == null) mMediaRecorder = new MediaRecorder();
mMediaRecorder.setPreviewDisplay(surface);
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setOnErrorListener(new MediaRecorder.OnErrorListener() {
#Override
public void onError(MediaRecorder mr, int what, int extra) {
Toast.makeText(getApplicationContext(),
Integer.toString(what) + "_____" + Integer.toString(extra), Toast.LENGTH_LONG)
.show(); }
});
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
// mMediaRecorder.setOutputFormat(8);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setVideoEncodingBitRate(512 * 1000);
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoSize(640, 480);
mMediaRecorder.setOutputFile(getVideoFile());
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
// This is thrown if the previous calls are not called with the
// proper order
e.printStackTrace();
}
mInitSuccesful = true;
}
I'm trying to use the Camera2 API to stream camera data to a SurfaceView. I'm following this guide: Camera2 guide
I cannot get past step 5
MainActivity.java::onCreate()
setContentView(R.layout.activity_main);
surfaceView = (SurfaceView)findViewById(R.id.surface);
manager = (CameraManager)getSystemService(Context.CAMERA_SERVICE);
MainActivity.java::onClick()
for (String id : manager.getCameraIdList()) {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(id);
Integer direction = characteristics.get(CameraCharacteristics.LENS_FACING);
if (direction != null && direction == CameraCharacteristics.LENS_FACING_BACK) {
if (checkCallingOrSelfPermission("android.permission.CAMERA") == PackageManager.PERMISSION_GRANTED)
manager.openCamera(id, new StateCallback(), null);
break;
}
}
MainActivity.java.StateCallback::onOpened(CameraDevice camera)
List<Surface> surfaces = new LinkedList<>();
surfaces.add(surfaceView.getHolder().getSurface());
CaptureRequest.Builder builder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
builder.addTarget(surfaces.get(0));
camera.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
Log.i(TAG, "Configured");
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
Log.e(TAG, "Configured failed"); // Ends up in this function :(
}
}, null);
The program ends up in the onConfigureFailed() function. I don't know what could be the error, and I don't know how to check what is.
My guess would be that I'm missing something in the CaptureRequest, but I have no idea what.
I'm running on a Samsung Galaxy S4.
add to onConfigured:
if (null == cameraDevice) {
Log.e(TAG, "updatePreview error, return");
return;
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Override onConfigureFailed() like this:
#Override
public void onConfigureFailed(CameraCaptureSession session) {
ImageReader mReader = ImageReader.newInstance(640, 480, ImageFormat.JPEG, 1);
takePicture() // function to get image
createCameraPreview(); // function to set camera Preview on screen
}
Call createCameraPreview function to restart the camera, otherwise, it will stay stuck.
You can change the ImageReader with new values
ImageReader mReader = ImageReader.newInstance(640, 480, ImageFormat.JPEG, 1);
And call the takePicture() function again so that user don't have to click again to capture image.
I have been busting my head over this, this is the last thing I need to complete and he app is done.
Basically, I created a camera for my app and I need to switch from back camera to front camera on onClick()...
When I switch, I lose the preview... When I record, the screen is black but the video get recorded... but no preview at all.. here is the code
#Override
protected void onCreate(Bundle saved) {
super.onCreate(saved);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.camera);
//some initializing code like checking flash, number of cameras...
preview = (FrameLayout) findViewById(R.id.camera_preview);
}
#Override
public void onResume(){
super.onResume();
if (Camera.getNumberOfCameras() < 2) {
a.id(R.id.camera_switch).clickable(false);
}
if(m!=null){
m.reset();
m.release();
m=null;
c.lock();
}
if (c != null) {
c.release();
c = null;
}
cam = "front";
Instance();
}
public void Instance(){
if(flash.equalsIgnoreCase("yes"))
a.id(R.id.camera_flash).clickable(true);
if(cam.equalsIgnoreCase("back")){
try{
m.reset();m=null;
c.stopPreview();
c.release();c.reconnect();
c = null;
}catch(Exception e){}
a.id(R.id.camera_flash).clickable(false);
Camera c = getCameraInstanceB(this);
parameters = c.getParameters();
parameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
c.setParameters(parameters);
cam = "front";
try {
c.setPreviewDisplay(mPreview.getHolder());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}c.startPreview();
}else{
try{
c.release();
c = null;
}catch(Exception e){}
c = getCameraInstance(this);
parameters = c.getParameters();
cam = "back";
}
m = new MediaRecorder();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, c);
int orien =getResources().getConfiguration().orientation;
if(orien ==1){
parameters.setRotation(0); // set rotation to save the picture
c.setDisplayOrientation(90);
cam_rotation =90;
parameters.setPictureSize(640, 480);
PIC_ORIENTATION = "portrait";
Toast.makeText(this, PIC_ORIENTATION, Toast.LENGTH_SHORT).show();
}else{
parameters.setRotation(0); // set rotation to save the picture
c.setDisplayOrientation(0);
parameters.setPictureSize(640, 480);
PIC_ORIENTATION = "landscape";
cam_rotation=0;
Toast.makeText(this, PIC_ORIENTATION, Toast.LENGTH_SHORT).show();
}
c.setParameters(parameters);
m.setCamera(c);
preview.addView(mPreview);
}
now the camera instances for back and front
public static Camera getCameraInstance(Cam cam){
c = null;
try {
c = Camera.open(CameraInfo.CAMERA_FACING_BACK); // attempt to get a Camera instance
Camera.Parameters parameters = c.getParameters();
parameters.setRecordingHint(true);
parameters.setFocusMode(Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
c.setParameters(parameters);
}
catch (Exception e){
// Camera is not available (in use or does not exist)
e.printStackTrace();
text = "The camera is in use";
//---set the data to pass back---
data.putExtra("vid",text);
cam.setResult(RESULT_OK, data);
//---close the activity---
cam.finish();
}
return c; // returns null if camera is unavailable
}
public static Camera getCameraInstanceB(Cam cam){
c = null;
try {
c = Camera.open(CameraInfo.CAMERA_FACING_FRONT); // attempt to get a Camera instance
Camera.Parameters parameters = c.getParameters();
parameters.setRecordingHint(true);
c.setParameters
(parameters);
}
catch (Exception e){
// Camera is not available (in use or does not exist)
e.printStackTrace();
text = "The camera is in use";
//---set the data to pass back---
data.putExtra("vid",text);
cam.setResult(RESULT_OK, data);
//---close the activity---
cam.finish();
}
return c; // returns null if camera is unavailable
}
on Resume()... everything is fine but when I switch... no more preview
After spending hours, I finally came up with a solution, basically, I just recreate the surfaceView on each switch as it was an onStart();
public void Instance(){
preview = (FrameLayout) findViewById(R.id.camera_preview);
//rest of the code here
Now it works like a charm... no more error even onResume