android studio use camera in background - java

I'm new to the Android studio and I was trying to make the camera work in the background. I just want the camera to get the feed activity of it but when I'm trying to open the app it just automatically stops. The camera is usually working fine if it's in the MainActivity but when I transfer it to my camera service the app automatically stops when I open.
public class CameraService extends Service implements CameraBridgeViewBase.CvCameraViewListener2{
private static final String TAG = "MainActivity";
private JavaCameraView mOpenCvCameraView;
private Mat mRgba;
public CameraService(){
}
#Override
public IBinder onBind(Intent intent) {
throw new UnsupportedOperationException("Not yet implemented");
}
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
#Override
public void onManagerConnected(int status){
switch (status) {
case LoaderCallbackInterface.SUCCESS:{
Log.i(TAG, "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
}
break;
default:{
super.onManagerConnected(status);
}
break;
}
}
};
#Override
public void onCreate() {
mOpenCvCameraView = (JavaCameraView) mOpenCvCameraView.findViewById(R.id.javacameraview);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
mOpenCvCameraView.setCameraIndex(CameraBridgeViewBase.CAMERA_ID_FRONT);
Toast.makeText(this, "The new Service was Created", Toast.LENGTH_LONG).show();
}
private void setContentView(int activity_main) {
}
#Override
public void onStart(Intent intent, int startId) {
Toast.makeText(this, " Service Started", Toast.LENGTH_LONG).show();
}
#Override
public void onDestroy() {
Toast.makeText(this, "Service Destroyed", Toast.LENGTH_LONG).show();
super.onDestroy();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
#Override
public void onCameraViewStarted(int width, int height) {
mRgba = new Mat(height, width, CvType.CV_8UC4);
}
#Override
public void onCameraViewStopped() {
mRgba.release();
}
#Override
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
return inputFrame.rgba();
}
}
can someone give me some tips or pinpoint my problem?

Google’s Android Pie update came with a new feature that prevents apps from running in the background from using the camera. This feature ensures that you don’t live in fear of malicious apps (that remain active in the background when your screen is turned off) capturing potentially unwanted scenes of you or your loved ones without authorization.
error in your code: you are using an unassigned object to find the id
if you are doing it for illegal purposes, don't do this -this type of problem is not liked by the StackOverflow community.

Related

WearOS Off Body Event is not always triggered

I'm developing a simple app for WearOS. This app consists of the app itself, which just shows an image and two services. One is a service, does NFC card emulation and the other simply listens for the off body event, and resets some values, whenever the watch is taken off. This works well while I'm debugging and also for a some time when started normally.
However, after a few hours, the watch can be taken off, without my app getting the event. I suspect, that the OS is killing the service and not restarting it, despite the START_STICKY flag. The watch is not coupled with a phone and is not running other apps.
This is the code of my service:
public class MySensorService extends Service
{
private static final String TAG = "MySensorService";
private SensorManager sensorManager;
private Sensor mOffBody;
public MySensorService()
{
Log.i(TAG, "Sensor service was created.");
}
#Override
public int onStartCommand(Intent intent, int flags, int startId)
{
Log.i(TAG, "Sensor service starting.");
sensorManager = (SensorManager) getApplicationContext().getSystemService(SENSOR_SERVICE);
mOffBody = sensorManager.getDefaultSensor(TYPE_LOW_LATENCY_OFFBODY_DETECT, true);
if (mOffBody != null) {
sensorManager.registerListener(mOffbodySensorListener, mOffBody,
SensorManager.SENSOR_DELAY_NORMAL);
}
Log.i(TAG, "Sensor service was started.");
return START_STICKY;
}
private void onWatchRemoved()
{
Log.i(TAG, "Watch is not worn anymore");
Intent intent = new Intent(this, MyAPDUService.class);
intent.putExtra("UserID", 0);
Log.d(TAG,"starting service");
startService(intent);
}
private void onWatchAttached()
{
Log.i(TAG, "Watch is now worn");
}
private final SensorEventListener mOffbodySensorListener = new SensorEventListener()
{
#Override
public void onSensorChanged(SensorEvent event)
{
if (event.values.length > 0)
{
if (event.values[0] == 0) // 0 = Watch is not worn; 1 = Watch is worn
{
onWatchRemoved();
}
else
{
onWatchAttached();
}
}
}
#Override
public void onAccuracyChanged(Sensor sensor, int i)
{
}
};
#Override
public void onDestroy()
{
super.onDestroy();
Log.i(TAG, "Sensor Service destroyed");
}
}

Android Service TextToSpeech

Hello I working on an app that readout the incoming message using service, it works fine but changing the TextToSpeech speed and pitch on service activity wont work. I used intent to get the speed and pitch value from MainActivity which works, but setting the speed and pitch to TTS wont work it remains on same speed and pitch value. I'm happy to receive any suggestions. Thanks
Speaker.java
public class Speaker extends Service implements TextToSpeech.OnInitListener {
private TextToSpeech tts;
private boolean ready = false;
private boolean allowed = false;
float speed, pitch;
public Speaker() {
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
speed = (float) intent.getExtras().get("Speed");
pitch = (float) intent.getExtras().get("Pitch");
Log.i("Speed", String.valueOf(speed));
Log.i("Pitch", String.valueOf(pitch));
return START_STICKY;
}
#Override
public IBinder onBind(Intent intent) {
// TODO: Return the communication channel to the service.
throw new UnsupportedOperationException("Not yet implemented");
}
public Speaker(Context context){
tts = new TextToSpeech(context, this);
}
#Override
public void onInit(int status) {
if(status == TextToSpeech.SUCCESS){
tts.setLanguage(Locale.getDefault());
tts.setPitch(pitch);
tts.setSpeechRate(speed);
ready = true;
}else{
ready = false;
}
}
public void speak(String text){
tts.speak(text, TextToSpeech.QUEUE_ADD, null);
}
public void pause(int duration){
tts.playSilence(duration, TextToSpeech.QUEUE_ADD, null);
}
public void destroy(){
tts.shutdown();
}
}

Implementing a separate Class into Activity

The problem
In cleaning up my code I want to move my Android camera methods to a separate class, in line with what I believe are best practices. After searching all day, I'm still struggling to figure out how to do this exactly. Main problem is that the differences in implementation methods and moving from camera API to camera2 API lead to solutions found online which I can't replicate. Please note that I am quite a beginner in Java and as such, it is probably a very rookie mistake which I can't solve due to the variety of info on the web.
Current code
My main problem is that SurfaceTexture texture = surfaceView.getSurfaceTexture(); in startCamera() says Cannot resolve method 'getSurfaceTexture()' and that previewBuilder.addTarget(texture); complains about addTarget (android.view.surface) in Builder cannot be applied to (android.graphics.SurfaceTexture).
public class CameraView extends TextureView implements TextureView.SurfaceTextureListener{
private Size previewsize;
private CameraDevice cameraDevice;
private CaptureRequest.Builder previewBuilder;
private CameraCaptureSession previewSession;
private final Context context;
public SurfaceView surfaceView;
public TextureView textureView;
public CameraView(Context context, AttributeSet attrs) {
super(context, attrs);
this.context = context;
}
public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
// Once the surface is created, simply open a handle to the camera hardware.
openCamera();
}
public void onSurfaceTextureUpdated(SurfaceTexture texture) {
}
public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
try {
//cameraDevice.setPreviewDisplay(holder);
//cameraDevice.startPreview();
} catch (Exception e){
e.printStackTrace();
}
}
public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {
// stopPreview();
cameraDevice.close();
return true;
}
public void openCamera()
{
CameraManager manager = (CameraManager) context.getSystemService (Context.CAMERA_SERVICE);
try
{
String cameraId = manager.getCameraIdList()[0];
CameraCharacteristics characteristics=manager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
previewsize = map.getOutputSizes(SurfaceTexture.class)[0];
try {
manager.openCamera(cameraId, stateCallback, null);
} catch (SecurityException e){
e.printStackTrace();
}
} catch (Exception e) {
e.printStackTrace();
}
}
private CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
cameraDevice = camera;
startCamera();
}
#Override
public void onClosed(CameraDevice camera) {
// nothing
}
#Override
public void onDisconnected(CameraDevice camera) {
}
#Override
public void onError(CameraDevice camera, int error) {
}
};
void startCamera()
{
if (cameraDevice == null || previewsize==null)
{
return;
}
SurfaceTexture texture = surfaceView.getSurfaceTexture();
texture.setDefaultBufferSize(previewsize.getWidth(),previewsize.getHeight());
Surface surface = new Surface(texture);
try
{
// add all the standard stuff to the previewBuilder
previewBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
} catch (Exception e) {}
previewBuilder.addTarget(texture);
try
{
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
previewSession = session;
getChangedPreview();
}
#Override
public void onConfigureFailed(CameraCaptureSession session{
}
},null);
} catch (Exception e) {}
}
void getChangedPreview()
{
previewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_ON);
HandlerThread thread = new HandlerThread("changed Preview");
thread.start();
Handler handler = new Handler(thread.getLooper());
try
{
previewSession.setRepeatingRequest(previewBuilder.build(), null, handler);
}catch (Exception e){}
}
}
The goal
To keep my code clean and understandable I would like to limit the MainActivity class to switching between views instead of having tons of methods in there. I would like to activate my camera view in my app by switching the following object from INVISIBLE to VISIBLE. Other suggestions are appreciated.
cameraView = (CameraView) findViewById(R.id.camera);
MainActivity.java would then look like:
public class MainActivity extends AppCompatActivity {
private TextView mTextMessage;
private CameraView cameraView;
private MainSurfaceView mGLView;
private TextureView textureView;
private BottomNavigationView.OnNavigationItemSelectedListener mOnNavigationItemSelectedListener
= new BottomNavigationView.OnNavigationItemSelectedListener() {
#Override
public boolean onNavigationItemSelected(#NonNull MenuItem item) {
switch (item.getItemId()) {
case R.id.navigation_home:
mTextMessage.setText(R.string.title_home);
return true;
case R.id.navigation_dashboard:
mTextMessage.setText(R.string.title_dashboard);
return true;
case R.id.navigation_notifications:
mTextMessage.setText(R.string.title_notifications);
return true;
}
return false;
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mTextMessage = (TextView) findViewById(R.id.message);
cameraView = (CameraView) findViewById(R.id.camera);
mGLView = (MainSurfaceView) findViewById(R.id.glSurfaceView);
BottomNavigationView navigation = (BottomNavigationView) findViewById(R.id.navigation);
navigation.setOnNavigationItemSelectedListener(mOnNavigationItemSelectedListener);
}
}
Your help is appreciated!
SurfaceTexture texture = surfaceView.getSurfaceTexture(); in startCamera() says Cannot resolve method 'getSurfaceTexture()'
You call the method getSurfaceTexture of surfaceView. surfaceView is a SurfaceView. Let's take a look at the documentation:
https://developer.android.com/reference/android/view/SurfaceView.html
Apparently SurfaceView has no method called "getSurfaceTexture()". However, searching on Google for "getSurfaceTexture() Android" shows us that the method belongs to the TextureView class. Your class CameraView has a field called "textureView", so (I don't know what you want to achieve exactly) can call the method on that field if you want. Additionally your class CameraView is a TextureView itself (do you want that), so you could also just call getSurfaceTexture() if you want to invoke it on the class itself.
previewBuilder.addTarget(texture); complains about addTarget (android.view.surface) in Builder cannot be applied to (android.graphics.SurfaceTexture).
Let's have a look at the documentation again: https://developer.android.com/reference/android/hardware/camera2/CaptureRequest.Builder.html
Apparently CaptureRequest.builder (the type of previewBuilder) has a method called addTarget, but that method only accepts a Surface! You're passing a texture. You probably want to change texture to surface

Camera is black on Android 4.1.2 with Unity3D and Vuforia

The company I work is building an android application with vuforia(version 5.x) and Unity3D(version 5.3.5f1 Personal) integration.
In newest android devices the camera is opening ok but We're facing problem in older devices like Samsung Galaxy S2(4.1.2). When the device open the camera via vuforia the screen is black and if we try to take a picture the image is obviously black as well.
My activity is only instantiating the unity player and it is like code below:
public class MainActivity extends AppCompatActivity {
protected UnityPlayer mUnityPlayer;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.RGBX_8888); // <--- This makes xperia play happy
mUnityPlayer = new UnityPlayer(this);
setContentView(mUnityPlayer);
mUnityPlayer.requestFocus();
}
public void onImageSaved(final String path) {
final Intent intent = new Intent(MainActivity.this, CameraPreviewActivity.class);
final Bundle bundle = new Bundle();
bundle.putString("IMAGE_PATH", path);
intent.putExtras(bundle);
startActivity(intent);
}
public void onImageSavedError(final String message) {
UnityPlayer.currentActivity.runOnUiThread(new Runnable() {
#Override
public void run() {
Log.e("Teste", message);
}
});
}
// Quit Unity
#Override protected void onDestroy ()
{
mUnityPlayer.quit();
super.onDestroy();
}
// Pause Unity
#Override protected void onPause()
{
super.onPause();
mUnityPlayer.pause();
}
// Resume Unity
#Override protected void onResume()
{
super.onResume();
mUnityPlayer.resume();
}
// This ensures the layout will be correct.
#Override public void onConfigurationChanged(Configuration newConfig)
{
super.onConfigurationChanged(newConfig);
mUnityPlayer.configurationChanged(newConfig);
}
// Notify Unity of the focus change.
#Override public void onWindowFocusChanged(boolean hasFocus)
{
super.onWindowFocusChanged(hasFocus);
mUnityPlayer.windowFocusChanged(hasFocus);
}
// For some reason the multiple keyevent type is not supported by the ndk.
// Force event injection by overriding dispatchKeyEvent().
#Override public boolean dispatchKeyEvent(KeyEvent event)
{
if (event.getAction() == KeyEvent.ACTION_MULTIPLE)
return mUnityPlayer.injectEvent(event);
return super.dispatchKeyEvent(event);
}
// Pass any events not handled by (unfocused) views straight to UnityPlayer
#Override public boolean onKeyUp(int keyCode, KeyEvent event) { return mUnityPlayer.injectEvent(event); }
#Override public boolean onKeyDown(int keyCode, KeyEvent event) { return mUnityPlayer.injectEvent(event); }
#Override public boolean onTouchEvent(MotionEvent event) { return mUnityPlayer.injectEvent(event); }
/*API12*/ public boolean onGenericMotionEvent(MotionEvent event) { return mUnityPlayer.injectEvent(event); }
#Override
public void onBackPressed() {
finish();
}
}
The vuforia is responsible for opening the camera and starting tracking the image targets while unity3D will render a 3D image with animation and responsible for taking the picture.
In unity project we have developed a button to take a picture and a function thats gonna save the image in an android directory. Its is script was done in C# and its implementation is as follow below:
IEnumerator TakeSnapshot()
{
yield return new WaitForEndOfFrame();
Texture2D snap = new Texture2D(Screen.width,Screen.height);
Camera camera = Camera.current;
if (rotation != null)
{
camera.transform.rotation = rotation.Value;
}
if (position != null)
{
camera.transform.position = position.Value;
}
camera.Render();
RenderTexture.active = camera.targetTexture;
snap.ReadPixels(new Rect(0,0,Screen.width,Screen.height),0,0);
snap.Apply();
RenderTexture.active = null;
byte[] bytes = snap.EncodeToJPG();
DestroyObject(snap);
string path = Application.persistentDataPath + "/MyAppPath/";
if(!System.IO.Directory.Exists(path)){
System.IO.Directory.CreateDirectory(path);
}
string filename = fileName(Convert.ToInt32(snap.width), Convert.ToInt32(snap.height));
path = path + filename;
System.IO.File.WriteAllBytes(path, bytes);
using (var actClass = new AndroidJavaClass("com.unity3d.player.UnityPlayer"))
{
var jo = actClass.GetStatic<AndroidJavaObject>("currentActivity");
if (jo == null) {
Debug.Log ("jo is null");
} else {
jo.Call("onImageSaved", path);
}
}
}
The main goal of this post is try to understand why camera is black only in older devices since it worked fine in newer one. Also I would like to highlight that openGL may not be a problem since I have already tested the application in older devices like Samsung Galaxy S2 using sinogen 5.1.
The current openGL version that unity project is being exported is openGL2.
Thanks in advance.
Try this code:
public string deviceName;
WebCamTexture wct;
void Start ()
{
WebCamDevice[] devices = WebCamTexture.devices;
deviceName = devices[0].name;
wct = new WebCamTexture(deviceName, 400, 300, 12);
GetComponent<Renderer> ().material.mainTexture = wct;
wct.Play();
}

Save Android CAMERA API without INTENT

Hello i have the whole code, but i want to save the snaps automatically and release camera to back preview. I don't know how to do that :/ It's taking the snapshot but don't save neither release camera.
Thanks for the hlep guys!!
package com.velcisribeiro.xcamera;
+imports
public class MainActivity extends Activity {
private Camera cameraObject;
private ShowCamera showCamera;
private ImageView pic;
public static Camera isCameraAvailiable(){
Camera object = null;
try {
object = Camera.open();
}
catch (Exception e){
}
return object;
}
private PictureCallback capturedIt = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap bitmap = BitmapFactory.decodeByteArray(data , 0, data .length);
if(bitmap==null){
Toast.makeText(getApplicationContext(), "not taken", Toast.LENGTH_SHORT).show();
}
else
{
Toast.makeText(getApplicationContext(), "taken", Toast.LENGTH_SHORT).show();
}
cameraObject.release();
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
cameraObject = isCameraAvailiable();
showCamera = new ShowCamera(this, cameraObject);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(showCamera);
}
public void snapIt(View view){
cameraObject.takePicture(null, null, capturedIt);
}
}
And the other one is:
public class ShowCamera extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder holdMe;
private Camera theCamera;
public ShowCamera(Context context,Camera camera) {
super(context);
theCamera = camera;
holdMe = getHolder();
holdMe.addCallback(this);
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
theCamera.setPreviewDisplay(holder);
theCamera.setDisplayOrientation(90);
theCamera.startPreview();
} catch (IOException e) {
}
}
#Override
public void surfaceDestroyed(SurfaceHolder arg0) {
}
}
When I was building my own camera implementation, I just used the code provided by the Zxing library. It works really well and you can easily modify it to do what you'd like:
https://github.com/zxing/zxing
You need to add following two lines in surfaceDestroyed callback for releasing camera.
theCamera.stopPreview();
theCamera.release();
And for saving image change onPictureTaken callback.
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap bitmap = BitmapFactory.decodeByteArray(data , 0, data .length);
if(bitmap==null){
Toast.makeText(getApplicationContext(), "not taken", Toast.LENGTH_SHORT).show();
}
else
{
Toast.makeText(getApplicationContext(), "taken", Toast.LENGTH_SHORT).show();
}
//Add code to save image
cameraObject.release();
}
Also have a look on following URL for better understanding.
http://androidtrainningcenter.blogspot.in/2012/01/how-to-use-android-camera-to-take.html.

Categories