The application purpose is to basically compare what's camera seeing vs an image from the gallery. I don't know which control gives you access to the camera without opening the default system app.
I know this app will be useless on split-screen phones (camera on one side and gallery in the other) but is intended to be used in phones without this Nougat functionality (marshmallow or lollipop).
I've seen some apps that display camera like Barcode readers and some quick photo editors.
You can use Camera api. You will need to create your own SurfaceView to display preview of what camera is seeing. There are a lot of tutorial on internet for this.
public class ImageSurfaceView extends SurfaceView implements SurfaceHolder.Callback {
private Camera camera;
private SurfaceHolder surfaceHolder;
public ImageSurfaceView(Context context, Camera camera) {
super(context);
this.camera = camera;
this.surfaceHolder = getHolder();
this.surfaceHolder.addCallback(this);
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
this.camera.setPreviewDisplay(holder);
this.camera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
this.camera.stopPreview();
this.camera.release();
}
}
public class MainActivity extends Activity implements SensorEventListener {
private Camera mCamera;
private ImageSurfaceView cameraView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
}
protected void onResume(){
super.onResume();
mCamera = getCameraInstance();
mCamera.setDisplayOrientation(90);
cameraView = new ImageSurfaceView(this, mCamera);
mainView.addView(cameraView);
mainView.bringChildToFront(buttonView);
senSensorManager.registerListener(this, senRotation, SensorManager.SENSOR_DELAY_GAME);
}
/** A safe way to get an instance of the Camera object. */
public Camera getCameraInstance(){
Camera c = null;
try{
c = Camera.open(); // attempt to get a Camera instance
Camera.Parameters parameters = c.getParameters();
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
c.setParameters(parameters);
}
catch (Exception e){
// Camera is not available (in use or does not exist)
}
return c; // returns null if camera is unavailable
}
}
Related
I try to create QR Code scanner in fragment, but camera won't showing in surfaceview and just turn black.
here's my java class:
public class ScanFragment extends Fragment {
SurfaceView surfaceView;
CameraSource cameraSource;
TextView textView;
BarcodeDetector barcodeDetector;
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
final View view = inflater.inflate(R.layout.fragment_scan, container, false);
surfaceView = (SurfaceView) view.findViewById(R.id.cameraPreview);
textView = (TextView) view.findViewById(R.id.scanText);
barcodeDetector = new BarcodeDetector.Builder(view.getContext().getApplicationContext())
.setBarcodeFormats(Barcode.QR_CODE).build();
cameraSource = new CameraSource.Builder(view.getContext().getApplicationContext(), barcodeDetector)
.setRequestedPreviewSize(640, 480).build();
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder holder) {
if (ActivityCompat.checkSelfPermission(getContext().getApplicationContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
try {
cameraSource.start(holder);
}catch (IOException e){
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
cameraSource.stop();
}
});
barcodeDetector.setProcessor(new Detector.Processor<Barcode>() {
#Override
public void release() {
}
#Override
public void receiveDetections(Detector.Detections<Barcode> detections) {
final SparseArray<Barcode> qrCodes = detections.getDetectedItems();
if(qrCodes.size() != 0){
textView.post(new Runnable() {
#Override
public void run() {
Vibrator vibrator = (Vibrator) getContext().getApplicationContext().getSystemService(Context.VIBRATOR_SERVICE);
vibrator.vibrate(1000);
textView.setText(qrCodes.valueAt(0).displayValue);
}
});
}
}
});
return view;
}
}
I gave the uses permissions from the android manifest file. compiles seamlessly in android studio but when I run it on the phone the camera just turn black and no crash from that.
Anyone know how to fix this?
From Android 6.0(API 23) on, you need to request runtime permission from the users. That is why your camera doesn't show anything. Permission is only defined in AndroidManifest, but the user did not agree to allow your application to use a camera. You have a good example of how to request runtime permissions here.
If you want to read more about this, there is also documentation available on Android developer:
https://developer.android.com/distribute/best-practices/develop/runtime-permissions
https://developer.android.com/training/permissions/requesting
Here is full code of the app which freezes (UI) after some seconds of work.
Is something dangerous here?
Thank you!
public class FragmentOne extends Fragment {
private Context _context;
private View view;
private BroadcastReceiver broadcastReceiver;
public FragmentOne() {
}
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
view = inflater.inflate(R.layout.fragment_fragment_one, container, false);
setup();
return view;
}
#Override
public void onAttach(Context context)
{
super.onAttach(context);
_context = context;
}
private void setup()
{
broadcastReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent i)
{
try
{
DLocation dLocation = (DLocation) i.getExtras().get("coordinates");
if (dLocation != null) {
Log.d("Первый фрагмент", "Применение параметров шир. сообщения к контролам окна");
TextView textLon = (TextView)view.findViewById(R.id.textLon);
textLon.setText(dLocation.Longitude);
TextView textLat = (TextView)view.findViewById(R.id.textLat);
textLat.setText(dLocation.Latitude);
TextView textTime = (TextView)view.findViewById(R.id.textTime);
textTime.setText(dLocation.TimeOfRequest);
TextView textErrors = (TextView)view.findViewById(R.id.textErrors);
textErrors.setText(dLocation.Errors);
}
}
catch (Exception ex)
{
Toast.makeText(getActivity(), ex.getMessage(), Toast.LENGTH_LONG).show();
}
}
};
_context.registerReceiver(broadcastReceiver, new IntentFilter("location_update"));
}
#Override
public void onResume() {
super.onResume();
}
#Override
public void onPause() {
super.onPause();
}
#Override
public void onDestroy() {
super.onDestroy();
if (broadcastReceiver != null) {
_context.unregisterReceiver(broadcastReceiver);
}
}
}
Root Cause
I think you are using a 3rd party library to detect location. The library is receiving the GPS coordinates at a very high rate. These coordinates are then received by your broadcast receiver. Your broadcast receiver is doing it's work on the UI thread. The reason why your app freezes is because the UI thread is doing work at very high rate.
Solution
The solution to your problem lies in Bound Service. You can find code examples in android developer docs Bound Services.
For use cases like a music player, where media is played in a background thread but duration of played music is shown on the UI, bound service can be useful. I hope this sets you in the right direction.
I am trying to fade in a TextureView but for some reason its not animating. It just simply pops in the video, no fade at all and i dont really know why that is because after some research i have found that TextureView can be animated normally.
Here is my code, i hope you guys can give me a pointer in the right direction.
PS, i have left out all irrelevant code that does not concern itself with the textureview and the animation.
public class MainActivity extends AppCompatActivity implements MediaPlayer.OnPreparedListener, MediaPlayer.OnCompletionListener
{
private static final String TAG = MainActivity.class.getSimpleName();
private MediaPlayer mediaPlayer1;
private ArrayList<Uri> videoUris = new ArrayList<>();
private int current_video_index = 0;
private TextureView textureView;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
textureView = (TextureView) findViewById(R.id.textureView);
mediaPlayer1 = new MediaPlayer();
mediaPlayer1.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer1.setOnPreparedListener(this);
mediaPlayer1.setOnCompletionListener(this);
initVideoUris();
initNewVideo();
}
private void startVideo(final Uri uri)
{
textureView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener()
{
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height)
{
try {
mediaPlayer1.setDataSource(MainActivity.this, uri);
mediaPlayer1.setSurface(new Surface(surface));
mediaPlayer1.setOnCompletionListener(MainActivity.this);
mediaPlayer1.setOnPreparedListener(MainActivity.this);
mediaPlayer1.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer1.prepareAsync();
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height)
{
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface)
{
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface)
{
}
});
}
#Override
public void onPrepared(MediaPlayer mp)
{
mp.start();
fadeInView(textureView);
}
#Override
public void onCompletion(MediaPlayer mp)
{
if (!moreVideosAvailable())
{
finish();
}
}
private boolean moreVideosAvailable()
{
return current_video_index < videoUris.size();
}
private void fadeInView(View view)
{
view.setAlpha(0f);
view.setVisibility(View.VISIBLE);
view.animate().alpha(1f).setDuration(2000).setListener(null).start();
}
}
I solved it by making a videoPlayerFragment that uses TextureView as the surface for displaying the video. Then simply animate the whole fragment instead of the textureView.
I am trying to view a live streaming by motion on my Raspberry Pi through an Android App written with Android Studio...I have compound this code using mplayer:
public class MainActivity extends ActionBarActivity implements SurfaceHolder.Callback, MediaPlayer.OnPreparedListener {
private MediaPlayer mediaPlayer;
private SurfaceHolder vidHolder;
private SurfaceView vidSurface;
String vidAddress = "http://www.example.com/";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
vidSurface = (SurfaceView) findViewById(R.id.surfView);
vidHolder = vidSurface.getHolder();
vidHolder.addCallback(this);
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
mediaPlayer = new MediaPlayer();
mediaPlayer.setDisplay(vidHolder);
mediaPlayer.setDataSource(vidAddress);
mediaPlayer.prepare();
mediaPlayer.setOnPreparedListener(this);
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
}
catch(Exception e){
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
#Override
public void onPrepared(MediaPlayer mp) {
mediaPlayer.start();
}
}
But it does not work. If I put an url of a streaming video (not live) for example: www.something.com/vid.mp4 it works. (Yes, I have added internet permission on manifest). Anyone could help me?
Thanks!
I noticed you are inserting a web address without a file at the end - that might be your cause. Motion uses the file stream.mjpg, so the local address by default would be http://hostip:8081/stream.mjpg
Source: http://www.lavrsen.dk/foswiki/bin/view/Motion/MotionGuideBasicFeatures#Webcam_Server
I am using Eclipse to write an Android app. I want my app to display a background image which is stretched to the size of the screen.
I have written the following code, but in the emulator it immediately exited the app when I ran it. Could someone please help me to understand the problem...
Here is my code...
public class Roller extends Activity {
Display display = getWindowManager().getDefaultDisplay();
int dwidth = display.getWidth();
int dheight = display.getHeight();
Bitmap background1 = BitmapFactory.decodeResource(getResources(),R.drawable.sunnybackground);
Bitmap BSunny = Bitmap.createScaledBitmap(background1,dwidth,dheight,true);
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
setContentView(new Panel(this));
}
class Panel extends View {
public Panel(Context context) {
super(context);
}
public void onDraw(Canvas canvas) {
canvas.drawBitmap(BSunny, 0, 0, null);
}
}
}
Why not just something like this:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
ImageView iv = new ImageView(this, null);
iv.setBackgroundResource(R.drawable.sunnybackground);
setContentView(iv);
}
Why don't you set the background image in the layout xml file?
Do you have to set it programmatically at runtime?
I think you do as below
(It seems that you can't get display info unless activity is created)
public class Roller extends Activity {
Bitmap BSunny;
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
Display display = getWindowManager().getDefaultDisplay();
int dwidth = display.getWidth();
int dheight = display.getHeight();
Bitmap background1 = BitmapFactory.decodeResource(getResources(),R.drawable.sunnybackground);
BSunny = Bitmap.createScaledBitmap(background1,dwidth,dheight,true);
setContentView(new Panel(this));
}
class Panel extends View {
public Panel(Context context) {
super(context);
}
public void onDraw(Canvas canvas) {
canvas.drawBitmap(BSunny, 0, 0, null);
}
}
}