I'm using ARCore in my android project. It has one activity, MainActivity.java and one CustomArFragment.java.
The problem is that the camera quality is very low. how can I increase the camera preview resolution. In future , I need the recognize the text using camera and having such low quality, its difficult to recognize. I am not able to find the solution on google. Tried many things but got no success. Anyone please tell how to increase camera quality. I'm totally new in AR and android field.
Thanks :)
MainActivity.java :
public class MainActivity extends AppCompatActivity {
private CustomArFragment arFragment;
private TextView textView;
private AugmentedImageDatabase aid;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
arFragment = (CustomArFragment) getSupportFragmentManager().findFragmentById(R.id.arFragment);
textView = findViewById(R.id.textView);
arFragment.getArSceneView().getScene().addOnUpdateListener(this::onUpdate);
findViewById(R.id.registeredBtn).setOnClickListener(v -> {
if(ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED){
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE},1);
return;
}
registeredImage();
});
}
private void registeredImage() {
File file = new File(getExternalFilesDir(null) + "/db.imgdb");
try {
FileOutputStream outputStream = new FileOutputStream(file);
Bitmap bitmap = BitmapFactory.decodeResource(getResources(),R.drawable.earth);
aid.addImage("earth",bitmap);
aid.serialize(outputStream);
outputStream.close();
Toast.makeText(this, "image Registered", Toast.LENGTH_SHORT).show();
} catch (IOException e) {
e.printStackTrace();
}
}
private void onUpdate(FrameTime frameTime) {
frame = arFragment.getArSceneView().getArFrame();
Collection<AugmentedImage> images = frame.getUpdatedTrackables(AugmentedImage.class);
for(AugmentedImage image : images){
if(image.getTrackingMethod() == AugmentedImage.TrackingMethod.FULL_TRACKING){
if(image.getName().equals("lion")){
textView.setText("LION is visible");
}
else if(image.getName().equals("download")){
textView.setText("download is visible");
}
else{
textView.setText("Nothing is visible till now");
}
Log.d("Value of textView : "," " + textView.getText());
}
Log.d("Value of textView1 : "," " + textView.getText());
}
}
public void loadDB(Session session, Config config){
//InputStream dbStream = getResources().openRawResource(R.raw.imagedb);
try {
File file = new File(getExternalFilesDir(null) + "/db.imgdb");
FileInputStream dbStream = new FileInputStream(file);
aid = AugmentedImageDatabase.deserialize(session, dbStream);
config.setAugmentedImageDatabase(aid);
session.configure(config);
Log.d("TotalImages"," : " + aid.getNumImages());
}catch (IOException e){
e.printStackTrace();
}
}
CustomArFragment.java
public class CustomArFragment extends ArFragment {
#Override
protected Config getSessionConfiguration(Session session){
getPlaneDiscoveryController().setInstructionView(null);
Config config = new Config(session);
config.setUpdateMode(Config.UpdateMode.LATEST_CAMERA_IMAGE);
MainActivity activity = (MainActivity) getActivity();
activity.loadDB(session,config);
this.getArSceneView().setupSession(session);
return config;
}
}
I added some info regarding a similar topic in this other question. But answering yours, check what CameraConfig is used by default (use getCameraConfig() on the Session). Then you can get the one that is of your interest (high GPU texture size) through getSupportedCameraConfigs() and configure the Session with it through setCameraConfig.
Reading your question again, if you need in the future to recognize some text on the image, you might want to get a higher CPU image instead of GPU texture size. You can still use CameraConfig to check that and choose your favorite, but keep in mind that higher CPU images decreases the performance quite a bit. You might want to check my answer in the question I mentioned earlier.
EDIT: Use this code snippet to check the CameraConfig being used and adapt it to your needs:
Session session = new Session(requireActivity());
// ...
Size selectedSize = new Size(0, 0);
CameraConfig selectedCameraConfig = null;
CameraConfigFilter filter = new CameraConfigFilter(session);
List<CameraConfig> cameraConfigsList = session.getSupportedCameraConfigs(filter);
for (CameraConfig currentCameraConfig : cameraConfigsList) {
Size cpuImageSize = currentCameraConfig.getImageSize();
Size gpuTextureSize = currentCameraConfig.getTextureSize();
Log.d("TAG", "CurrentCameraConfig CPU image size:" + cpuImageSize + " GPU texture size:" + gpuTextureSize);
// Adapt this check to your needs
if (gpuTextureSize.getWidth() > selectedSize.getWidth()) {
selectedSize = gpuTextureSize;
selectedCameraConfig = currentCameraConfig;
}
}
Log.d("TAG", "Selected CameraConfig CPU image size:" + selectedCameraConfig.getImageSize() + " GPU texture size:" + selectedCameraConfig.getTextureSize());
session.setCameraConfig(selectedCameraConfig);
// ...
// Don't forget to configure the session afterwards
session.configure(config);
Related
I have an app that writes to its local storage depending on user actions; said contents need to
be forwarded to another app.
My approach:
create a worker thread with a file observer pointed to local storage
start worker from the apps main activity
worker thread creates and sends intents with updated contents to separate app
I'm not sure (maybe need to open a separate question), but everything created in an activity gets destroyed when the activity is stopped, right? meaning that adding workers, file observers have the same life span as the activity they're defined in, right?
Code:
MainActivity.java:
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final String FILE_OBSERVER_WORK_NAME = "file_observer_work";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Log.i(TAG, "Creating file observer worker");
WorkManager workManager = WorkManager.getInstance(getApplication());
WorkContinuation continuation = workManager
.beginUniqueWork(FILE_OBSERVER_WORK_NAME,
ExistingWorkPolicy.REPLACE,
OneTimeWorkRequest.from(APIWorker.class));
Log.i(TAG, "Starting worker");
continuation.enqueue();
final Button button = findViewById(R.id.button2);
button.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
Log.i(TAG, "Button clicked!");
String stuffToWriteToFile = getStuff();
String cwd = getApplicationInfo().dataDir;
String stuffFilePath= cwd + File.separator + "stuff.json";
PrintWriter stuffFile= null;
try {
stuffFile = new PrintWriter(stuffFilePath, "UTF-8");
stuffFile.println(stuffToWriteToFile);
stuffFile.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
}
});
}
#Override
public void onResume(){
super.onResume();
// start worker here?
}
#Override
public void onStart() {
super.onStart();
// start worker here?
}
}
APIWorker.java:
public class APIWorker extends Worker {
public APIWorker(#NonNull Context context, #NonNull WorkerParameters workerParams) {
super(context, workerParams);
}
private static final String TAG = APIWorker.class.getSimpleName();
#NonNull
#Override
public Result doWork() {
Context applicationContext = getApplicationContext();
Log.d(TAG, "Observing stuff file");
FileObserver fileObserver = new FileObserver(cwd) {
#Override
public void onEvent(int event, #Nullable String path) {
if(event == FileObserver.CREATE ||
event == FileObserver.MODIFY) {
String cwd = applicationContext.getApplicationInfo().dataDir;
String stuffFilePath = cwd + File.separator + "stuff.json";
String fileContents;
File observedFile = new File(stuffFilePath);
long length = observedFile.length();
if (length < 1 || length > Integer.MAX_VALUE) {
fileContents = "";
Log.w(TAG, "Empty file: " + observedFile);
} else {
try (FileReader in = new FileReader(observedFile)) {
char[] content = new char[(int)length];
int numRead = in.read(content);
if (numRead != length) {
Log.e(TAG, "Incomplete read of " + observedFile +
". Read chars " + numRead + " of " + length);
}
fileContents = new String(content, 0, numRead);
Log.d(TAG, "Sending intent ");
String packageName = "com.cam.differentapp";
Intent sendIntent = applicationContext.getPackageManager().
getLaunchIntentForPackage(packageName);
if (sendIntent == null) {
// Bring user to the market or let them choose an app?
sendIntent = new Intent(Intent.ACTION_VIEW);
sendIntent.setData(Uri.parse("market://details?id=" + packageName));
}
// sendIntent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
sendIntent.setAction(Intent.ACTION_SEND);
sendIntent.putExtra(Intent.EXTRA_TEXT, fileContents);
sendIntent.setType("application/json");
applicationContext.startActivity(sendIntent);
Log.d(TAG, "Intent sent ");
}
catch (Exception ex) {
Log.e(TAG, "Failed to read file " + path, ex);
fileContents = "";
}
}
}
}
};
fileObserver.startWatching();
return null;
}
}
Looking at the docs:
https://developer.android.com/guide/components/activities/background-starts
there are restrictions as to when activities can be started from the background but also exceptions, namely:
The app has a visible window, such as an activity in the foreground.
meaning (I think?) that as long as the user interacts with the app (MainActivity) the background worker should run, correct? It's stopped if the activity is paused/destroyed, right?
Usually you would use a Service if you have background processing to do that doesn't need user interaction (display or user input). If your app is in the foreground then your Service can launch other activities using startActivity().
Your architecture seems very strange to me. You are using a Worker, which has a maximum 10 minute lifetime. You are starting the Worker which then creates a FileObserver to detect creation/modification of files. It then reads the file and starts another Activity. This is a very complicated and roundabout way of doing things. I have doubts that you can get this working reliably.
Your Activity is writing the data to the file system. It could just call a method (on a background thread) after it has written the file that then forwards the data to another Activity. This would be much more straightforward and has a lot less moving parts.
I don't know exactly how the lifecycle of the Activity effects the Workers. I would assume that they are not directly linked to the Activity and therefore would not stop when the Activity is paused or destroyed.
I also notice that you are writing to a file on the main (UI) thread (in your OnClickListener). This is not OK and you should do file I/O in a background thread, because file I/O can block and you don't want to block the main (UI) thread.
So I basically have a List of URIs, each has a .jpeg file, and I want to show this list like a GIF file (not necessesary to make a gif, only to display).
So after a research I found the AnimationDrawble object, converted each URI into Drawable and added it as a frame to AnimationDrawable.
This is my code:
AnimationDrawable ad = new AnimationDrawable();
Drawable[] dr = new Drawable[position+1];
ProgressItem pi;
try {
for (int i = 0; i <= position; i++) {
pi = progress.get(i);
try {
dr[i] = drawableFromUrl(pi.getImage());
} catch (IOException ios) {
Toast.makeText(activity, ios.getMessage(), Toast.LENGTH_SHORT).show();
}
}
}
catch(Exception ex)
{
Toast.makeText(activity, ex.getMessage(), Toast.LENGTH_SHORT).show();
}
Intent i = new Intent(activity,ProgressImage.class);
DataWraper.setItems(dr);
drawableFromUrl:
public Drawable drawableFromUrl(String url) throws IOException {
Bitmap x;
HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
MyDownloadTask mdt = new MyDownloadTask();
try{
mdt.execute(connection);
}
catch(Exception ex)
{
Toast.makeText(activity, ex.getMessage(), Toast.LENGTH_SHORT).show();
}
InputStream input = mdt.getInputStream();
x = BitmapFactory.decodeStream(input);
return new BitmapDrawable(x);
}
The implementation part:
Glide.with(this)
.load(ad)
.into(progressImage);
When I'm trying to Glide the AnimationDrawble into the ImageView I get the following exception:
java.lang.IllegalArgumentException: Unknown type class android.graphics.drawable.AnimationDrawable.
This made me hesistate the way I'm trying to pull this off. Should this be this complicated?
If this is the right way, what am I doing wrong? maybe there's another way of doing so? I'd love to get some details. Thanks in advance!
https://github.com/koral--/android-gif-drawable
Not sure if you're open to any 3rd party libraries, but i used this one for gifs before and it worked quite well for me
I have this weird problem that seems to only happen on one of my tablets that runs android 6.
I have this chunk of code to add a photo taken from the camera to a recycler view
1) I create a file object onto device (this is the photo)
2) I get the uri from that file
3) I create an intent passing in that uri as such
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (intent.resolveActivity(getActivity().getPackageManager()) != null)
// this is done in a fragment, everything else below is in the if statement
intent.putExtra(MediaStore.EXTRA_OUTPUT, uri);
startActivityForResult(take_photo_intent, option_int);
I know MediaStore.EXTRA_OUTPUT does NOT return anything to onActivityResult, but instead writes to the uri you pass in.
now inside of onActivityResult I have
Log.i("PHOTO", "path--->" + uri.getPath());
So I want to mention, when the program works or DOESN'T work, the uri ALWAYS has a path, one example of one of the paths is
/storage/emulated/0/data/20161212_175150797715155.jpg
so to continue on in the onActivityResult
5) create bitmap based on uri path to use it later on
BitmapFactory.Options bitmap_options = new BitmapFactory.Options();
bitmap_options.inPreferredConfig = Bitmap.Config.ARGB_8888;
*************** problem here ****************
Bitmap temp = BitmapFactory.decodeFile(uri.getPath());
the temp bitmap returns null SOME of the time, let me explain
when I take the photo, the asus android 6 tablet shows two buttons when you take the photo, one button to discard the photo, another to keep the photo.. here is the weird part, ON THAT screen, if I wait like 5-15 seconds before pressing the button to keep the photo, the bitmap will NOT be null, but if I take a photo and immediately accept it, the a bitmap is null.
now as said before, does not matter how i do it, if the bitmap comes out null, or it does not come out null, it always has a path before passing it into the bitmapdecode function (which is weird)
I have no clue if it is the camera software, the physical camera hardware, an android 6 bug....
I also want to say, I am not sure if this is the best code but it has worked on 4 other android devices ( phones, tablets) it is just this ONE tablet that is a asus and only one with android 6, it works fine with everything else
EDIT:
I tried this as suggested
Bitmap TEMP = null;
try {
TEMP = BitmapFactory.decodeStream(getContext().getContentResolver().openInputStream(uri));
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
with no luck
EDIT 2:
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == 1 && resultCode == Activity.RESULT_OK) {
BitmapFactory.Options bitmap_options = new BitmapFactory.Options();
bitmap_options.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap TEMP = BitmapFactory.decodeFile(uri.getPath());
Bitmap bitmap_image = ThumbnailUtils.extractThumbnail(TEMP, THUMBNAIL_SIZE, THUMBNAIL_SIZE);
if (bitmap_image == null)
Log.i("PHOTO", "BITMAP THUMBNAIL NULL");
setAdapterBitmap(bitmap_image, uri.getPath(), 1);
}
}
SOLUTION:
I had this method to create a File object and turn it into a URI Object
private File createFileForImage() throws IOException {
String file_creation_time = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date());
String image_file_name = DIR_NAME + "_" + file_creation_time;
File storage_dir = new File(String.valueOf(Environment.getExternalStorageDirectory()), DIR_NAME);
if (!storage_dir.exists())
{
if(!storage_dir.mkdir())
{
return null;
}
}
return File.createTempFile(image_file_name, ".jpg", storage_dir);
}
then I used this after it was return
uri = Uri.fromFile(image_file);
but for some reason this was working but it had a slight delay that cause bizarre behavior as stated in the original post
Jan Kaufmann suggestion seem to work, to i made some small modifications
private Uri getOutputMediaUriFromFile()
{
File photo_storage_dir = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES), DIR_NAME);
if (!photo_storage_dir.exists())
{
if (!photo_storage_dir.mkdirs())
{
return null;
}
}
String file_creation_time = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date());
String image_file_name = "ipr" + "_" + file_creation_time;
File photo = new File(photo_storage_dir.getPath() + File.separator + image_file_name + ".jpg");
return Uri.fromFile(photo);
}
It does essentially the same thing, but for some reason this doesn't cause the issue I was having, at least not with my testing.
I am glad this now works but can anyone explain why this worked?
Make sure the image size is not too big. If needed,subsample the original image to save memory
BitmapFactory.Options opt = new BitmapFactory.Options();
opt.inSampleSize = 8;
Make sure you have permission for
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Or try constructing your URI into something like this:
Uri uri = getOutputMediaFileUri(MEDIA_TYPE_IMAGE, "myimage"); where getOutputMediaFileUri is :
private File getOutputMediaFile(int type, String imgname) {
// External sdcard location
//public static String DIRECTORY_PICTURES = "Pictures";
File mediaStorageDir = new File(
Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES), IMAGE_DIRECTORY_NAME);
// Create the storage directory if it does not exist
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d(IMAGE_DIRECTORY_NAME, "Oops! Failed create "
+ IMAGE_DIRECTORY_NAME + " directory");
return null;
}
}
File mediaFile;
if (type == MEDIA_TYPE_IMAGE) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator + "IMG_" + imgname + ".jpg");
} else {
return null;
}
return mediaFile;
}
Try this dude:
TEMP = MediaStore.Images.Media.getBitmap(context.getContentResolver(), YOUR_URL);
If this is not work, I guess that you didn't create a temp file to save the image.
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
try{
GEY_FILE;
}catch (Exception ex){
Log.i(TAG, "run: " + ex);
}
}
}, 15000);
I have been busting my head over this, this is the last thing I need to complete and he app is done.
Basically, I created a camera for my app and I need to switch from back camera to front camera on onClick()...
When I switch, I lose the preview... When I record, the screen is black but the video get recorded... but no preview at all.. here is the code
#Override
protected void onCreate(Bundle saved) {
super.onCreate(saved);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.camera);
//some initializing code like checking flash, number of cameras...
preview = (FrameLayout) findViewById(R.id.camera_preview);
}
#Override
public void onResume(){
super.onResume();
if (Camera.getNumberOfCameras() < 2) {
a.id(R.id.camera_switch).clickable(false);
}
if(m!=null){
m.reset();
m.release();
m=null;
c.lock();
}
if (c != null) {
c.release();
c = null;
}
cam = "front";
Instance();
}
public void Instance(){
if(flash.equalsIgnoreCase("yes"))
a.id(R.id.camera_flash).clickable(true);
if(cam.equalsIgnoreCase("back")){
try{
m.reset();m=null;
c.stopPreview();
c.release();c.reconnect();
c = null;
}catch(Exception e){}
a.id(R.id.camera_flash).clickable(false);
Camera c = getCameraInstanceB(this);
parameters = c.getParameters();
parameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
c.setParameters(parameters);
cam = "front";
try {
c.setPreviewDisplay(mPreview.getHolder());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}c.startPreview();
}else{
try{
c.release();
c = null;
}catch(Exception e){}
c = getCameraInstance(this);
parameters = c.getParameters();
cam = "back";
}
m = new MediaRecorder();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, c);
int orien =getResources().getConfiguration().orientation;
if(orien ==1){
parameters.setRotation(0); // set rotation to save the picture
c.setDisplayOrientation(90);
cam_rotation =90;
parameters.setPictureSize(640, 480);
PIC_ORIENTATION = "portrait";
Toast.makeText(this, PIC_ORIENTATION, Toast.LENGTH_SHORT).show();
}else{
parameters.setRotation(0); // set rotation to save the picture
c.setDisplayOrientation(0);
parameters.setPictureSize(640, 480);
PIC_ORIENTATION = "landscape";
cam_rotation=0;
Toast.makeText(this, PIC_ORIENTATION, Toast.LENGTH_SHORT).show();
}
c.setParameters(parameters);
m.setCamera(c);
preview.addView(mPreview);
}
now the camera instances for back and front
public static Camera getCameraInstance(Cam cam){
c = null;
try {
c = Camera.open(CameraInfo.CAMERA_FACING_BACK); // attempt to get a Camera instance
Camera.Parameters parameters = c.getParameters();
parameters.setRecordingHint(true);
parameters.setFocusMode(Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
c.setParameters(parameters);
}
catch (Exception e){
// Camera is not available (in use or does not exist)
e.printStackTrace();
text = "The camera is in use";
//---set the data to pass back---
data.putExtra("vid",text);
cam.setResult(RESULT_OK, data);
//---close the activity---
cam.finish();
}
return c; // returns null if camera is unavailable
}
public static Camera getCameraInstanceB(Cam cam){
c = null;
try {
c = Camera.open(CameraInfo.CAMERA_FACING_FRONT); // attempt to get a Camera instance
Camera.Parameters parameters = c.getParameters();
parameters.setRecordingHint(true);
c.setParameters
(parameters);
}
catch (Exception e){
// Camera is not available (in use or does not exist)
e.printStackTrace();
text = "The camera is in use";
//---set the data to pass back---
data.putExtra("vid",text);
cam.setResult(RESULT_OK, data);
//---close the activity---
cam.finish();
}
return c; // returns null if camera is unavailable
}
on Resume()... everything is fine but when I switch... no more preview
After spending hours, I finally came up with a solution, basically, I just recreate the surfaceView on each switch as it was an onStart();
public void Instance(){
preview = (FrameLayout) findViewById(R.id.camera_preview);
//rest of the code here
Now it works like a charm... no more error even onResume
I've been trying to create a function in my app that consist in a bluetooth RFID scanner, it's paired to my device and I have it working and all.
I can receive the text and log it in the console, when I compile the activity, everything goes fine, the stick reads the code, and then appends the text into an EditText, but if I go back and enter the activity again, I can see the code in the log, but the text doesn't go to the Edittext.
I tried a lot of different approaches, but nothing seems to work :/
here's the code I have:
/**
* Called when the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.bluetooth);
mBluetoothAdapter = BluetoothAdapter.getDefaultAdapter();
Set<BluetoothDevice> bondedSet = mBluetoothAdapter.getBondedDevices();
if (mBluetoothAdapter == null) {
Toast.makeText(this, "Bluetooth is not available.", Toast.LENGTH_LONG).show();
}
if (!mBluetoothAdapter.isEnabled()) {
Toast.makeText(this, "Please enable your BT and re-run this program.", Toast.LENGTH_LONG).show();
finish();
}
if (mBluetoothAdapter.isEnabled()) {
if(bondedSet.size() == 1){
for(BluetoothDevice device : bondedSet){
address = device.getAddress();
Log.d("bt:", address);
}
}
}
String address = "00:A0:96:2A:0A:1B";
out = (EditText) findViewById(R.id.output);
BluetoothDevice device = mBluetoothAdapter.getRemoteDevice(address);
Log.d(TAG, device.getName() + " connected");
myConnection = new ConnectThread(device);
myConnection.start();
}
private class ConnectThread extends Thread {
private final BluetoothSocket mySocket;
Message msg;
public ConnectThread(BluetoothDevice device) {
BluetoothSocket tmp = null;
try {
tmp = device.createRfcommSocketToServiceRecord(MY_UUID);
} catch (IOException e) {
Log.d(TAG, "CONNECTION IN THREAD DIDNT WORK");
}
mySocket = tmp;
}
Handler uiThreadHandler = new Handler() {
public void handleMessage(Message msg) {
out = (EditText) findViewById(R.id.output);
Object o = msg.obj;
out.append(o.toString().trim());
Log.d("handler", o.toString());
}
};
public void run() {
out = (EditText) findViewById(R.id.output);
Log.d(TAG, "STARTING TO CONNECT THE SOCKET");
setName("My Connection Thread");
InputStream inStream = null;
boolean run = false;
mBluetoothAdapter.cancelDiscovery();
try {
mySocket.connect();
run = true;
} catch (IOException e) {
Log.d(TAG, this.getName() + ": CONN DIDNT WORK, Try closing socket");
try {
mySocket.close();
Log.d(TAG, this.getName() + ": CLOSED SOCKET");
} catch (IOException e1) {
Log.d(TAG, this.getName() + ": COULD CLOSE SOCKET", e1);
this.destroy();
}
run = false;
}
synchronized (BluetoothActivity.this) {
myConnection = null;
}
byte[] buffer = new byte[1024];
int bytes;
// handle Connection
try {
inStream = mySocket.getInputStream();
while (run) {
try {
bytes = inStream.read(buffer);
readMessage = new String(buffer, 0, bytes);
msg = uiThreadHandler.obtainMessage();
msg.obj = readMessage;
uiThreadHandler.sendMessage(msg);
Log.d(TAG, "Received: " + readMessage);
} catch (IOException e3) {
Log.d(TAG, "disconnected");
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
My guess is that this has something to do with the Thread itself. When you start your Activity for the first time, you also call .start() on the Thread, that would work fine.
The problem is when you leave your Activity and open it up again. In that case, one of onStop() or onPause() is called (depending on situation), and onRestart() or onResume() will be called afterwards respectively.
The trick comes now: Meanwhile all that process, your Thread is still running. As you show your code, it has not been stopped/paused, and keeps running all the time. So basically my tip is that there's something you do within your onCreate() method of your Activity that should also be done in your onPause() and onStop() events, and my another tip it's somewhere within your ConnectThread(BluetoothDevice device) method.
To know how to procceed, I'd firstly define both onStop() and onPause() methods within your Activity and see which is fired, log every attribute to see its value/state, and that way you'll be able to debug what is failing.
There's a diagram of the Activity lifecycle.
Problem was solved, the code works, and the TextView get the inputstream, the problem was when i left the activity, the thread continued to work, so far, no problem at all, after TONS of hours spent on this, i turn the TextView a static var and it worked :)
If anyone reads this, i hope it helps.