I've been trying to send an image from one device to another using the BluetoothChat sample from Google. I can send string value from device to device but I'm having problems sending images. I have two classes handling the receiving and sending of data.
For sending images, I convert the image path to bitmap then convert it to byte[] and pass that to the utility class, same as the BluetoothChat sample but with increased buffer size (1024 default, changed it to 8192). My code for the BluetoothSend class that sends data to the utility class is this,
send.setOnClickListener(view -> {
Bitmap bitmap = BitmapFactory.decodeFile(filePath);
int width = bitmap.getRowBytes();
int height = bitmap.getHeight();
int bmpSize = width * height;
ByteBuffer byteBuffer = ByteBuffer.allocate(bmpSize);
bitmap.copyPixelsToBuffer(byteBuffer);
byte[] byteArray = byteBuffer.array();
sendUtils.write(byteArray);
bitmap.recycle();
});
This is the utility class that handles the sending and receiving of data,
/**
* This thread runs during a connection with a remote device.
* It handles all incoming and outgoing transmissions.
*/
private class ConnectedThread extends Thread {
private final BluetoothSocket bluetoothsocket;
private final InputStream inputStream;
private final OutputStream outputStream;
public ConnectedThread(BluetoothSocket socket) {
bluetoothsocket = socket;
InputStream tmpIn = null;
OutputStream tmpOut = null;
try {
tmpIn = bluetoothsocket.getInputStream();
tmpOut = bluetoothsocket.getOutputStream();
} catch (IOException e) {
Log.e("ConnectedThrd->Cons", "Socket not created.");
e.printStackTrace();
}
inputStream = tmpIn;
outputStream = tmpOut;
}
public void run() {
byte[] buffer = new byte[8192]; //1024 original
int bytes;
// Keep listening to the InputStream while connected
while (true) {
try {
// Read from the InputStream
try {
bytes = inputStream.read(buffer);
// Send the obtained bytes to the UI Activity
handler.obtainMessage(BluetoothSend.MESSAGE_READ, bytes, -1, buffer).sendToTarget();
handler.obtainMessage(BluetoothReceive.MESSAGE_READ, bytes, -1, buffer).sendToTarget();
} catch (NullPointerException n) {
Log.e("ConnectedThrd->Run", n.getMessage());
}
} catch (IOException e) {
Log.e("ConnectedThrd->Run", "Connection Lost.", e);
e.printStackTrace();
connectionLost();
break;
}
}
}
public void write(byte[] buffer) {
try {
try {
outputStream.write(buffer);
handler.obtainMessage(BluetoothSend.MESSAGE_WRITE, -1, -1, buffer)
.sendToTarget();
handler.obtainMessage(BluetoothReceive.MESSAGE_WRITE, -1, -1, buffer)
.sendToTarget();
} catch (NullPointerException n) {
Log.e("ConnectedThrd->Write", "Bluetooth Socket is null: " + n.getMessage());
}
} catch (IOException e) {
Log.e("ConnectedThread->Write", "Empty write stream.");
}
}
public void cancel() {
try {
bluetoothsocket.close();
} catch (IOException e) {
Log.e("ConnectedThread->Cancel", "Failed to close socket.");
}
}
}
Lastly, on the BluetoothReceive class, I receive the data using a handler. Code for handler object is as follows,
case MESSAGE_READ:
//Read message from sender
byte[] bufferRead = (byte[]) message.obj;
//bitmap decodedByte is null
Bitmap decodedByte = BitmapFactory.decodeByteArray(bufferRead, 0, bufferRead.length);
int height = decodedByte.getHeight();
int width = decodedByte.getWidth();
Bitmap.Config config = Bitmap.Config.valueOf(decodedByte.getConfig().name());
Bitmap bitmap_tmp = Bitmap.createBitmap(width, height, config);
ByteBuffer buffer = ByteBuffer.wrap(bufferRead);
bitmap_tmp.copyPixelsFromBuffer(buffer);
fileView.setImageBitmap(bitmap_tmp);
break;
I seem to always get a null value when I try to convert the byte array received from the other device when I convert it to a bitmap so I can use it to display it in an ImageView.
What am I doing wrong?
As you said, you can pass strings. You can convert bitmap to Base64 string format and sent it.
Bitmap bitmap = BitmapFactory.decodeFile("filePath");
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte[] bytes = baos.toByteArray();
String encodedImage = Base64.encodeToString(bytes, Base64.DEFAULT);
On receiving side, reverse it (String Base64 to image)
byte[] decodedString = Base64.decode(encodedImage, Base64.DEFAULT);
Bitmap decodedByte = BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length);
Related
I am trying to receive image in client socket of android from server socket of C# and C# server works fine for a C# client and the received image is exactly the same as the image sent. The problem is when i try to receive image in client socket of Android, it shows only a white line on the top of the image. I think android app is not reading whole Byte array of data even tough i am sending and receiving the size of image before the image is sent or received.
Here is my code:
class imageReceiver implements Runnable {
Bitmap bitmap;
Bitmap btmp;
private InputStreamReader isr;
private String mesg ;
BufferedReader br ;
MainActivity mn;
DataInputStream dis;
byte[] data;
public imageReceiver(MainActivity mn){
this.mn = mn;
}
#Override
public void run() {
if (s.isConnected()) {
try {
final InputStream in = s.getInputStream();
dis = new DataInputStream(in);
byte[] readMsgLen = new byte[4];
dis .read(readMsgLen,0,4);
final int length = readMsgLen[0]<<24
| (readMsgLen[1] & 0xFF) << 16
| (readMsgLen[2] & 0xFF) <<8
| (readMsgLen[3] & 0xFF);
data = new byte[length];
dis.readFully(data,0,data.length);
bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
mn.runOnUiThread(new Runnable() {
#Override
public void run() {
Toast.makeText(mn.getApplicationContext(),
"Image received of length: "+length,
Toast.LENGTH_SHORT).show();
if (bitmap==null){
Toast.makeText(mn.getApplicationContext(),
"Image is not received",
Toast.LENGTH_SHORT).show();
}
else {
Toast.makeText(mn.getApplicationContext(),
"Image received",
Toast.LENGTH_SHORT).show();
Picasso.get()
.load(getImageUri(mn,bitmap))
.fit()
.into(mn.iv);
}
}
});
}
catch (IOException e) {
e.printStackTrace();
}
}
}
public Uri getImageUri(Context inContext , Bitmap inImage){
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
inImage.compress(Bitmap.CompressFormat.PNG,100,bytes);
String path = MediaStore.Images.Media.insertImage(
inContext.getContentResolver(),
inImage,"Sssss",null);
return Uri.parse(path);
}
}
I am developing an app that takes all images sent via whatsapp by notification but the function is apparently not working on older versions of android. Help me please!
public String getBase64(Notification notification){
if(bundle.containsKey(Notification.EXTRA_PICTURE)){
// Log.d("Tem foto","notificação");
String encoded="";
try {
Bitmap bmp = (Bitmap) bundle.get(Notification.EXTRA_PICTURE);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.JPEG, 100, byteArrayOutputStream);
byte[] byteArray = byteArrayOutputStream.toByteArray();
encoded = Base64.encodeToString(byteArray, Base64.DEFAULT);
//base64String = encoded;
}catch(Exception e){
Log.d("erro",e.getMessage());
}
return encoded;
}
else{
//Log.d("key","não tem key");
return "";
}
}
You may want to use a NotificationListenerService
#Override
public void onNotificationPosted(StatusBarNotification statusBarNotification){
String package = statusBarNotification.getPackageName();
Bundle extras = statusBarNotification.getNotification().extras;
try {
PackageManager manager = getPackageManager();
Resources resources = manager.getResourcesForApplication(package);
Drawable icon = resources.getDrawable(iconId);
} catch (Exception e) {
e.printStackTrace();
}
if (extras.containsKey(Notification.EXTRA_PICTURE)) {
// Here you may get the image
Bitmap bmp = (Bitmap) extras.get(Notification.EXTRA_PICTURE);
}
}
i am trying to send any text or image/audio file over bluetooth using RFCOM server socket.i used the following code
send.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
checkBTPermissions();
// byte[] bytes = etSend.getText().toString().getBytes(Charset.defaultCharset());
// mBluetoothConnection.write(bytes);
file_permission();
// byte[] bytes = etSend.getText().toString().getBytes(Charset.defaultCharset());
File myfile = new File("/sdcard/bluetooth/tom.txt");
byte[] bytes= new byte[(int)myfile.length()];
Log.d(TAG,"file length() =" + (int)myfile.length());
try {
FileInputStream fis = new FileInputStream(myfile);
BufferedInputStream bis = new BufferedInputStream(fis,(int)myfile.length());
//bis.read(bytes,0,bytes.length);
Log.d(TAG,"fis created");
// FileInputStream
mBluetoothConnection.write(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
});
here i am able to send the contents of my text file and even able to receive.
receiver code:
public void run(){
byte[] buffer = new byte[1024]; // buffer store for the stream
int bytes; // bytes returned from read()
// Keep listening to the InputStream until an exception occurs
while (true) {
// Read from the InputStream
try {
bytes = mmInStream.read(buffer);
String incomingMessage = new String(buffer, 0, bytes);
Log.d(TAG, "InputStream: " + incomingMessage);
} catch (IOException e) {
Log.e(TAG, "write: Error reading Input Stream. " + e.getMessage() );
break;
}
}
}
My question how i can send my file as a whole to the receiver and receive it and save it ?.Among various tutorials i only find how to send text over bluetooth. please help.
I would like to implement a image processing Asynctask in Android. There is a condition - if the previous asyntask is processing locally, the current task should process on the server.
I tried 4 images, and set the Thread.sleep(1000) in side the local process section, expected the first one process locally and others on server. However, they are all processed locally. Am I wrong?
private class ProcessImageTask extends AsyncTask<ImageItem, Void, ImageItem>{
#Override
protected ImageItem doInBackground(ImageItem... params) {
if(localProcessing==false){
//**************processing locally*****************
localProcessing = true;
try {
Bitmap bm = BitmapFactory.decodeFile(params[0].getBitmap());
Bitmap croppedBitmap = getBitmap(getApplicationContext(), INPUT_SIZE, bm);
final List<Classifier.Recognition> results = classifier.recognizeImage(croppedBitmap);
String resultStr = results.toString();
String trimResult = resultStr.substring(resultStr.indexOf("[")+1,resultStr.indexOf("]")).trim();
String localId = params[0].getId();
trimResult = trimResult.substring(0,trimResult.indexOf(")")) + " likely)";
Bitmap thumbnail = getBitmap(getApplicationContext(), 50, bm);
ImageItem tmp = new ImageItem(localId, imgToString(thumbnail), trimResult);
Thread.currentThread();
Thread.sleep(1000);
localProcessing = false;
return tmp;
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
} else {
//****************processing on server*************************
try {
String ip = "192.168.1.3";
int port = 8195;
Bitmap bm = BitmapFactory.decodeFile(params[0].getBitmap());
Bitmap croppedBitmap = getBitmap(getApplicationContext(), INPUT_SIZE, bm);
String encodedImage = "/ID-BEGIN/" + ID + "/ID-END" + imgToString(croppedBitmap);
try {
//**********Send request to server*********
Socket socket = new Socket(ip,port);
DataInputStream dis = new DataInputStream(socket.getInputStream());
DataOutputStream dout = new DataOutputStream(socket.getOutputStream());
byte [] messageToServer = encodedImage.getBytes();
dout.writeInt(messageToServer.length);
dout.write(messageToServer);
//Receive response from server
int length = dis.readInt();
if(length>0) {
byte [] message = new byte[length];
dis.readFully(message, 0, message.length);
String response = new String(message);
//Handler updateHandler.post(new updateUIThread(response));
Bitmap thumbnail = getBitmap(getApplicationContext(), 50, bm);
ImageItem tmp = new ImageItem(params[0].getId(),imgToString(thumbnail), extractServerMessage(response)+"##");
return tmp;
}
socket.close();
} catch (IOException e) {
e.printStackTrace();
}
} catch (IOException e) {
e.printStackTrace();
}
}
return null;
}
#Override
protected void onPostExecute(ImageItem imageItem) {
super.onPostExecute(imageItem);
}
}
and I executes in a for loop
ImageItem it = pit.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR, tmp).get();
Should I need to set the core pool size? Thanks a lot.
Your call to AsyncTask.get() waits for the task to finish before returning, so you're not actually running these in parallel, despite using the THREAD_POOL_EXECUTOR. You shouldn't call get here, but instead rely on onPostExecute to communicate results back to your program.
i try to conver video from sd card in byte array.i wrote code ,but i am doing something wrong.what is a wrong in my code ? i have no idea what i am doing wrong
this is a my source
public byte[] readBytes(Uri uri) throws IOException {
InputStream inputStream = getContentResolver().openInputStream(uri);
ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = inputStream.read(buffer)) != -1) {
byteBuffer.write(buffer, 0, len);
}
return byteBuffer.toByteArray();
}
upload_btn.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
String fileName = "video.mp4";
String completePath = Environment.getExternalStorageDirectory()
+ "/" + fileName;
File file = new File(completePath);
Uri imageUri = Uri.fromFile(file);
try {
byte[] bytes = readBytes(imageUri);
Log.e("Byte Array is", String.valueOf(bytes));
Toast.makeText(getApplicationContext(),
String.valueOf(bytes), Toast.LENGTH_LONG).show();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
});
i run my app and result is small streang,not byte array