I am trying to send a file (png to be specific) over sockets from python server to android client. I know that my python server is sending the data, I just can't figure out how to receive the data on the android side. Here is what the code looks like to receive the file.
String path = Environment.getExternalStorageDirectory().toString() +"/tmp/test.png";
try {
socket = new Socket("192.168.1.129", 29877);
is = socket.getInputStream();
out = new FileOutputStream(path);
byte[] temp = new byte[1024];
for(int c = is.read(temp,0,1024); c > 0; c = is.read(temp,0,1024)){
out.write(temp,0,c);
Log.d("debug tag", out.toString());
}
Log.d("debug tag", temp.toString());
Bitmap myBitmap = BitmapFactory.decodeByteArray(temp, 0, temp.length);
imageView.setImageBitmap(myBitmap);
Thanks for any advice.
You are reading from socket in 1K chunks and saving them into a file. Then you try to interpret the last chunk as a bitmap. This doesn't work.
Either read your image from the file after you saved it, or buffer it all in memory.
Related
How to upload a large(>4mb) file as an AppendBlob using Azure Storage Blob client library for Java?
I've successfully implemented the BlockBlob uploading with large files and it seems that the library internally handles the 4mb(?) limitation for single request and chunks the file into multiple requests.
Yet it seems that the library is not capable of doing the same for AppendBlob, so how can this chunking be done manually? Basically I think this requires to chunk an InputStream into smaller batches...
Using Azure Java SDK 12.14.1
Inspired by below answer in SO (related on doing this in C#):
c-sharp-azure-appendblob-appendblock-adding-a-file-larger-than-the-4mb-limit
... I ended up doing it like this in Java:
AppendBlobRequestConditions appendBlobRequestConditions = new AppendBlobRequestConditions()
.setLeaseId("myLeaseId");
try (InputStream input = new BufferedInputStream(
new FileInputStream(file));) {
byte[] buf = new byte[AppendBlobClient.MAX_APPEND_BLOCK_BYTES];
int bytesRead;
while ((bytesRead = input.read(buf)) > 0) {
if (bytesRead != buf.length) {
byte[] smallerData = new byte[bytesRead];
smallerData = Arrays.copyOf(buf, bytesRead);
buf = smallerData;
}
try (InputStream byteStream = new ByteArrayInputStream(buf);) {
appendBlobClient
.appendBlockWithResponse(byteStream, bytesRead,
null, appendBlobRequestConditions, null,
null);
}
}
}
Of course you need to do bunch of stuff before this, like make sure the AppendBlob exists, and if not then create it before trying to append any data.
I am currently working on a project using a smartphone and a Raspberry Pi.
I have one problem, which is the fact that i cannot send correctly the .jpg from the phone.
I have successfully sent the size of the photo which is the length of the byte array that should be sent after.
Can anyone help me?
Part of Client (Android Studio - Java)
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
InputStream is=fis;
byte[] message=IOUtils.toByteArray(is);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
String zxc;
zxc = Integer.toString(message.length);
dOut.writeBytes(zxc);
dOut.write(message);
Part of Server (QT Creator - C++)
socket->waitForBytesWriten();
socket->waitForReadyRead(100);
char request[6];
socket->read(request,6);
request[6]=NULL;
int bs;
bs=atoi(request); //bs is the length which has the correct value
I have also tried to send the byte array in chunks, but I probably hadn't written it correctly as it didn't work.
Thanks in advance.
EDIT:
I have successfully managed to send it, thank you for all your help.
Part of Client(GOOD)
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
InputStream is=fis;
byte[] message=IOUtils.toByteArray(is);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
int leng = message.length;
byte [] length = ByteBuffer.allocate(4).putInt(message.length).array(); //int to bytes
System.out.println(message.length);
byte [] newLen = new byte[4]; //little endian, big endian stuff
for (int i=0; i<4; i++)
{
System.out.println(length[i]); //original bytes
newLen[3-i]=length[i]; //reversing order
}
dOut.write(newLen); //sending the size of image
dOut.flush();
dOut.write(message);//send image
Part of Server (GOOD)
QTcpSocket *socket = server->nextPendingConnection();
socket->waitForConnected();
qDebug()<<"connected";
char *sockData = new char[92160000]; //max bytes size for photo
int size = 0; //photo size
int bytes = 0; //bytes read at a time
qDebug()<<"waiting for bytes";
socket->waitForReadyRead();
socket->read((char*)&size,sizeof(int)); //reading the size of the photo
qDebug()<<"the size is just " <<size;
for(int i=0;i<size;i+=bytes){ //reading the rest of bytes
socket->waitForReadyRead();
bytes = socket->read(sockData+i,size-i);
if(bytes==-1){
printf("error");
break;
}
}
qDebug()<<"success in reading the image";
std::vector<char> data(sockData,sockData+size);
if(data.size()==0){
qDebug()<<"errorrrrr";
return;
}
Mat temporary = cv::imdecode(data,CV_LOAD_IMAGE_COLOR);
cv::imshow("sdfsd",temporary);
cv::waitKey(1000);
delete sockData;//memory deallocation
After converting the image data to a byte array, why are you sending the array's length as a string? A string is variable-length, but you are not telling the receiver what the string length actually is, either by sending the actual string length before sending the string, or by placing a unique terminator after the string. As such, the receiver has no way to know reliably where the length string actually ends and the image data begins. Your receiver is just blindly reading 6 bytes (why 6? That will only work for images that are between 100000-999999 bytes in size), but is then replacing the 7th byte with a null terminator (thus trashing memory, as only space for 6 bytes was allocated), before then reading the actual image bytes.
You should be sending the image byte count as raw binary data, not as a variable-length string. Use dOut.writeInt(message.length) to send the array length as a 4-byte integer (in network byte order).
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
byte[] message = IOUtils.toByteArray(fis);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
dOut.writeInt(message.length);
dOut.write(message);
Then the receiver can read the first 4 bytes (swapping the bytes to host order if needed), and then interpret those bytes as an integer that specifies how many image bytes to read:
socket->waitForBytesWriten();
socket->waitForReadyRead(100);
int32_t bs; // <-- be sure to use a 4-byte data type
socket->read((char*)&bs, 4);
bs = ntohl(bs);
//bs is the image length to read
I have the following OutputStream which saves SQL receiving BLOB to a network drive. The boolean "true" affects that files which are already there dont get overwritten.
In conclusion, if there are 500 BLOB's in the buffer but 200 of them already saved to the network drive before, only 300 new files will be added.
Question: How could I count the new files?
I want to write a them to a logfile.
Thanks in advance!
OutputStream out = new FileOutputStream(path + "\\" + xy +".jpg", true);
byte[] buff = blob.getBytes(1, (int) blob.length());
out.write(buff);
out.close();
How to receive an image file through Rest APIs. There is an option of MULTIPART_FORM_DATA which looks like it will send files in parts as in more than one request.
I want to receive images very fast on server. around 2 images per second.
Simply read image in a File and use Response class to build the response.
Response.ok(new File("myimage.jpg"), "image/jpeg").build();
There are other variations of the same.
Read the image using following.
URL url = new URL("http://localhost:8080/myimage/1");
URLConnection connection = url.openConnection();
input = connection.getInputStream();
byte[] buffer = new byte[1024];
int n = - 1;
OutputStream fos = new FileOutputStream("Output.jpg" );
while ( (n = input.read(buffer)) != -1)
{
fos.write(buffer, 0, n);
}
fos.close();
You can use Apache HTTP client to make it prettier.
My task is:
Clients connect to ServerSocket and send files with any encoding what they want(UTF-8, ISO-8859-5, CP1251 e.g.).
When Server receive file content, script must insert it into MySQL.
As the encoding can be different, I need save file content like ByteArray(?).
Byt I dont know how get ByteArray from Socket.getInputStream().
Please help with this.
Thanks in advance!
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] tmp = new byte[4096];
int ret = 0;
while((ret = inputStream.read(tmp)) > 0)
{
bos.write(tmp, 0, ret);
}
byte[] myArray = bos.toByteArray();
Commons IO - http://commons.apache.org/io/
toByteArray(Reader input, String encoding)
Get the contents of a Reader as a byte[] using the specified character encoding.
http://commons.apache.org/io/api-release/org/apache/commons/io/IOUtils.html