Sending .jpg from Android to c++ using sockets - java

I am currently working on a project using a smartphone and a Raspberry Pi.
I have one problem, which is the fact that i cannot send correctly the .jpg from the phone.
I have successfully sent the size of the photo which is the length of the byte array that should be sent after.
Can anyone help me?
Part of Client (Android Studio - Java)
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
InputStream is=fis;
byte[] message=IOUtils.toByteArray(is);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
String zxc;
zxc = Integer.toString(message.length);
dOut.writeBytes(zxc);
dOut.write(message);
Part of Server (QT Creator - C++)
socket->waitForBytesWriten();
socket->waitForReadyRead(100);
char request[6];
socket->read(request,6);
request[6]=NULL;
int bs;
bs=atoi(request); //bs is the length which has the correct value
I have also tried to send the byte array in chunks, but I probably hadn't written it correctly as it didn't work.
Thanks in advance.
EDIT:
I have successfully managed to send it, thank you for all your help.
Part of Client(GOOD)
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
InputStream is=fis;
byte[] message=IOUtils.toByteArray(is);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
int leng = message.length;
byte [] length = ByteBuffer.allocate(4).putInt(message.length).array(); //int to bytes
System.out.println(message.length);
byte [] newLen = new byte[4]; //little endian, big endian stuff
for (int i=0; i<4; i++)
{
System.out.println(length[i]); //original bytes
newLen[3-i]=length[i]; //reversing order
}
dOut.write(newLen); //sending the size of image
dOut.flush();
dOut.write(message);//send image
Part of Server (GOOD)
QTcpSocket *socket = server->nextPendingConnection();
socket->waitForConnected();
qDebug()<<"connected";
char *sockData = new char[92160000]; //max bytes size for photo
int size = 0; //photo size
int bytes = 0; //bytes read at a time
qDebug()<<"waiting for bytes";
socket->waitForReadyRead();
socket->read((char*)&size,sizeof(int)); //reading the size of the photo
qDebug()<<"the size is just " <<size;
for(int i=0;i<size;i+=bytes){ //reading the rest of bytes
socket->waitForReadyRead();
bytes = socket->read(sockData+i,size-i);
if(bytes==-1){
printf("error");
break;
}
}
qDebug()<<"success in reading the image";
std::vector<char> data(sockData,sockData+size);
if(data.size()==0){
qDebug()<<"errorrrrr";
return;
}
Mat temporary = cv::imdecode(data,CV_LOAD_IMAGE_COLOR);
cv::imshow("sdfsd",temporary);
cv::waitKey(1000);
delete sockData;//memory deallocation

After converting the image data to a byte array, why are you sending the array's length as a string? A string is variable-length, but you are not telling the receiver what the string length actually is, either by sending the actual string length before sending the string, or by placing a unique terminator after the string. As such, the receiver has no way to know reliably where the length string actually ends and the image data begins. Your receiver is just blindly reading 6 bytes (why 6? That will only work for images that are between 100000-999999 bytes in size), but is then replacing the 7th byte with a null terminator (thus trashing memory, as only space for 6 bytes was allocated), before then reading the actual image bytes.
You should be sending the image byte count as raw binary data, not as a variable-length string. Use dOut.writeInt(message.length) to send the array length as a 4-byte integer (in network byte order).
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
byte[] message = IOUtils.toByteArray(fis);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
dOut.writeInt(message.length);
dOut.write(message);
Then the receiver can read the first 4 bytes (swapping the bytes to host order if needed), and then interpret those bytes as an integer that specifies how many image bytes to read:
socket->waitForBytesWriten();
socket->waitForReadyRead(100);
int32_t bs; // <-- be sure to use a 4-byte data type
socket->read((char*)&bs, 4);
bs = ntohl(bs);
//bs is the image length to read

Related

Base64 Encoded to Decoded File Conversion Problem

I am processing very large files (> 2Gig). Each input file is Base64 encoded, andI am outputting to new files after decoding. Depending on the buffer size (LARGE_BUF) and for a given input file, my input to output conversion either works fine, is missing one or more bytes, or throws an exception at the outputStream.write line (IllegalArgumentException: Last unit does not have enough bits). Here is the code snippet (could not cut and paste so my not be perfect):
.
.
final int LARGE_BUF = 1024;
byte[] inBuf = new byte[LARGE_BUF];
try(InputStream inputStream = new FileInputStream(inFile); OutputStream outStream new new FileOutputStream(outFile)) {
for(int len; (len = inputStream.read(inBuf)) > 0); ) {
String out = new String(inBuf, 0, len);
outStream.write(Base64.getMimeDecoder().decode(out.getBytes()));
}
}
For instance, for my sample input file, if LARGE_BUF is 1024, output file is 4 bytes too small, if 2*1024, I get the exception mentioned above, if 7*1024, it works correctly. Grateful for any ideas. Thank you.
First, you are converting bytes into a String, then immediately back into bytes. So, remove the use of String entirely.
Second, base64 encoding turns each sequence of three bytes into four bytes, so when decoding, you need four bytes to properly decode three bytes of original data. It is not safe to create a new decoder for each arbitrarily read sequence of bytes, which may or may not have a length which is an exact multiple of four.
Finally, Base64.Decoder has a wrap(InputStream) method which makes this considerably easier:
try (InputStream inputStream = Base64.getDecoder().wrap(
new BufferedInputStream(
Files.newInputStream(Paths.get(inFile))))) {
Files.copy(inputStream, Paths.get(outFile));
}

Java nio read chars and ints binary on the same line

I'm trying to get a grip on java.nio and got stuck on reading from a binary file, previously written
single line "on on on748".
I use try with resources so I'm sure that file and channel are ok.
On the bytebuffer ,declared and allocated 12 that being channel size.
Here the problem starts cause on my bytearray ,I can read it with a for each
and char casting and with a for i can't seem to get any method to adress the numbers.
I've tried a second buffer with .get(xx,8,2) but I don't know how to turn the byte[] array of 2 to an int.
try(FileInputStream file = new FileInputStream("data.dat");
FileChannel channel = file.getChannel()){
ByteBuffer buffer = ByteBuffer.allocate((12));
channel.read(buffer);
byte[] xx = buffer.array();
System.out.println(xx.length);
for (byte z:xx) {
System.out.println((char)z);
}
for (int i = 0; i < xx.length; i++) {
if (i<8)
System.out.print((char)xx[i]);
if (i>=8)
System.out.println((int)xx[i]);
}

How to read Serial data the same way in Processing as C#?

In C#, I use the SerialPort Read function as so:
byte[] buffer = new byte[100000];
int bytesRead = serial.Read(buffer, 0, 100000);
In Processing, I use readBytes as so:
byte[] buffer = new byte[100000];
int bytesRead = serial.readBytes(buffer);
In Processing, I'm getting the incorrect byte values when I loop over the buffer array from the readBytes function, but when I just use the regular read function I get the proper values, but I can't grab the data into a byte array. What am I doing wrong in the Processing version of the code that's leading me to get the wrong values in the buffer array?
I print out the data the same way in both versions:
for(int i=0; i<bytesRead; i++){
println(buffer[i]);
}
C# Correct Output:
Processing Incorrect Output:
Java bytes are signed, so any value over 128 will overflow.
A quick solution is to do
int anUnsignedByte = (int) aSignedByte & 0xff;
to each of your bytes.

Byte array lengths with Deflater and Inflater

How do we understand the defined length of the byte array?
For instance in this example we are defining here that the length of the byte array is 100.
What if the data that would have to be written to the byte array would be longer than 100 bytes?
The same for the result variable here. I don't understand how these lengths work and how to choose a proper length of a byte array for the needs if you don't know how big your data will be?
try {
// Encode a String into bytes
String inputString = "blahblahblah";
byte[] input = inputString.getBytes("UTF-8");
// Compress the bytes
**byte[] output = new byte[100];**
Deflater compresser = new Deflater();
compresser.setInput(input);
compresser.finish();
int compressedDataLength = compresser.deflate(output);
compresser.end();
// Decompress the bytes
Inflater decompresser = new Inflater();
decompresser.setInput(output, 0, compressedDataLength);
**byte[] result = new byte[100];**
int resultLength = decompresser.inflate(result);
decompresser.end();
// Decode the bytes into a String
String outputString = new String(result, 0, resultLength, "UTF-8");
} catch(java.io.UnsupportedEncodingException ex) {
// handle
} catch (java.util.zip.DataFormatException ex) {
// handle
}
And for this example, the byte array that is used here as input, is actually called a buffer, how do we understand it?
Here, when you call compresser.deflate(output) you cannot know the size needed for output unless you know how this method works. But this is not a problem since output is meant as a buffer.
So you should call deflate multiple times and insert output in another object like an OutputStream, like this:
byte[] buffer = new byte[1024];
while (!deflater.finished()) {
int count = deflater.deflate(buffer);
outputStream.write(buffer, 0, count);
}
Same goes for inflating.
By allocating 100 bytes to the byte array, JVM guarantees that a buffer large enough to hold 100 JVM defined bytes (i.e. 8 bits) is available to the caller. Any attempt to access the array with more than 100 bytes would result in exception e.g. ArrayIndexOutOfBoundException in case you directly try to access the array by array[101].
In case the code is written as your demo, the caller assumes the data length never exceeds 100.

Memory problems loading a file, plus converting into hex

I'm trying to make a file hexadecimal converter (input file -> output hex string of the file)
The code I came up with is
static String open2(String path) throws FileNotFoundException, IOException,OutOfMemoryError {
System.out.println("BEGIN LOADING FILE");
StringBuilder sb = new StringBuilder();
//sb.ensureCapacity(2147483648);
int size = 262144;
FileInputStream f = new FileInputStream(path);
FileChannel ch = f.getChannel( );
byte[] barray = new byte[size];
ByteBuffer bb = ByteBuffer.wrap( barray );
while (ch.read(bb) != -1)
{
//System.out.println(sb.capacity());
sb.append(bytesToHex(barray));
bb.clear();
}
System.out.println("FILE LOADED; BRING IT BACK");
return sb.toString();
}
I am sure that "path" is a valid filename.
The problem is with big files (>=
500mb), the compiler outputs a OutOfMemoryError: Java Heap Space on the StringBuilder.append.
To create this code I followed some tips from http://nadeausoftware.com/articles/2008/02/java_tip_how_read_files_quickly but I got a doubt when I tried to force a space allocation for the StringBuilder sb: "2147483648 is too big for an int".
If I want to use this code even with very big files (let's say up to 2gb if I really have to stop somewhere) what's the better way to output a hexadecimal string conversion of the file in terms of speed?
I'm now working on copying the converted string into a file. Anyway I'm having problems of "writing the empty buffer on the file" after the eof of the original one.
static String open3(String path) throws FileNotFoundException, IOException {
System.out.println("BEGIN LOADING FILE (Hope this is the last change)");
FileWriter fos = new FileWriter("HEXTMP");
int size = 262144;
FileInputStream f = new FileInputStream(path);
FileChannel ch = f.getChannel( );
byte[] barray = new byte[size];
ByteBuffer bb = ByteBuffer.wrap( barray );
while (ch.read(bb) != -1)
{
fos.write(bytesToHex(barray));
bb.clear();
}
System.out.println("FILE LOADED; BRING IT BACK");
return "HEXTMP";
}
obviously the file HEXTMP created has a size multiple of 256k, but if the file is 257k it will be a 512 file with LOT of "000000" at the end.
I know I just have to create a last byte array with cut length.
(I used a file writer because i wanted to write the string of hex; otherwise it would have just copied the file as-is)
Why are you loading complete file?
You can load few bytes in buffer from input file, process bytes in buffer, then write processed bytes buffer to output file. Continue this till all bytes from input file are not processed.
FileInputStream fis = new FileInputStream("in file");
FileOutputStream fos = new FileOutputStream("out");
byte buffer [] = new byte[8192];
while(true){
int count = fis.read(buffer);
if(count == -1)
break;
byte[] processed = processBytesToConvert(buffer, count);
fos.write(processed);
}
fis.close();
fos.close();
So just read few bytes in buffer, convert it to hex string, get bytes from converted hex string, then write back these bytes to file, and continue for next few input bytes.
The problem here is that you try to read the whole file and store it in memory.
You should use stream, read some lines of your input file, convert them and write them in the output file. That way your program can scale, whatever the size of the input file is.
The key would be to read file in chunks instead of reading all of it in one go. Depending on its use you could vary size of the chunk. For example, if you are trying to make a hex viewer / editor determine how much content is being shown in the viewport and read only as much of data from file. Or if you are simply converting and dumping hex to another file use any chunk size that is small enough to fit in memory but big enough for performance. This should be tunable over some runs. Perhaps use filesystem NIO in Java 7 so that you can do all three tasks - reading, processing and writing - concurrently. The link included in question gives good primer on reading files.

Categories