Java nio read chars and ints binary on the same line - java

I'm trying to get a grip on java.nio and got stuck on reading from a binary file, previously written
single line "on on on748".
I use try with resources so I'm sure that file and channel are ok.
On the bytebuffer ,declared and allocated 12 that being channel size.
Here the problem starts cause on my bytearray ,I can read it with a for each
and char casting and with a for i can't seem to get any method to adress the numbers.
I've tried a second buffer with .get(xx,8,2) but I don't know how to turn the byte[] array of 2 to an int.
try(FileInputStream file = new FileInputStream("data.dat");
FileChannel channel = file.getChannel()){
ByteBuffer buffer = ByteBuffer.allocate((12));
channel.read(buffer);
byte[] xx = buffer.array();
System.out.println(xx.length);
for (byte z:xx) {
System.out.println((char)z);
}
for (int i = 0; i < xx.length; i++) {
if (i<8)
System.out.print((char)xx[i]);
if (i>=8)
System.out.println((int)xx[i]);
}

Related

How to read Serial data the same way in Processing as C#?

In C#, I use the SerialPort Read function as so:
byte[] buffer = new byte[100000];
int bytesRead = serial.Read(buffer, 0, 100000);
In Processing, I use readBytes as so:
byte[] buffer = new byte[100000];
int bytesRead = serial.readBytes(buffer);
In Processing, I'm getting the incorrect byte values when I loop over the buffer array from the readBytes function, but when I just use the regular read function I get the proper values, but I can't grab the data into a byte array. What am I doing wrong in the Processing version of the code that's leading me to get the wrong values in the buffer array?
I print out the data the same way in both versions:
for(int i=0; i<bytesRead; i++){
println(buffer[i]);
}
C# Correct Output:
Processing Incorrect Output:
Java bytes are signed, so any value over 128 will overflow.
A quick solution is to do
int anUnsignedByte = (int) aSignedByte & 0xff;
to each of your bytes.

Sending .jpg from Android to c++ using sockets

I am currently working on a project using a smartphone and a Raspberry Pi.
I have one problem, which is the fact that i cannot send correctly the .jpg from the phone.
I have successfully sent the size of the photo which is the length of the byte array that should be sent after.
Can anyone help me?
Part of Client (Android Studio - Java)
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
InputStream is=fis;
byte[] message=IOUtils.toByteArray(is);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
String zxc;
zxc = Integer.toString(message.length);
dOut.writeBytes(zxc);
dOut.write(message);
Part of Server (QT Creator - C++)
socket->waitForBytesWriten();
socket->waitForReadyRead(100);
char request[6];
socket->read(request,6);
request[6]=NULL;
int bs;
bs=atoi(request); //bs is the length which has the correct value
I have also tried to send the byte array in chunks, but I probably hadn't written it correctly as it didn't work.
Thanks in advance.
EDIT:
I have successfully managed to send it, thank you for all your help.
Part of Client(GOOD)
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
InputStream is=fis;
byte[] message=IOUtils.toByteArray(is);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
int leng = message.length;
byte [] length = ByteBuffer.allocate(4).putInt(message.length).array(); //int to bytes
System.out.println(message.length);
byte [] newLen = new byte[4]; //little endian, big endian stuff
for (int i=0; i<4; i++)
{
System.out.println(length[i]); //original bytes
newLen[3-i]=length[i]; //reversing order
}
dOut.write(newLen); //sending the size of image
dOut.flush();
dOut.write(message);//send image
Part of Server (GOOD)
QTcpSocket *socket = server->nextPendingConnection();
socket->waitForConnected();
qDebug()<<"connected";
char *sockData = new char[92160000]; //max bytes size for photo
int size = 0; //photo size
int bytes = 0; //bytes read at a time
qDebug()<<"waiting for bytes";
socket->waitForReadyRead();
socket->read((char*)&size,sizeof(int)); //reading the size of the photo
qDebug()<<"the size is just " <<size;
for(int i=0;i<size;i+=bytes){ //reading the rest of bytes
socket->waitForReadyRead();
bytes = socket->read(sockData+i,size-i);
if(bytes==-1){
printf("error");
break;
}
}
qDebug()<<"success in reading the image";
std::vector<char> data(sockData,sockData+size);
if(data.size()==0){
qDebug()<<"errorrrrr";
return;
}
Mat temporary = cv::imdecode(data,CV_LOAD_IMAGE_COLOR);
cv::imshow("sdfsd",temporary);
cv::waitKey(1000);
delete sockData;//memory deallocation
After converting the image data to a byte array, why are you sending the array's length as a string? A string is variable-length, but you are not telling the receiver what the string length actually is, either by sending the actual string length before sending the string, or by placing a unique terminator after the string. As such, the receiver has no way to know reliably where the length string actually ends and the image data begins. Your receiver is just blindly reading 6 bytes (why 6? That will only work for images that are between 100000-999999 bytes in size), but is then replacing the 7th byte with a null terminator (thus trashing memory, as only space for 6 bytes was allocated), before then reading the actual image bytes.
You should be sending the image byte count as raw binary data, not as a variable-length string. Use dOut.writeInt(message.length) to send the array length as a 4-byte integer (in network byte order).
Socket socket = new Socket("192.168.2.122",1234);
File file = new File("/storage/emulated/0/poza/cam_image.jpg");
FileInputStream fis = new FileInputStream(file);
byte[] message = IOUtils.toByteArray(fis);
DataOutputStream dOut = new DataOutputStream(socket.getOutputStream());
dOut.writeInt(message.length);
dOut.write(message);
Then the receiver can read the first 4 bytes (swapping the bytes to host order if needed), and then interpret those bytes as an integer that specifies how many image bytes to read:
socket->waitForBytesWriten();
socket->waitForReadyRead(100);
int32_t bs; // <-- be sure to use a 4-byte data type
socket->read((char*)&bs, 4);
bs = ntohl(bs);
//bs is the image length to read

Base64 encode file by chunks

I want to split a file into multiple chunks (in this case, trying lengths of 300) and base64 encode it, since loading the entire file to memory gives a negative array exception when base64 encoding it. I tried using the following code:
int offset = 0;
bis = new BufferedInputStream(new FileInputStream(f));
while(offset + 300 <= f.length()){
byte[] temp = new byte[300];
bis.skip(offset);
bis.read(temp, 0, 300);
offset += 300;
System.out.println(Base64.encode(temp));
}
if(offset < f.length()){
byte[] temp = new byte[(int) f.length() - offset];
bis.skip(offset);
bis.read(temp, 0, temp.length);
System.out.println(Base64.encode(temp));
}
At first it appears to be working, however, at one point it switches to just printing out "AAAAAAAAA" and fills up the entire console with it, and the new file is corrupted when decoded. What could be causing this error?
skip() "Skips over and discards n bytes of data from the input stream", and read() returns "the number of bytes read".
So, you read some bytes, skip some bytes, read some more, skip, .... eventually reaching EOF at which point read() returns -1, but you ignore that and use the content of temp which contains all 0's, that are then encoded to all A's.
Your code should be:
try (InputStream in = new BufferedInputStream(new FileInputStream(f))) {
int len;
byte[] temp = new byte[300];
while ((len = in.read(temp)) > 0)
System.out.println(Base64.encode(temp, 0, len));
}
This code reuses the single buffer allocated before the loop, so it will also cause much less garbage collection than your code.
If Base64.encode doesn't have a 3 parameter version, do this:
try (InputStream in = new BufferedInputStream(new FileInputStream(f))) {
int len;
byte[] temp = new byte[300];
while ((len = in.read(temp)) > 0) {
byte[] data;
if (len == temp.length)
data = temp;
else {
data = new byte[len];
System.arraycopy(temp, 0, data, 0, len);
}
System.out.println(Base64.encode(data));
}
}
Be sure to use a buffer size that is a multiple of 3 for encoding and a multiple of 4 for decoding when using chunks of data.
300 fulfills both, so that is already OK. Just as an info for those trying different buffer sizes.
Keep in mind, that reading from a stream into a buffer can in some cicumstances result in a buffer not being fully filled, even though the end of the stream was not yet reached. Might be possible when reading from an internet stream and a timeout occures.
You can heal that, but taking that into account would lead to much more complex coding, that would not be educational anymore.

Memory problems loading a file, plus converting into hex

I'm trying to make a file hexadecimal converter (input file -> output hex string of the file)
The code I came up with is
static String open2(String path) throws FileNotFoundException, IOException,OutOfMemoryError {
System.out.println("BEGIN LOADING FILE");
StringBuilder sb = new StringBuilder();
//sb.ensureCapacity(2147483648);
int size = 262144;
FileInputStream f = new FileInputStream(path);
FileChannel ch = f.getChannel( );
byte[] barray = new byte[size];
ByteBuffer bb = ByteBuffer.wrap( barray );
while (ch.read(bb) != -1)
{
//System.out.println(sb.capacity());
sb.append(bytesToHex(barray));
bb.clear();
}
System.out.println("FILE LOADED; BRING IT BACK");
return sb.toString();
}
I am sure that "path" is a valid filename.
The problem is with big files (>=
500mb), the compiler outputs a OutOfMemoryError: Java Heap Space on the StringBuilder.append.
To create this code I followed some tips from http://nadeausoftware.com/articles/2008/02/java_tip_how_read_files_quickly but I got a doubt when I tried to force a space allocation for the StringBuilder sb: "2147483648 is too big for an int".
If I want to use this code even with very big files (let's say up to 2gb if I really have to stop somewhere) what's the better way to output a hexadecimal string conversion of the file in terms of speed?
I'm now working on copying the converted string into a file. Anyway I'm having problems of "writing the empty buffer on the file" after the eof of the original one.
static String open3(String path) throws FileNotFoundException, IOException {
System.out.println("BEGIN LOADING FILE (Hope this is the last change)");
FileWriter fos = new FileWriter("HEXTMP");
int size = 262144;
FileInputStream f = new FileInputStream(path);
FileChannel ch = f.getChannel( );
byte[] barray = new byte[size];
ByteBuffer bb = ByteBuffer.wrap( barray );
while (ch.read(bb) != -1)
{
fos.write(bytesToHex(barray));
bb.clear();
}
System.out.println("FILE LOADED; BRING IT BACK");
return "HEXTMP";
}
obviously the file HEXTMP created has a size multiple of 256k, but if the file is 257k it will be a 512 file with LOT of "000000" at the end.
I know I just have to create a last byte array with cut length.
(I used a file writer because i wanted to write the string of hex; otherwise it would have just copied the file as-is)
Why are you loading complete file?
You can load few bytes in buffer from input file, process bytes in buffer, then write processed bytes buffer to output file. Continue this till all bytes from input file are not processed.
FileInputStream fis = new FileInputStream("in file");
FileOutputStream fos = new FileOutputStream("out");
byte buffer [] = new byte[8192];
while(true){
int count = fis.read(buffer);
if(count == -1)
break;
byte[] processed = processBytesToConvert(buffer, count);
fos.write(processed);
}
fis.close();
fos.close();
So just read few bytes in buffer, convert it to hex string, get bytes from converted hex string, then write back these bytes to file, and continue for next few input bytes.
The problem here is that you try to read the whole file and store it in memory.
You should use stream, read some lines of your input file, convert them and write them in the output file. That way your program can scale, whatever the size of the input file is.
The key would be to read file in chunks instead of reading all of it in one go. Depending on its use you could vary size of the chunk. For example, if you are trying to make a hex viewer / editor determine how much content is being shown in the viewport and read only as much of data from file. Or if you are simply converting and dumping hex to another file use any chunk size that is small enough to fit in memory but big enough for performance. This should be tunable over some runs. Perhaps use filesystem NIO in Java 7 so that you can do all three tasks - reading, processing and writing - concurrently. The link included in question gives good primer on reading files.

Java Read File Larger than 2 GB (Using Chunking)

I'm implementing a file transfer server, and I've run into an issue with sending a file larger than 2 GB over the network. The issue starts when I get the File I want to work with and try to read its contents into a byte[]. I have a for loop :
for(long i = 0; i < fileToSend.length(); i += PACKET_SIZE){
fileBytes = getBytesFromFile(fileToSend, i);
where getBytesFromFile() reads a PACKET_SIZE amount of bytes from fileToSend which is then sent to the client in the for loop. getBytesFromFile() uses i as an offset; however, the offset variable in FileInputStream.read() has to be an int. I'm sure there is a better way to read this file into the array, I just haven't found it yet.
I would prefer to not use NIO yet, although I will switch to using that in the future. Indulge my madness :-)
It doesn't look like you're reading data from the file properly. When reading data from a stream in Java, it's standard practice to read data into a buffer. The size of the buffer can be your packet size.
File fileToSend = //...
InputStream in = new FileInputStream(fileToSend);
OutputStream out = //...
byte buffer[] = new byte[PACKET_SIZE];
int read;
while ((read = in.read(buffer)) != -1){
out.write(buffer, 0, read);
}
in.close();
out.close();
Note that, the size of the buffer array remains constant. But-- if the buffer cannot be filled (like when it reaches the end of the file), the remaining elements of the array will contain data from the last packet, so you must ignore these elements (this is what the out.write() line in my code sample does)
Umm, realize that your handling of the variable i is not correct..
Iteration 0: i=0
Iteration 1: i=PACKET_SIZE
...
...
Iteration n: i=PACKET_SIZE*n

Categories