I am using: Android Studio/Java 1.8 (Client) VS2010/.NET 4.0 (Server)
This Java code is constantly sending invalid data:
long FileSize = 1131666;
byte [] fs = ByteBuffer.allocate(8).order(ByteOrder.LITTLE_ENDIAN).putLong(FileSize).array();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
baos.write("FILE".getBytes());
baos.write(fs); //HERE
baos.write(GetHash().getHexStdstring().getBytes());
baos.write(dest_filename.getBytes());
OutputStream out = socket.getOutputStream();
out.write(baos.toByteArray());
out.flush();
This is what I get (viewed with wireshark):
46:49:4c:45: ("FILE")
3f:44:11:00:00:00:00:00: (1131583)
32:37:64:35:36:61:32:34:32:36:31:30:37:36:37:32:30:65:34:38:66:37:34:65:36:61:64:38:34:65:36:30:65:64:33:63:66:64:34:36:32:64:36:62:37:65:62:64:62:32:63:63:62:37:37:64:36:38:37:66:64:64:66:39:
5c:45:75:72:6f:70:65:2e:70:6e:67
And thus my C++ server app receives the value as 1131583, but no matter what value I use as FileSize, I always seem to get a 3f in there somewhere...
Another instance is when the FileSize is 22451663, I get 3f:3f:56:01:00:00:00:00 (or 22429503).
Any thoughts?
Weird thing is, if I translate fs back to a number or a string and Toast the value, just before it's sent, it says its correct.
6:49:4c:45: ("FILE")
00:00:00:00:00:11:44:3f: (1131583)
Cannot reproduce, and 00:00:00:00:00:11:44:3f: is not a little-endian representation of 1131583. It is a big-endian representation of 1131583. As a little-endian number it is a rather large number: 4347598471168.
What this code actually produces on the wire, as printed by
ByteArrayOutputStream baos = new ByteArrayOutputStream()
{
public void write(byte[] bytes) throws IOException
{
for (byte b : bytes)
{
System.out.print(Integer.toHexString(b & 0xff)+":");
}
System.out.println();
super.write(bytes);
}
};
is:
46:49:4c:45:
92:44:11:0:0:0:0:0:
// etc.
There is no problem with this code.
So, the ISO-8859-1 character for 0x3f is ?, and it seems that because I was sending my constructed ByteArayOutputStream to my Send(String string) function, it was changing a lot of bytes to 0x3f, like 0x92 which has no ISO-8859-1 character.
I changed my code to like so:
// in main code
Bytestring message = new Bytestring(baos.toByteArray());
Send(message);
// send function
public void Send(Bytestring string) {
OutputStream out = socket.getOutputStream();
// other code for writing header
out.write(string.getBytes());
out.flush();
}
Bytestring being a string class I created based on a byte array, instead of a String which is based on a (2-byte) char array.
Related
I'm trying to serialize Object between NIO SocketChannel and blocking IO Socket. Since I can't use Serializable/writeObject on NIO, I thought to write code to serialize object into an ByteArrayOutputStream then send array length followed by array.
Sender function is
public void writeObject(Object obj) throws IOException{
ByteArrayOutputStream serializedObj = new ByteArrayOutputStream();
ObjectOutputStream writer = new ObjectOutputStream(serializedObj);
writer.writeUnshared(obj);
ByteBuffer size = ByteBuffer.allocate(4).putInt(serializedObj.toByteArray().length);
this.getSocket().write(size);
this.getSocket().write(ByteBuffer.wrap(serializedObj.toByteArray()));
}
and receiver is:
public Object readObject(){
try {
//Leggi dimensione totale pacchetto
byte[] dimension = new byte[4];
int byteRead = 0;
while(byteRead < 4) {
byteRead += this.getInputStream().read(dimension, byteRead, 4 - byteRead);
}
int size = ByteBuffer.wrap(dimension).getInt(); /* (*) */
System.out.println(size);
byte[] object = new byte[size];
while(size > 0){
size -= this.getInputStream().read(object);
}
InputStream in = new ByteArrayInputStream(object, 0, object.length);
ObjectInputStream ois = new ObjectInputStream(in);
Object res = ois.readUnshared();
ois.close();
return res;
} catch (IOException | ClassNotFoundException e) {
return null;
}
}
The problem is that size (*) is always equals to -1393754107 when serializedObj.toByteArray().length in my test is 316.
I don't understand why casting not works properly.
this.getSocket().write(size);
this.getSocket().write(ByteBuffer.wrap(serializedObj.toByteArray()));
If the result of getSocket() is a SocketChannel in non-blocking mode, the problem is here. You aren't checking the result of write(). In non-blocking mode it can write less than the number of bytes remaining in the ByteBuffer; indeed it can write zero bytes.
So youu aren't writing all the data you think you're writing, so the other end overruns and reads the next length word as part of the data being written, and reads part of the next data as the next length word, and gets a wrong answer. I'm surprised it didn't barf earlier. In fact it probably did, but your deplorable practice of ignoring IOExceptions masked it. Don't do that. Log them.
So you need to loop until all requested data has been written, and if any write() returns zero you need to select on OP_WRITE until it fires, which adds a considerable complication into your code as you have to return to the select loop while remembering that there is an outstanding ByteBuffer with data remaining to be written. And when you get the OP_WRITE and the writes complete you have to deregister interest in OP_WRITE, as it's only of interest after a write() has returned zero.
NB There is no casting in your code.
The problem was write() returned 0 always. This happens because the buffer wasn't flipped before write().
A piece of Java code is residing on a server expecting about 64 bytes of information from a piece of hardware, sent via TCP. The packet has a 10 byte header. The first byte is a protocol identifier, the second two bytes gives the total number of bytes in the packet, including all the header bytes and checksum. The last 7 bytes are a UID.
Server Code:
public void run () throws Exception
{
//Open a socket on localhost at port 11111
ServerSocket welcomeSocket = new ServerSocket(11111);
while(true) {
//Open and Accept on Socket
Socket connectionSocket = welcomeSocket.accept();
//Alt Method
DataInputStream dis = new DataInputStream(connectionSocket.getInputStream());
int len = dis.readInt();
byte[] data = new byte[len];
if (len > 0) {
dis.readFully(data);
}
System.out.println("Recv[HEX]: " + StringTools.toHexString(data));
}
}
The issue is my readInt() line, that takes the first four bytes, however I need to determine the length based on the second two bytes. How can this be achieved?
And secondly, is my stringTools.toHexString(data) correct to dump the received buffer which I know should be readable as a HEX string?
Note: This question has its root here: Java TCP Socket Byte Heap Memory Issue
Only use DataInputStream if the other side is using DataOutputStream or it's exact format. The integers, for example, may be encoded big-endian or little-endian - DataOutputStream uses big-endian notation, if the other side uses different encoding, you cannot use DataInputStream. Using InputStream.read() gives you more control if you need it.
Now, since the format of message as you stated starts with one byte for protocol identifier, you first need to read that as a byte (dis.readByte() or InputStream.read()) and either check that the protocol is what you expect or handle different protocols. Then you read the message length, etc.
You can use ByteBuffer to read the int in the last two bytes
import static java.lang.System.out;
import java.nio.ByteBuffer;
class B {
public static void main( String ... args ) {
// test value
int a = 1238098;
// convert it into an arrays of bytes
ByteBuffer b = ByteBuffer.allocate(4);
b.putInt(a);
byte [] r = b.array();
// read last two
int size = ByteBuffer.wrap(new byte[]{0x0,0x0, r[2], r[3]}).getInt();
// print it
out.println("Original: " + String.format("%32s%n" , Integer.toString(a,2)).replace(' ', '0'));
out.printf("Last two: %32s%n" , Integer.toString(size,2));
out.printf("Decimal : %d%n" , size );
}
}
Output:
Original: 00000000000100101110010001010010
Last two: 1110010001010010
Decimal : 58450
However I would recommend to follow #Jiri answer about read using InputStream.read() instead of DateInputStream
Sever code
if(success){
out.write("true".getBytes().length);
out.write("true".getBytes());
out.flush();
}
else{
out.write("false".getBytes().length);
out.write("false".getBytes());
out.flush();
}
Client Code
int size = inputStream.read();
byte[] buf = new byte[size];
inputStream.read(buf);
ns = new String(buf);
Boolean.valueOf(ns);
Although the sever send the result client read it wrong. What is the problem in here? how can i solve it. As example sever send value true but client receive it as false
You need to step thread what you are doing exactly. Obviously the simplest way to sent a boolean is as a single byte like this.
out.write(success ? 1 : 0);
and to read this you would do
boolean success = in.read() != 0;
However, if you need to send a string, I would check what string you are reading and what the correct length is, because there is any number of reasons a binary protocol can fail, e.g. because the previous thing you read/wrote was incorrect.
Server and Client are probably using different charsets.
Use an explicit one (and the same) in both sides.
see http://docs.oracle.com/javase/6/docs/api/java/lang/String.html
public byte[] getBytes(String charsetName)
throws UnsupportedEncodingException
and
public String(byte[] bytes,
String charsetName)
throws UnsupportedEncodingException
I created a client (created by JAVA) and a server (by qt/c++), but I have a data transfer problem (something wrong with format I think).
The server side code:
void Pirate::DateArrived()
{
QTcpSocket *socket = qobject_cast<QTcpSocket *>(sender());
QDataStream in (socket);
qDebug()<< socket->bytesAvailable();// here it give me the number of chars i sent in this ex:3
QString cmd ;
in >> cmd;
qDebug()<< cmd.size(); // here it always stay 0
qDebug() << cmd; // always ""
}
public void SendData(String data) throws IOException
{
OutputStream theOutput = socket.getOutputStream();
OutputStreamWriter out = new OutputStreamWriter(theOutput);
out.write("abc");
out.flush();
}
According to the docs, when you deserialize a QString, it is expected that the data will consist of the string length in bytes (quint32) followed by the data in UTF-16.
NOW here is the coding for j2me mobile for sending the string:
String s="hai";
try{
String url = "btspp://001F81000250:1;authenticate=false;encrypt=false;master=false";
StreamConnection stream = null;
InputStream in;
OutputStream out;
stream = (StreamConnection) Connector.open(url);
out=stream.openOutputStream();
String s=tf.getString();
byte size=(byte) s.length();
out.write(size);
out.write(s.getBytes());
out.flush();
out.close();
stream.close();
}
catch(Exception e){
}
NOW the coding for j2se for receiving the String :
StreamConnectionNotifier notifier=null;
try{
String url = "btspp://localhost:"+new UUID("1101", true).toString()+";name=PCServerCOMM;authenticate=false";
System.out.println(LocalDevice.getLocalDevice().getBluetoothAddress()+"\nCreate server by uri: " + url);
notifier= (StreamConnectionNotifier) Connector.open(url);
while(true){
System.out.println("waiting....");
StreamConnection con = notifier.acceptAndOpen();
System.out.println("Got connection..");
InputStream is=con.openInputStream();
//byte b[]=new byte[40];
/*
while(is.available()>0){
System.out.print((char)is.read());
}*/
//is.read(b, 0, 40);
int size=is.read();
byte b[]=new byte[size];
is.read(b, 0, size);
File f=new File("d://test.xml");
FileOutputStream fo=new FileOutputStream(f);
fo.write(b,0,b.length);
fo.close();
con.close();
System.out.println(new String (b));
}
//printing(f);
} catch(Exception e){
JOptionPane.showConfirmDialog(new JFrame(), e.getMessage());
}
I tried this coding for data transfer but it is not a successful one because when the string which we sent is too long then there is problem in receiving side. How can I solve this?
Is there any other way to transfer the data in rms to j2se, if so please help me.... please make your reply quick...
The way you are writing and reading here, only strings up to 255 characters in length, which additionally only take the same number of bytes in your default encoding, are written right.
On the writing side:
The statement byte size=(byte) s.length(); converts the length of the string in a byte, thus only takes the lower 8 bits of the length. So, only lengths up to 255 are written right.
Then you are converting the String to a byte array with s.getBytes() - this array could be longer (in bytes) than the original string in characters. This conversion uses the default encoding of your sending device.
On the reading side:
The statement int size=is.read(); reads the length written before, then you are creating a byte array.
is.read(b, 0, size); reads some bytes into this array - it does not necessarily fills the complete array.
Then you are converting your byte array (which may not even be filled completely) to a string, using the default encoding of the receiving device.
So, we have:
All strings longer than 255 characters are written wrongly.
If sending and receiving side are using different encodings, you may get a wrong output.
If the sending side uses an encoding like UTF-8 where some characters take more than one byte, the string is cut off at the end (if such characters occur).
How to solve this:
If you can use a DataInputStream and DataOutputStream on both sides (I don't know anything about J2ME), use them there, with their readUTF and writeUTF methods. They solve all your problems (if your strings take at most 65535 bytes in the modified UTF-8 encoding used here).
If not:
make a decision on how long the strings can be, and encode your length with the right number of bytes. 4 bytes are enough for every Java String.
measure the length after converting to a byte[], not before.
use a loop for reading into the array, to be sure to capture the whole string.
for the getBytes() and new String(...), use the variants which take an explicit encoding name and give them the same encoding (I recommend "UTF-8").