I send an integer from a C client to a java server and it worked perfectly. But when i tried to do the same thing with a string i got and error
this is the client code to send the String
char clientString[30];
printf("String to send : \n");
if( send( to_server_socket, &clientString, sizeof( clientString ), 0 ) != sizeof( clientString ) )
{
printf( "socket write failed");
exit( -1 );
}
And the java code to read it
DataInputStream din = new DataInputStream(socket.getInputStream());
String clientString=din.readUTF();
System.out.println(clientString);
Error
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:180) at
java.io.DataInputStream.readUTF(DataInputStream.java:592) at
java.io.DataInputStream.readUTF(DataInputStream.java:547) at
ServiceRequest.run(ServiceRequest.java:43) at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138) at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680)
EDIT :I tried using din.readLine(),I don't have the error anymore but if i type fffffff12 on the client i got fffffff12`?7E^Ê?h in the server
You send all of the data in the clientString array, no matter how long the input really is. Terminate the string properly and only send e.g. strlen(clientString) bytes instead.
readUTF doesn't just read bytes from a socket. It starts by reading the length of the string (as a 16-bit integer) and then reading the string. The problem is that what you send is not what is required for readUTF to work successfully.
As Joachim Pileborg noted, you are also sending the entire 30 bytes of clientString (including any remaining bytes that were not explicitly set). You should send it like this instead:
send(to_server_socket, clientString, strlen(clientString), 0);
Maybe the problem is that on the client side you are writing ASCII string but on the server side you are reading UTF, try to read the data as ASCII, as well if possible please mention the exception that occurred. This method can raise two exceptions: 1- An IO Exception or 2- An EOF exception.
Related
I am testing a java code to issue AT Commands to the Modem at the designated port. I was able to successfully make Socket connection to the Modem's default gateway IP and AT Command port and write AT commands to that socket. (something like below)
Socket socket = new Socket(address, port);
socket.setKeepAlive(true);
...
String command = "AT\r\n";
...
DataOutputStream out = new DataOutputStream(socket.getOutputStream());
byte[] commandBytes = command.getBytes(StandardCharsets.US_ASCII);
out.write(commandBytes, 0, commandBytes.length);
out.flush();
And try to read the response from the socket as below
BufferedInputStream in = new BufferedInputStream(socket.getInputStream());
byte[] byteArray = new byte[1024];
int count = in.read(byteArray, 0, byteArray.length);
System.out.println("Response Received: " + new String(byteArray, StandardCharsets.US_ASCII));
The problem is, I am getting bad characters (like ??) as the response. I am expecting the output as "OK". When I issue the same command from the command prompt of the same PC where I am running this code, I am getting the response in proper English. I am assuming (mostly true) that the modem might be using a different language like C, C++ and their byte rage is different from the byte range of Java. But if this is the case, I am not sure how to fix it. Need help.
Things that I have already tried:
Printed the bytes as retrieved and found that these are of negative byte values (like -1, -3, -5 etc)
Verified the default of charset of the modem and if found to be ISO-8859-1 and tried using the same in my code (both write and read) still getting similar bad characters only
Tried reading as characters using BufferredReader* classed, but nothing is getting received at the response
My question in short, how to read the binary(byte array) data received as response from Modem for the given AT commands issued from Java Socket Connection?
Any help in this regard is highly appreciated.
I'm working on implementing a RCON for Minecraft, which uses the Valve rcon protocol, and I've gotten my hands on the C source and tried to implement it into Java, this is what I've done so far:
Creating the packet:
http://pastebin.com/9AeiSQPD
Recieve the packet: http://pastebin.com/n6V1KnPa
Send the packet: http://pastebin.com/rixhD15p
I'm sending the AUTH packet to the server and trying to receive a response, but the return value is null, also trying to send a command throws:
Software caused connection abort: socket write error
What am I doing wrong?
I think there are two wrong things in your code.
1) First as you can see here the packet structure use 4 bytes blocks (32bits) of little-endian integer. The means reverse blocks (see here).
2) Second you didnt use null block (empty String or null character) at end of packet.
Solution:
1) Use: (ByteBuffer and ByteOrder are from native java.nio java7)
writer.write(ByteBuffer.allocate(4).order(ByteOrder.LITTLE_ENDIAN)
.putInt(p.size).array());
writer.write(ByteBuffer.allocate(4).order(ByteOrder.LITTLE_ENDIAN)
.putInt(p.id).array());
writer.write(ByteBuffer.allocate(4).order(ByteOrder.LITTLE_ENDIAN)
.putInt(p.cmd).array());
Instead of:
writer.writeInt(p.size);
writer.writeInt(p.id);
writer.writeInt(p.cmd);
and:
ByteBuffer.wrap(<4_BLOCKS_BYTES>)
.order(ByteOrder.LITTLE_ENDIAN).getInt();
instead of:
reader.readInt();
where <4_BLOCKS_BYTES> is a bytes array of size 4, read from the reader.
And if I can give you an advice, it can be easier (I think) to use a global buffer to send.
I mean a buffer containing size, id, type, data and empty block as bytes. And same thing when you read response: use a buffer to read while the DataInputStream is available() and then parse it.
Good luck!
Java Doc links:
java.nio: docs.oracle.com/javase/7/docs/api/java/nio/package-frame.html
DataInputStream.available(): docs.oracle.com/javase/7/docs/api/java/io/FilterInputStream.html#available()
In my server-side code I need to be able to listen to a socket to exchange JSON 'packets' with a Java 7 test application on the same machine. The connection is made and a JSON string is constructed and written to the socket by the Java test application. It is received by the Dart server-side application and passed to a callback method, handleJson, which attempts to decode it. The process dies on 'JSON.decode'.
I think it dies because the string is prepended, by the Java 'writeUTF' method with a short int that contains the number of bytes in the JSON UTF-8 uncluding the leading short and the leading byte is 0.
Is there a Dart method to handle this, in each direction, or must I write the code? (I had thought that JSON work easily between languages.)
The JSON string before writing to the socket in my Java test application:
{"target":"DOOR","command":"OPEN"} // 34 characters
A Java snippet:
// in a try-catch
Socket client = new Socket(serverName, port);
OutputStream outToServer = client.getOutputStream();
DataOutputStream out = new DataOutputStream(outToServer);
out.writeUTF(json);
client.close();
The Java documentation states that the out.writeUTF method converts the json string to UTF-8 with the string length prepended as a short int containing the total number of bytes written.
In main:
ServerSocket.bind('127.0.0.1', 4041)
.then((serverSocket) {
print('connected');
// prints: 'connected'
serverSocket.listen((socket) {
socket.transform(UTF8.decoder).listen(handleJson);
});
});
handleJson method:
handleJson(String stringAsJson){
print('string length is ' + (stringAsJson.length).toString());
// prints: 'string length is 36'
print('received json $stringAsJson');
// prints: 'received json '
String json = JSON.decode(stringAsJson);
// dies on decode
print('Sever Socket received: $json');
}
This will give you some troubles, since Socket is raw TCP, and TCP is streaming. That means that the text (bytes) you send can be split and merged in any way the network may find suitable.
In your case, you need a way to mark the end of each JSON message. An example could be to accumulate all bytes received, until the byte 0 is seen (invalid in JSON). Those bytes could then be converted to UTF8 and then again converted to JSON. Note that the peer needs to send this 0 byte in between messages, for this to work.
Now, you also consider using WebSockets as a way to sent messages. After the initial HTTP handshake, it's actually just a raw TCP socket with some extra header information, to make it package oriented - exactly what you need. dart:io already includes a WebSocket implementation.
My Java server sends an Integer and a String (to a C client):
DataOutputStream dos = new DataOutputStream(socket.getOutputStream());
dos.writeInt(ClientNumber); //send the Integer
String randomString= getRandomValue(10,20);
dos.writeUTF(randomString); //send the String
String clientString=din.readLine();
The C code for the client that's reading them is:
if( recv( to_server_socket, &reply, sizeof( reply ), MSG_WAITALL ) != sizeof( reply ) )
{
printf( "socket read failed");
exit( -1 );
}
char buf[50];
int byte_count;
byte_count = recv(to_server_socket, buf, sizeof buf, 0);
printf("recv()'d %d bytes of data in buf\n", byte_count)
Until here, it works fine.
Now, I want to send another String to the Client. So, I tried just adding the line:
dos.writeUTF("blabla");
It's still working and when I tried to get the client to read it, I added:
byte_count2 = recv(to_server_socket, buf2, sizeof buf2, 0);
printf("recv()'d %d bytes of data in buf\n", byte_count2);
And it doesn't work. The client receives the number and the first String but it doesn't send anything and doesn't receive the "blabla" string. I'm not sure if the problem is in the client or the server.
Can anyone explain to me what I'm doing wrong?
Try closing your dos(DataOutpuStream) after every write. You may try to check first if flush helps.
You are mixing your protocols. I suggest you use either binary or text wire format. It's not clear which one you are trying to use.
I suggest text wire format as it is easier to work with in this case. i.e. don't DataInputStream or DataOutputStream as these are for binary formats.
Instead you can use BufferedReader for reading lines of text and PrintWriter for writing lines of text. You can test your server works by connecting to it with telnet i.e. if it doesn't work with telnet, it won't work with C.
Once this is working, get your C client to work as well. BTW You shouldn't assume that one write translates to one read. You are writing a Stream of data, not messages.
I have strange problem with receiving data from socket.
On client im using air socket. On server java netty.
Im writing to socket simple packets: int numPacket, int textLength, utf8String text. And read on client.
//server
buffer.writeInt( packetId );
ChannelBuffer ch = ChannelBuffers.copiedBuffer( text, CharsetUtil.UTF_8);
buffer.writeInt( text.length() );
buffer.writeBytes(ch);
//client
packetId = socket.readInt()
packetLen = socket.readInt()
text = socket.readUtfBytes(packetLen)
Sometimes one packets() doesnt receives by client, but server was send there, and tcpdump show that packet was send. If server send new packet, client read previous packet, and doesn't receivs new packet - and it's works like queue that im don't need.
p.s sorry for bad english -_-
Looks like client maybe waiting for some byte \n,\u etc to know the end of frame. I had similar problem with flash because the client was expecting a null byte at the end of the the transmission.
You could try to add the following sort of encoder as the last encoder in your pipeline and give it a try. The relevant code for handling nul byte is shown below.
ChannelBuffer nulBuffer = ChannelBuffers.wrappedBuffer(new byte[] { 0 });
ChannelBuffer buffer = ChannelBuffers.wrappedBuffer((ChannelBuffer)msg,nulBuffer);
Try using flush() on your buffer after each or all three