My Java server sends an Integer and a String (to a C client):
DataOutputStream dos = new DataOutputStream(socket.getOutputStream());
dos.writeInt(ClientNumber); //send the Integer
String randomString= getRandomValue(10,20);
dos.writeUTF(randomString); //send the String
String clientString=din.readLine();
The C code for the client that's reading them is:
if( recv( to_server_socket, &reply, sizeof( reply ), MSG_WAITALL ) != sizeof( reply ) )
{
printf( "socket read failed");
exit( -1 );
}
char buf[50];
int byte_count;
byte_count = recv(to_server_socket, buf, sizeof buf, 0);
printf("recv()'d %d bytes of data in buf\n", byte_count)
Until here, it works fine.
Now, I want to send another String to the Client. So, I tried just adding the line:
dos.writeUTF("blabla");
It's still working and when I tried to get the client to read it, I added:
byte_count2 = recv(to_server_socket, buf2, sizeof buf2, 0);
printf("recv()'d %d bytes of data in buf\n", byte_count2);
And it doesn't work. The client receives the number and the first String but it doesn't send anything and doesn't receive the "blabla" string. I'm not sure if the problem is in the client or the server.
Can anyone explain to me what I'm doing wrong?
Try closing your dos(DataOutpuStream) after every write. You may try to check first if flush helps.
You are mixing your protocols. I suggest you use either binary or text wire format. It's not clear which one you are trying to use.
I suggest text wire format as it is easier to work with in this case. i.e. don't DataInputStream or DataOutputStream as these are for binary formats.
Instead you can use BufferedReader for reading lines of text and PrintWriter for writing lines of text. You can test your server works by connecting to it with telnet i.e. if it doesn't work with telnet, it won't work with C.
Once this is working, get your C client to work as well. BTW You shouldn't assume that one write translates to one read. You are writing a Stream of data, not messages.
Related
I'm sending a string over the socket I previously sent a file to, but the recipient reads it as part of the file itself, is there a way to send a sort of EOF before sending the string?
To send the file I'm using
byte[] buffer = new byte[1024];
int count;
while ((count = fis.read(buffer)) >= 0) os.write(buffer, 0, count);
os.flush();
(and almost the same to receive it)
To send the string I'm using OutputStreamWriter
(Here you are my code: hatebin)
I've also read here that I should send a SOH character, but which one should I send and how?
Thanks in advance.
No there's no way to send an "eof" and then send something afterwards.
If you don't want to open a new connection, there are basically two ways to solve this.
You can modify the client so it recognizes some special byte sequence as a "delimiter", and stops writing to the file when it reads the delimiter from the socket. In this case you need to have some strategy to deal with the possibility that the file actually contains the delimiter.
You can send the size of the file in bytes before sending the file, and modify the client so it counts the number of bytes it reads from the socket. When the client has read enough, it should stop writing to the file.
I am testing a java code to issue AT Commands to the Modem at the designated port. I was able to successfully make Socket connection to the Modem's default gateway IP and AT Command port and write AT commands to that socket. (something like below)
Socket socket = new Socket(address, port);
socket.setKeepAlive(true);
...
String command = "AT\r\n";
...
DataOutputStream out = new DataOutputStream(socket.getOutputStream());
byte[] commandBytes = command.getBytes(StandardCharsets.US_ASCII);
out.write(commandBytes, 0, commandBytes.length);
out.flush();
And try to read the response from the socket as below
BufferedInputStream in = new BufferedInputStream(socket.getInputStream());
byte[] byteArray = new byte[1024];
int count = in.read(byteArray, 0, byteArray.length);
System.out.println("Response Received: " + new String(byteArray, StandardCharsets.US_ASCII));
The problem is, I am getting bad characters (like ??) as the response. I am expecting the output as "OK". When I issue the same command from the command prompt of the same PC where I am running this code, I am getting the response in proper English. I am assuming (mostly true) that the modem might be using a different language like C, C++ and their byte rage is different from the byte range of Java. But if this is the case, I am not sure how to fix it. Need help.
Things that I have already tried:
Printed the bytes as retrieved and found that these are of negative byte values (like -1, -3, -5 etc)
Verified the default of charset of the modem and if found to be ISO-8859-1 and tried using the same in my code (both write and read) still getting similar bad characters only
Tried reading as characters using BufferredReader* classed, but nothing is getting received at the response
My question in short, how to read the binary(byte array) data received as response from Modem for the given AT commands issued from Java Socket Connection?
Any help in this regard is highly appreciated.
My problem is that C sockets look to act differently than Java sockets. I have a C proxy and I tested it between a workload generator (oltp benchmark client written in Java) and the JDBC connector of the Postgres DB.
This works great and forwards data from one to other, as it should. We need to make this proxy work in Java, so I used plain ServerSocket and Socket classes from java.net and I cannot make it work. The Postgres returns an authentication error message, assuming that the client did not send the correct password.
Here is how the authentication at the JDBC protocol works:
-client sends a requests to connect to a database specifying the database name and the username
-server responds back with a one time challenge message (13 byte message with random content)
-client concatenates this message with the user password and performs a md5 hash
-server compares the hash got from the client with the hash he computes
[This procedure is performed in order to avoid replay attacks (if client would send only the md5 hash of its password then an attacker could replay this message, pretending he is the client)]
So I inspected the packets with tcpdump and they look correct! The size is exactly as it should, so maybe the content is corrupted (??)
Sometimes though the DB server responds ok for the authentication (depending on the value of the challenge message)!! And then the oltp client sends a couple of queries, but it crashes in a while…
I guess that maybe it has to do with the encoding, so I tried with the encoding that C uses (US-ANSII), but still the same.
I send the data using fixed size character or byte arrays both in C and in Java!
I really don't have any more ideas, as I tried so many cases...
What is your guess of what would be the problem?
Here is a representative code that may help you have a more clear view:
byte [] msgBuf;
char [] msgBufChars;
while(fromInputReader.ready()){
msgBuf = new byte[1024];
msgBufChars = new char[1024];
// read data from one party
int read = fromInputReader.read(msgBufChars, 0, 1024);
System.out.println("Read returned : " + read);
for(int i=0; i<1024; i++)
msgBuf[i] = (byte) msgBufChars[i];
String messageRead = new String(msgBufChars);
String messageToWrite = new String(msgBuf);
System.out.println("message read : "+messageRead);
System.out.println("message to write : "+new String(messageToWrite));
// immediatelly write data to other party (write the amount of data we read (read value) )
// there is no write method that takes a char [] as a parameter, so pass a byte []
toDataOutputStream.write(msgBuf, 0, read);
toDataOutputStream.flush();
}
There are a couple of message exchanges in the beginning and then Postgres responds with an authentication failure message.
Thanks for your time!
What is your guess of what would be the problem?
It is nothing to do with C versus Java sockets. It is everything to do with bad Java code.
I can see some problems:
You are using a Reader in what should be a binary stream. This is going to result in the data being converted from bytes (from the JDBC client) to characters and then back to bytes. Depending on the character set used by the reader, this is likely to be destructive.
You should use plain, unadorned1 input streams for both reading and writing, and you should read / write to / from a preallocated byte[].
This is terrible:
for(int i=0; i<1024; i++)
msgBuf[i] = (byte) msgBufChars[i];
If the characters you read are not in the range 0 ... 255 you are mangling them when you stuff them into msgBuf.
You are assuming that you actually got 1024 characters.
You are using the ready() method to decide when to stop reading stuff. This is almost certainly wrong. Read the javadoc for that method (and think about it) and you should understand why it is wrong. (Hint: what happens if the proxy can read faster than the client can deliver?)
You should use a while(true), and then break out of the loop if read tells you it has reached the end of stream; i.e. if it returns -1 ...
1 - Just use the stream objects that the Socket API provides. DataXxxStream is unnecessary because the read and write methods are simply call-throughs. I wouldn't even use BufferedXxxStream wrappers in this case, because you are already doing your own buffering using the byte array.
Here's how I'd write that code:
byte [] buffer = new byte[1024]; // or bigger
while(true) {
int nosRead = inputStream.read(buffer);
if (nosRead < 0) {
break;
}
// Note that this is a bit dodgy, given that the data you are converting is
// binary. However, if the purpose is to see what embedded character data
// looks like, and if the proxy's charset matches the text charset used by
// the client-side JDBC driver for encoding data, this should achieve that.
System.out.println("Read returned : " + nosRead);
System.out.println("message read : " + new String(buffer, 0, nosRead));
outputStream.write(buffer, 0, nosRead);
outputStream.flush();
}
C sockets look to act differently than Java sockets.
Impossible. Java sockets are just a very thin layer over C sockets. You're on the wrong track with this line of thinking.
byte [] msgBuf;
char [] msgBufChars;
Why are you reading chars when you want to write bytes? Don't use Readers unless you know that the input is text.
And don't call ready(). There are very few correct uses, and this isn't one of them. Just block.
I send an integer from a C client to a java server and it worked perfectly. But when i tried to do the same thing with a string i got and error
this is the client code to send the String
char clientString[30];
printf("String to send : \n");
if( send( to_server_socket, &clientString, sizeof( clientString ), 0 ) != sizeof( clientString ) )
{
printf( "socket write failed");
exit( -1 );
}
And the java code to read it
DataInputStream din = new DataInputStream(socket.getInputStream());
String clientString=din.readUTF();
System.out.println(clientString);
Error
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:180) at
java.io.DataInputStream.readUTF(DataInputStream.java:592) at
java.io.DataInputStream.readUTF(DataInputStream.java:547) at
ServiceRequest.run(ServiceRequest.java:43) at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138) at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680)
EDIT :I tried using din.readLine(),I don't have the error anymore but if i type fffffff12 on the client i got fffffff12`?7E^Ê?h in the server
You send all of the data in the clientString array, no matter how long the input really is. Terminate the string properly and only send e.g. strlen(clientString) bytes instead.
readUTF doesn't just read bytes from a socket. It starts by reading the length of the string (as a 16-bit integer) and then reading the string. The problem is that what you send is not what is required for readUTF to work successfully.
As Joachim Pileborg noted, you are also sending the entire 30 bytes of clientString (including any remaining bytes that were not explicitly set). You should send it like this instead:
send(to_server_socket, clientString, strlen(clientString), 0);
Maybe the problem is that on the client side you are writing ASCII string but on the server side you are reading UTF, try to read the data as ASCII, as well if possible please mention the exception that occurred. This method can raise two exceptions: 1- An IO Exception or 2- An EOF exception.
I have strange problem with receiving data from socket.
On client im using air socket. On server java netty.
Im writing to socket simple packets: int numPacket, int textLength, utf8String text. And read on client.
//server
buffer.writeInt( packetId );
ChannelBuffer ch = ChannelBuffers.copiedBuffer( text, CharsetUtil.UTF_8);
buffer.writeInt( text.length() );
buffer.writeBytes(ch);
//client
packetId = socket.readInt()
packetLen = socket.readInt()
text = socket.readUtfBytes(packetLen)
Sometimes one packets() doesnt receives by client, but server was send there, and tcpdump show that packet was send. If server send new packet, client read previous packet, and doesn't receivs new packet - and it's works like queue that im don't need.
p.s sorry for bad english -_-
Looks like client maybe waiting for some byte \n,\u etc to know the end of frame. I had similar problem with flash because the client was expecting a null byte at the end of the the transmission.
You could try to add the following sort of encoder as the last encoder in your pipeline and give it a try. The relevant code for handling nul byte is shown below.
ChannelBuffer nulBuffer = ChannelBuffers.wrappedBuffer(new byte[] { 0 });
ChannelBuffer buffer = ChannelBuffers.wrappedBuffer((ChannelBuffer)msg,nulBuffer);
Try using flush() on your buffer after each or all three