I was just wondering how to send an int from a Java application to a C application using sockets. I have got different C programs communicating with each other and have got the Java application retrieving data from the C application, but I can't work out sending.
The C application is acting as database, the Java application then sends a user id (a 4 digit number) to the C application, if it exists it returns that record's details.
In Java I have tried using a printWriter and DataOutputStream to send the data, printWriter produces weird symbols and DataOutputStream produces "prof_agent.so".
Any help would be appreciated as I don't have a good grasp of sockets at the moment.
You can use DataOutputStream.writeInt. It writes an int already in network byte order by contract.
On a C side you can call recv, or read to fill in the 4-byte buffer, and then you can use ntohl ( Network-TO-Host-Long ) to convert the value you've just read to your platform int representation.
You can send the textual representation. So the number 123 would be sent as 3 bytes '1' '2' '3'.
It's a bit too late but let this answer be here. Using UDP sockets:
Java code:
public void runJavaSocket() {
System.out.println("Java Sockets Program has started."); int i=0;
try {
DatagramSocket socket = new DatagramSocket();
System.out.println("Sending the udp socket...");
// Send the Message "HI"
socket.send(toDatagram("HI",InetAddress.getByName("127.0.0.1"),3800));
while (true)
{
System.out.println("Sending hi " + i);
Thread.currentThread();
Thread.sleep(1000);
socket.send(toDatagram("HI " + String.valueOf(i),InetAddress.getByName("127.0.0.1"),3800));
i++;
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
public DatagramPacket toDatagram(
String s, InetAddress destIA, int destPort) {
// Deprecated in Java 1.1, but it works:
byte[] buf = new byte[s.length() + 1];
s.getBytes(0, s.length(), buf, 0);
// The correct Java 1.1 approach, but it's
// Broken (it truncates the String):
// byte[] buf = s.getBytes();
return new DatagramPacket(buf, buf.length,
destIA, destPort);
}
C# code:
string returnData;
byte[] receiveBytes;
//ConsoleKeyInfo cki = new ConsoleKeyInfo();
using (UdpClient udpClient = new UdpClient(new IPEndPoint(IPAddress.Parse("127.0.0.1"), 3800)))
{
IPEndPoint remoteIpEndPoint = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 3800);
while (true)
{
receiveBytes = udpClient.Receive(ref remoteIpEndPoint);
returnData = Encoding.ASCII.GetString(receiveBytes);
Console.WriteLine(returnData);
}
}
Try this:
Socket s = ...;
DataOutputStream out = null;
try {
out = new DataOutputStream( s.getOutputStream() );
out.writeInt( 123456 );
} catch ( IOException e ) {
// TODO Handle exception
} finally {
if ( out != null ) {
try {
out.close();
} catch ( IOException e ) {
// TODO Handle exception
}
}
}
It whould help if you could explain a little more what your problem is.
Related
Good morning, I have a quick question regarding the differences between a byte object in python (denoted b'') and how to replicate it in java.
The project I am working on is some personal work on an emulation server for a dead game to better my reversing skills. I have a working rendition of the project in python, but would like to switch over to java as I am better with the language and it comes with many additional tools included that are useful for a project like this.
I am using a ServerSocket to capture TCP data in the java project.
When data comes over the network from the Python project it looks a little something like this:
When I capture the same data over the java ServerSocket I get something like this:
My question is how can I reformat this ASCII text to get the proper data as seen in the python version of the software.
Currently I am able to get an output like this:
By converting the byte[] data from the ServerSocket as such
while(true) {
try {
Socket socket = serverSocket.accept();
onConnection(socket);
byte[] incomingData = new byte[0];
byte[] temp = new byte[1024];
int k = -1;
//this is due to the client of said game not sending EOL (readLine() does not work here)
while((k = socket.getInputStream().read(temp, 0, temp.length)) > -1) {
byte[] tbuff = new byte[incomingData.length + k];
System.arraycopy(incomingData, 0, tbuff, 0, incomingData.length);
System.arraycopy(temp, 0, tbuff, incomingData.length, k);
incomingData = tbuff;
receiveData(socket, incomingData); <--- this is the important bit
}
} catch (IOException e) {
e.printStackTrace();
}
}
public void receiveData(Socket socket, byte[] data) {
int lenLo = (int) (data[0]);
int lenHi = (int) (data[1]);
int length = lenHi * 256 + lenLo;
if(lenHi < 0) {
System.out.println("Invalid Packet Length");
}
if(data.length != length) {
System.out.println("Incomplete Packet Received");
}
try {
String test = new String(data, "UTF-8");
serverGUI.serverDebug(test); //produces the string in a jframe (pic 2)
serverGUI.debugByteArray(test.getBytes(StandardCharsets.UTF_8)); //produces the byte[] in jframe (pic 3 -- all bytes in this array are & 0xff prior to being printed out)
} catch (UnsupportedEncodingException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
However this has obviously not produced the desired outcome. Any advice is appreciated or any resources that can be put forth are also appreciated.
Thanks in advance!
So I want to have a TCP connection between a Java client and a C++ server. Think of the client as an input device and the C++ server should receive JSON objects, parse them and use them in a game.
It seems like the connection is established successfully, but 1) there is an error("parse error-unexpected ''") when I try to parse the json objects (i'm using nlohmann's json) and 2) when I don't even call doStuff a.k.a just print out the buffer, only weird characters are printed (e.g.).
I assume I messed up something in the sending/receiving of data(This is the first time I use C++), but I've lost two days and really can't figure it out!
In the Java client I have:
private void connect() {
try {
hostname = conn.getHostname();
portnumber = conn.getPortNr();
socket = new Socket(hostname, portnumber);
out = new OutputStreamWriter(socket.getOutputStream());
in = new BufferedReader(new InputStreamReader(socket.getInputStream()));
} catch (Exception e) {
e.printStackTrace();
Log.e(debugString, e.getMessage());
}
}
public void sendMessage(String json) {
try {
//connect();
out.write(json.length());
Log.d(debugString, String.valueOf(json.length()));
out.flush();
out.write(json);
out.flush();
Log.d(debugString, json);
in.read();
this.close();
} catch (Exception e) {
e.printStackTrace();
Log.e(debugString, e.getMessage());
}
}
And in the C++ server:
void Server::startConnection() {
if (listen(s, 1) != 0) {
perror("Error on listen");
exit(EXIT_FAILURE);
}
listen(s, 1);
clilen = sizeof(cli_addr);
newsockfd = accept(s, (struct sockaddr *) &cli_addr, &clilen);
if (newsockfd < 0) {
close(newsockfd);
perror("Server: ERROR on accept");
exit(EXIT_FAILURE);
}
puts("Connection accepted");
int numbytes;
char buffer[MAXDATASIZE];
while (1)
{
numbytes = recv(s,buffer,MAXDATASIZE-1,0);
buffer[numbytes]='\0';
//Here's where the weird stuff happens
//cout << buffer;
//doStuff(numbytes,buffer);
if (numbytes==0)
{
cout << "Connection closed"<< endl;
break;
}
}
}
bool Server::sendData(char *msg) {
int len = strlen(msg);
int bytes_sent = send(s,msg,len,0);
if (bytes_sent == 0) {
return false;
} else {
return true;
}
}
void Server::doStuff(int numbytes, char * buf) {
json jdata;
try {
jdata.clear();
jdata = nlohmann::json::parse(buf);
if (jdata["type"] == "life") {
life = jdata["value"];
puts("json parsed");
}
} catch (const std::exception& e) {
cerr << "Unable to parse json: " << e.what() << std::endl;
}
}
Since your char "buffer" is showing weird characters after recv() on the C++ server it seems to me the issue should be due to character encoding mismatch between the Java client and the C++ server. To verify you can check the "numbytes" returned by recv() on C++ server, it should be greater than the number of characters in the JSON string on the Java client.
You are sending the lower 8 bytes of the JSON length but you're never doing anything about it at the receiver. This is almost certainly a mistake anyway. You shouldn't need to send the length. JSON is self-describing.
I want an android app and Windows C++ winsock to communicate using TCP sockets and I successfully sent a string from android to the C++ server but I cannot send string the other way around (from C++ server to Android client).
Here is the important C++ server part:
recvbuf = "Back At u \0";
cout << " \n " << recvbuf << "\n";
int iResult= send(ClientSocket, recvbuf, (int) strlen(recvbuf), 0);
if (iResult == SOCKET_ERROR) {
wprintf(L"send failed with error: %d\n", WSAGetLastError());
closesocket(ClientSocket);
WSACleanup();
return 1;
}
printf("Bytes Sent: %d\n", iResult);
And here is the android client recieving part:
class TextRcv extends AsyncTask<Void, Void, String>
{
#Override
protected String doInBackground(Void... params) {
//TO SEND A STRING
Socket clientSocket = null;
try {
clientSocket= new Socket("192.168.1.5",8889);
DataOutputStream oos= new DataOutputStream(clientSocket.getOutputStream());
oos.writeBytes(String.valueOf(mystr.length()));
oos.flush();
byte[] bufferout=mystr.getBytes();
oos.write(bufferout, 0, bufferout.length);
oos.close();
} catch (IOException e) {
e.printStackTrace();
}
//to recieve a string
String input =null;
char[] buffin=new char[128];
try {
BufferedReader in = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
in.read(buffin, 0, 128);
input=String.valueOf(buffin);
clientSocket.close();
} catch (IOException e) {
e.printStackTrace();
}
return input;
}
#Override
protected void onPostExecute(String input) {
super.onPostExecute(input);
Toast toast=Toast.makeText(getApplicationContext(),input,Toast.LENGTH_LONG);
toast.show();
}
}
The C++ output says that there is no error and that 11 bytes (length of the recvbuff string) are sent. But on the android the 'input' string is always null.
Here is the c++ server output:
Start Receving
length of string recieved in bytes =14
AndroidID - Hello World...
Done
Back At u
Bytes Sent: 11
Press any key to continue . . .
String input =null;
At this point input is null.
char[] buffin=new char[128];
try {
BufferedReader in = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
in.read(buffin, 0, 128);
input=String.valueOf(buffin);
This code is not correct, but if it executes at all, input cannot possibly be null. The correct code is as follows:
int count = in.read(buffin);
if (count > 0)
{
input = new String(buffin, 0, count);
}
Back to your code:
clientSocket.close();
} catch (IOException e) {
e.printStackTrace();
}
return input;
If input is still null at this point, there must have been an IOException which you haven't disclosed.
I hope to find any help on my old annoying problem.
I have a TCP sever program with java and client program with c#
packet protocol between those two is simply consist of 4byte length & body ASCII data.
The Problem is that C# client faces FormatException which is from parsing fail on length byte. If I look into an error from client side, then client is trying to parse somewhere in the body which is not length header.
But apparently, Server does not send broken packet.
meanwhile, at the server, I could find an Broken pipe error whenever this kind of problem happens.
Unfortunately this error does not always happen and was not able to recreate the problem situation. it makes me difficult to find exact cause of this problem
Please see below codes for server side
public class SimplifiedServer {
private Map<InetAddress, DataOutputStream> outMap;
private Map<InetAddress,DataInputStream> inMap;
protected void onAcceptNewClient(Socket client) {
DataOutputStream out = null;
DataInputStream in = null;
try {
out = new DataOutputStream(client.getOutputStream());
in = new DataInputStream(client.getInputStream());
} catch (IOException e) {
e.printStackTrace();
}
outMap.put(client.getInetAddress(), out);
inMap.put(client.getInetAddress(), in);
}
public void writeToAll(String packet) {
outMap.forEach((key, out) -> {
try {
byte[] body = packet.getBytes("UTF-8");
int len = body.length;
if (len > 9999) {
throw new IllegalArgumentException("packet length is longer than 10000, this try will be neglected");
}
String lenStr = String.format("%04d%s", len, packet);
byte[] obuf = lenStr.getBytes();
synchronized (out) {
out.write(obuf);
out.flush();
}
} catch (IOException e) {
e.printStackTrace();
}
});
}
public void listenClient(Socket client) {
try {
DataOutputStream out = outMap.get(client.getInetAddress());
DataInputStream in = inMap.get(client.getInetAddress());
while (true) {
byte[] received = SimplePacketHandler.receiveLpControlerData(in);
byte[] lenBytes = new byte[4];
for( int i = 0 ; i < 4 ; i ++){
lenBytes[i] = in.readByte();
}
String lenString = new String(lenBytes);
int length = Integer.parseInt(lenString);
byte[] data = new byte[length];
for ( int i = 0 ; i < length ; i ++){
data[i] = in.readByte();
}
if ( data == null ){
System.out.println("NetWork error, closing socket :" + client.getInetAddress());
in.close();
out.close();
outMap.remove(client.getInetAddress());
inMap.remove(client.getInetAddress());
return;
}
doSomethingWithData(out, data);
}
} catch (NumberFormatException e) {
e.printStackTrace();
} catch ( Exception e ) {
e.printStackTrace();
} finally {
try {
System.out.println(client.getRemoteSocketAddress().toString() + " closing !!! ");
// remove stream handler from map
outMap.remove(client.getInetAddress());
inMap.remove(client.getInetAddress());
//close socket.
client.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
And here is client side code
public class ClientSide
{
public TcpClient client;
public String ip;
public int port;
public NetworkStream ns;
public BinaryWriter writer;
public BinaryReader reader;
public Boolean isConnected = false;
public System.Timers.Timer t;
public String lastPacketSucceeded = String.Empty;
public ClientSide(String ip, int port)
{
this.ip = ip;
this.port = port;
client = new TcpClient();
}
public bool connect()
{
try
{
client.Connect(ip, port);
}
catch (SocketException e)
{
Console.WriteLine(e.ToString());
return false;
}
Console.WriteLine("Connection Established");
reader = new BinaryReader(client.GetStream());
writer = new BinaryWriter(client.GetStream());
isConnected = true;
return true;
}
public void startListen()
{
Thread t = new Thread(new ThreadStart(listen));
t.Start();
}
public void listen()
{
byte[] buffer = new byte[4];
while (true)
{
try
{
reader.Read(buffer, 0, 4);
String len = Encoding.UTF8.GetString(buffer);
int length = Int32.Parse(len);
byte[] bodyBuf = new byte[length];
reader.Read(bodyBuf, 0, length);
String body = Encoding.UTF8.GetString(bodyBuf);
doSomethingWithBody(body);
}
catch (FormatException e)
{
Console.WriteLine(e.Message);
}
}
}
public void writeToServer(String bodyStr)
{
byte[] body = Encoding.UTF8.GetBytes(bodyStr);
int len = body.Length;
if (len > 10000)
{
Console.WriteLine("Send Abort:" + bodyStr);
}
len = len + 10000;
String lenStr = Convert.ToString(len);
lenStr = lenStr.Substring(1);
byte[] lengthHeader = Encoding.UTF8.GetBytes(lenStr);
String fullPacket = lenStr + bodyStr;
byte[] full = Encoding.UTF8.GetBytes(fullPacket);
try
{
writer.Write(full);
}
catch (Exception)
{
reader.Close();
writer.Close();
client.Close();
reader = null;
writer = null;
client = null;
Console.WriteLine("Send Fail" + fullPacket);
}
Console.WriteLine("Send complete " + fullPacket);
}
}
Considering it is impossible to recreate problem, I would guess this problem is from multithread issue. but I could not find any further clue to fix this problem.
Please let me know if you guys need any more information to solve this out.
Any help will be great appreciated, thanks in advance.
A broken pipe exception is caused by closing the connection on the other side. Most likely the C# client has a bug, causing the format exception which causes it to close the connection and therefore the broken pipe on the server side. See what is the meaning of Broken pipe Exception?.
Check the return value of this read:
byte[] bodyBuf = new byte[length];
reader.Read(bodyBuf, 0, length);
According to Microsoft documentation for BinaryReader.Read https://msdn.microsoft.com/en-us/library/ms143295%28v=vs.110%29.aspx
[The return value is ] The number of bytes read into buffer. This might be less than the number of bytes requested if that many bytes are not available, or it might be zero if the end of the stream is reached.
If it reads less than the length bytes then next time it will be parsing the length using data somewhere in the middle of the last message.
These broke pipe exceptions happen when the client (browser) has closed the connection, but the server (your tag) continues to try to write to the stream.
This usually happens when someone clicks Back, Stop, etc. in the browser and it disconnects from the server before the request is finished. Sometimes, it can happen because, for example, the Content-Length header is incorrect (and the browser takes its value as true).
Usually, this is a non-event, and nothing to worry about. But if you are seeing them in your dev environment when you know you have not interrupted your browser, you might dig a bit more to find out why.
WLS server will try to filter these exceptions from the web container out of the log, since it is due to client (browser) action and we can't do anything about it. But the server doesn't catch all of them.
refer from :: https://community.oracle.com/thread/806884
i would like to know how to keep the input stream of a socket and reuse it until the application is close.
What i do for now is creating a thread in the main method. This thread is supposed to keep running for the all time the application runs. In this thread i read data from the server using the socket input stream. But i'm able to read only one time what the server is sending. After that i think the thread is dead or i cannot read from the input stream. How can i do to keep the input stream reading what is coming from the server.
Thanks.
int length = readInt(input);
byte[] msg = new byte[length];
input.read(msg);
ByteArrayInputStream bs = new ByteArrayInputStream(msg);
DataInputStream in = new DataInputStream(bs);
int cmd = readInt(in);
switch(cmd) {
case 1: Msg msg = readMsg(cmd, msg);
}
I put here everything, but in my code things happen in different methods.
The readInt method:
public static int readInt(InputStream in) throws IOException {
int byte1 = in.read();
int byte2 = in.read();
int byte3 = in.read();
int byte4 = in.read();
if (byte4 == -1) {
throw new EOFException();
}
return (byte4 << 24)
+ ((byte3 << 24) >>> 8)
+ ((byte2 << 24) >>> 16)
+ ((byte1 << 24) >>> 24);
}
Used for little-endian conversion.
your socket might well be blocking. If you encounter such a problem one good way around is to design your software for a polling method rather than being interrupt driven. Then again, the software design pattern will be done around what you are trying to achieve.
Hope it helps! Cheers!
You need to call input.read() in a loop such as this:
try {
while(running) {
int length = readInt(input);
byte[] msg = new byte[length];
input.read(msg);
ByteArrayInputStream bs = new ByteArrayInputStream(msg);
DataInputStream in = new DataInputStream(bs);
int cmd = readInt(in);
switch(cmd) {
case 1: Msg msg = readMsg(cmd, msg);
}
}
} catch (IOException e) {
//Handle error
}
Set running to false when you are finished with what ever your thread needs to be doing. Remember input.read() will block until the socket has received something. I hope this helps.