I have been trying to figure out how to implement a simple xml rpc server and client with apache xml-rpc (http://ws.apache.org/xmlrpc/) but i haven't been successfull. I implemented a server and a client as specified here in the section Webserver: http://ws.apache.org/xmlrpc/server.html
The only thing i did differently was this "phm.addHandler("Calculator",org.apache.xmlrpc.demo.Calculator.class);" instead of this "phm.load(Thread.currentThread().getContextClassLoader(), "MyHandlers.properties");". When i start the server it start properly but I can't see the service when running netstat. Then when I run a request from the client it fails because it cannot find the class to call (i.e. the client doesn't really connect to the server). I have also observed that the client can start (not sending any requests) without any errors even when the server is not running.
Anyone have any ideas. I'm really lost.
Are you trying to run both client and server on the same computer? ...that may be the reason for netstat not working. (It cannot sniff packets on the loopback address.)
If you're still interested I can post a working example.
Related
The problem:
I am having some strange behaviour from a Jetty server (rest over https) when some client connections are closed (client-side) before the server has had time to reply. Normally this is well managed and expected by a webserver/application server but in a specific instance something breaks the server that stops replying.
I am trying to reproduce programmatically and locally the issue, opening a client connection and closing it before the server has had time to reply, but I do not have much experience with a situation like this, normally the clients I write are expected to not die immediately.
I am not interested in the language/application I have to use to replicate my case, it can be a Java program, a netcat command, telnet, dotnetcore... The only limit I have is that it should run on a Kubernetes pod, if possible.
I am trying to use Java to open a socket then close it immediately, or to create an Http client and stop it immediately after a request sent, but with no luck at the moment.
At the same time I am looking at netcat, but I fear it's too low level for a rest request.
I'm really scratching my head around this one.
I'm running a java server against a few test cases through a test tool and I'm getting a java socket timed out exception. The test tool runs on windows, while my server is deployed on a Raspbian. The test tool is black box so I do not know its implementation.
From what I gathered, the test case creates a client sending an http address to my server, say http://192.168.1.100:8080, in which my server is to make a connection and send data back to it. The port number is not static.
I verified that the port is opened by the system each test. I then placed Wireshark on the test tool's machine and found that the Syn packet made it to the machine, but Syn/Ack was never sent out.
I'm using java httpURLConnection, but I've also tried to make a java socket but both resulted in the same and throwing a socket timed out.
This problem does not come up if I run the server locally with with the test tool. What make this stranger is that it also happens to another code base, which is supposed to pass the test.
I don't know why it would not connect to the port.
More information: I'm working on an Onvif device controller. Onvif essentially provides a common interface between video streaming device and client applications. The server needs to conform to Onvif specs and passes the Onvif device test tool.
The problem above happens with test cases related to base event handlings, which follows WS-BaseNotification specs.The serer is given an endpoint so it can use to notify the client. However, given that that I'm getting a socket timed out, I don't think it is a schema parsing problem. The server throw an exception before any data was written to the output stream. The same problem happens with another code base which is supposed to be pass the test tool but the test tool also report that the notification was not delivered.
I have a Corba application in Java working on my PC. Both the client and server running on one PC. Now I want the server and client running on different PCs and establishing a connection between them but I'm not sure how to do it I tried looking for a solution online but no luck so far.
As long as you pass the object reference to your client you can put your server on any machine. You could pass the IOR as a stringified form (e.g. IOR:....) or as a corbaloc string.
For example see https://github.com/JacORB/JacORB/tree/master/demo/hello
At the moment i have an android client app which connects to my java server through socket - serversocket. It sends and receive strings. The java server is connected to a mysql database (actually mariadb) using the jdbc driver.
I succeed to create a jbossas application and upload the code of the java server to openshift, but i didn't find any detailed tutorial on how do i connect to this new uploaded server from my socket client (This one (RMI or socket connection to Java Program on OpenShift) gives some tips but i'm still stucked).
More on this, how do i know that my server runs just fine on openshift and how do i control de calls to the database after i connect it (found this: $ rhc app create MyApp jbossas-7
$ rhc cartridge add mysql-5.5 -a MyApp), using org.mariadb.jdbc.Driver and java.sql is still working ?
Any small guide or tip is highly appreciated. I'm new to these things so please don't be too heavy on comments.
You can only make connections to your OpenShift server on http/https or ws/wss ports. If you want to connect to your java application and pull data from it from an android device, I would suggest using a RESTful api or a servlet, etc.
I had similar problem: My app server originally was running as a ServerSocket listener, and any clients/devices connect to it directly via Socket binding.
To deploy it into OpenShift, my previous initial solution was to change its host:port configuration by following the suggestion as described in this link [Socket connection to Java Program on OpenShift]. It worked nice as far as my app server was successfully up and running. But it did not work well with the port forwarding approach in order to accept remote requests.
So for the final solution, I modified the app server by wrapping my original code with a RESTful webservice around it, and deploy it as a web service.
My node app is up and running (listening). It is in essence a bunch of rest api handlers.
I am sending http requests from my test java client using general java codes (Apache httprequest and httpclient to execute the simple http commands).
The thing is the java http client side instantly gets an IO exception after send the request to the app. Saying " org.apache.http.conn.HttpHostConnectException: Connection to http://xxxxxxx.herokuapp.com:38084 refused"
Is there any thing i missed so as to make http calls from a java client?
thanks.
You are never supposed to connect to any port other than 80 or 443 when connecting to foo.herokuapp.com. Granted, your app listens on some port (given by the PORT environment variable), but you still need to connect to port 80 (for http).
You are not connecting directly to your dyno(s), bur rather to a Heroku gateway. Heroku's gateway will do the routing (from the gateway to your application) and port forwarding for you.
I would recommend using curl to try to replicate the connection problem. If it works correctly with curl (or for that matter, in your browser), than you know you have a Java client issue.