can an asynchronous web service be achieved with java spring-ws framework like how it's explained in here
basically when a client sends a request to the server for the first time, the web service on the server will callback the the client whenever it has information based on the request. so that means the server may reply more than once based on the first initial request from the client.
Suggested approach as per my experience:
Asynchronous web services are generally implemented in the following model:
CLIENT SUBMIT REQUEST -> SERVER RETURNS 202 ACCEPTED RESPONSE(polling/JOB URL in header) -> CLIENT KEEP POLLING THE JOB URL -> SERVER RETURNS 200 OK for the JOB URL ALONG WITH JOB RESPONSE IN BODY.
You may need to define few response body for job in progress. When client polls the server and server is still processing the request, the body should contain the IN PROGRESS message in a predefined form for the client. If server finished processing, then the desired response should be available in the body.
Hope it helps!
Related
I have a business in the following scenario: the client makes a request to me, and I have some requests that take a long time to process. After I return the response, the client has already handled this business as a timeout. Is there any way for Java or spring-boot to realize the server side's judgment that the client has successfully received the response sent by the service
I am writing a web application that makes intense use of websockets (standard JSR implementation).
Through the websocket I exchange information.
A Client sends a request (JSON) to the Server, the Server decodes the message and sends some info back (JSON).
How can avoid maliciously client to flood my Server with requests. For example I want to limit the number of requests at 10 every 5 seconds (what I mean by request is the certain JSON message the client is sending in order to get the information).
Is there are built-in-way of doing this or I have to write my own mechanism ?
I've configured my Jetty server with a bounded ThreadPool and a bounded QueueSize. What would be the behavior when this is actually hit? Ie, a user submits a HTTP request to the server, and Jetty is unable to get a thread/queue-slot to fulfill the request.
I would have expected the server to respond to the client with a 500 error of some form. But from my local testing, it appears that the client doesn't get any response at all.
What does Jetty do in this case? Is there any way for me to configure Jetty to send a 500 response to the user when this occurs?
We have a Java web service with document style and http protocol. Local this service works smoothly and fast (~ 6ms). But calling the service-methods from remote takes over 200ms.
One main reason for this delay is that the
server sends first the response http header,
the client sends in return a ACK and
then again the server sends the response http body.
This second step where the client sends the ACK costs the most time, almost the whole 200ms. I would like to avoid this step and save the time.
So that's why my question: Is it possible to send the whole response in one package? And how and where do I configure that?
Thanks for any advice.
I'm not fully understanding the question.
Why is the server sending the first message? Shouldn't the client be requesting for a web service via HTTP initially?
From what I understand, SOAP requests are wrapped within an http message. HTTP messages assumes a TCP connection and requires a response. This suggests that a client must respond when the server sends an http header.
Basically whatever one end sends to another, the other end must reply. The ACK return from you step 2 will always be present.
EDIT:
I think the reason for the difference in time when requesting via local and remote is simply the routing that happens in the real network versus on your local machine. It's not the number of steps taken in your SOAP request and response.
I'm designing a REST API where some operations are propagated to AMQP queues. When message is processed (with error or successfully) the client must be notified. My first thoughts were to boot lightweight embedded HTTP server when client library is initialized, so message processing mechanism can also emit an HTTP request to the server about how went the operation execution. Any other / better ideas of how to implement this?