Asynchronous web service SOAP - java

I have an interface that I've exposed as a regular SOAP web service. One method of the interface consists for the client to send a file to the server, then the server processes the file and returns a result file. Processing the file may take some time, so I think using asynchronous invocation of this method is a better idea. I thought about the following flow:
The client invokes the asynchronous method and sends the file using an attachment (MTOM).
When the file is received by the server, a response is sent back to the client indicating that the file has been received and that it will be processed shortly.
Once the file is processes, a response is sent back to the client indicating it has been processed and a result file is returned in the response also as an attachment.
Is it possible using SOAP with CXF?
Thanks

You can use Callback approach of Asynchronous InvocationModel.
Callback approach - in this case, to invoke the remote operation, you
call another special method that takes a reference to a callback
object (of javax.xml.ws.AsyncHandler type) as one of its parameters.
Whenever the response message arrives at the client, the CXF runtime
calls back on the AsyncHandler object to give it the contents of the
response message
More information can be had from the following:
Apache CXF

If you use some tool like WSDL2Java for client generation, you can even choose to generate an asynchronous client.
It will generate for you a callback handler with empty methods for each of the service operations and exceptions of the service. You then can just implement those methods to set the actions to do when the response is received.
Remember that when an asynchronous call is done a new thread is started.

Yes, Once you receive the file, you may return the request id to client and start processing on server side and do maintain various states of processing. Client can come back in different interval, and will receive the processing status or the output if it is completed.

Related

What Asynchronous connection really means in inter-dependent JAVA application

I am building an application which connect with other too. For the application I need to take a request from a websocket connection. Once i received the request need to send the request to some other application for processing and there will be few cases.
If second application return accepted then wait for a response from 3rd application (3 will send response to 2nd and 2nd will initiate a push model).
If second application return other than accepted then return false to the request
My confusion is, this way I will handle the request as synchronous or asynchronous ?
As in case #1 I have to wait for some time to receive response from another application. more over in case #2 I can immediately process a request.
Sequence diagram for clarity of flow

RabbitMQ RPC tutorial query

I was going through the tutorial shared by RabbitMQ here
I am assuming that the client code below
while (true)
{
var ea = (BasicDeliverEventArgs)consumer.Queue.Dequeue();
if (ea.BasicProperties.CorrelationId == corrId)
{
return Encoding.UTF8.GetString(ea.Body);
}
}
Would receive all messages on the queue and will unnecessarily iterate through messages not designated for it. Is their anyway we can avoid it i.e we can modify the client to only receive the messages intended for it only.
The basic work that i intend to achieve through RabbitMQ is Request-Response pattern where a request would be received by web-service which will send data in a queue the data object would have a unique reference number . This would be received by an asynchronous tcp-client which will send data on a tcp/ip layer based on message it had received.
On receiving reply from the asynchronous channel of tcp/ip the channel would parse the data and respond back on the queue with the corresponding request reference number.
The RPC approach is well suited for it but the client code shared have this shortcoming would appreciate feedback on it.
Actually I didn’t understand well your aim, but when you create an RPC model, you have to create an “reply queue”, this queue is bound only to the client.
It means that you will receive back only the client messages, and not all messages.
Since the Rabbitmq RPC model is asynchronous you can execute more than one request without wait the responses and replies could not have the same publish order.
The correlation id is necessary to map your client requests with the replies, so there are not "unnecessarily" messages
hope it helps

Netty client with synchronous request response

i am trying to create a http client based on netty. I have written the code based on the HttpSnoopClient example given in Netty site. But the problem is HttpResponse are handled by HttpSnoopClientHandler & HttpRequests are sent in HttpSnoopClient & i want to sync it. as in if i send a request i want to make sure that i will send the next request once i know the response to the previous. But since both are handled in different class, It is becoming difficult to do the same.
One thing i did was to create a setResponse() method in HttpTarget & HttpSnoopClientHandler will be setting the HttpResponse when it receives the Response from the sever. But i don't think it is a good approach since i won't be able to know the reposne was for which request.
So basically i want to do it synchronously i.e. send a request(channel.writeandFlush(req)) in HttpSnoopClient then wait till the response is received by the HttpSnoopCLientHandler & once it recieves a HTTP 1.1 200 OK then send the next request.
Can anyone tell me a good approach for doing it. Thanks in advance!
I had a similar use case where I had to block concurrent requests till one completes for a resource. I implemented a ConcurrentHashMap<RequestKey, ArrayList<ChannelHandlerContext>>> which will hold all the concurrent requests ChannelHandlerContext (ctx) and on completion of the first request raise an event which would trigger all other ctx to consume the cached response. In all this I had to make sure the AUTO_READ was set to false for fine grain control over the reads on each channel.
channelRead ->
if(map.contains(reqKey)){
map.add(reqKey, list.add(ctx))
//do nothing with AUTO_READ = false and do not initiate any ctx.* methods
}else{
//firstRequest
map.add(reqKey, new ArrayList<CTX>(){{ add(ctx);}})
//continue with request execution
//cache response and raise event on completion
}
userEventTriggered ->
onCompletionEvent {
ctxList = map.get(reqKey).clone();
map.remove(reqKey);
for(blockedCtx : ctxList){
//respond back with cached response to each blockedCtx
}
}
#norman-maurer would you give your take on this !!!
As you're creating a new HttpSnoopClientHandler for each connection, I would consider turning HttpSnoopClientHandler into a ChannelDuplexHandler. In the write method you can store a reference to the outgoing http request. When the response is received you can call your setResponse method with (channel, request, response). This should provide enough context so you can process the response correctly.
If your client is pure request/response, does not issue unrelated requests separately, and you want your application thread to process responses sequentially, then you could use a SynchronousQueue to coordinate responses with allowing the main thread to continue. Alternatively your callback can process the responses internally.
You can also extends this technique to use HTTP pipelining. HTTP pipelining guarantees that responses are returned in the order that requests are issued. In HttpSnoopClientHandler you maintain a queue of requests. As each response is returned you match it to the request at the front of the queue.

How to create Ajax request that gets information as the servlet runs?

I have a form that creates an account and a servlet that handles the request.
However, the process to create this account is a long process and I want to create something like a status bar or a progress bar. Heres the POST:
$.post("createAccount.jsp", function(data) { $("#status").text(data);
});
And the servlet would continuously print data like "creating x..." then "creating y" as the servlet runs. Is there a way to accomplish this or maybe another way to tackle this issue?
Thanks
Http works on a request-response model. You send a request, and server responds back. After that Server doesn't know who are you?!
It's like Server is a post-office that doesn't know your address. You
go to it and get your letters.It doesn't come to your home for
delivering letters.
If you want constant notifications from server, You can either use Web Sockets(Stack Overflow also uses Web Sockets) or use `AJAX Polling' mechanisms,
which sends an AJAX request to the server and waits for server to
respond. On retrieval of response,it generates another AJAX request
and keep on doing the same until server stops generating new data.
Read this for an explanation of AJAX Polling techniques
You could have your account creation servlet update a database or context attribute as it creates the account.
You could have a separate AJAX request to a different servlet that sends back to the webpage the most recent development found in the database or context attribute. You would then poll your server with that AJAX request every so many fractions of a second(or relevant time interval depending on how long of a task it is to create an account) to get all the updates.

When does servlet release its thread

Assuming no keep alives, when a servlet container is acting as a stand alone server, I assume that the servlet's thread is not released until the entire response is sent to the client (say a web browser). Is this a correct assumption?
But what happens if the servlet is behind a reverse proxy like Nginx? Is the thread released once the response is delivered to Nginx, or is it held until the response is sent to its final client (say a browser)?
Update: Let me try make this a bit more clear.
It takes mere milliseconds (say 2ms) for a response to be sent from servlet to proxy like nginx. But it can then take an additional 80ms (or so) for the final response to be sent from nginx to the browser. Does the servlet release the thread/stream once the response is sent to nginx, or does the servlet hold onto them until the response is sent to the browser (that is the entire 80ms)
Question: I assume that the servlet's thread is not released until the entire response is sent to the client (say a web browser). Is this a correct assumption?
Ans: No it is wrong. Servlet container will just write the content to the socket and return. It is not guaranteed that return from write() method will ensure that the response has reached the client.
Question: Is the thread released once the response is delivered to Nginx, or is it held until the response is sent to its final client (say a browser)?
Ans: When Nginx is behind , then the client for Servlet container is Nginx. It is not aware of actual remote client. So, the thread will be released once the response is written to Nginx.
The server container not being able to send a response to the client will trigger an exception that will be handled by the container. You can enclose the writing to the outputstream or writer by a try catch finally (with close()) but you don't need to, the container will manage, including the return of the thread to the pool.
Regards
S
A servlet does not see the network. According to the specifications It is handled 2 objects: a Request and a Response to be filled in (in the case of HTTP, this means a HTTPRequest and a HTTPResponse). It shall process the request data within the request object, and write to the buffer in the response object. Once that content is commited by the servlet, the container may do some postprocessing (using filters) and will transmit it back to the client.
The servlet thread returns naturally to the pool once the call to the request handling method finishes (that may happen after the payload is sent back to the client, if the method has to do further work.
Note that because the servlet doesn't see the network and is only concerned about a single request, the state of the http connection (keep-alive or close) is independent of the servlet lifetime; several servlets may handle the different requests pipelined in a single connection. See this question for a related issue.

Categories