Asynchronounous Invocation WS - java

I would like to understand how the asynchronous invocation model in jax ws works.
If for example I use Future invokeAsync(T msg, AsyncHandler handler)
then my program can resume and when the response from the web service arrives, the result will be passed to my AsyncHandler.
If I have several threads in the same program and one thread calls invokeAsync and resumes operation and immediately another thread (perhaps more) tries to also call invokeAsync to the same web service (perhaps different operation but same portType) how will this situation be handled by the framework? Will a series of POSTs go the same web service (POST for thread1, POST for thread2 etc) or after a response arrives then the next POST will be send (POST for thread1 when response arrives pass result to the callback handler and then POST for thread2?)
Thanks

I can't say for Jax in particular, but the only way that makes sense to me is if the posts are independent. It would be crazy (IMO) for the client to wait until the web service had returned the first response before it made the next request.

Related

Is it legal to write and close the response in JSP, then do some extra job?

I'm working on a web app, which is communicating with the server with AJAX requests. A special type of "close" request takes 5 secs, which the web app should just fire-and-forget, the result is irrelevant. Due to browser behaviors (only limited number of simultaneous AJAX requests are performed), a 5-sec request may stuck other AJAX requests, which is unacceptable.
The smart folks here in StackOverflow has adviced me to write a small server-side proxy, which the web app should call instead of the original 5-sec one. The proxy should response immediatelly, close response channel, then perform a HTTP request and wait for it, spending the 5 secs server-side, instead of client-side. (The original question is here: See Is there a way to perform fire-and-forget AJAX request? )
The server is a Tomcat with JSP, and I can write small JPS pages. (I'm not an experienced JSP ninja, but I don't afraid of Java.) My question is: is it legal to write such a JSP, or what's the best practice:
send the response,
close reply channel (is out.close() enough?), in order to end the AJAX request at client-side,
fire and process (actually: just drop response) a HTTP request "in background", which may take as long as 5 secs?
It's not (only) your browser you should worry about. Blocking a tomcat thread for 5s severly limits your max-users as well (how many requests per second do you need to handle ultimately?)
So making it "more" asynchronous in the server might make sense.
Doing it in JSP (with Sriplets?!) alone will noway be a robust implementation - but if you need to do it that way, you should think about starting the "work to do" in a separate Thread.
So instead of
<%
do_something_heavy();
%>
You'll do like
<%
new Thread(new Runnable() {
public void run() {
do_something_heavy();
}
}).start();
%>
There's other options as well (JMS, ExecutorService, Spring #Async...) but this should get you started quick.
First the best is to separate business logic from view: it means write java code on a servlet and delegate only the view aspect to the jsp.
To execute your task asynchronously in the servlet code you can:
Invoke a submit method of an ExecutorService
Make a call to a JMS
Manually create a thread and start it
Then you can forward to the jsp.
TIP: It is possible to assign an id to the long task and return it in the jsp with a link to monitor the status of the task.
Basically you do something like that:
Accept the request
Start asynchronously a thread to execute the long task
Return immediately without waiting for the long task termination
Or using an id:
Accept the request
Calculate the id of the task
Start asynchronously a thread to execute the long task with the desired id
Return immediately a link with the id of the long task without waiting for the termination

Netty client with synchronous request response

i am trying to create a http client based on netty. I have written the code based on the HttpSnoopClient example given in Netty site. But the problem is HttpResponse are handled by HttpSnoopClientHandler & HttpRequests are sent in HttpSnoopClient & i want to sync it. as in if i send a request i want to make sure that i will send the next request once i know the response to the previous. But since both are handled in different class, It is becoming difficult to do the same.
One thing i did was to create a setResponse() method in HttpTarget & HttpSnoopClientHandler will be setting the HttpResponse when it receives the Response from the sever. But i don't think it is a good approach since i won't be able to know the reposne was for which request.
So basically i want to do it synchronously i.e. send a request(channel.writeandFlush(req)) in HttpSnoopClient then wait till the response is received by the HttpSnoopCLientHandler & once it recieves a HTTP 1.1 200 OK then send the next request.
Can anyone tell me a good approach for doing it. Thanks in advance!
I had a similar use case where I had to block concurrent requests till one completes for a resource. I implemented a ConcurrentHashMap<RequestKey, ArrayList<ChannelHandlerContext>>> which will hold all the concurrent requests ChannelHandlerContext (ctx) and on completion of the first request raise an event which would trigger all other ctx to consume the cached response. In all this I had to make sure the AUTO_READ was set to false for fine grain control over the reads on each channel.
channelRead ->
if(map.contains(reqKey)){
map.add(reqKey, list.add(ctx))
//do nothing with AUTO_READ = false and do not initiate any ctx.* methods
}else{
//firstRequest
map.add(reqKey, new ArrayList<CTX>(){{ add(ctx);}})
//continue with request execution
//cache response and raise event on completion
}
userEventTriggered ->
onCompletionEvent {
ctxList = map.get(reqKey).clone();
map.remove(reqKey);
for(blockedCtx : ctxList){
//respond back with cached response to each blockedCtx
}
}
#norman-maurer would you give your take on this !!!
As you're creating a new HttpSnoopClientHandler for each connection, I would consider turning HttpSnoopClientHandler into a ChannelDuplexHandler. In the write method you can store a reference to the outgoing http request. When the response is received you can call your setResponse method with (channel, request, response). This should provide enough context so you can process the response correctly.
If your client is pure request/response, does not issue unrelated requests separately, and you want your application thread to process responses sequentially, then you could use a SynchronousQueue to coordinate responses with allowing the main thread to continue. Alternatively your callback can process the responses internally.
You can also extends this technique to use HTTP pipelining. HTTP pipelining guarantees that responses are returned in the order that requests are issued. In HttpSnoopClientHandler you maintain a queue of requests. As each response is returned you match it to the request at the front of the queue.

Android Threads, Services, and two way communication between them

I'm struggling to wrap my head around what needs to happen here. I'm currently working on an app that runs a service. The service when started opens a webserver that runs in a background thread.
At any point while this service is running the user can send commands to the device from a browser. The current sequence of events is as follows.
User sends request to server
Server sends a message to the service via the msg handler construct, it sends data such as the url parameters
The service does what it wants with the data, and wants to send some feedback message to the user in the browser
?????
The server's response to the request contains a feed back message from the service.
The way my functions are set up I need to pause my serve() function while waiting for a response from the service and then once the message is received resume and send an http response.
WebServer.java
public Response serve( String uri, String method, Properties header, Properties parms, Properties files )
{
Bundle b = Utilities.convertToBundle(parms);
Message msg = new Message();
msg.setData(b);
handler.sendMessage(msg);
//sending a message to the handler in the service
return new NanoHTTPD.Response();
}
CommandService.java
public class CommandService extends Service {
private WebServer webserver;
public Handler handler = new Handler() {
#Override
public void handleMessage(Message msg) {
execute_command(msg.getData());//some type of message should be sent back after this executes
};
Any suggestions? Is this structure the best way to go about it, or can you think of a better design that would lead to a cleaner implementation?
I think the lack of answers is because you haven't been very specific in what your question is. In my experience it's easier to get answers to simple or direct questions that general architecture advice on StackOverflow.
I'm no expert on Android but I'll give it a shot. My question is why you have a Webservice running in the background of a Service, why not just have one class, make your Service the Webservice?
Regarding threading and communication and sleeping, the main thing to remember is that a webserver needs to always be available to serve new requests, whilst serving current requests. Other than that, it's normal that a client will wait for a thread to complete its task (i.e. the thread "blocks"). So most webservers spawn new a thread to handle each request that comes in. If you have a background thread but you block the initial thread while you wait for the background thread to complete its task, then you're no better off than just completing everything on the one thread. Actually, the latter would be preferable for the sake of simplicity.
If Android is actually spawning new threads for you when requests come in, then there's no need for a background thread. Just do everything synchronously on one thread and rejoice in the simplicity!

Call a Web Service from Servlet at AppEngine

Question: What is best way to call a web service (0.5-1.5 seconds/call) from a servlet at AppEngine? Are blocking calls are scalable at AppEngine environment?
Context: I am developing a web application using AppEngine and J2EE. The applications calls Amazon web service to grab some information for the user. From my asp.net experience, best way to do the calls - is to use async http handler to prevent starvation at IIS thread pool. This feature is not available for J2EE with Servlet 2.5 spec (3.0 is planned).
Right now I am thinking of making my controllers (and servlets) thread safe and request scoped. Is there anything also that I can do? Is it even an issue in J2EE + AppEngine environment?
EDIT: I am aware of AppEngine and JAX-WS async invocation support, but I am not sure how it play with servlet environment. As far as I understand, to complete servlet request, the code still should wait for async WS call completion (callback or whatever).
I assume that doing it using synchronization primitives will block current working thread.
So, as far as thread is blocked, to serve another user request servlet container need to allocate new thread in thread pool, allocate new memory for stack and waste time for context switching. Moreover, requests can block entire server, when we run out of threads in thread pool. This assumptions are based on ASP.Net and IIS thread model. Are they applicable to J2EE environment?
ANSWER: After studying Apache and GAE documentation, it seems that starvation of threads in the thread pool is not a real issue. Apache, by default has 200 threads for thread pool (compared to 25 in asp.NET and IIS). Based on this I can infer that threads are rather cheap in JVM.
In case if async processing is really required or servlet container will run out of threads, it's possible to redesign the application to send response via google channel api.
The workflow will look like:
Make sync request to servlet
Servlet makes creates channel for async reply and queues task for background worker
Servlet returns response to client
[Serving other requests]
Background worker does processing and pushes data to client via channel api
As you observe, servlets don't support using a single thread to service multiple concurrent requests - one thread is required per request. The best way to do your HTTP call is to use asynchronous urlfetch, and wait on that call to complete when you need the result. This will block the request's thread, but there's no avoiding that - the thread is dedicated to the current request until it terminates no matter what you do.
If you don't need the response from the API call to serve the user's request, you could use the task queue to do the work offline, instead.
Isn't it OK to use fetchAsync?
looks at this, this might help
http://today.java.net/pub/a/today/2006/09/19/asynchronous-jax-ws-web-services.html
I am not sure, If you can exactly replicate what you do in dot net, Here is what you could do to may be to simulate it page on load
Submit an ajax request to controller using a java script body onload
In the controller start the async task and send the response back the user and use a session token to keep track of the task
You can poll the controller (add another method to ask for update of the task, since you have session token to track the task) until u get the response
You can do this either waiting for response page or hidden frame that keeps polling the controller
Once you have the response that you are looking for remove the session token
If you want to do that would be the best option instead of polling would be ideal in this case Reverse Ajax / server push
Edit: Now I understand what you mean, I think you can have your code execute async task not wait for response from async itself, just send response back to the user. I have simple thread that I will start but will wait for it to finish as I send the response back to the user and the same time use a session token to track the request
#Controller
#RequestMapping("/asyncTest")
public class AsyncCotroller {
#RequestMapping(value = "/async.html", method = RequestMethod.GET)
public ModelAndView dialogController(Model model, HttpServletRequest request)
{
System.err.println("(System.currentTimeMillis()/1000) " + (System.currentTimeMillis()/1000));
//start a thread (async simulator)
new Thread(new MyRunnbelImpl()).start();
//use this attribute to track response
request.getSession().setAttribute("asyncTaskSessionAttribute", "asyncTaskSessionAttribute");
//if you look at the print of system out, you will see that it is not waiting on //async task
System.err.println("(System.currentTimeMillis()/1000) " + (System.currentTimeMillis()/1000));
return new ModelAndView("test");
}
class MyRunnbelImpl implements Runnable
{
#Override
public void run()
{
try
{
Thread.sleep(5000);
} catch (InterruptedException e)
{
e.printStackTrace();
}
}
}
}

Servlet 3.0 asynchronous

What's the diffrence between servlet 3.0 asynchronous feature against:
Оld servlet implementation:
doGet(request,response) {
Thread t = new Thread(new Runnable()
void run(){
// heavy processing
response.write(result)
}
}
t.start();
In servlet 3.0 if I waste a thread to do heavy processing - I earn one more thread in the container, but I waste it in heavy processing... :(
Could somebody help?
This won't work. Once your doGet method ends, the response is complete and sent back to the client. Your thread may or may not still be running, but it can't change the response any longer.
What the new async feature in Servlet 3.0 does, is that it allows you to free the request thread for processing another request. What happens is the following:
RequestThread: |-- doGet() { startAsync() } // Thread free to do something else
WorkerThread: |-- do heavy processing --|
OtherThread: |-- send response --|
The important thing is that once RequestThread has started asynchronous processing via a call to startAsync(...), it is free to do something else. It can accept new requests, for example. This improves throughput.
There are several API-s supporting COMET (long living HTTP requests, where there is no thread/request problem) programming. So there is no strict need to use servlet 3 API for avoiding thread/request. One is the Grizzly engine which is running in Glassfish 2.11 (example). Second solution is Jetty Continuation. The third is Servlet 3 API..
The basic concept is that the request creates some container managed asynchronous handler in which the request can subscribe to an event identified by an object (for example a clientid string). Then the asynchronous processing thread once can say to the handler, that the event occours, and the request gets a thread to continue. It totally depends on your choosen application server wich API you can use. Which is your choice?
The servlet 3.0 async feature provides to keep the http connection open but to release any unused threads when the request cannot be served immediately but is waiting for some event to occur or for example when you are writing some comet/reverse ajax application.,In the above case you are creating a new thread completely so it should not make any difference for you unless you want to keep the request waiting for some event.
Best Regards,
Keshav
Creating your own threads in a servlet container is asking for trouble. (There might be cases where you have to do it, but if you have some framework that manages the threads for you, then you should use it.)

Categories