Using netty with 3rd party blocking API - java

I am using a 3rd party blocking API. I am going to be using this API as follows:
while(true){
blockingAPI();
sendResultSomewhere();
}
blockingAPI() polls a server for a specific property until it gets a response.
In order to make things asynchronous to some extent I could spawn this API call within a separate thread. and have a callback implemented in Java to handle the response. I was wondering if I can use the netty framework in this scenario, and how I could do this? The examples I have seen involve a server that listens and communicates with a client, and I am not sure how my use case fits in.
If netty cannot be used, would my best bet be spawning a new thread and implementing a callback in Java?

Not sure what you really try to do:
Spawn internally a new thread: you could use LocalChannel with Netty to have intra-JVM process communication and therefore having something like you want, without any network consideration (only within the JVM). The blockingAPI will be computed within ServerLocalChannel side, while the result will be written once the client get back a response through the same LocalChannel.
Spawn but with a request from outside (network), then Netty could of course be used too there. Maybe still keeping a LocalChannel logic to separate network to compute.
Note that I could recommand to use asynchronous operation using LocalChannel (executing the blocking task), such that the send somewhere else is done without blocking the Netty's Network IO thread.
Network Handler side:
localChannel = creationWithinNetworkHandler(networkChannelCtx);
localChannel.writeAndFlush(something);
while LocalChannel handler server side could be as:
void read0(ChannelHandlerContext ctx, someData) {
blockingAPI();
ctx.channel().writeAndFlush(answear).addFutureListener(Channels.CloseFuture);
}
and LocalChannel handler client side could be as:
void read0(ChannelHandlerContext ctx, answear) {
//Using ctx from Network channel side
networkCtx.writeAndFlush(answear);
}

Related

Java HTTP client which uses SocketChannel so that I may interrupt threads reading a request

It looks like SocketChannel supports being interrupted, yet regular sockets do not.
Do any java HTTP clients exists which are able to use the SocketChannel instead of Socket.
I would like to support threads being interrupted when reading from the server, currently when using URL#openConnection() if the thread is stuck waiting for a response from the server it can not be unstuck by interrupting it.
I've never used it myself, but it looks like what would be closest to your need is a non-blocking HTTP client. The only implementation I know of is Netty based Spring reactive client: reactor-netty.
It should allow you to do something like this:
final Mono<HttpClientResponse> futureResponse = HttpClient.create()
.baseUrl("http://example.com")
.get()
.response();
final Disposable httpHandle = response.subscribe(
response -> onSuccess(response),
response -> onError(response)
);
// When you want to force disconnection
httpHandle.dispose();
Note: be cautious however, because I'm really not aware of the limitations of this system. However, Spring team provides advanced documentation to help.

what is the most efficient way to create asynchronous request using axis in java?

I'm looking for the best solution to solve this problem :
I have a client and a server.
The client sending request to the server using the call.invoke method.
The call for now is synchronous and waiting for the answer.
The time is taking to receive the replay from the server under load is around 1 sec(this is a lot of time).
at the client side we are generating requests around 50-100 request per second , the queue is exploding.
For now i just created a thread pool that will work asynchronous and will send the requests to the server per thread , but the request it self will be synchronous.
The meaning of that is that the thread pool should maintain ~100 threads if we do want that it will work fine.
I'm not sure this is the best solution.
I also was thinking to create somehow 1 thread that will send the requests and 1 thread that will catch the replays, but then i'm afraid that i will pass on the load to the server side.
Few things that are importent:
We cannot effect the code on the server side and we cannot control the time it takes to receive a replay.
while receiving the replay we just use this data to create another data structure and pass it on - so the time stamp is not relay importent.
we are using axis api.
any idea of how is it the best way to solve it? the thread pool of the 100 thread seems fine ? or there some other ways?
Thanks!
You can call axis service using non-blocking client way by registering the callback instance.
Client class:
ServiceClient sc = new ServiceClient();
Options opt= new Options();
//set the target EP
opt.setTo(new EndpointReference("http://localhost:8080/axis2/services/CountryService"));
opt.setAction("urn:getCountryDetails");
sc.setOptions(opt);
sc.sendReceiveNonBlocking(payload, callBack);
// inner class with axisCallback , overide all its methods. onMessage get called once result receive from backend
AxisCallback callBack = new AxisCallback() {
#Override
public void onMessage(MessageContext msgContext) {
System.out.println(msgContext.getEnvelope().getBody().getFirstElement());
//this method get called when you received the results from the backend
}
...
}
Reference for writing axis service : http://jayalalk.blogspot.com/2014/01/writing-axis2-services-and-deploying-in.html

Android Threads, Services, and two way communication between them

I'm struggling to wrap my head around what needs to happen here. I'm currently working on an app that runs a service. The service when started opens a webserver that runs in a background thread.
At any point while this service is running the user can send commands to the device from a browser. The current sequence of events is as follows.
User sends request to server
Server sends a message to the service via the msg handler construct, it sends data such as the url parameters
The service does what it wants with the data, and wants to send some feedback message to the user in the browser
?????
The server's response to the request contains a feed back message from the service.
The way my functions are set up I need to pause my serve() function while waiting for a response from the service and then once the message is received resume and send an http response.
WebServer.java
public Response serve( String uri, String method, Properties header, Properties parms, Properties files )
{
Bundle b = Utilities.convertToBundle(parms);
Message msg = new Message();
msg.setData(b);
handler.sendMessage(msg);
//sending a message to the handler in the service
return new NanoHTTPD.Response();
}
CommandService.java
public class CommandService extends Service {
private WebServer webserver;
public Handler handler = new Handler() {
#Override
public void handleMessage(Message msg) {
execute_command(msg.getData());//some type of message should be sent back after this executes
};
Any suggestions? Is this structure the best way to go about it, or can you think of a better design that would lead to a cleaner implementation?
I think the lack of answers is because you haven't been very specific in what your question is. In my experience it's easier to get answers to simple or direct questions that general architecture advice on StackOverflow.
I'm no expert on Android but I'll give it a shot. My question is why you have a Webservice running in the background of a Service, why not just have one class, make your Service the Webservice?
Regarding threading and communication and sleeping, the main thing to remember is that a webserver needs to always be available to serve new requests, whilst serving current requests. Other than that, it's normal that a client will wait for a thread to complete its task (i.e. the thread "blocks"). So most webservers spawn new a thread to handle each request that comes in. If you have a background thread but you block the initial thread while you wait for the background thread to complete its task, then you're no better off than just completing everything on the one thread. Actually, the latter would be preferable for the sake of simplicity.
If Android is actually spawning new threads for you when requests come in, then there's no need for a background thread. Just do everything synchronously on one thread and rejoice in the simplicity!

Call a Web Service from Servlet at AppEngine

Question: What is best way to call a web service (0.5-1.5 seconds/call) from a servlet at AppEngine? Are blocking calls are scalable at AppEngine environment?
Context: I am developing a web application using AppEngine and J2EE. The applications calls Amazon web service to grab some information for the user. From my asp.net experience, best way to do the calls - is to use async http handler to prevent starvation at IIS thread pool. This feature is not available for J2EE with Servlet 2.5 spec (3.0 is planned).
Right now I am thinking of making my controllers (and servlets) thread safe and request scoped. Is there anything also that I can do? Is it even an issue in J2EE + AppEngine environment?
EDIT: I am aware of AppEngine and JAX-WS async invocation support, but I am not sure how it play with servlet environment. As far as I understand, to complete servlet request, the code still should wait for async WS call completion (callback or whatever).
I assume that doing it using synchronization primitives will block current working thread.
So, as far as thread is blocked, to serve another user request servlet container need to allocate new thread in thread pool, allocate new memory for stack and waste time for context switching. Moreover, requests can block entire server, when we run out of threads in thread pool. This assumptions are based on ASP.Net and IIS thread model. Are they applicable to J2EE environment?
ANSWER: After studying Apache and GAE documentation, it seems that starvation of threads in the thread pool is not a real issue. Apache, by default has 200 threads for thread pool (compared to 25 in asp.NET and IIS). Based on this I can infer that threads are rather cheap in JVM.
In case if async processing is really required or servlet container will run out of threads, it's possible to redesign the application to send response via google channel api.
The workflow will look like:
Make sync request to servlet
Servlet makes creates channel for async reply and queues task for background worker
Servlet returns response to client
[Serving other requests]
Background worker does processing and pushes data to client via channel api
As you observe, servlets don't support using a single thread to service multiple concurrent requests - one thread is required per request. The best way to do your HTTP call is to use asynchronous urlfetch, and wait on that call to complete when you need the result. This will block the request's thread, but there's no avoiding that - the thread is dedicated to the current request until it terminates no matter what you do.
If you don't need the response from the API call to serve the user's request, you could use the task queue to do the work offline, instead.
Isn't it OK to use fetchAsync?
looks at this, this might help
http://today.java.net/pub/a/today/2006/09/19/asynchronous-jax-ws-web-services.html
I am not sure, If you can exactly replicate what you do in dot net, Here is what you could do to may be to simulate it page on load
Submit an ajax request to controller using a java script body onload
In the controller start the async task and send the response back the user and use a session token to keep track of the task
You can poll the controller (add another method to ask for update of the task, since you have session token to track the task) until u get the response
You can do this either waiting for response page or hidden frame that keeps polling the controller
Once you have the response that you are looking for remove the session token
If you want to do that would be the best option instead of polling would be ideal in this case Reverse Ajax / server push
Edit: Now I understand what you mean, I think you can have your code execute async task not wait for response from async itself, just send response back to the user. I have simple thread that I will start but will wait for it to finish as I send the response back to the user and the same time use a session token to track the request
#Controller
#RequestMapping("/asyncTest")
public class AsyncCotroller {
#RequestMapping(value = "/async.html", method = RequestMethod.GET)
public ModelAndView dialogController(Model model, HttpServletRequest request)
{
System.err.println("(System.currentTimeMillis()/1000) " + (System.currentTimeMillis()/1000));
//start a thread (async simulator)
new Thread(new MyRunnbelImpl()).start();
//use this attribute to track response
request.getSession().setAttribute("asyncTaskSessionAttribute", "asyncTaskSessionAttribute");
//if you look at the print of system out, you will see that it is not waiting on //async task
System.err.println("(System.currentTimeMillis()/1000) " + (System.currentTimeMillis()/1000));
return new ModelAndView("test");
}
class MyRunnbelImpl implements Runnable
{
#Override
public void run()
{
try
{
Thread.sleep(5000);
} catch (InterruptedException e)
{
e.printStackTrace();
}
}
}
}

How do I make an async call to Hive in Java?

I would like to execute a Hive query on the server in an asynchronous manner. The Hive query will likely take a long time to complete, so I would prefer not to block on the call. I am currently using Thirft to make a blocking call (blocks on client.execute()), but I have not seen an example of how to make a non-blocking call. Here is the blocking code:
TSocket transport = new TSocket("hive.example.com", 10000);
transport.setTimeout(999999999);
TBinaryProtocol protocol = new TBinaryProtocol(transport);
Client client = new ThriftHive.Client(protocol);
transport.open();
client.execute(hql); // Omitted HQL
List<String> rows;
while ((rows = client.fetchN(1000)) != null) {
for (String row : rows) {
// Do stuff with row
}
}
transport.close();
The code above is missing try/catch blocks to keep it short.
Does anyone have any ideas how to do an async call? Can Hive/Thrift support it? Is there a better way?
Thanks!
AFAIK, at the time of writing Thrift does not generate asynchronous clients. The reason as explained in this link here (search text for "asynchronous") is that Thrift was designed for the data centre where latency is assumed to be low.
Unfortunately as you know the latency experienced between call and result is not always caused by the network, but by the logic being performed! We have this problem calling into the Cassandra database from a Java application server where we want to limit total threads.
Summary: for now all you can do is make sure you have sufficient resources to handle the required numbers of blocked concurrent threads and wait for a more efficient implementation.
It is now possible to make an asynchronous call in a Java thrift client after this patch was put in:
https://issues.apache.org/jira/browse/THRIFT-768
Generate the async java client using the new thrift and initialize your client as follows:
TNonblockingTransport transport = new TNonblockingSocket("127.0.0.1", 9160);
TAsyncClientManager clientManager = new TAsyncClientManager();
TProtocolFactory protocolFactory = new TBinaryProtocol.Factory();
Hive.AsyncClient client = new Hive.AsyncClient(protocolFactory, clientManager, transport);
Now you can execute methods on this client as you would on a synchronous interface. The only change is that all methods take an additional parameter of a callback.
I know nothing about Hive, but as a last resort, you can use Java's concurrency library:
Callable<SomeResult> c = new Callable<SomeResult>(){public SomeResult call(){
// your Hive code here
}};
Future<SomeResult> result = executorService.submit(c);
// when you need the result, this will block
result.get();
Or, if you do not need to wait for the result, use Runnable instead of Callable.
After talking to the Hive mailing list, Hive does not support async calls using Thirft.
I don't know about Hive in particular but any blocking call can be turned in an asynch call by spawning a new thread and using a callback. You could look at java.util.concurrent.FutureTask which has been designed to allow easy handling of such asynchronous operation.
We fire off asynchronous calls to AWS Elastic MapReduce. AWS MapReduce can run hadoop/hive jobs on Amazon's cloud with a call to the AWS MapReduce web services.
You can also monitor the status of your jobs and grab the results off S3 once the job is completed.
Since the calls to the web services are asynchronous in nature, we never block our other operations. We continue to monitor the status of our jobs in a separate thread and grab the results when the job is complete.

Categories