I'm struggling to wrap my head around what needs to happen here. I'm currently working on an app that runs a service. The service when started opens a webserver that runs in a background thread.
At any point while this service is running the user can send commands to the device from a browser. The current sequence of events is as follows.
User sends request to server
Server sends a message to the service via the msg handler construct, it sends data such as the url parameters
The service does what it wants with the data, and wants to send some feedback message to the user in the browser
?????
The server's response to the request contains a feed back message from the service.
The way my functions are set up I need to pause my serve() function while waiting for a response from the service and then once the message is received resume and send an http response.
WebServer.java
public Response serve( String uri, String method, Properties header, Properties parms, Properties files )
{
Bundle b = Utilities.convertToBundle(parms);
Message msg = new Message();
msg.setData(b);
handler.sendMessage(msg);
//sending a message to the handler in the service
return new NanoHTTPD.Response();
}
CommandService.java
public class CommandService extends Service {
private WebServer webserver;
public Handler handler = new Handler() {
#Override
public void handleMessage(Message msg) {
execute_command(msg.getData());//some type of message should be sent back after this executes
};
Any suggestions? Is this structure the best way to go about it, or can you think of a better design that would lead to a cleaner implementation?
I think the lack of answers is because you haven't been very specific in what your question is. In my experience it's easier to get answers to simple or direct questions that general architecture advice on StackOverflow.
I'm no expert on Android but I'll give it a shot. My question is why you have a Webservice running in the background of a Service, why not just have one class, make your Service the Webservice?
Regarding threading and communication and sleeping, the main thing to remember is that a webserver needs to always be available to serve new requests, whilst serving current requests. Other than that, it's normal that a client will wait for a thread to complete its task (i.e. the thread "blocks"). So most webservers spawn new a thread to handle each request that comes in. If you have a background thread but you block the initial thread while you wait for the background thread to complete its task, then you're no better off than just completing everything on the one thread. Actually, the latter would be preferable for the sake of simplicity.
If Android is actually spawning new threads for you when requests come in, then there's no need for a background thread. Just do everything synchronously on one thread and rejoice in the simplicity!
Related
I am using a 3rd party blocking API. I am going to be using this API as follows:
while(true){
blockingAPI();
sendResultSomewhere();
}
blockingAPI() polls a server for a specific property until it gets a response.
In order to make things asynchronous to some extent I could spawn this API call within a separate thread. and have a callback implemented in Java to handle the response. I was wondering if I can use the netty framework in this scenario, and how I could do this? The examples I have seen involve a server that listens and communicates with a client, and I am not sure how my use case fits in.
If netty cannot be used, would my best bet be spawning a new thread and implementing a callback in Java?
Not sure what you really try to do:
Spawn internally a new thread: you could use LocalChannel with Netty to have intra-JVM process communication and therefore having something like you want, without any network consideration (only within the JVM). The blockingAPI will be computed within ServerLocalChannel side, while the result will be written once the client get back a response through the same LocalChannel.
Spawn but with a request from outside (network), then Netty could of course be used too there. Maybe still keeping a LocalChannel logic to separate network to compute.
Note that I could recommand to use asynchronous operation using LocalChannel (executing the blocking task), such that the send somewhere else is done without blocking the Netty's Network IO thread.
Network Handler side:
localChannel = creationWithinNetworkHandler(networkChannelCtx);
localChannel.writeAndFlush(something);
while LocalChannel handler server side could be as:
void read0(ChannelHandlerContext ctx, someData) {
blockingAPI();
ctx.channel().writeAndFlush(answear).addFutureListener(Channels.CloseFuture);
}
and LocalChannel handler client side could be as:
void read0(ChannelHandlerContext ctx, answear) {
//Using ctx from Network channel side
networkCtx.writeAndFlush(answear);
}
Kind of a "noob" problem here.
I have a small application to write (like a simple game). There is server-side and client-side. It has to use websockets as the way of communication. Server has a server class (with main() that starts the server) as well as server endpoint class. However, the game is not turn based, but real time based. So the server has to do certain computations every "tick" b/c of the dynamic field.
I assume that Threads would suit well in this case, but I don't know how to put threads with this kind of server.
As I can see, the only thing that can receive/send messages is endpoint. If I make it implement Runnable and pause every 0.5 of a sec, it won't accept messages during that pause time. If I define a different class for that purpose, I have no idea how I start it inside of an endpoint and make a way for them to communicate.
Does anyone have any suggestions/info/links/anything that may help?
Thank you in advance.
Server endpoint will continuously receive data from client side. All you have to do is to process that data in some other thread. You can define a different class for that purpose (a thread). This thread class will have two different queues.
In queue - to receive data from the endpoint
Out queue - to send data to the endpoint
(You can use ConcurrentLinkedQueue for that. more help -> How to use ConcurrentLinkedQueue?)
Start this processing thread inside the endpoint. when endpoint receives data, put them into the In Queue. Continuously listen to the Out Queue and send that data again to the client side.
Endpoint code
#OnMessage
public void onMessage(String message,Session peer) throws IOException{
processingThread t = new processThread(peer);
t.inQueue.add(data);
t.start();
String s;
//listen to the Out Queue
while (true) {
while ((s =t.outQueue.poll()) != null) {
peer.getBasicRemote.sendText(dataToBeSent);
}
}
}
processingThread Code
public class processingThread extends Thread{
public ConcurrentLinkedQueue<String> inQueue = new ConcurrentLinkedQueue<String>();
public ConcurrentLinkedQueue<String> outQueue = new ConcurrentLinkedQueue<String>();
public void run(){
//listen to in queue and process
//after processing put to the out queue
}
}
Hope this will help :)
I'm looking for the best solution to solve this problem :
I have a client and a server.
The client sending request to the server using the call.invoke method.
The call for now is synchronous and waiting for the answer.
The time is taking to receive the replay from the server under load is around 1 sec(this is a lot of time).
at the client side we are generating requests around 50-100 request per second , the queue is exploding.
For now i just created a thread pool that will work asynchronous and will send the requests to the server per thread , but the request it self will be synchronous.
The meaning of that is that the thread pool should maintain ~100 threads if we do want that it will work fine.
I'm not sure this is the best solution.
I also was thinking to create somehow 1 thread that will send the requests and 1 thread that will catch the replays, but then i'm afraid that i will pass on the load to the server side.
Few things that are importent:
We cannot effect the code on the server side and we cannot control the time it takes to receive a replay.
while receiving the replay we just use this data to create another data structure and pass it on - so the time stamp is not relay importent.
we are using axis api.
any idea of how is it the best way to solve it? the thread pool of the 100 thread seems fine ? or there some other ways?
Thanks!
You can call axis service using non-blocking client way by registering the callback instance.
Client class:
ServiceClient sc = new ServiceClient();
Options opt= new Options();
//set the target EP
opt.setTo(new EndpointReference("http://localhost:8080/axis2/services/CountryService"));
opt.setAction("urn:getCountryDetails");
sc.setOptions(opt);
sc.sendReceiveNonBlocking(payload, callBack);
// inner class with axisCallback , overide all its methods. onMessage get called once result receive from backend
AxisCallback callBack = new AxisCallback() {
#Override
public void onMessage(MessageContext msgContext) {
System.out.println(msgContext.getEnvelope().getBody().getFirstElement());
//this method get called when you received the results from the backend
}
...
}
Reference for writing axis service : http://jayalalk.blogspot.com/2014/01/writing-axis2-services-and-deploying-in.html
On my Android App, I'm implementing SignalR connection (https://github.com/erizet/SignalA) to connect to a Hub server to send requests and receive responses.
a sample of my code is as follows:
signalAConnection = new com.zsoft.SignalA.Connection(Constants.getHubUrl(), this, new LongPollingTransport())
{
#Override
public void OnError(Exception exception)
{
}
#Override
public void OnMessage(String message)
{
}
#Override
public void OnStateChanged(StateBase oldState, StateBase newState)
{
}
};
if (signalAConnection != null)
signalAConnection.Start();
There's also the sending bit
signalAConnection.Send(hubMessageJson, new SendCallback()
{
public void OnError(Exception ex)
{
}
public void OnSent(CharSequence message)
{
}
});
The sending and receiving will occur across activites, and some responses will be sent at random times regardless of the activity, also, the connection should be opened as long as the app is running (even if the app is running in the background) that's why I wish to implement the signalA connection as a background service
The question is should I implement it as:
1 - a Service (http://developer.android.com/reference/android/app/Service.html)
OR
2 - an Intent Service (http://developer.android.com/training/run-background-service/create-service.html)
Keeping in mind that I will need to send strings to the service and get response strings from the service.
I would be most grateful if someone would show me how to implement this kind of connection in code as a background service/intentservice.
Thanks for reading.
UPDATE:
Please see this demo activity made by the developer as how he implemented SignalA
https://github.com/erizet/SignalA/blob/master/Demo/src/com/zsoft/SignalADemo/DemoActivity.java
The problem is AQuery (which I know nothing about) is being used in this demo activity. Does AQuery run in the background all the time ?
The problem is, the latest update on SignalA mentions the following
I have changed the transport. LongPolling now uses basic-http-client
instead of Aquery for http communication. I've removed all
dependencies on Aquery.
Hence I'm not sure whether I should follow this demo activity or not
Update 2:
This is the thing that is confusing me most
in the IntentService, the OnHandleIntent method calls stopSelf after it finishes its tasks, when I actually want the code in the IntentService to keep running all the time
protected abstract void onHandleIntent (Intent intent)
Added in API level 3
This method is invoked on the worker thread with a request to process. Only one Intent is processed at a time, but the processing happens on a worker thread that runs independently from other application logic. So, if this code takes a long time, it will hold up other requests to the same IntentService, but it will not hold up anything else. When all requests have been handled, the IntentService stops itself, so you should not call stopSelf().
SignalA is running on the thread that creates and starts the connection, but all network access is done in the background. The remaining work on the starting thread is really lightweight, hence its perfectly ok to do it on the UI tread.
To answer your question, you need to have a thread running the signala connection. Therefore I think a Service is the best choice since SignalA need to be running all the time.
Regarding Aquery and the demo project. I removed all dependencies to Aquery in the libraries, not in the Demo. To be clear, you don't need Aquery to run SignalA.
In my case, what I wanted was a Service not an Intent Service, since I wanted something that would keep running until the app closes
Question: What is best way to call a web service (0.5-1.5 seconds/call) from a servlet at AppEngine? Are blocking calls are scalable at AppEngine environment?
Context: I am developing a web application using AppEngine and J2EE. The applications calls Amazon web service to grab some information for the user. From my asp.net experience, best way to do the calls - is to use async http handler to prevent starvation at IIS thread pool. This feature is not available for J2EE with Servlet 2.5 spec (3.0 is planned).
Right now I am thinking of making my controllers (and servlets) thread safe and request scoped. Is there anything also that I can do? Is it even an issue in J2EE + AppEngine environment?
EDIT: I am aware of AppEngine and JAX-WS async invocation support, but I am not sure how it play with servlet environment. As far as I understand, to complete servlet request, the code still should wait for async WS call completion (callback or whatever).
I assume that doing it using synchronization primitives will block current working thread.
So, as far as thread is blocked, to serve another user request servlet container need to allocate new thread in thread pool, allocate new memory for stack and waste time for context switching. Moreover, requests can block entire server, when we run out of threads in thread pool. This assumptions are based on ASP.Net and IIS thread model. Are they applicable to J2EE environment?
ANSWER: After studying Apache and GAE documentation, it seems that starvation of threads in the thread pool is not a real issue. Apache, by default has 200 threads for thread pool (compared to 25 in asp.NET and IIS). Based on this I can infer that threads are rather cheap in JVM.
In case if async processing is really required or servlet container will run out of threads, it's possible to redesign the application to send response via google channel api.
The workflow will look like:
Make sync request to servlet
Servlet makes creates channel for async reply and queues task for background worker
Servlet returns response to client
[Serving other requests]
Background worker does processing and pushes data to client via channel api
As you observe, servlets don't support using a single thread to service multiple concurrent requests - one thread is required per request. The best way to do your HTTP call is to use asynchronous urlfetch, and wait on that call to complete when you need the result. This will block the request's thread, but there's no avoiding that - the thread is dedicated to the current request until it terminates no matter what you do.
If you don't need the response from the API call to serve the user's request, you could use the task queue to do the work offline, instead.
Isn't it OK to use fetchAsync?
looks at this, this might help
http://today.java.net/pub/a/today/2006/09/19/asynchronous-jax-ws-web-services.html
I am not sure, If you can exactly replicate what you do in dot net, Here is what you could do to may be to simulate it page on load
Submit an ajax request to controller using a java script body onload
In the controller start the async task and send the response back the user and use a session token to keep track of the task
You can poll the controller (add another method to ask for update of the task, since you have session token to track the task) until u get the response
You can do this either waiting for response page or hidden frame that keeps polling the controller
Once you have the response that you are looking for remove the session token
If you want to do that would be the best option instead of polling would be ideal in this case Reverse Ajax / server push
Edit: Now I understand what you mean, I think you can have your code execute async task not wait for response from async itself, just send response back to the user. I have simple thread that I will start but will wait for it to finish as I send the response back to the user and the same time use a session token to track the request
#Controller
#RequestMapping("/asyncTest")
public class AsyncCotroller {
#RequestMapping(value = "/async.html", method = RequestMethod.GET)
public ModelAndView dialogController(Model model, HttpServletRequest request)
{
System.err.println("(System.currentTimeMillis()/1000) " + (System.currentTimeMillis()/1000));
//start a thread (async simulator)
new Thread(new MyRunnbelImpl()).start();
//use this attribute to track response
request.getSession().setAttribute("asyncTaskSessionAttribute", "asyncTaskSessionAttribute");
//if you look at the print of system out, you will see that it is not waiting on //async task
System.err.println("(System.currentTimeMillis()/1000) " + (System.currentTimeMillis()/1000));
return new ModelAndView("test");
}
class MyRunnbelImpl implements Runnable
{
#Override
public void run()
{
try
{
Thread.sleep(5000);
} catch (InterruptedException e)
{
e.printStackTrace();
}
}
}
}