Multi Threading application + web application - java

Is a typical J2ee web application or any web app built on top java is multi threaded application , so every time i write some code i have to keep race condition or concurrent modification in mind ?

Is a typical J2ee web application or any web app built on top java is multi threaded application?
Yes, it is. But the application server (Tomcat, JBoss, WebSphere, etc.) handles the threads and resources for you, so you may not worry about race condition or concurrent modification.
When you should worry about concurrent modification? For example, if you happen to create a field in a Servlet and you update this field on every request (doPost or doGet method of the servlet), then two users in their pcs could perform the request on the same URL at the same time, and this field will have an unexpected value. This is covered here: How do servlets work? Instantiation, sessions, shared variables and multithreading, Threadsafety section of the accepted answer. Note that having a design like this is a bad practice.
Another case may be you firing new threads and resources shared among this threads by your own. It is not a good practice nor a bad practice, it is kind of you must understand the risk you're taking and assume the consequences. This means, you can have a Servlet and fire threads on your own, but it's up to you to handle this in the right way. Note that you should evaluate if you really need to fire and handle threads in your Java EE application or if you could use another approach like firing JMS messages that will handle multiple requests in parallel and asynchronously.
#AndreiI noted in his/her answer that EJB prohibits using threads, but this means that you cannot manage threads inside an EJB, nor by creating a new instance of Thread nor by using ExecutorService or any other. In code:
#Stateless
public class FooEJB {
public void bar() {
//this is not allowed!
Thread t = new Thread(new Runnable() {
//implementation of runnable
});
t.start();
}
public void baz() {
//this is not allowed either!
final int numberOfThreads = ...;
ExecutorService es = Executors.newFixedThreadPool(numberOfThreads);
es.execute(new Runnable() { ... });
es.shutdown();
}
}

Like almost any framework in Java (server applications, inclusive Web frameworks or GUI applications based on AWT or Swing), Java EE is multi-threaded. But the answer to your question is no: you do not have to care about race condition or concurrent modification. Of course you are not allowed to make some errors (like sharing Servlet variables), but in a typical application you do not care about such things. For example the EJB specification prohibits using threads, but it has a mechanism for asynchronous jobs. Excerpt from the EJB specification:
The enterprise bean must not attempt to manage threads. The enterprise
bean must not attempt to start, stop, suspend, or resume a thread, or
to change a thread’s priority or name. The enterprise bean must not
attempt to manage thread groups.
Also the most used interface in the JPA specification (EntityManager) is not thread safe, although others are.

In a Java EE application container the Server takes care of the threading for you. Typically it creates one thread per request. However using Spring or EJB you can declare different scopes to your threads. So, you should not have to directly manage threads in a JavaEE application.

Related

if multiple requests are handled by a server to run a single servlet then where we need to take care of synchronization?

If multiple requests are handled by a server to run a single servlet then where we need to take care of synchronization?
I have got the answer from How does a single servlet handle multiple requests from client side how multiple requests are handled. But then again there is a question that why we need synchronization if all requests are handled separately?
Can you give some real life example how a shared state works and how a servlet can be dependent? I am not much interested in code but looking for explanation with example of any portal application? Like if there is any login page how it is accessed by n number of users concurrently.
If more than one request is handled by the server.. like what I read is server make a thread pool of n threads to serve the requests and I guess each thread will have their own set of parameters to maintain the session... so is there any chance that two or more threads (means two or more requests) can collide with each other?
Synchronization is required when multiple threads are modifying a shared resources.
So, when all your servlets are independent of each other, you don't worry about the fact that they run in parallel.
But, if they work on "shared state" somehow (for example by reading/writing values into some sort of centralized data store); then you have to make sure that things don't go wrong. Of course: the layer/form how to provide the necessary synchronization to your application depends on your exact setup.
Yes, my answer is pretty generic; but so is your question.
Synchronization in Java will only be needed if shared object is mutable. if your shared object is either read-only or immutable object, then you don't need synchronization, despite running multiple threads. Same is true with what threads are doing with an object if all the threads are only reading value then you don't require synchronization in Java.
Read more
Basically if your servlet application is multi-threaded, then data associated with servlet will not be thread safe. The common example given in many text books are things like a hit counter, stored as a private variable:
e.g
public class YourServlet implements Servlet {
private int counter;
public void service(ServletRequest req, ServletResponse, res) {
//this is not thread safe
counter ++;
}
}
This is because the service method and Servlet is operated on by multiple thread incoming as HTTP requests. The unary increment operator has to firstly read the current value, add one and the write the value back. Another thread doing the same operation concurrently, may increment the value after the first thread has read the value, but before it is written back, thus resulting in a lost write.
So in this case you should use synchronisation, or even better, the AtomicInteger class included as part of Java Concurrency from 1.5 onwards.

Spring MVC Rest Services - Number of Threads (Controller Instances)

In our application we want to achieve higher throughput so I just want to know how threading works in Spring MVC controllers.
Thanks in advance for your help.
This helped me
http://community.jaspersoft.com/wiki/how-increase-maximum-thread-count-tomcat-level
A web application is hosted in an application server (like tomcat). Usually the application server manage a thread pool and every request is handled by a thread.
The web application don't have to worry about this thread pool. The size of the thread pool is a parameter of the application server.
To achieve higher throughput you really need to identify the bottleneck.
(According my experience, the size of the thread pool of the application server is rarely the root cause of performance problem.)
Note that the "number of controller instances" is normally one. i.e. a controller is usually a singleton shared/used by all threads, and therefore a controller must be thread-safe.
Let us specify the question a little more: an application of interest, implementing a REST controller, is deployed on a typical mutlithreaded application server (running, possibly, other things as well). Q: Is there concurrence in handling of separate requests to the mapped methods of the controller?
I'm not authoritative in this subject, but it is of high importance (in particular: should single-threaded logic be applied to REST-Controller code?).
Edit: answer below is WRONG. Concurrent calls to different methods of same controller are handled concurrently, and so all shared resources they use (services, repositories etc.) must ensure thread safety. For some reason, however, calls to be handled by the same method of the controller are serialized (or: so it appears to me as of now).
The small test below shows, that even though subsequent (and rapid) calls to the mapped methods of the controller are indeed handled by different threads, single-threaded logic applies (i.e. there is no cuncurrency "out of the box").
Let us take the controller:
AtomicInteger count = new AtomicInteger();
#RequestMapping(value = {"/xx/newproduct"})
#ResponseBody
public Answer newProduct(){
Integer atCount = count.incrementAndGet();
////// Any delay/work would do here
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
Answer ans = new Answer("thread:" + Thread.currentThread().getName() + " controller:" + this, atCount);
count.decrementAndGet();
return ans;
}
and launch 10 rapid (almost concurrent w.r.t. the 1000ms sleep time) REST requests, e.g. by the AngularJS code
$scope.newProd = function (cnt) {
var url = $scope.M.dataSource + 'xx/newproduct';
for(var i=0; i<cnt; ++i) {
$http.get(url).success(function(data){
console.log(data);
});
}
};
(the Answer just carries a String and an Integer; making count static would not change anything). What happens, is that all requests become pending concurrently, but responses come sequentially, exactly 1s apart, and none has atCount>1. They do come from different threads though.
More specifically the console log has:
in other words:
Edit: This shows, that concurrent calls to the same method/route are serialized. However, by adding a second method to the controller we easily verify, that calls to this method would be handled concurrently with the calls to the first method, and hence, multithreaded logic for handling requests is mandatory "out-of-the-box".
In order to profit from multithreading one should therefore, as it seems, employ traditional explicit methods, such as launching any non-trivial work as a Runnable on an Executor.
Basically this has nothing to do with Spring. Usually each request is forked into a separate thread. So the usual thing to do here is finding the bottleneck.
However there is a possibility that badly written beans that share state over thread boundaries and therefore need to be synchronized might have a very bad effect.

Threading and Concurrency Within A Servlet

I have a web application that retrieves a (large) list of results from the database, then needs to pare down the list by looking at each result, and throwing out "invalid" ones. The parameters that make a result "invalid" are dynamic, and we cannot pass the work on to the database.
So, one idea is to create a thread pool and ExecutorService and check these results concurrently. But I keep seeing people saying "Oh, the spec prohibits spawning threads in a servlet" or "that's just a bad idea".
So, my question: what am I supposed to do? I'm in a servlet 2.5 container, so all the asynchrous goodies as part of the 3.0 spec are unavailable to me. Writing a separate service that I communicate with via JMS seems like overkill.
Looking for expert advice here.
Jason
Nonsense.
The JEE spec has lots of "should nots" and "thou shant's". The Servlet spec, on the other hand, has none of that. The Servlet spec is much more wild west. It really doesn't dive in to the actual operational aspects like the JEE spec does.
I've yet to see a JEE container (either a pure servlet container ala Tomcat/Jetty, or full boat ala Glassfish/JBoss) that actually prevented me from firing off a thread on my own. WebSphere might, it's supposed to be rather notorious, but I've not used WebSphere.
If the concept of creating unruly, self-managed threads makes you itch, then the full JEE containers internally have a formal "WorkManager" that can be used to peel threads off of. They just all expose them in different ways. That's the more "by the book-ish" mechanism for getting a thread.
But, frankly, I wouldn't bother. You'll likely have more success using the Executors out of the standard class library. If you saturate your system with too many threads and everything gets out of hand, well, that's on you. Don't Do That(tm).
As to whether an async solution is even appropriate, I'll punt on that. It's not clear from your post whether it is or not. But your question was about threads and Servlets.
Just Do It. Be aware it "may not be portable", do it right (use an Executor), take responsibility for it, and the container won't be the wiser, nor care.
Doesn't look like concurrency will help you much here. Unless it's very expensive to check each entry, making that check concurrent won't speed things up. Your bottleneck is passing the result set through the database connection, and you couldn't multithread that even if you weren't working on a servlet.
There's nothing to stop you from hitting some ThreadPool from your Servlet, the challenge comes in getting the results. If the Servlet invocation is expecting some result from your submission of a Task to the TreadPool you will end up blocking waiting for the TreadPool stuff to finish so you can compose a response to the doGet/doPut invocation.
If, on the other hand, you devise your service such that a doPut, for example, submits a Task to a ThreadPool but gets back a "handle" or some other unique identifier of the Task returning that to the client, then the client can "poll" the handle through some doGet API to see if the task is done. When the task is done, the client can get the results.
It's completely fine and appropriate. I have done countless work with Servlets that use thread pools on different containers without any problems whatsoever.
EJB containers (like JBoss) tend to warn against spawning threads, but this is because EJB guarantees that an instance of a Bean is only called by one thread, and some of the facilities rely on this and thus you could mess that up by using your own threads. In Servlet there is no such reliance and hence nothing you can mess up this way.
Even in EJB containers, you can use thread pools and be fine as long as you don't interact (like call) with EJB facilities from your own threads.
The thing to watch out for with servlet/threads is that member variables of the servlet need to be thread safe.
Technically nothing stops you from using a thread pool in your servlet to do some post processing but you could shoot yourself in the foot if you create a static thread pool with say 20 threads and 50 clients access your servlet concurrently because 30 clients will be waiting (depending on how long your post-processing takes).

JEE7: best way to create another thread that never exits

I'm writing a JEE7/Glassfish 4 application that reads data from an external queue (RabbitMQ) and processes it. It needs a method (I suppose an EJB method) that contains a loop that never exits that reads the queue. I suppose since this loop never exits, it needs to be on a separate thread. My question is, what is the correct way to do this in a JEE7 application?
This may be obvious, but the ReadQueue() method needs to start automatically when the app starts and must keep running permanently.
Is the ManagedExecutorService appropriate for this?
ManagedExecutorService is exactly what you want to use for this.
The availability of this service in JEE is a great benefit. In the past, we basically just ignored the guidelines and managed all of this stuff ourselves.
The MES allows you to capture the context information of the invoking component, and tie your task in to the life cycle of the container. These are both very important in the JEE environment.
As to where to start the task, you basically have two options.
One, you can use a ServletContextListener, and have that kick off the task during container startup.
Two, you can use an #Singleton EJB, and tie in to its lifecycle methods to start your task.
If you start the task up from the ServletContextListener, then the task will run as if it's in the WAR environment. If you start it up from the #Singleton, it will run within the Session Beans environment (this mostly relates to how the JNDI appears).
Either way, you only need to worry about starting the task via these mechanisms. You should rely on the ManagedTaskListener.taskAborted interface method to shut your task down.
In theory you can work with the Thread.interrupt that is sent to your task during shut down. I've never had good luck with that myself, I rely on an external mechanism to tell the long running tasks to shut off.
I wish I could give first hand experience with this new facility, but I haven't had an opportunity to try it out yet. But from the spec, this is what you want to do.
To start a thread with an infinite loop that polls the queue periodically is usually not a good idea. The nature of queues suggests an async, event-driven processing. For such problems in the JEE world you have MDBs. The only issue here is that MDB requires a JMS queue provider but RabbitMQ is using a different protocol (AMQP). You need a JMS-AMQP bridge to make this work. Could be Qpid JMS but no guarantee that it will work.
Here is one way to create a thread that never exits:
public class HelloRunnable implements Runnable {
public void run() {
while (true) {
// do ReadQueue() here
}
}
public static void main(String args[]) {
(new Thread(new HelloRunnable())).start();
}
}

How do we start a thread from a servlet?

What's the recommended way of starting a thread from a servlet?
Example: One user posts a new chat message to a game room. I want to send a push notification to all other players connected to the room, but it doesn't have to happen synchronously. Something like:
public MyChatServlet extends HttpServlet {
protected void doPost(HttpServletRequest request,
HttpServletResponse response)
{
// Update the database with the new chat message.
final String msg = ...;
putMsgInDatabaseForGameroom(msg);
// Now spawn a thread which will deal with communicating
// with apple's apns service, this can be done async.
new Thread() {
public void run() {
talkToApple(msg);
someOtherUnimportantStuff(msg);
}
}.start();
// We can send a reply back to the caller now.
// ...
}
}
I'm using Jetty, but I don't know if the web container really matters in this case.
Thanks
What's the recommended way of starting a thread from a servlet?
You should be very careful when writing the threading program in servlet.
Because it may causes errors (like memory leaks or missing synchronization) can cause bugs that are very hard to reproduce,
or bring down the whole server.
You can start the thread by using start() method.
As per my knowledge , I would recommend startAsync (servlet 3.0).
I got some helpful link for you Click.
but I don't know if the web container really matters in this case.
Yes it matters.Most webservers (Java and otherwise, including JBoss) follow a "one thread per request" model, i.e. each HTTP request is fully processed by exactly one thread.
This thread will often spend most of the time waiting for things like DB requests. The web container will create new threads as necessary.
Hope it will help you.
I would use a ThreadPoolExecutor and submit the tasks to it. The executor can be configured with a fixed/varying number of threads, and with a work queue that can be bounded or not.
The advantages:
The total number of threads (as well as the queue size) can be bounded, so you have good control on resource consumption.
Threads are pooled, eliminating the overhead of thread starting per request
You can choose a task rejection policy (Occurs when the pool is at full capacity)
You can easily monitor the load on the pool
The executor mechanism supports convenient ways of tracking the asynchronous operation (using Future)
In general that is the way. You can start any thread anywhere in a servlet web application.
But in particulary, you should protect your JVM from starting too much threads on any HTTP request. Someone may request a lot ( or very very much ) and propably at some point your JVM will stop with out of memory or something similiar.
So better choice is to use one of the queues found in the java.util.concurrent package.
One option would be to use ExecutorService and it's implementations like ThreadPoolExecutor
, to re-use the pooled threads thus reducing the creation overhead.
You can use also JMS for queuing you tasks to be executed later.

Categories