Java ExecutorService sequentially -Spring MVC - java

I want to implement the ExecutorService in my Spring-MVC application.
I need a global ExecutorService which take tasks put them in a queue and then executes them sequentially. So I want to pass the tasks from different locations in the application. Therefore I am using the Executors.newSingleThreadExecutor(); so I have only 1 thread executing these tasks.
However, I am not sure how to integrate it in a Spring application:
public enum TaskQueue {
INSTANCE;
ExecutorService executorService;
private TaskQueue() {
executorService = Executors.newSingleThreadExecutor();
}
public void addTaskToQueue (Runnable task){
executorService.submit(task);
executorService.shutdown();
}
}
So I am thinking of creating a Singleton and then just passing the Task (Runnable object) to the method:
TaskQueue.INSTANCE.addTaskToQueue(new Runnable() {
#Override
public void run() {
System.out.println("Executing Task1 inside : " + Thread.currentThread().getName());
}
});
However, I have several questions:
I am not sure how to integrate it in a Spring MVC application where I have controllers, services and so on.
The application receives some notifications from a web-service. These notifications will be processed on different locations in the code. I want to
execute them sequentially. So I need to identify all tasks I want to run asynchronously and then pass them to the method above (`addTaskToQueue) wrapped in a Runnabel object, so they can be executed asynchronously. Is this the correct approach?
So I always pass a Runnable objects to this method to execute it. This method executes it and shuts the executorservice down. So each time I pass a task to this service - it creates a new service and then closes it. But I dont want to have that - I want the executorservice to stay alive and execute the tasks that are comming and not shutdown after each task. How do I achieve this?
Or am I totally wrong in implementing it this way?
EDIT:
Using the TaskExecutor provided by Spring - I would implement it like this:
#Autowired
private TaskExecutor taskExecutor;
Then calling it from different location in my code:
taskExecutor.execute(new Runnable() {
#Override
public void run() {
//TODO add long running task
}
});
Probably I need to make sure somewhere in the configuration that it needs to be executed sequentially - so it should create only 1 thread. Is it safe to call it from different locations? It wont always create a new service with a new queue?

Spring already has a bean for this: org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor. It implements DisposableBean interface and shutdown() method is called when Spring context is destroyed.
<bean id="taskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="${threadPoolSize}"/>
<property name="maxPoolSize" value="${threadPoolSize}"/>
<property name="WaitForTasksToCompleteOnShutdown" value="false"/>
</bean>

The Singleton design pattern is not a very good fit for this. Since you're using Spring, it makes much more sense to make a component with #PostConstruct and #PreDestroy methods. Here and here are articles that explain this.
In your case, I'd do something like the following:
#Component
public class TaskQueue {
private ExecutorService executorService;
#PostConstruct
public void init() {
executorService = Executors.newSingleThreadExecutor();
}
#PreDestroy
public void cleanup() {
executorService.shutdown();
}
public void addTaskToQueue (Runnable task){
executorService.submit(task);
}
}
And then you can autowire this component.
Edit: #Ivan's answer is to be preferred.

Related

Do I need to shutdown ExecutorService that processes data on a consistent basis in java

I have a Spirngboot application that runs tasks in separate threads every 24 hours. Tasks take a little bit of time but after that the app becomes idle before the next batch triggers the api the next day. Demo code is as follows:
private ExecutorService es = Executors.newFixedThreadPool(2);
#PostMapping
public void startTest(#RequestBody DummyModel dummy) throws InterruptedException {
int i = 0;
while(i<3) {
es.execute(new ListProcessing(dummy));
es.execute(new AnotherListProcessing(dummy));
i++;
}
// because methods are async this line is reached in an instant before processing is done
}
Now as you can see, I have no es.shutdown() after my while loop. In most articls and discussions, there seem to be an emphisis on how important a .shutdown() command is after you have completed your work. Adding it after my while loop would mean that that the next post request I do will result in error (which makes sense since .shutdown() states that it will not allow new tasks after the existing tasks are completed).
Now, I want to know is it really important to do .shutdown() here? My app will receive a post request once a day everyday, so ExecutorService will be used frequently. Are there downsides of not shuting down your Executor for a prolong period of time? And if I do really need to shut it down everytime ExecutorService was used, how can I do it so that app is ready to receive a new request the following day?
I was thinking of adding these lines after my while loop:
es.shutdown();
es = Executors.newFixedThreadPool(2);
It works, but that seems:
a) unnecessary (why shut it down and waste effort recreating it)
b) that, just looks & feels wrong. There has to be a better way.
Update
So it seems that you can either create a custom ThreadPoolExecutor (which for my simple usecase seems like an overkill) or you can use CachedThreadPool option of ExecutorService (thou it will attempt to use as many threads as there are available, so if you only need to use n number of them that option may not be for you).
Update 2
As explained by Thomas and Deinum, we can use customl defined executor. It does exactly what ExecutorService does + clean up and also allows for quick & easy way to configure it. For anyone curious, here is how I implemented it:
#Autowired
private TaskExecutor taskExecutor;
#PostMapping
public void startTest(#RequestBody DummyModel dummy) throws InterruptedException {
taskExecutor.execute(new ProcessList());
taskExecutor.execute(new AnotherProcessList());
taskExecutor.execute(new YetAnotherProcessList());
}
where taskExecutor is a bean defined in my main class (or it can be defined in any class with #Configuration annotation). It is as follows:
#Bean
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(5); // min number of threads that are always there
executor.setMaxPoolSize(10); // if threads are full and queue is full then additional threads will be created
executor.setQueueCapacity(5); // the number of tasks to be placed in the queue (caution queues require memory. Larger queue = more memory)
return executor;
}
Your proposed solution won't work reliably.
es.shutdown();
es = Executors.newFixedThreadPool(2);
This assumes that the method startTest is only ever to be invoked by 1 incoming request concurrently and that the next one that comes in is always after a executor has been shutdown and refreshed.
That solution would only work if you also create the ExecutorService inside the method for the scope of the method. However that is also problematic. If 100 requests come in you will create 200 concurrent threads, each thread takes up resources. So you effectivly created a potential resource leak (or at least an attack vector for your application).
General rule of thumb if you create the Executor yourself in the same scope then you should close it, if not leave it untouched. In your case you basically use a shared thread pool and should only do a shutdown on application stop. You could do that in an #PreDestroy method in your controller
#PreDestroy
public void cleanUp() {
es.shutdown();
}
However instead of adding this to your controller you could also define the ExecutorService as a bean and configure a destroy method.
#Bean(destroyMethod="shutdown")
public ExecutorService taskExecutor() {
return Executors.newFixedThreadPool(2);
}
You could now dependency inject the ExecutorService in your controller.
#RestController
public class YourController {
private final ExecutorService es;
public YourController(ExecutorService es) {
this.es=es;
}
}
Finally I suspect you are even better of using the Spring (Boot) provided TaskExecutor which taps into the Spring context lifecycle automatically. You can simply inject this into your controller instead of the ExecutorService.
#RestController
public class YourController {
private final TaskExecutor executor;
public YourController(TaskExecutor executor) {
this.executor = executor;
}
}
Spring Boot provides one by default which will be injected, you can control those by using the spring.task.execution.pool.* properties.
spring.task.execution.pool.max-size=10
spring.task.execution.pool.core-size=5
spring.task.execution.pool.queue-capacity=15
Or you could define a bean, this would override the default TaskExecutor as well.
#Bean
public ThreadPoolTaskExecutor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(5);
executor.setMaxPoolSize(10);
executor.setQueueCapacity(5);
return executor;
}

Should I have a singleton ThreadPoolTaskExecutor to be shared different services?

I have declared ThreadPoolTaskExecutor as a #Bean as per my application context as :
#Configuration
#ConfigurationProperties(prefix = "application")
#EnableCaching
public class ApplicationConfig {
private static final int POOL_SIZE = 2;
#Bean
public ThreadPoolTaskExecutor taskExecutor() {
ThreadPoolTaskExecutor pool = new ThreadPoolTaskExecutor();
pool.setCorePoolSize(POOL_SIZE);
return pool;
}
}
I have 2 different services that need get wired an instance of ThreadPoolTaskExecutor. Each service will submit a Runnable that will do some service-specific job.
For example, these are the 2 services:
#Service
public class TerminatedContractsService {
#Autowired
private ThreadPoolTaskExecutor taskExec;
public void notifyTerminatedContracts(Date d) {
// do some contract specific work
taskExec.submit(() -> System.out.println("emailing terminated contracts..."));
}
}
#Service
public class SalaryCalculationService {
#Autowired
private ThreadPoolTaskExecutor taskExec;
public void calculateSalary(Date d) {
// do some salary related work
taskExec.submit(() -> System.out.println("calculating salaries..."));
}
}
Is should be safe to share the same ThreadPoolTaskExecutor instance (since its singleton) for both services right?
Do you foresee any issues with this and if I should prototype instead?
Yes, it's ok for multiple services to use the same executor. There isn't any state kept by the executor that would make it a good idea to throw it away and create a new one.
There can be things to look out for. If you have tasks of varying duration that you submit to the same executor, short duration tasks can be blocked if they are queued up behind long running ones. You may want to make sure tasks submitted to an executor have similar durations.
Also if you have some category of task that you need to execute predictably and reliably you might want to reserve a dedicated executor for it. Otherwise if those tasks share a queue with others and there's an issue that prevents those tasks from completing or just slows them down, then the tasks you need executed reliably may be stuck queued up behind them.
But no, prototype scope shouldn't be necessary.

java scheduler spring vs quartz

Currently I am building a spring standalone program in order to learn new methods and architectures.
The last few days I tried to learn scheduler. I never used them before so I read some articles handling the different possible methods. Two of them are especially interesting: The spring nativ #Scheduler and Quartz.
From what I read, Spring is a little bit smaller then Quartz and much more basic. And quartz is not easy to use with spring (because of the autowired and components).
My problem now is, that there is one thing I do not understand:
From my understanding, both methods are creating parallel Threads in order to asynchronously run the jobs. But what if I now have a spring #Service in my main Application, that is holding a HashMap with some information. The data is updated and changed with user interaction. Parallel there are the scheduler. And a scheduler now whants to use this HashMap from the main application as well. Is this even possible?
Or do I understand something wrong? Because there is also the #Async annotation and I did not understand the difference. Because a scheduler itself is already parallel to the main corpus, isn't it?
(summing up, two questions:
can a job that is executed every five seconds, implemented with a scheduler, use a HashMap out of a service inside the main program? (in spring #Scheduler and/or in Quartz?)
Why is there a #Async annotation. Isn't a scheduler already parallel to the main process?
)
I have to make a few assumptions about which version of Spring you're using but as you're in the process of learning, I would assume that you're using spring-boot or a fairly new version, so please excuse if the annotations don't match your version of Spring. This said, to answer your two questions the best I can:
can a job that is executed every five seconds, implemented with a scheduler, use a HashMap out of a service inside the main program? (in spring #Scheduler and/or in Quartz?)
Yes, absolutely! The easiest way is to make sure that the hashmap in question is declared as static. To access the hashmap from the scheduled job, simply either autowire your service class or create a static get function for the hashmap.
Here is an example of a recent Vaadin project where I needed a scheduled message sent to a set of subscribers.
SchedulerConfig.class
#Configuration
#EnableAsync
#EnableScheduling
public class SchedulerConfig {
#Scheduled(fixedDelay=5000)
public void refreshVaadinUIs() {
Broadcaster.broadcast(
new BroadcastMessage(
BroadcastMessageType.AUTO_REFRESH_LIST
)
);
}
}
Broadcaster.class
public class Broadcaster implements Serializable {
private static final long serialVersionUID = 3540459607283346649L;
private static ExecutorService executorService = Executors.newSingleThreadExecutor();
private static LinkedList<BroadcastListener> listeners = new LinkedList<BroadcastListener>();
public interface BroadcastListener {
void receiveBroadcast(BroadcastMessage message);
}
public static synchronized void register(BroadcastListener listener) {
listeners.add(listener);
}
public static synchronized void unregister(BroadcastListener listener) {
listeners.remove(listener);
}
public static synchronized void broadcast(final BroadcastMessage message) {
for (final BroadcastListener listener: listeners)
executorService.execute(new Runnable() {
#Override
public void run() {
listener.receiveBroadcast(message);
}
});
}
}
Why is there a #Async annotation. Isn't a scheduler already parallel to the main process?
Yes, the scheduler is running in its own thread but what occurs to the scheduler on long running tasks (ie: doing a SOAP call to a remote server that takes a very long time to complete)?
The #Async annotation isn't required for scheduling but if you have a long running function being invoked by the scheduler, it becomes quite important.
This annotation is used to take a specific task and request to Spring's TaskExecutor to execute it on its own thread instead of the current thread. The #Async annotation causes the function to immediately return but execution will be later made by the TaskExecutor.
This said, without the #EnableAsync or #Async annotation, the functions you call will hold up the TaskScheduler as they will be executed on the same thread. On a long running operation, this would cause the scheduler to be held up and unable to execute any other scheduled functions until it returns.
I would suggest a read of Spring's Documentation about Task Execution and Scheduling It provides a great explanation of the TaskScheduler and TaskExecutor in Spring

Scheduling a Task in Spring MVC

In my Spring MVC application i need to schedule a task with specific date & Time. Like- i have to schedule to send a email which will be configured dynamically by customer. In Spring #Schedule annotation is there but how can i change value dynamically every time with any date & Time.
Any help is appreciated.
You should try TaskScheduler, see the javadoc here:
private TaskScheduler scheduler = new ConcurrentTaskScheduler();
#PostConstruct
private void executeJob() {
scheduler.scheduleAtFixedRate(new Runnable() {
#Override
public void run() {
// your business here
}
}, INTERVAL);
}
Refer Spring Task Execution and Scheduling
Example Annotations
#Configuration
#EnableAsync
#EnableScheduling
public class MyComponent {
#Async
#Scheduled(fixedDelay=5000, repeatCount=0)
public void doSomething() {
// something that should execute periodically
}
}
I think the repeatCount=0 will make the function execute only once (yet to test)
Full example with Quartz scheduler http://www.mkyong.com/spring/spring-quartz-scheduler-example/
You need to introduce XML configuration as follows
<task:annotation-driven executor="myExecutor" scheduler="myScheduler"/>
<task:executor id="myExecutor" pool-size="5"/>
<task:scheduler id="myScheduler" pool-size="10"/>}
You can achieve this easily within standard java API, by scheduling task for time difference between the time task is created and target date entered by customers. Simply provide this difference as parameter delay.
ScheduledThreadPoolExecutor
schedule(Callable<V> callable, long delay, TimeUnit unit)
Creates and executes a ScheduledFuture that becomes enabled after the given delay.
ScheduledFuture<?> schedule(Runnable command, long delay, TimeUnit unit)
Creates and executes a one-shot action that becomes enabled after the given delay.
So you either have to submit Runnable or Callable to this service.
You can refer to this answer for calculation between dates:
time difference

How to run a background job method at fixed intervals? [duplicate]

This question already has answers here:
How to run a background task in a servlet based web application?
(5 answers)
Closed 7 years ago.
I am using JSP/Servlet on Apache Tomcat. I have to run a method every 10 minutes. How can I achieve this?
As you're on Tomcat, which is just a barebones servletcontainer, you can't use EJB's #Schedule for this which is recommended by Java EE specification. Your best bet is then the ScheduledExecutorService from Java 1.5's java.util.concurrent package. You can trigger this with help of a ServletContextListener like follows:
#WebListener
public class BackgroundJobManager implements ServletContextListener {
private ScheduledExecutorService scheduler;
#Override
public void contextInitialized(ServletContextEvent event) {
scheduler = Executors.newSingleThreadScheduledExecutor();
scheduler.scheduleAtFixedRate(new SomeTask(), 0, 10, TimeUnit.MINUTES);
}
#Override
public void contextDestroyed(ServletContextEvent event) {
scheduler.shutdownNow();
}
}
where the SomeTask class look like this:
public class SomeTask implements Runnable {
#Override
public void run() {
// Do your job here.
}
}
If you were actually using a real Java EE container with EJB support and all on em (like Glassfish, JBoss AS, TomEE, etc), then you could use a #Singleton EJB with a #Schedule method. This way the container will worry itself about pooling and destroying threads. All you need is then the following EJB:
#Singleton
public class SomeTask {
#Schedule(hour="*", minute="*/10", second="0", persistent=false)
public void run() {
// Do your job here.
}
}
Note that this way you can continue transparently using container managed transactions the usual way (#PersistenceContext and so on), which isn't possible with ScheduledExecutorService — you'd have to manually obtain the entity manager and manually start/commit/end transaction, but you would by default already not have another option on a barebones servletcontainer like Tomcat anyway.
Note that you should never use a Timer in a supposedly "lifetime long" running Java EE web application. It has the following major problems which makes it unsuitable for use in Java EE (quoted from Java Concurrency in Practice):
Timer is sensitive to changes in the system clock, ScheduledExecutorService isn't.
Timer has only one execution thread, so long-running task can delay other tasks. ScheduledExecutorService can be configured with any number of threads.
Any runtime exceptions thrown in a TimerTask kill that one thread, thus making Timer dead, i.e. scheduled tasks will not run anymore (until you restart the server). ScheduledThreadExecutor not only catches runtime exceptions, but it lets you handle them if you want. Task which threw exception will be canceled, but other tasks will continue to run.
Read on ScheduledExecutorService it has to be initiated by a ServletContextListener
public class MyContext implements ServletContextListener
{
private ScheduledExecutorService sched;
#Override
public void contextInitialized(ServletContextEvent event)
{
sched = Executors.newSingleThreadScheduledExecutor();
sched.scheduleAtFixedRate(new MyTask(), 0, 10, TimeUnit.MINUTES);
}
#Override
public void contextDestroyed(ServletContextEvent event)
{
sched.shutdownNow();
}
}
Also, you may try using the Java Timer from a ServletContextListener but it's not recommended in a Java EE container since it takes away control of the Thread resources from the container. (the first option with ScheduledExecutorService is the way to go).
Timer timer = new Timer("MyTimer");
MyTask t = new MyTask();
//Second Parameter is the specified the Starting Time for your timer in
//MilliSeconds or Date
//Third Parameter is the specified the Period between consecutive
//calling for the method.
timer.schedule(t, 0, 1000*60*10);
And MyTask that implements TimerTask is a class that implements the Runnable interface so you have to override the run method with your code:
class MyTask extends TimerTask
{
public void run()
{
// your code here
}
}

Categories