Singleton and Multithread in SpringBoot. Is it really multi thread? - java

Since I am not working specifically on multi threads, the questions can be low level or even silly, please excuse me =)
Here is my code call flow like;
MessageNotificationJobExecutionConfig -> AsyncMessageNotificationJobExecutor -> NotificationJobExecutor.execute()
MessageNotificationJobExecutionConfig (finds the objects to process) and calls AsyncMessageNotificationJobExecutor inside the loop
AsyncMessageNotificationJobExecutor has #Async("messageNotificationTaskExecutor") annotation over the execute() method.
AsyncMessageNotificationJobExecutor.execute() method calls NotificationJobExecutor.execute()
messageNotificationTaskExecutor is an instance of ThreadPoolTaskExecutor
Here is my question;
If am not wrong as default NotificationJobExecutor has a singletone instance.
Even if AsyncMessageNotificationJobExecutor work async and use thread pool task executor, all thread call only NotificationJobExecutor instance (singletone).
I am not sure, I may misunderstand that Thread_1 calls NotificationJobExecutor.execute() and until this thread finish its job other thread wait for Thread_1. Is my inference correct ?
I think even if it looks multi thread actually it works singletone
#Component("messageNotificationTaskExecutor")
public class MessageNotificationThreadPoolTaskExecutor extends ThreadPoolTaskExecutor {
#Value("${message.notification.task.executor.corePoolSize}")
Integer corePoolSize;
#Value("${message.notification.task.executor.maxPoolSize}")
Integer maxPoolSize;
#Value("${message.notification.task.executor.queueCapacity}")
Integer queueCapacity;
public MessageNotificationThreadPoolTaskExecutor() {
super();
}
#PostConstruct
public void init() {
super.setCorePoolSize(corePoolSize);
super.setMaxPoolSize(maxPoolSize);
super.setQueueCapacity(queueCapacity);
}
}
#Configuration
public class MessageNotificationJobExecutionConfig {
protected Logger log = LoggerFactory.getLogger(getClass());
#Autowired
AsyncMessageNotificationJobExecutor asyncMessageNotificationJobExecutor;
#Autowired
MessageNotificationThreadPoolTaskExecutor threadPoolTaskExecutor;
#Autowired
JobExecutionRouter jobExecutionRouter;
#Autowired
NotificationJobService notificationJobService;
private Integer operationType = OperationType.ACCOUNT_NOTIFICATION.getValue();
#Scheduled(cron = "${message.notification.scheduler.cronexpression}")
public void executePendingJobs() {
List<NotificationJob> nextNotificationJobList = notificationJobService.findNextJobForExecution(operationType, 10);
for (NotificationJob nextNotificationJob : nextNotificationJobList) {
if (threadPoolTaskExecutor.getActiveCount() < threadPoolTaskExecutor.getMaxPoolSize()) {
asyncMessageNotificationJobExecutor.execute(nextNotificationJob);
}
}
}
}
#Service
public class AsyncMessageNotificationJobExecutor {
#Autowired
NotificationJobExecutor notificationJobExecutor;
#Autowired
NotificationJobService notificationJobService;
#Async("messageNotificationTaskExecutor")
public void execute(NotificationJob notificationJob) {
notificationJobExecutor.execute(notificationJob);
}
}
#Component
public class NotificationJobExecutor implements JobExecutor {
#Override
public Integer getOperationType() {
return OperationType.ACCOUNT_NOTIFICATION.getValue();
}
#Override
public String getOperationTypeAsString() {
return OperationType.ACCOUNT_NOTIFICATION.name();
}
#Override
public void execute(NotificationJob notificationJob) {
// TODO: 20.08.2020 will be execute
}
}

In the scenario you created you have all singleton instances. But the flow looks something like this:
call to executePendingJobs in MessageNotificationJobExecutionConfig
iterate over each NotificationJob sequentially (so this is waiting)
call to execute in AsyncMessageNotificationJobExecutor which will add a execution to the messageNotificationTaskExecutor sequential (thus blocking) to the thread pool
execute the job created in step 3 in a separate thread (so this actually executes your method in AsyncMessageNotificationJobExecutor
a blocking call to the execute method in NotificationJobExecutor
The 'magic' happens in step 3, where rather then executing the method Spring will add a job to the messageNotificationTaskExecutor which wraps the call to step 4. This causes the call for step 4 to happen asynchronous and thus multiple calls to the same instance can occur at the same time. So make sure this object is stateless.

Related

How to best implement a multi-thread approach in Spring Boot

I have been struggling with implementing a multi-threaded approach to the application I am working on.
The part I want to run in parallel threads was originally constructed with a for loop going about a list.
#Service
public ApplicationServiceImpl implements ApplicationService {
#Override
public ResponseEntity<Void> startProcess(List<MyObject> myObjectList) throws Exception {
for (MyObject myObject : myObjectList) {
AnotherTypeOfObject anotherTypeOfObject = runMethodA(myObject);
YetAnotherTypeOfObject yetAnotherTypeOfObject = runMethodB(anotherTypeOfObject);
runMethodC(yetAnotherTypeOfObject, aStringValue, anotherStringValue);
runMethodD(yetAnotherTypeOfObject);
}
}
}
The methods private AnotherTypeOfObject runMethodA(MyObject myObject) {...}, private YetAnotherTypeOfObject yetAnotherTypeOfObject(AnotherTypeOfObject anotherTypeOfObject) {...}, private void runMethodC(YetAnotherTypeOfObject yetAnotherTypeOfObject, String aStringValue, String anotherStringValue) {...} and private void runMethodD(MyObject myObject) {...} only use local variables.
I have looked quite a bit to get a solution that would allow firing the threads of a list of 100s of MyObject instead of one after the other.
What I have done is create a:
#Configuration
#EnableAsync
public class AsyncConfiguration() {
#Bean(name = "threadPoolTaskExecutor")
public Executor aSyncExecutor() {
final ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
threadPoolTaskExecutor.setCorePoolSize(4);
threadPoolTaskExecutor.setMaxPoolSize(4);
threadPoolTaskExecutor.setQueueCapacity(50);
threadPoolTaskExecutor.setThreadNamePrefix("threadNamePrefix");
threadPoolTaskExecutor.initialize();
return threadPoolTaskExecutor;
}
}
I do have loads of log.info("some recognizable text") through the methods A, B, C and D so I can make sure what is going on and I aggregated these methods into one like
private void runThreads(MyObject myObject, String aStringValue, String anotherStringValue) {
AnotherTypeOfObject anotherTypeOfObject = runMethodA(myObject);
YetAnotherTypeOfObject yetAnotherTypeOfObject = runMethodB(anotherTypeOfObject);
runMethodC(yetAnotherTypeOfObject, aStringValue, anotherStringValue);
runMethodD(yetAnotherTypeOfObject);
}
And I have tried to run the main method as:
#Override
#Async("threadPoolTaskExecutor")
public ResponseEntity<Void> startProcess(List<MyObject> myObjectList) throws Exception {
String aStringValue = myObject.getAStringValue();
String anotherStringValue = myObject.getAnotherStringValue();
myObjectList.forEach(myObject -> runThreads(myObject, aStringValue, anotherStringValue));
}
I still don't get the intended result of firing a few threads for the runThreads(...) {} method, so the processing is done in parallel.
What am I missing here?
Actually you are not parallelising the for loop, but the method that executes the for loop. A single thread in this case would execute all the loop.
You need to put the #Async on top of runThreads()
Although It's not recommended to create the executor with static configurations. Try to use the completablefuture API :
https://www.baeldung.com/java-completablefuture
If it's only for running all elements of a collection in parallel, then you can use Stream.parallel(). It uses a default ForkJoinPool with a thread per CPU core. This is the simplest method introduced in Java 8.
myObjectList.stream()
.parallel()
.forEach(myObject -> runThreads(myObject, myObject.getAStringValue(), myObject.getAnotherStringValue()));
For this you don't need any #Async or Spring-provided Executor.
You can use a custom ForkJoinPool to customize the number of threads, but the default might work well, too.
ForkJoinPool customThreadPool = new ForkJoinPool(4);
customThreadPool.invoke(
() -> myObjectList.stream()
.parallel()
.forEach(myObject -> runThreads(myObject, myObject.getAStringValue(), myObject.getAnotherStringValue())));

EJB Singleton - Lock READ method calling a Lock WRITE method of the same instance

Given a Singleton like this one:
#Singleton
public class waitingTimeManager {
private Map<Integer, Object> waitingTimes;
#PostConstruct
public void setup() {
waitingTimes = new HashMap<>();
}
#Lock(LockType.READ)
public boolean shouldWeWait(Integer id) {
if (waitingTimes.containsKey(id)) {
boolean wait = someLogic(waitingTimes.get(id));
if (!wait) {
// we don't need to wait for it anymore
stopWaiting(id);
}
return wait;
}
return false;
}
#Lock(LockType.WRITE)
public void stopWaiting(Integer id){
waitingTimes.remove(id);
}
}
The initial method shouldWeWait can be accessed at the same time by several threads. The other stopWaiting will need to get a write lock.
Will the call to stopWaiting inside shouldWeWait try to get a WRITE Lock? or simply execute it as it already got the READ Lock initially?
No, it won't try to get write lock.
Container job is done within interceptors, wrapping EJB method calls. For example, when stateless BeanA calls your singleton - it does so through proxy, which makes possible the guarantees given by container (retrieving lock, etc.).
But in this case, it's just a normal method call (stopWaiting), not wrapped by proxy, so no place for magic.

How to run two asynchronus task synchronously

I am new to multithreading concept in java(springboot) and have a scenario to solve.There is a function in which 2 asynchronus functions are called.I want to make their execution happen synchronously.eg:
public void func(){
call1();
call2();
}
#Async
public void call1(){}
#Async
public void call2(){}
Can anyone please suggest a method to achieve this functionality.
Thanks
Not exactly sure whats the motivation over here, but from what I could understand from the question, the objective seems like that you dont want to block the main thread (thread executing func()), and at the same time achieve serial execution of call1() and call2(). If thats what you want, you could perhaps make call1() and call2() synchronous (i.e. remove the #Async annotation), and add a third asynchronous method (callWrapper() perhaps), and invoke call1() and call2() serially in that method.
You can wait on #Async methods if you change them to return a Future. For example like this:
#Component
class AsyncStuff {
#Async
public ListenableFuture<?> call1() {
/** do things */
return AsyncResult.forValue(null);
}
#Async
public ListenableFuture<?> call2() {
/** do other things */
return AsyncResult.forValue(null);
}
}
#Component
class User {
#Autowired
AsyncStuff asyncStuff; // #Async methods work only when they are in a different class
public void use() throws InterruptedException, ExecutionException {
asyncStuff
.call1() // starts this execution in another thread
.get(); // lets this thread wait for the other thread
asyncStuff
.call2() // now start the seconds thing
.get(); // and wait again
}
}
But it's guaranteed to be slower than simply doing all this without async because all this adds is overhead to move execution between threads. The calling thread could instead of waiting for other threads to do things simply execute the code itself in that time.

Different taskScheduler for different tasks

I'm using Spring and I've serveral #Scheduled classes in my application:
#Component
public class CheckHealthTask {
#Scheduled(fixedDelay = 10_000)
public void checkHealth() {
//stuff inside
}
}
#Component
public class ReconnectTask {
#Scheduled(fixedDelay = 1200_000)
public void run() {
//stuff here
}
}
I want the first task use a pool of 2 threads, while the second use a single thread. I don't want the second task is stuck because the first one use all threads available and the computation is slower than fixedDelay time.
Of course mine is just an example to get you the idea.
I could use a configuration class like this:
#Configuration
#EnableScheduling
public class TaskConfig implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskScheduler());
}
#Bean
public Executor taskScheduler() {
ThreadPoolTaskScheduler t = new ThreadPoolTaskScheduler();
t.setPoolSize(2);
t.setThreadNamePrefix("taskScheduler - ");
t.initialize();
return t;
}
}
I don't understand how define a different configuration for each #Scheduled component though.
The first task does not require a pool of 2 threads.
Different tasks do not need to be assigned different pools if all using fixed delays. The fixedDelay works as follows:
#Scheduled(fixedDelay=5000)
public void doSomething() {
// something that should execute periodically
}
Would be invoked every 5 seconds with a fixed delay, meaning that the period will be measured from the completion time of each preceding invocation.
Each task only uses one thread, if you have two threads, one thread will not hold up the other to be useable for the other task.

java #Asynchronous Methods: not running async

I try to get an async process running.
Based on this example: http://tomee.apache.org/examples-trunk/async-methods/README.html
But the method addWorkflow(Workflow workflow) will only return when the code in run(Workflow workflow) is fully completed.
Then when it returns and result.get(); is called I'll get the exception:
Caused by: java.lang.IllegalStateException: Object does not represent an acutal Future
Any suggestion what I'm missing?
#Singleton
public class WorkflowProcessor {
#EJB
private WorkflowManager workflowManager;
private final static Logger log = Logger.getLogger(WorkflowProcessor.class.getName());
public void runWorkflows(Collection<Workflow> workflows) throws Exception{
final long start = System.nanoTime();
final long numberOfWorkflows = workflows.size();
Collection<Future<Workflow>> asyncWorkflows = new ArrayList<>();
for(Workflow workflow : workflows){
Future<Workflow> w = addWorkflow(workflow);
asyncWorkflows.add(w);
}
log.log(Level.INFO, "workflow jobs added {0}", new Object[]{numberOfWorkflows});
for(Future<Workflow> result : asyncWorkflows){
result.get();
}
final long total = TimeUnit.NANOSECONDS.toSeconds(System.nanoTime() - start);
log.log(Level.INFO, "WorkflowProcessor->runWorkflows {0} workflows completed in:{1}", new Object[]{numberOfWorkflows, total});
}
#Asynchronous
#Lock(LockType.READ)
#AccessTimeout(-1)
private Future<Workflow> addWorkflow(Workflow workflow){
run(workflow);
return new AsyncResult<Workflow>(workflow);
}
private void run(Workflow workflow){
this.workflowManager.runWorkflow(workflow);
}
So the normal way would be to have the #Asynchronous method in another bean from the caller method.
#Stateless
public class ComputationProcessor {
#Asynchronous
public Future<Data> performComputation {
return new AsyncResult<Data>(null);
}
}
#Stateless
public class ComputationService {
#Inject
private ComputationProcessor mProcessor;
public void ...() {
Future<Data> result = mProcessor.performComputation();
...
}
}
As you discovered, it won't work if the #Asynchronous method is in the same bean than the caller.
The issue is that Java can't decorate the implicit this pointer.
In other words, the #Asynchronous annotation won't be processed and you're doing an ordinary method call.
You can inject your so singleton with a reference to itself (call this e.g. "self"), then call self.addWorkflow.
You might also want to consider running your async code in a stateless bean. You are using a read lock for addWorkflow, but runWorkflow still has a write lock. I think you have a dead lock now: you're holding the lock until work is done, but no work can be done until the write lock is released.

Categories