Spring thread safety static utils - java

I was wondering if next scenario is thread-safe:
I have a spring controller with method
#Autowired
private JobService jobService;
public String launch(#ModelAttribute("profile") Profile profile){
JobParameters jobParams = MyUtils.transform(profile);
jobService.launch(profile.getJobName(), jobParams);
return "job";
}
and I have MyUtils class with static method that transforms one kind of object to another... like so :
public class MyUtils {
public static JobParameters transform(Profile profile) {
JobParametersBuilder jpb = new JobParametersBuilder();
jpb.addString("profile.name", profile.getProfileName());
jpb.addString("profile.number", String.valueOf(profile.getNumber()));
return jpb.toJobParameters();
}
}
Classes JobParametersBuilder , JobParameters and JobService are from spring batch core project. Profile class is simple POJO.
The question really is... is this static method transform thread-safe since it is dealing with object instances, although all of those instances are locally created for the method.

This concrete code IS thread safe if some conditions are met. Here is explanation:
launch method is called from Spring boot controller. Every call that comes to Spring boot controller is called from different thread and that thread is taking execution to the end of the call stack(unless you call some asynchronous call inside that thread). Tomcat can handle 200 threads in same time by default. Which means you can have 200 calls to your Controllers in same time and they will all be in separate threads. Which means launch is thread safe.
Profile is passed to the transform method and if it is simple POJO it is thread safe, because on every call it will be new instance of Profile.
Inside transform method you are instantiate JobParametersBuilder every time which is thread safe if code inside toJobParameters is thread safe and doesn't keep any state of the JobParametersBuilder class or some other.

Related

java scheduler spring vs quartz

Currently I am building a spring standalone program in order to learn new methods and architectures.
The last few days I tried to learn scheduler. I never used them before so I read some articles handling the different possible methods. Two of them are especially interesting: The spring nativ #Scheduler and Quartz.
From what I read, Spring is a little bit smaller then Quartz and much more basic. And quartz is not easy to use with spring (because of the autowired and components).
My problem now is, that there is one thing I do not understand:
From my understanding, both methods are creating parallel Threads in order to asynchronously run the jobs. But what if I now have a spring #Service in my main Application, that is holding a HashMap with some information. The data is updated and changed with user interaction. Parallel there are the scheduler. And a scheduler now whants to use this HashMap from the main application as well. Is this even possible?
Or do I understand something wrong? Because there is also the #Async annotation and I did not understand the difference. Because a scheduler itself is already parallel to the main corpus, isn't it?
(summing up, two questions:
can a job that is executed every five seconds, implemented with a scheduler, use a HashMap out of a service inside the main program? (in spring #Scheduler and/or in Quartz?)
Why is there a #Async annotation. Isn't a scheduler already parallel to the main process?
)
I have to make a few assumptions about which version of Spring you're using but as you're in the process of learning, I would assume that you're using spring-boot or a fairly new version, so please excuse if the annotations don't match your version of Spring. This said, to answer your two questions the best I can:
can a job that is executed every five seconds, implemented with a scheduler, use a HashMap out of a service inside the main program? (in spring #Scheduler and/or in Quartz?)
Yes, absolutely! The easiest way is to make sure that the hashmap in question is declared as static. To access the hashmap from the scheduled job, simply either autowire your service class or create a static get function for the hashmap.
Here is an example of a recent Vaadin project where I needed a scheduled message sent to a set of subscribers.
SchedulerConfig.class
#Configuration
#EnableAsync
#EnableScheduling
public class SchedulerConfig {
#Scheduled(fixedDelay=5000)
public void refreshVaadinUIs() {
Broadcaster.broadcast(
new BroadcastMessage(
BroadcastMessageType.AUTO_REFRESH_LIST
)
);
}
}
Broadcaster.class
public class Broadcaster implements Serializable {
private static final long serialVersionUID = 3540459607283346649L;
private static ExecutorService executorService = Executors.newSingleThreadExecutor();
private static LinkedList<BroadcastListener> listeners = new LinkedList<BroadcastListener>();
public interface BroadcastListener {
void receiveBroadcast(BroadcastMessage message);
}
public static synchronized void register(BroadcastListener listener) {
listeners.add(listener);
}
public static synchronized void unregister(BroadcastListener listener) {
listeners.remove(listener);
}
public static synchronized void broadcast(final BroadcastMessage message) {
for (final BroadcastListener listener: listeners)
executorService.execute(new Runnable() {
#Override
public void run() {
listener.receiveBroadcast(message);
}
});
}
}
Why is there a #Async annotation. Isn't a scheduler already parallel to the main process?
Yes, the scheduler is running in its own thread but what occurs to the scheduler on long running tasks (ie: doing a SOAP call to a remote server that takes a very long time to complete)?
The #Async annotation isn't required for scheduling but if you have a long running function being invoked by the scheduler, it becomes quite important.
This annotation is used to take a specific task and request to Spring's TaskExecutor to execute it on its own thread instead of the current thread. The #Async annotation causes the function to immediately return but execution will be later made by the TaskExecutor.
This said, without the #EnableAsync or #Async annotation, the functions you call will hold up the TaskScheduler as they will be executed on the same thread. On a long running operation, this would cause the scheduler to be held up and unable to execute any other scheduled functions until it returns.
I would suggest a read of Spring's Documentation about Task Execution and Scheduling It provides a great explanation of the TaskScheduler and TaskExecutor in Spring

Thread safe struts web app with spring

In a struts 2 and spring web based application, please consider below sample.
The BookManager has an action which returns a Map of books to client. It get the map from service layer which is injected by Spring
public class BookManager extends ActionSupport {
//with setter and getter
private Map<String, BookVO> books;
#inject
BookService bookservice
#Action("book-form")
public String form(){
setBooks(bookservice.getAllBooks());
}
}
The service layer gets the book list from DB an returns a MAP.
#Named
public class BookService(){
private Map<String,BookVO> books;
public Map<String,BookVO> getAllBooks(){
books = new HashMap<String,BookVO>();
//fill books from DB
return books;
}
}
I have tested and found that the above implementation is not thread safe.
I can make the code thread safe by removing private field books from BookService and use it like method HashMap<String,BookVO>() books = new HashMap<String,BookVO>();. Why this change make the code thread safe ?
The struts action is thread safe, shouldn't this assure that the even non thread safe spring service runs in a thread safe manner.
If I use the non thread safe version of service in my action, by making a new service object instead of using spring inject, I will face no issue. Why? If the service is not thread safe why making a new instance and calling it will be thread safe!
I can make the code thread safe by removing private field books from BookService and use it like method HashMap() books = new HashMap();. Why this change make the code thread safe ?
Because method-level variables are thread safe, while class-level variables are not.
The struts action is thread safe, shouldn't this assure that the even non thread safe spring service runs in a thread safe manner ?
Nope. It depends.
If I use the non thread safe version of service in my action, by making a new service object instead of using spring inject, I will face no issue. Why? If the service is not thread safe why making a new instance and calling it will be thread safe!
If you instantiate it manually in the action, you are creating an instance of that object private to that action, thread-safe since the actions are ThreadLocal, and managed by you (that's means if your BookService class has some #Inject in it, the container won't resolve them).
If instead you have the DI managed by the container, the instance is not thread-safe; what you're using (#Inject, #Named) is more than "Spring", it's Java EE, is an implementaton of the JSR-330 (Dependency Injection) available only in CDI-enabled applications (JSR-299).
CDI beans are not thread safe. You should use EJB3's #Singleton for this to be thread-safe, but you really don't need to retain that attribute at class-level, since it's used only to be returned, then left there to be overwritten the next time.
BTW consider using reference CDI (Weld in JBOSS) with the Struts2 CDI-plugin, it's worthy of a try.

Pass request scope data to async methods in CDI

Java EE 7 application is running on Wildfly 9.0.2.Final. There is a problem to access request scoped data from within #Asynchronous methods.
In web filter data (e.g. token) is set into RequestScoped CDI bean. Later we want to access this data. Everything works fine if we work in one thread. But if there is need to run code asynchronously the problem appears. CDI injects empty bean and request data is lost.
Here is the example:
#RequestScoped
public class CurrentUserService implements Serializable {
public String token;
}
#Stateless
public class Service {
#Inject
private RestClient client;
#Resource
private ManagedExecutorService executorService;
#Resource
private ContextService contextService;
#Asynchronous
private <T> Future<T> getFuture(Supplier<T> supplier) {
Callable<T> task = supplier::get;
Callable<T> callable = contextService.createContextualProxy(task, Callable.class);
return executorService.submit(callable);
}
public String getToken() throws Exception {
return getFuture(client::getToken).get();
}
}
#ApplicationScoped
public class RestClient {
#Inject
private CurrentUserService currentUserBean;
public String getToken() {
return currentUserBean.token;
}
}
In the given example we want to access current user token (CurrentUserService#token) from asynchronous Service.getToken method. As the result we will receive null.
It's expected that 'request scoped' data should be accessible from tasks executed within request scope. Something like InheritableThreadLocal should be used to allow assess to original thread data from new threads.
Is it a bug? May be I'm doing something wrong? If yes - what it the correct way to propagate such data into async calls?
Thanks in advance.
According to ยง2.3.2.1 of the Java EE Concurrency Utilities specification, you should not attempt to do this:
Tasks that are submitted to a managed instance of ExecutorService may still be running after the lifecycle of the submitting component. Therefore, CDI beans with a scope of #RequestScoped, #SessionScoped, or #ConversationScoped are not recommended to use as tasks as it cannot be guaranteed that the tasks will complete before the CDI context is destroyed.
You need to collect your request scoped data and pass it to your asynchronous task when you create it, whether you use concurrency utilities or #Asynchronous methods.

#PreDestroy not being called for a Runnable

I have a spring context in which a we have Runnable beans started like so:
public void startChildAndWait(Class type) {
BaseMyDispatcher child = appContext.getBean(type);
child.initialize(this); //The child references its parent during run method
new Thread(child).start();
synchronized(LOCK_OBJECT) {
LOCK_OBJECT.wait(someTime);
}
}
The BaseMyDispatcher class is an abstract class and SampleRunnableX are implementations which are with prototype scope, the base class basically has #PostConstruct method and a #PreDestroy method (the main functionality of which is to call notify on a LOCK_OBJECT) and of course a Run method
My problem is that the PostConstruct method is called but when the Run method completes the object doesn't seem to be destroyed therefore the PreDestroy method is not called and I get stuck waiting in parent on the LOCK_OBJECT
The code is called in a function inside a parent Runnable (which is executed inside a ThreadPoolExecutor and starts (sequentially) several children with the same method startChildAndWait passing each time a different class:
startChildAndWait(SampleRunnable1.class);
if(run2IsRequired && lastExitCode == 100) {//runIsRequired are booleans
startChildAndWait(SampleRunnable2.class);
}
if(run3IsRequired && lastExitCode == 100) {//lastExitCode is an integer
startChildAndWait(SampleRunnable3.class);
}
So what do I do to make the PreDestroy method called upon completion of the child thread?
From the documentation:
In contrast to the other scopes, Spring does not manage the complete lifecycle of a prototype bean: the container instantiates, configures, and otherwise assembles a prototype object, and hands it to the client, with no further record of that prototype instance. Thus, although initialization lifecycle callback methods are called on all objects regardless of scope, in the case of prototypes, configured destruction lifecycle callbacks are not called.
If you want something to happen when the run() method completes, put that code at the end of the run() method.
#PreDestroy callback method will be called when your application shuts down, you need to register a shutdown hook to JVM and close the application context in it when JVM exits.
//Create applicationContext
final ApplicationContext appContext =
//register ashutdown hook on application context
Runtime.getRuntime().addShutdownHook(new Thread() {
public void run() {
appContext.close();
}});
If you want to execute some code when child thread completes, better put it in end of run method(thread itself).

how to make sure single guava service manager instance per jvm?

Singleton pattern allows to contain one instance per application thread.
How can I make sure only single instance of guava Service Manager is running per JVM ? So when ever it launches a new seperate entry java thread can check whether the service manager is running.
Why do you think that simply not creating multiple instances wouldn't work? Implement a ServiceManagerProvider as a singleton and use only serviceManagerProvider.get() for accessing the Service Manager.
Consider using Dependency Injection instead of the singleton (anti-)pattern:
#Singleton
public class ServiceManagerProvider implements Provider<ServiceManager> {
private final ServiceManager serviceManager = ...
#Overrride
public ServiceManager get() {
return serviceManager;
}
}
Here, you get a single instance per injector, which is exactly what you (should) want.

Categories