I have a question about a general design pattern in EJB. I hava Java EE application (EJBs and Web) and I need a kind of background process which is permanently scanning and processing specific data via JPA.
One solution, I think about, is to implement a #Singleton EJB. In a method annotated with #PostConstruct I can start my process.
#Singleton
#Startup
public class MyUpdateService {
#PostConstruct
void init() {
while(true) {
// scann for new data...
// do the job.....
}
}
}
But is this a recommended pattern? Or is there a better way to run such a class in an EJB Container?
In EJBs there are the other patterns like #TimerService and the new Java EE7 batch processing. But both concepts - I think - are used for finite Processes?
Using EJB TimerService in current project for tasks like periodic data pruning, or back-end data synchronization. It allows not only single time execution, but also interval timers and timers with calendar based schedule.
Smth like:
#Startup
#Singleton
public class SyncTimer {
private static final long HOUR = 60 * 60 * 1000L;
#Resource
private TimerService timerService;
private Timer timer;
#PostConstruct
public void init() {
TimerConfig config = new TimerConfig();
config.setPersistent(false);
timer = timerService.createIntervalTimer(HOUR, HOUR, config);
}
#Timeout
private synchronized void onTimer() {
// every hour action
}
}
As an alternative to the TimerSerivce mentioned by #devmind since Java EE 7 it is possible to use the ManagedScheduledExecutorService:
#Startup
#Singleton
public class Scheduler {
static final long INITIAL_DELAY = 0;
static final long PERIOD = 2;
#Resource
ManagedScheduledExecutorService scheduler;
#PostConstruct
public void init() {
this.scheduler.scheduleAtFixedRate(this::invokePeriodically,
INITIAL_DELAY, PERIOD,
TimeUnit.SECONDS);
}
public void invokePeriodically() {
System.out.println("Don't use sout in prod " + LocalTime.now());
}
}
In difference to the TimerSerivce the ExecutorService can be run in parallel in separate tasks. See also a blog post form Adam Bien.
Related
I am scheduling the spring scheduler with SchedulingConfigurer as follows. However, new traceid is not getting created every time the "ProcessJob" method is getting called.
Even following method always logs with the same traceid.
log.info("Performing task");
What is the issue here? and how do i ensure new traceid everytime this job is triggered.
I have even tried wrapping "processJob" method call inside newSpan as follows: but no luck.
Fix 1: not working:
private void setSchedule() {
future =
taskScheduler.schedule(
() -> {
Span newSpan = tracer.nextSpan().name("newSpan").start();
try (SpanInScope ws = tracer.withSpanInScope(newSpan.start())) {
log.info("Performing task");
taskManager.processJob();
} finally {
newSpan.finish();
}
},
dynamicTrigger);
}
Original class that needs fix:
public class SchedulerConfig
implements SchedulingConfigurer, ApplicationListener<RefreshScopeRefreshedEvent> {
private final DynamicTrigger dynamicTrigger;
private final TaskManager taskManager;
private TaskScheduler taskScheduler;
private ScheduledFuture<?> future;
#Bean(destroyMethod = "shutdown")
public ExecutorService taskExecutor() {
return Executors.newScheduledThreadPool(1);
}
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
taskScheduler = taskRegistrar.getScheduler();
setSchedule();
}
private void setSchedule() {
future =
taskScheduler.schedule(
() -> {z
log.info("Performing task");
taskManager.processJob();
},
dynamicTrigger);
}
#Override
public void onApplicationEvent(RefreshScopeRefreshedEvent event) {
log.info("Rescheduling due to change in cron expression");
future.cancel(false);
setSchedule();
}
The way you start the span is not how you suppose to do it (e.g.: you call start twice). Please check the docs to see how to do it properly: https://docs.spring.io/spring-cloud-sleuth/docs/current/reference/htmlsingle/#using-creating-and-ending-spans
The easiest way to start a new span is using #NewSpan on a method that belongs to a Spring Bean, please see the docs: https://docs.spring.io/spring-cloud-sleuth/docs/current/reference/htmlsingle/#using-annotations-new-spans
For scheduling, I think it is way simpler using #Scheduled, see the docs: https://docs.spring.io/spring-cloud-sleuth/docs/current/reference/htmlsingle/#sleuth-async-scheduled-integration
This is also instrumented out of the box by Sleuth so you don't need to do anything to start a new Span:
#Scheduled(fixedDelay = 1_000)
public void scheduled() {
log.info("Hey, look what I'm doing");
}
If you don't want to use #Scheduled, you can use a TraceableScheduledExecutorService as your ExecutorService, docs: https://docs.spring.io/spring-cloud-sleuth/docs/current/reference/htmlsingle/#sleuth-async-executor-service-integration
I have a cron expression which will run everyday at 7 PM .I am using spring boot latest version.
#Scheduled(cron = "${my.cron.expression}")
public void scheduleTask(){
//call service layer where business logic resides
//other autowired beans here
}
I have 2 doubts.
Q1) How can i make sure that CRON JOB is executed only if old instance has finished running.
Q2) How to reload/refresh application context and reload all the beans afresh for every CRON JOB call?
For second point take a look at spring-cloud-config and client library spring-cloud-config-client. There is an HTTP endpoint to refresh beans
I have to create storage for any distributed operation like this, i.e. message queues, databases.
For your Q1: suggest you to have class level boolean variable like isRunning like below
public class scheduler {
private boolean isRunning = false;
#Scheduled(cron = "${my.cron.expression}")
public void scheduleTask(){
if(!isRunning) {
isRunning = true;
// executions goes here
}
isRunning = false;
}
}
Q2: To Reload beans
public class scheduler {
private boolean isRunning = false;
#Scheduled(cron = "${my.cron.expression}")
public void scheduleTask(){
if(!isRunning) {
isRunning = true;
// executions goes here
}
isRunning = false;
DefaultSingletonBeanRegistry registry = (DefaultSingletonBeanRegistry)
context.getBeanFactory();
registry.destroySingleton({yourbean}) //destroys the bean object
registry.registerSingleton({yourbeanname}, {newbeanobject}) //add to singleton beans cache
}
}
The fixedDelay property makes sure that there is a delay of n millisecond between the finish time of an execution of a task and the start time of the next execution of the task.
#Schedules({
#Scheduled(fixedDelay = 1000),
#Scheduled(cron = "${my.cron.expression}")
})
I have a class with the following function:
public class classA{
...
...
void function_to_be_scheduled(String param){
...
...
}
}
I want to schedule the function using the scheduled-tasks element of the task namespace.
<task:scheduled-tasks>
<task:scheduled ref="beanA" method="function_to_be_scheduled" cron="${cron}"/>
</task:scheduled-tasks>
How do i pass the parameter to the function which i want to schedule?
According to the docs you cant.
Notice that the methods to be scheduled must have void returns and
must not expect any arguments.
The Spring doc about scheduling says:
Notice that the methods to be scheduled must have void returns and must not expect any arguments
Since the parameter comes from the Spring config file you can declare a bean (es beanB which wraps beanA) in the spring file, inject the parameter you need in the bean and the schedule the execution of a method of the bean which knows the parameter (it could be a simple wrapper of your beanA)
You can use TaskScheduler and encapsule your logic with a parameter in Runnable:
#Autowired
private TaskScheduler scheduler;
public void scheduleRules() {
MyTask task = new MyTaskImpl(someParam);
// new CronTrigger
scheduler.scheduleAtFixedRate(task, Duration.ofMinutes(1));
}
I've found that the only way to do this is to have a façade method that is #Scheduled and knows the default value required. The useful side-effect of this is that you can also provide an API through a #Controller to provide manual triggering with a specific parameter - useful if you need to re-run an activity.
#Scheduled(cron = "${myChronSchedule}")
public void generateActivities() {
this.generateActivities(LocalDate.now());
}
public void generateActivities(LocalDate theDate) {
// do the work
...
}
If you don't need the façade to be public, there's no reason why it can't be private and no-one is the wiser.
The Task Scheduler did the trick for me
First create a configuration class called ThreadPoolTaskScheduler class. Find details Here!
Then create a class where the magic happens
#Component
public class ThreadPoolTaskSchedulerExample {
#Autowired
private ThreadPoolTaskScheduler taskScheduler;
class EmailWatch implements Runnable{
private String userEmail;
public EmailWatch(String userEmail){
this.userEmail = userEmail;
}
#Override
public void run() {
System.out.println("This is the email "+ userEmail);
}
}
public void watchEmail(String userEmail) {
//refresh watch every minute
CronTrigger cronTrigger = new CronTrigger("0 * * ? * *");
taskScheduler.schedule(new EmailWatch(userEmail));
}
}
I would like to know what could be the best architecture for a RESTful web service with a single thread executor.
My goal :
Call a RESTful web service
The web service add a task in a thread queue and execute all the task 1 per 1.
The life cyle of instanciated object is really important (there must be only one thread queue). I know that a RESTful web service life cycle is "per request" (similar to #RequestScoped I think), so I see 2 options :
Option 1 :
public class RestService {
protected final static Executor executor;
protected final static Implementation1 impl1;
protected final static Implementation2 impl2;
static {
executor = Executors.newSingleThreadExecutor();
impl1 = new Implementation1();
impl2 = new Implementation2();
}
}
#Path("/servicename")
public class MyService extends RestService {
#POST
#Path("/compute")
public void compute(){
executor.execute(new Runnable(){
public void run(){
impl1.compute();
}
});
}
}
Option 2 :
#Singleton
public class RestService {
private Executor executor;
private Implementation1 impl1;
private Implementation2 impl2;
public RestService () {
executor = Executors.newSingleThreadExecutor();
impl1 = new Implementation1();
impl2 = new Implementation2();
}
public void execute(Runnable run){
executor.execute(run);
}
public Implementation1 getImplementation1(){
return impl1;
}
public Implementation2 getImplementation2(){
return impl2;
}
}
#Path("/servicename")
public class MyService {
#Inject
private RestService rs;
#POST
#Path("/compute")
public void compute(){
rs.execute(new Runnable(){
public void run(){
rs.getImplementation1().compute();
}
});
}
}
For option 1 I'm not sure about the "life cycle" about a static field. Which option should I use ? How would you do that ?
Thanks
EDIT :
Option 3 (thread handled by EJB Container) and "ordering" is not important :
#Singleton
public class RestService {
private final Executor executor;
private final Implementation1 impl1;
private final Implementation2 impl2;
public RestService () {
executor = Executors.newSingleThreadExecutor();
impl1 = new Implementation1();
impl2 = new Implementation2();
}
public void compute1(){
executor.execute(new Runnable(){
public void run(){
impl1.compute();
}
});
}
public void compute2(){
executor.execute(new Runnable(){
public void run(){
impl2.compute();
}
});
}
}
#Path("/servicename")
public class MyService {
#Inject
private RestService rs;
#POST
#Path("/compute1")
public void compute1(){
rs.compute1();
}
#POST
#Path("/compute2")
public void compute2(){
rs.compute2();
}
}
I think Option 3 is still better than Option 1 & 2.
I think it's a bad idea to do this. Threading ought to be handled by your container, not your code.
If you're deploying on a Java EE app server, you should let it handle the threading.
If you're deploying on a non-blocking I/O server like Netty or vert.x, you should let it handle the threading.
Under no circumstances should you be managing threads.
Static fields are instantiated when the .class loads. They don't have a "lifecycle" the way instances do. They won't be cleaned up until the class loader removes the .class file.
If you must have this kind of behavior, I'd either use a JMS queue to order the processing or a producer/consumer deque to manage it. You want the processing to be asynch. Let the REST service return a token or receipt to the client and have them come back to see when the processing is done. If the line is long, you'll have no way to know when their shirt is ready. The receipt lets them come back and check when their result is available.
If you used Spring, and then separated your business logic out into separate components (beans) and injected these beans into your service class you could effectively control the threading model of your implementation simply by changing the 'scope' attribute of your business logic beans.
Here's a description of Spring bean scopes.
This approach would allow you to easily modify/experiment with your solution and discover the best alternative for your situation.
As far as architecture is concerned I would recommend that you layer your system(loose coupling, high cohesion, separation of concerns, etc.). In this regard, you would probably have a service layer (REST), a business layer, and a data (dao) layer minimum. The layers interact with each other via interfaces. Use Spring to wire things together (inject dependencies). This gives you the flexibility to inject different implementations. Example would be to inject mock DAOs for your business classes for unit tests. The service layer where your REST services live would perform the translation of requests/responses, some validation, determine which business components to invoke, and invoke them.
I have spring 3.0 app, which connects to WebService. The webservice requests are limited to 1 per second, and I need to fire ~1000 requests with 1 second delay between each.
I'm trying to do it using Spring TaskExecutor and I've found the example here
But how can I set the 1 second delay between each taskExecutor.execute call?
The code from example I'm using:
import org.springframework.core.task.TaskExecutor;
public class TaskExecutorExample {
private class MessagePrinterTask implements Runnable {
private String message;
public MessagePrinterTask(String message) {
this.message = message;
}
public void run() {
System.out.println(message);
}
}
private TaskExecutor taskExecutor;
public TaskExecutorExample(TaskExecutor taskExecutor) {
this.taskExecutor = taskExecutor;
}
public void printMessages() {
for(int i = 0; i < 25; i++) {
taskExecutor.execute(new MessagePrinterTask("Message" + i));
}
}
}
Its called fixedDelay or fixedRate, depending on whant you need exactly
<task:scheduled-tasks scheduler="myScheduler">
<task:scheduled ref="someObject" method="someMethod" fixed-delay="1000"/>
<task:scheduled ref="someObject" method="someOtherMethod" fixed-rate="1000"/>
</task:scheduled-tasks>
or
#Scheduled(fixedDelay=1000)
or
#Scheduled(fixedRate=1000)
It is well documented in the Spring Reference, where you have taken the example from
TaskExecutor is not the correct interface to use for this, it used for fire-and-forget "execute this whenever you can" operations. You should use TaskScheduler instead. This provides methods such as scheduleAtFixedDelay and scheduleAtFixedRate.
Check out the javadoc to read the descriptions of these methods - be careful, it's quite subtle.