I'm using a #Scheduled with fixed rate spring annotaion in my service.
In the iteration im doing some logic and send http request.
I'm running it every 2 seconds and most of the time it is okay.
I have a log to console that tells me that a new iteration beigns.
Sometimes, the log is presented 4/6 times per second.
I also have a log that tells me how long the iteration took, and it never exceeded more than 1 second.
#Service
#EnableScheduling
public class Handler {
#Scheduled(fixedRate = 2000)
public void handle() {
System.out.println("Start iteration");
// -- logic --
}
}
I'm using sprintboot version 2.1.2.RELEASE.
Any ideas?
running every 2 seconds
Related
I have following rabbit listener:
#Slf4j
#Service
#RequiredArgsConstructor
#RabbitListener(queues = "${spring.rabbitmq.template.default-receive-queue}")
public class RabbitmqListener {
private final Processor processor;
#RabbitHandler(isDefault = true)
public void receiveMessage(List<String> someData) {
log.info("Received {} some data", someData.size());
processor.process(someData);
//should wait for 15 minutes here
}
}
I need to configure listener to wait for 15 minutes after it processed one message before receiving next one. Not necessary to wait inside this method. All i need is NOT to receive any messages after processing one. It could be done by Thread.sleep(15000), but i'm not sure that it's the best way to achieve this. Is there any rabbitmq configuration for this kind of situation?
Thread.sleep(15000) will wait for 15 seconds, not 15 minutes.
It's probably not a good idea to sleep for 15 minutes; if you only need to sleep for 15 seconds, a sleep is probably ok (but you do risk a redelivery if the server crashes while you are are sleeping).
You might want to consider using RabbitTemplate.receiveAndConvert() instead for this use case, rather than using a message-driven architecture.
I have a set of Spring boot batch jobs, which I have deployed in Spring cloud data flow serve. I'm using local server configuration. But I also want the Scheduling option for each jobs inside my application. So as mentioned in document Scheduling for Local Config for Scheduling jobs using local configuration, I use rest services along with #Scheduled annotation to kick start the job or otherwise knows as task in SCDF.
These scheduled jobs are supposed to run at 15 minutes intervals for several days. And there are 10 jobs. So what's happening when I launch the job using REst API is,
Job gets launched and gets executed on scheduled intervals. Job Id link is produced in the
Task Execution Details page.
Job gets launched again as per the scheduled interval and the new Job Id links is produced along with
the previous Job Id.
Since these are all jobs that would have several executions (500+) over several days, in the
Task Execution Details page theere would be 100s of the Job Id present. And there's no scroll bar and it occupies more than half page.
//Job Config
#Configuration
#EnableBatchProcessing
#EnableTask
public class Job1Loader {
#Bean
public Job loadJob1()
{
return jobBuilderFactory().get("JOb1Loader")
.incrementer(new RunIdIncrementer())
.flow(job01_step01())
.end()
.build();;//return job
}
Rest Controller
#RestController
public class JobLauncherController {
Logger logger = LoggerFactory.getLogger(JobLauncherController.class);
#Autowired
JobLauncher jobLauncher;
#Autowired
#Qualifier(value = "loadJob1")
Job job1;
#Scheduled(cron ="0 */2 * * * ?")
#RequestMapping("/LaunchJob1")
public String LaunchJob1() throws Exception
{
logger.info("Executing LaunchJob1");
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(job1, jobParameters);
return "Job has been launched";
}
}
So my question is this. How to limit the number of job id's listed in "Task Execution Details" page to minimum of 10 job id's. Or is there a possibility of introducing a scroll bar when certain thershold reaches for Job id count. Attached the screenshot for better unedrstanding.
Currently, the REST API for Task execution response which includes the JobExecutionIds doesn't have such filtering options. What you mention above is more than a feature request than an issue :-)
Would you mind creating a feature request in here and of course, you are welcome to contribute by providing a Pull Request with the changes.
is there a solution that allows you to check on the jobrepository for a given job(JobInstance), the presence of a completed job during the day, if there is no completed status on the batch_job_execution table during the current day, so I must send a notification or an exit code like what we got nothing today.
i plan to implement the solution in a class that extends from JobExecutionListenerSupport, like this:
public class JobCompletionNotificationListener extends JobExecutionListenerSupport {
private Logger logger = LoggerFactory.getLogger(JobCompletionNotificationListener.class);
private JobRegistry jobRegistry;
private JobRepository jobRepository;
public JobCompletionNotificationListener(JobRegistry jobRegistry, JobRepository jobRepository) {
this.jobRegistry = jobRegistry;
this.jobRepository = jobRepository;
}
#Override
public void afterJob(JobExecution jobExecution) {
System.out.println("finishhhhh");
//the logic if no job completed to day
if(noJobCompletedToDay){
Notify();
}
if (jobExecution.getStatus() == BatchStatus.COMPLETED) {
logger.info("!!! JOB FINISHED! -> example action execute after Job");
}
}
}
You can use JobExplorer#getLastJobExecution to get the last execution for your job instance and check if it's completed during the current day.
Depending on when you are going to do that check, you might also make sure there are no currently running jobs (JobExplorer#findRunningJobExecutions can help).
You can implement monitoring in multiple ways. Since version 4.2 Spring Batch provides support for metrics and monitoring based on Micrometer. There is an example of spring [grafana sample][1], with prometheus and grafana from which you can rely to customize a custom board or launch alerts from these tools.
If you have several batch processes it may be the best option, in addition to these tools will help you to monitor services, applications etc.
Buily in metrics:
Duration of job execution
Currently active jobs
Duration of step execution
Duration of item reading
Duration of item processing
Duration of chunk writing
You can create your own custom metrics (eg. Execution failures).
Otherwise, you can implement the monitoring, for example, through another independent batch process, which executes and sends a notification / mail etc. collecting for example the state of the process base, of the application or a filesystem shared between both processes.
You can also implement the check the way you describe it, there is an interesting thread where you can find described how to throw an exception in one step and process it in a next step that sends or not an alert as appropriate.
I am getting the below timeout error while calling the java springboot API service. Code attached
o.s.w.c.request.async.WebAsyncManager : Could not complete async processing due to timeout or network error
Also, I want to include concurrency in the service. Please let me know how to do.
#Configuration
public class WebConfiguration extends WebMvcConfigurerAdapter {
#Override
public void configureAsyncSupport(AsyncSupportConfigurer configurer) {
configurer.setDefaultTimeout(-1);
configurer.setTaskExecutor(asyncTaskExecutor());
}
#Bean
public AsyncTaskExecutor asyncTaskExecutor() {
return new SimpleAsyncTaskExecutor("stream-task");
}
}
In this,
configurer.setDefaultTimeout(-1);
You are actually wrongly configuring the timeout. The value passed in this method is the the amount of time, in milliseconds, before asynchronous request times out.
You should set it according to your use-case like set it to 5000 for 5 seconds. Or you can ignore this and Spring wil automatically set it to 10 seconds by default.
I have used Spring Framework's Scheduled to schedule my job to run at every 5 mins using cron. But sometime my job waits infinitely for an external resource and I can't put timeout there. I can't use fixedDelay as previous process sometime goes in wait infinitely mode and I have to refresh data at every 5 mins.
So I was looking any option in Spring Framework's Scheduled to stop that process/thread after a fixed-time either it run successfully or not.
I have found below setting which initialized ThreadPoolExecutor with 120 seconds for keepAliveTime which I put in #Configuration class. Can anybody tell me will this work as I expected.
#Bean(destroyMethod="shutdown")
public Executor taskExecutor() {
int coreThreads = 8;
int maxThreads = 20;
final ThreadPoolExecutor threadPoolExecutor = new ThreadPoolExecutor(
coreThreads, maxThreads, 120L,
TimeUnit.SECONDS, new LinkedBlockingQueue<Runnable>()
);
threadPoolExecutor.allowCoreThreadTimeOut(true);
return threadPoolExecutor;
}
I'm not sure this will work as expected. Indeed the keepAlive is for IDLE thread and I don't know if your thread waiting for resources is in IDLE. Furthermore it's only when the number of threads is greater than the core so you can't really know when it happen unless you monitor the threadpool.
keepAliveTime - when the number of threads is greater than the core, this is the maximum time that excess idle threads will wait for new tasks before terminating.
What you can do is the following:
public class MyTask {
private final long timeout;
public MyTask(long timeout) {
this.timeout = timeout;
}
#Scheduled(cron = "")
public void cronTask() {
Future<Object> result = doSomething();
result.get(timeout, TimeUnit.MILLISECONDS);
}
#Async
Future<Object> doSomething() {
//what i should do
//get ressources etc...
}
}
Don't forget to add #EnableAsync
It's also possible to do the same without #Async by implementing a Callable.
Edit: Keep in mind that it will wait until timeout but the thread running the task won't be interrupted. You will need to call Future.cancel when TimeoutException occurs. And in the task check for isInterrupted() to stop the processing. If you are calling an api be sure that isInterrupted() is checked.
allowCoreThreadTimeOut and timeout setting doesn't help cause it just allow work thread to be ended after some time without work (See javadocs)
You say your job waits infinitely for an external resource. I'am sure it's because you (or some third-party library you using) use sockets with time out infinite-by-default.
Also keep in mind what jvm ignores Thread.interrupt() when it blocked on socket.connect/read.
So find out witch socket library used in your task (and how exactly it used) and change it's default timeout settings.
As example: there is RestTemplate widely used inside Spring (in rest client, in spring social, in spring security OAuth and so on). And there is ClientHttpRequestFactory implementation to create RestTemplate instances. By default, spring use SimpleClientHttpRequestFactory which use JDK sockets. And by default all it's timeouts are infinite.
So find out where exactly you freeze, read it's docs and configure it properly.
P.S. If you don't have enough time and "feeling lucky" try to run your app with setting jvm properties sun.net.client.defaultConnectTimeout and
sun.net.client.defaultReadTimeout to some reasonable values (See docs for more details)
The keepAliveTime is just for cleaning out worker threads that hasn't been needed for a while - it doesn't have any impact on the execution time of the tasks submitted to the executor.
If whatever is taking time respects interrupts you can start a new thread and join it with a timeout, interrupting it if it doesn't complete in time.
public class SomeService {
#Scheduled(fixedRate = 5 * 60 * 1000)
public void doSomething() throws InterruptedException {
Thread taskThread = new TaskThread();
taskThread.start();
taskThread.join(120 * 000);
if(taskThread.isAlive()) {
// We timed out
taskThread.interrupt();
}
}
private class TaskThread extends Thread {
public void run() {
// Do the actual work here
}
}
}