I have an existing java application that uses spring and hibernate and is deployed in AWS EBS environment. I now have a need to support thousands of light weight but persistent jobs and am considering using quartz for managing those jobs.
First, does anybody who has done this before see any issues or has word of wisdom. Second, I am looking for samples of managing a separate bean in this application that would start the scheduler so that it could run jobs, add more jobs or delete jobs that are not needed anymore. All the samples that I have seen so far use xml configuration. My environment does not have any xml configuration. Are there any samples that I can use to accomplish this in a configuration-less spring environment.
Thanks for your help in advance.
Waqar
I think Camel can help you
http://camel.apache.org/quartz.html
http://camel.apache.org/cronscheduledroutepolicy.html
CronScheduledRoutePolicy startPolicy = new CronScheduledRoutePolicy();
startPolicy.setRouteStartTime("*/3 * * * * ?");
from("direct:start")
.routeId("testRoute").routePolicy(startPolicy).noAutoStartup()
.to("mock:success");
Related
Spring Boot 2.x here. Reading through metrics/micrometer docs and trying to figure out how I could stop/restart all metrics collections on the fly at runtime.
It looks like the typical way metrics registries are created are:
SimpleMeterRegistry registry = new SimpleMeterRegistry();
And I would expect this to happen in a #Bean-annotated method from inside a #Configuration-annotated class. So to me, this indicates that the registry is started automatically as part of some background process. However, for my application, I want to be able to explicitly start/stop/restart metrics collection on demand.
Any ideas as to how to accomplish this? Thanks in advance.
I am looking for a way to store system processes / tasks that the application will then execute according to the specified system-wide conditions. The point is that the system should check the input variables and trigger specific actions in the system according to the rules of the process.
I am probably looking for some form of meta-language that I can write rules / actions and that can be programmed to start and stop based on input system parameters.
In what format record such processes?
How to parse these jobs?
What design patterns apply to this?
Are there any existing solutions on this use-case?
Which Java libraries to use for this.
If anything is unclear, I will gladly complete the question.
Thank you.
You could try Spring Batch. It introduces it's own domain language for jobs and allows configure them both using XML or java.
Here are couple of examples from their reference guide.
XML config:
<job id="footballJob">
<step id="playerload" next="gameLoad"/>
<step id="gameLoad" next="playerSummarization"/>
<step id="playerSummarization"/>
</job>
Java config:
#Bean
public Job footballJob() {
return this.jobBuilderFactory.get("footballJob")
.start(playerLoad())
.next(gameLoad())
.next(playerSummarization())
.end()
.build();
}
Spring Batch provides functionality for repeating or retrying failed job steps as well as for parallel steps execution.
For some tasks also Apache Camel can be used. It although provides its own DSL and both XML and Java configuration options.
Both frameworks provide abstractions for description of sequence of actions, which should be done during the job. Apache Camel is more convenient for jobs which require some integration tasks (sending messages in JMS queues, calling REST- or Web- services, sending emails etc). Advantage of Spring Batch is ability to configure application behavior in case of an error or temporary inaccessibility of a service, which should be called (repeat / retry mechanisms). Both frameworks can be integrated with each other: you can call Spring Batch jobs from Apache Camel routes or initiate Apache Camel routes from Spring Batch jobs.
Most complicated solution would be usage of some BPMN engine (e.g. Camunda, Apache Activiti, jBPMN), but that probably would be an overkill.
I am developing a Spring Boot application that uses Spring Data JPA and will need to connect to many different databases e.g. PostreSQL, MySQL, MS-SQL, MongoDB.
I need to create all datasources in runtime i.e. user choose these data by GUI in started application:
-driver(one of the list),
-source,
-port,
-username,
-password.
And after all he writes native sql to choosen database and get results.
I read a lot of things about it in stack and spring forums(e.g. AbstractRoutingDataSource) but all of these tutorials show how to create datasources from xml configuration or static definition in java bean. It is possible to create many datsources in runtime? How to manage transactions and how to create many sessionFactories? It is possible to use #Transactional annotation? What is the best method to do this? Can someone explain me how to do this 'step by step'?
Hope it's not too late for an answer ;)
I developed a module which can be easily integrated in any spring project. It uses a meta-datasource to hold the tenant-datasource connection details.
For the tenant-datasource an AbstractRoutingDataSource is used.
Here you find my core implementation using the AbstractRoutingDataSource.
https://github.com/Dactabird/multitenancy
Here is an example to show how to integrate it. https://github.com/Dactabird/multitenancy-sample
In this example I'm using H2 embedded db. But of course you can use whatever you want.
Feel free to modify it for your purposes or to ask if questions are left!
I have a Spring Web application with an applicationContext.xml loaded through a ContextLoaderListener in an XmlWebApplicationContext. The application context has a Quartz scheduler (defined with a SchedulerFactoryBean like here) but has no trigger nor job details.
During loading of this main application context, I load some "plug-in" JARs containing their own pluginApplicationContext.xml file.
Each pluginApplicationContext.xml is loaded in a GenericXmlApplicationContext as a child of the main XmlWebApplicationContext.
Those plug-ins may contain Quartz jobs (QuartzJobBean) which are scheduled within the scheduler discussed above. Scheduling have to be done programmatically through the Quartz API but this is fine for me. When the job is triggered, it is well instanciated by Quartz and, because it extends the QuartzJobBean, I'm able to get the current ApplicationContext through setApplicationContext.
The problem here is that I get the XmlWebApplicationContext instead of the GenericXmlApplicationContext from which the job have been scheduled. Thus, I cannot call getBean to retrieve the beans defined within the plugin.
I well understand why all of this happen. But I cannot find a clean and reusable solution to handle it. I've already had a look at OSGi but we're implementing this plug-in system on an existing application, not creating a new one from scratch and migrating the whole application to OSGi would be too much work to do. Do you know how OSGi and other plug-in frameworks deal with this kind of situation?
Thanks a lot for your help
I am not sure I get all those spring problems but I've done these things with OSGi.
What people often do not realize is that you can embed OSGi in your existing application without making any changes to the existing code. Richard Hall describes it here http://felix.apache.org/site/apache-felix-framework-launching-and-embedding.html (the API is 100% standardized).
Having a framework, you can then run your plugins in the framework. You will have to make sure the framework exports all the application packages (see the org.osgi.framework.system.packages.extra launch property). Plugins and application can then communicate through services.
I've never used Quartz but I've some experience with scheduling. I register a Runnable service with cron like properties:
#Component(properties="cron=1 * * * *")
public void SomeImpl implements Runnable {
public void run() {
...
}
}
You will then need to make a bundle that calls that service according to its cron specification).
I agree osgi is a good approach, but maybe you can simply crate one huge application context (to rule them all)? Instead of manually starting new child application context based on pluginApplicationContext.xml file simply add:
<import resource="classpath:/pluginApplicationContext.xml"/>
And this will find all plugins and merge their beans into a single application context. From architecture point of view this is a worse approach, but it will work if you discover all plugins at startup time.
I am looking at Spring Batch 2.0 to implement a pipeline process. The process is listening to some event, and needs to perform a set of transformation steps base on the event type and its content.
Spring batch seem to be a great fit. However, going through the documentation, every example have them job and its steps configured in xml. Does the framework support creating jobs during run-time and configuring the steps dynamically?
the job configuration itself is set before the job runs, but it is possible to create a flexible job configuration with conditional flows
you can't just change the job configuration while the job runs, but between jobs its easy to replace the configuration
Addon for Michael answer:
Do you want to create a flow from beginning to end completely dynamically or you want to have some dynamics at certain point?
As Spring Batch instantiates jobs (will all internals) from XML configuration, that means all necessary beans have setters/getters and you can create the Job from empty page. This is long and bug-prone way (you need to create FlowJob as in JobParserJobFactoryBean goes, then SimpleFlow then StepState then TaskletStep as in SimpleStepFactoryBean and bind them together).
I think the alternative to XML flows could be your coded logic. For String Batch it will look as one step, but with custom implementation and subflow. See <tasklet ref="myCleverTasklet" /> example in Example Tasklet Implementation.