I am looking for a way to store system processes / tasks that the application will then execute according to the specified system-wide conditions. The point is that the system should check the input variables and trigger specific actions in the system according to the rules of the process.
I am probably looking for some form of meta-language that I can write rules / actions and that can be programmed to start and stop based on input system parameters.
In what format record such processes?
How to parse these jobs?
What design patterns apply to this?
Are there any existing solutions on this use-case?
Which Java libraries to use for this.
If anything is unclear, I will gladly complete the question.
Thank you.
You could try Spring Batch. It introduces it's own domain language for jobs and allows configure them both using XML or java.
Here are couple of examples from their reference guide.
XML config:
<job id="footballJob">
<step id="playerload" next="gameLoad"/>
<step id="gameLoad" next="playerSummarization"/>
<step id="playerSummarization"/>
</job>
Java config:
#Bean
public Job footballJob() {
return this.jobBuilderFactory.get("footballJob")
.start(playerLoad())
.next(gameLoad())
.next(playerSummarization())
.end()
.build();
}
Spring Batch provides functionality for repeating or retrying failed job steps as well as for parallel steps execution.
For some tasks also Apache Camel can be used. It although provides its own DSL and both XML and Java configuration options.
Both frameworks provide abstractions for description of sequence of actions, which should be done during the job. Apache Camel is more convenient for jobs which require some integration tasks (sending messages in JMS queues, calling REST- or Web- services, sending emails etc). Advantage of Spring Batch is ability to configure application behavior in case of an error or temporary inaccessibility of a service, which should be called (repeat / retry mechanisms). Both frameworks can be integrated with each other: you can call Spring Batch jobs from Apache Camel routes or initiate Apache Camel routes from Spring Batch jobs.
Most complicated solution would be usage of some BPMN engine (e.g. Camunda, Apache Activiti, jBPMN), but that probably would be an overkill.
Related
I've started to use Axon 4.3.1 (latest version) in my project and having a problem.
Where can I config the kafka retry policies after #eventhandler throw an exception?
OBS: I'm using SubscribingEventProcessor type as event processor (both projects). I'm using separate projects! Command model use mongo and publish events on Kafka. Query model consume events from Kafka (eventbus). In this way, using separate JVMs.
#processinggroup(event-processor) is configured to class with event-handler method. I'd like to have a config to Kafka auto retry after some time in error cases (from query model project).
Can I use some default Axon component? Could I use something like spring-retry or internal kafka configs itself?
I've found something like that (documentation):
https://docs.axoniq.io/reference-guide/configuring-infrastructure-components/event-processing/event-processors#error-handling
"Based on the provided ErrorContext object, you can decide to ignore the error, schedule retries, perform dead-letter-queue delivery or rethrow the exception."
How can I config (for example, schedule retries) on #eventhandler after errors?
Could you help me?
Thanks.
The current implementation of Axon's Kafka Extension (version 4.0-M2) does not support setting a retry policy when it comes to event handling.
I'd argue your best approach right now is to set up something like that on Kafka, if that's even possible. Otherwise, forcing a replay of the events through Kafka would be your best approach.
I am using java DSL to configure my channel adapters. The thing I want to achieve can be described with the following piece of code:
IntegrationFlows
.from(Jms.messageDriverChannelAdapter(mqCacheConnectionFactory)
.configureListenerContainer(container -> container.sessionTransacted(transacted))
.destinations(inputDestination1, inputDestination2) // missing method
.autoStartup(autoStartup)
.id(channelName)
.errorChannel(errorChannel)
)
.channel(commonChannel)
.get();
So I would like to have messageDriverChannelAdapter that would be capable of receiving from multiple JMS destinations. Is it achievable?
No, it isn't possible.
The Spring Integration JMS support is fully based on the Spring JMS foundation. And its AbstractMessageListenerContainer provides ability to consume only one destination. Therefore Jms.messageDriverChannelAdapter() doesn't provide an option to configure several destinations to listen to.
Only option you have is configure several Jms.messageDriverChannelAdapter()s. What is good with Spring Integration that you can output them all to the same MessageChannel and you won't have so much copy/paste hell.
I have an existing java application that uses spring and hibernate and is deployed in AWS EBS environment. I now have a need to support thousands of light weight but persistent jobs and am considering using quartz for managing those jobs.
First, does anybody who has done this before see any issues or has word of wisdom. Second, I am looking for samples of managing a separate bean in this application that would start the scheduler so that it could run jobs, add more jobs or delete jobs that are not needed anymore. All the samples that I have seen so far use xml configuration. My environment does not have any xml configuration. Are there any samples that I can use to accomplish this in a configuration-less spring environment.
Thanks for your help in advance.
Waqar
I think Camel can help you
http://camel.apache.org/quartz.html
http://camel.apache.org/cronscheduledroutepolicy.html
CronScheduledRoutePolicy startPolicy = new CronScheduledRoutePolicy();
startPolicy.setRouteStartTime("*/3 * * * * ?");
from("direct:start")
.routeId("testRoute").routePolicy(startPolicy).noAutoStartup()
.to("mock:success");
I am using camel 2.9.0 in my project. We have a number of routes divided into different camel contexts. Each camel context is bundled separately and deployed in Apache Karaf. Now the problem is divied into 2 parts:
1.) Each route is a scheduled route. Although using Quartz component, we are able to define a cron expressio in each route, we want a console where in we can trigger,stop any route and also put a cron expression to any route.(Scheduling a route through a web console is our main objective).
2.) Also we tried to configure the cron expression for each route through quartz.property. But if someone wants to change the cron expression at runtime in Apache Karaf, then we have to stop the bundle deployed and start in again. What can be done to change the value of cron expression at runtime.
Any replies and help would be appreciable.
Piyush
JMX provides remote context/route management support (start, stop, etc)
see these posts for more information:
http://www.consulting-notes.com/2010/08/managing-camel-routes-with-jmx-apis.html
http://www.consulting-notes.com/2011/01/apache-camel-monitoring.html
otherwise, to add/remove/alter routes at runtime, you'd need to get a handle to the CamelContext and leverage its APIs (addRoute(), removeRoute(), etc)
see these for more information:
Add camel route at runtime in Java
http://camel.apache.org/loading-routes-from-xml-files.html
I am looking at Spring Batch 2.0 to implement a pipeline process. The process is listening to some event, and needs to perform a set of transformation steps base on the event type and its content.
Spring batch seem to be a great fit. However, going through the documentation, every example have them job and its steps configured in xml. Does the framework support creating jobs during run-time and configuring the steps dynamically?
the job configuration itself is set before the job runs, but it is possible to create a flexible job configuration with conditional flows
you can't just change the job configuration while the job runs, but between jobs its easy to replace the configuration
Addon for Michael answer:
Do you want to create a flow from beginning to end completely dynamically or you want to have some dynamics at certain point?
As Spring Batch instantiates jobs (will all internals) from XML configuration, that means all necessary beans have setters/getters and you can create the Job from empty page. This is long and bug-prone way (you need to create FlowJob as in JobParserJobFactoryBean goes, then SimpleFlow then StepState then TaskletStep as in SimpleStepFactoryBean and bind them together).
I think the alternative to XML flows could be your coded logic. For String Batch it will look as one step, but with custom implementation and subflow. See <tasklet ref="myCleverTasklet" /> example in Example Tasklet Implementation.