How to load a Spring Configuration based on a command line argument? - java

I have a few different #Configuration classes, each of which corresponds to a different Spring Batch job, i.e., one Job bean exists in each configuration and each Step, Tasklet, etc. required for a given job exists in the same configuration class as that job. Example:
#Configuration
public class DemoJobConfiguration(JobBuilderFactory jobBuilderFactory) {
#Bean
public Job demoJob() {
return jobBuilderFactory.get("demoJob").start(...).build();
}
}
#Configuration
public class TestJobConfiguration(JobBuilderFactory jobBuilderFactory) {
#Bean
public Job testJob() {
return jobBuilderFactory.get("testJob").start(...).build();
}
}
The application is a command-line application. The first argument is the name of the job to run. The associated Job bean is retrieved based on that argument and then is executed with a JobLauncher. Example:
#Override
public void run(String... args) throws Exception {
String jobName = args[0];
Job job = prepareJob(jobName); //gets the Job from the application context
JobParameters jobParameters = prepareJobParameters(args); //sets args[1], etc. to JobParameter objects
JobExecution result = jobLauncher.run(job, jobParameters);
}
What I'd like to know is if there's a way to use a #Conditional annotation (or something else) to only load a configuration class if args[0] is a certain value, e.g.,
#Configuration
#Conditional("\"testJob\".equals(args[0])")
public class TestJobConfiguration(JobBuilderFactory jobBuilderFactory) {
...
}
The advantage to this would be that only beans relevant to the job being run are ever loaded into memory and beans corresponding to other jobs are never loaded. This would be majorly helpful as more jobs get added to the project.
Is loading configurations based on command line arguments possible? Has it been done before? An hour of googling didn't turn up anything but I'm still hopeful that there's a way to accomplish this.

I figured out the answer to my own question.
Solution:
Include a command line argument in the form --jobName=testJob. Spring boot will automatically load that into the Environment (https://docs.spring.io/spring-boot/docs/1.0.1.RELEASE/reference/html/boot-features-external-config.html)
Use the #ConditonalOnProperty annotation like so:
#ConditionalOnProperty(value = "jobName", havingValue = "testJob")
public class TestJobConfiguration {
...
}

If you want more control over Conditional Configuration based on Command Line arguments you can create a custom Condition that can use the the ApplicationArguments class to fully parse the Options:
public class CustomCommandLineArgsCondition implements Condition {
#Override
public boolean matches(ConditionContext context, AnnotatedTypeMetadata metadata) {
ApplicationArguments args = context.getBeanFactory().getBean(ApplicationArguments.class);
//Do something with the command line arguments.
return true;
}
}
You then annotate your configuration class with #Conditional(CustomCommandLineArgsCondition.class)

Related

Spring batch reading files with MultiResourceItemReader and using ItemReadListener

Here's the scenario: I have a Spring Batch that reads multiple input files, processes them, and finally generates more output files.
Using FlatFileItemReader and restarting the entire Batch with a cron, I can process the files 1 by 1, however it is not feasible to restart the batch every X seconds just to process the files individually.
PS: I use ItemReadListener to add some properties of the object being read within a jobExecutionContext, which will be used later to validate (and generate, or not, the output file).
However, if I use MultiResourceItemReader to read all the input files without completely restarting the whole context (and the resources), the ItemReadListener overwrites the properties of each object (input file) in the jobExecutionContext, so that we only have data from the last one object present in the array of input files.
Is there any way to use the ItemReadListener for each Resource read inside a MultiResourceItemReader?
Example Reader:
#Bean
public MultiResourceItemReader<CustomObject> multiResourceItemReader() {
MultiResourceItemReader<CustomObject> resourceItemReader = new MultiResourceItemReader<CustomObject>();
resourceItemReader.setResources(resources);
resourceItemReader.setDelegate(reader());
return resourceItemReader;
}
#Bean
public FlatFileItemReader<CustomObject> reader() {
FlatFileItemReader<CustomObject> reader = new FlatFileItemReader<CustomObject>();
reader.setLineMapper(customObjectLineMapper());
return reader;
}
Example Step:
#Bean
public Step loadInputFiles() {
return stepBuilderFactory.get("loadInputFiles").<CustomObject, CustomObject>chunk(10)
.reader(multiResourceItemReader())
.writer(new NoOpItemWriter())
.listener(customObjectListener())
.build();
}
Example Listener:
public class CustomObjectListener implements ItemReadListener<CustomObject> {
#Value("#{jobExecution.executionContext}")
private ExecutionContext executionContext;
#Override
public void beforeRead() {
}
#Override
public void afterRead(CustomObject item) {
executionContext.put("customProperty", item.getCustomProperty());
}
#Override
public void onReadError(Exception ex) {
}
}
Scheduler:
public class Scheduler {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
#Scheduled(fixedDelay = 5000, initialDelay = 5000)
public void scheduleByFixedRate() throws Exception {
JobParameters params = new JobParametersBuilder().addString("time", format.format(Calendar.getInstance().getTime()))
.toJobParameters();
jobLauncher.run(job, params);
}
Using FlatFileItemReader and restarting the entire Batch with a cron, I can process the files 1 by 1, however it is not feasible to restart the batch every X seconds just to process the files individually.
That is the very reason I always recommend the job-per-file approach over the single-job-for-all-files-with-MultiResourceItemReader approach, like here or here.
Is there any way to use the ItemReadListener for each Resource read inside a MultiResourceItemReader?
No, because the listener is not aware of the resource the item was read from. This is a limitation of the approach itself, not in Spring Batch. What you can do though is make your items aware of the resource they were read from, by implementing ResourceAware.

spring batch using spring boot: Read arguments from config or command line and use them in job

I am pretty new to spring technology. I am trying to build an ETL like app using spring batch with spring boot.
Able to run the basic job (read->process->write). Now, I want to read the arguments (like date, file name, type, etc) from a config file (later) or command line (can work with it now) and use them in my job.
Entry point:
// Imports
#SpringBootApplication
#EnableBatchProcessing
public class EtlSpringBatchApplication {
public static void main(String[] args) {
SpringApplication.run(EtlSpringBatchApplication.class, args);
}
}
My batch configuration
// BatchConfig.java
// Imports
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public MyDao myDao;
#Bean
public Job job() {
return jobBuilderFactory
.get("job")
.incrementer(new RunIdIncrementer())
.listener(new Listener(myDao))
.flow(step1())
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<myModel, myModel>chunk(1000)
.reader(Reader.reader("my_file_20200520.txt"))
.processor(new Processor())
.writer(new Writer(myDao))
.build();
}
I have basic steps steps.
Reader.java has method to read flat file.
public static FlatFileItemReader<MyModel> reader(String path) {......}
Processor.java has process method defined. I added a #BeforeStep to fetch some details from DB required for processing.
public class Processor implements ItemProcessor<MyModel, MyModel> {
private static final Logger log = LoggerFactory.getLogger(Processor.class);
private Long id = null;
#BeforeStep
public void getId(StepExecution stepExecution) {
this.id = stepExecution.getJobExecution().getExecutionContext().getLong("Id");
}
#Override
public MyModel process(MyModel myModel) throws Exception {
}
}
Writer.java is implementing ItemWriter and write code.
Listener.java extends JobExecutionListenerSupport and has overridden methods afterJob and beforeJob.
Basically tried to use executioncontext here in beforeJob.
#Override
public void beforeJob(JobExecution jobExecution) {
log.info("Getting the id..");
this.id = myDao.getLatestId();
log.info("id retrieved is: " + this.id);
jobExecution.getExecutionContext().putLong("Id", this.id);
}
Now, what I am looking for is:
The reader should get the file name from job arguments. i.e. when run the job, I should be able to give some arguments, one of them is file path.
Later some methods (like get id, etc) require few more variables which can be passed as arguments to job i.e. run_date, type, etc.
In short I am looking for a way to,
Pass job arguments to my app (run_date, type, file path etc)
Use them in reader and other places (Listener, Writer)
Can someone provide me what addiitons I should do in my BatchConfig.java and other places, to read the job parameters (from command line or config file, whichever is easy)?
Both Spring Batch and Spring Boot reference documentation show how to pass parameters to a job:
Running Jobs from the Command Line
Running Spring Batch jobs from the Command Line
Moreover, Spring Batch docs explain in details and with code examples how to use those parameters in batch components (like reader, writer, etc):
Late Binding of Job and Step Attributes
You can read the value of the of the job parameters set from the config file inside the reader or other classes within the spring batch execution context. Below is a snippet for reference,
application.yml file can have the below config,
batch.configs.filePath: c:\test
You can add the filePath read from the config to your job parameters when you start the job. Snippet of the class,
// Job and Job Launcher related autowires..
#Value("${batch.configs.filePath}")
private String filePath;
// inside a method block,
JobParameters jobParameters = new JobParametersBuilder().addLong("JobID", System.currentTimeMillis())
.addString("filePath", filePath).toJobParameters();
try {
jobLauncher.run(batchJob, jobParameters);
} catch (Exception e) {
logger.error("Exception while running a batch job {}", e.getMessage());
}
One of the ways to access the Job Parameters is to implement StepExecutionListener to your reader Class to make use of its Overridden methods beforeStep and afterStep. Similar implementations can be performed to other classes as well,
public class Reader implements ItemReader<String>, StepExecutionListener {
private String filePath;
#Override
public void beforeStep(StepExecution stepExecution) {
try {
filePath = (String) stepExecution.getJobExecution().getExecutionContext()
.get("filePath");
} catch (Exception e) {
logger.error("Exception while performing read {}", e);
}
}
#Override
public String read() throws Exception {
// filePath value read from the job execution can be used inside read use case impl
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return ExitStatus.COMPLETED;
}
}

Unable to resolve variable from properties file when tried to access as function parameter using #Value annotation

This may be silly question to ask but i'm unable to find any satisfactory solution to my problem. In java we don't have the concept of default variables so i am trying to give default value from properties file to my function parameters/arguments using #Value annotation, but i'm always getting null and i'm unable to figure why is this happening. Please help me to solve the issue or provide me some appropriate link/reference which may solve my issue.
MainApplication.java
#SpringBootApplication
public class Application
{
public static void main(String[] args)
{
ApplicationContext context = SpringApplication.run(NetappApplication.class, args);
Sample sample = context.getBean(Sample.class);
System.out.println(sample.check(null));
}
}
Sample.java
public interface Sample
{
public String check(String message);
}
SampleImpl.java
#Service
#PropertySource("classpath:app.properties")
public class SampleImpl implements Sample
{
#Value("${test}")
String message1;
#Override
public String check(#Value("${test}") String message)
{
return message;
}
}
app.properties
test=anand
But you are passing null to your method...
Perhaps what you want to do is to assign default value to test in case it's not defined in property file:
#Value("${test:default}");
Then, when properties are autowired by Spring if placeholder resolver doesn't get the value from props file, it will use what is after :.
The best use case for this (that I can think of) is when you create Spring configuration.
Let's say you have a configuration class: for DB access. Simply put:
#Configuration
public class DbConfig {
#Value("${url:localhost}")
String dbUrl;
// rest for driver, user, pass etc
public DataSource createDatasource() {
// here you use some DataSourceBuilder to configure connection
}
}
Now, when Spring application starts up, properties' values are resolved, and as I wrote above you can switch between value from property and a default value. But it is done once, when app starts and Spring creates your beans.
If you want to check incoming argument on runtime, simple null check will be enough.
#Value("${test}")
String message1;
#Override
public String check(String message) {
if (message == null) {
return message1;
}
}

Spring ApplicationContext getBean only works when AutoWired before

I am creating a route controller structure for commands.
Every controller has a #ControlController annotation:
#Target(ElementType.TYPE)
#Retention(RetentionPolicy.RUNTIME)
#Component // Because #Component all controllers will be spring managed.
public #interface ControlController {
}
The controller should contain methods with the #CommandMapping annotation:
#Target(ElementType.METHOD)
#Retention(RetentionPolicy.RUNTIME)
public #interface CommandMapping {
String value();
}
The value of the #CommandMapping annotation is the command. So the method should be called when the value is the same as the command that is called.
At the start of the application the following code is called to fetch all the #CommandMappings:
/**
* Load all controller mappings.
*/
private void fetchControllers() {
// Get all beans with the ControlController annotation.
Map<String, Object> controllers = this.applicationContext.getBeansWithAnnotation(ControlController.class);
for (Map.Entry<String, Object> entry : controllers.entrySet()) {
Class controller = entry.getValue().getClass();
for (Method method: controller.getMethods()) {
// Check every method in a controller for the CommandMapping annotation.
// When the annotation is present the method is a command mapping.
if (method.isAnnotationPresent(CommandMapping.class)) {
CommandMapping commandMapping = method.getAnnotation(CommandMapping.class);
// Add the command mapping to the controller list.
this.controllers.put(commandMapping.value(), method);
}
}
}
}
This code will find all the beans with the #ControlController annotation and will loop trough all the methods to find the #CommandMapping annotation. All the methods will be put in a Map<String, Method>.
Until this far everything works perfect.
The following method is used to execute the right method that belongs to a command:
/**
* Execute a command for a client.
*
* #param client The client.
* #param command The command.
*/
public void executeCommand(Client client, String command) {
// Get the method that belongs to the command.
Method method = this.controllers.get(command);
Class<?> controllerClass = method.getDeclaringClass();
// The the controller that belongs to the method.
Object controller = this.applicationContext.getBean(controllerClass); // Here the code just stops.
System.out.println("Yeah"); // This isn't executed.
try {
List<Object> arguments = new ArrayList<>();
for (Parameter parameter: method.getParameters()) {
// Add arguments based on the parameter type.
}
method.invoke(controller, arguments.toArray(new Object[arguments.size()]));
} catch (Exception exception) {
exception.printStackTrace();
}
}
The code just stops without any exception at the this.applicationContext.getBean(controllerClass);
I found out that when I AutoWire the controllerClass it for some reason works. It doesn't matter in what class I autowire the controllers. But of course AutoWiring every controller is an ugly fix.
Why does the ApplicationContext.getBean get stuck and how can I fix this?
UPDATE:
I just found out that using the bean name in getBean also works.
Example:
this.applicationContext.getBean(MainController.class); //Doesn't work
this.applicationContext.getBean("mainController"); // Works
UPDATE:
I forgot to mention something very important(I think): The executeCommand method is called from a thread, but the thread is spring managed. When I run it without a thread it works, but I really need threads. How can I make beans work in a thread?
You can try by searching the Controller using the 'name' ; this solution implie to find the name of the Controller by getting the annotation.
i.e.:
#Service
#Component(value = "statService")
public class Controller {...}
public class AnnotationFinder {
public static String findComponentName(Class cls) {
for (Annotation annotation : cls.getDeclaredAnnotations()) {
if (annotation.annotationType().equals(Component.class)) {
return annotation.value();
}
}
return null;
}
}
When you get your #Component you get the value member and =>
Object controller = this.applicationContext.getBean(AnnotationFinder.findComponentName(controllerClass));
I found out the the web application wasn't working either.
The problem was that the loop that was accepting connections was not running in a separate thread, but just in a component's #PostConstruct, so the application was never fully started, but the server(My SocketServer) was running.
Because the application was not fully started the beans didn't work like expected. So it had nothing to do with the code I posted...
I hope someone else can still learn of my answer.

Java EE 7 Batch API : produce job scoped CDI Bean

I'm currently working on a Java EE 7 Batch API application, and I would like the lifecycle of one of my CDI Bean be related to the current job.
Actually I would like this bean to have a #JobScoped scope (but it doesn't exist in the API). Also I would like this bean to be injectable in any of my jobs class.
At first, I wanted to create my own #JobScoped scope, with a JobScopedContext, etc. But then I came with the idea that Batch API has the JobContext bean with a unique job id per bean.
So I wonder if I could manage the lifecycle of my job scoped bean with this JobContext.
For example, I would have my bean that I want to be job scoped :
#Alternative
public class JobScopedBean
{
private String m_value;
public String getValue()
{
return m_value;
}
public void setValue(String p_value)
{
m_value = p_value;
}
}
Then I would have the producer of this bean which will return the JobScopedBean associated to the current job (thanks to the JobContext which is unique per job)
public class ProducerJobScopedBean
{
#Inject
private JobContext m_jobContext;// this is the JobContext of Batch API
#Inject
private JobScopedManager m_manager;
#Produces
public JobScopedBean getObjectJobScoped() throws Exception
{
if (null == m_jobContext)
{
throw new Exception("Job Context not active");
}
return m_manager.get(m_jobContext.getExecutionId());
}
}
And the manager which holds the map of my JobScopedBean :
#ApplicationScoped
public class JobScopedManager
{
private final ConcurrentMap<Long, JobScopedBean> mapObjets = new ConcurrentHashMap<Long, JobScopedBean>();
public JobScopedBean get(final long jobId)
{
JobScopedBean returnObject = mapObjets.get(jobId);
if (null == returnObject)
{
final JobScopedBean ajout = new JobScopedBean();
returnObject = mapObjets.putIfAbsent(jobId, ajout);
if (null == returnObject)
{
returnObject = ajout;
}
}
return returnObject;
}
Of course, I will manage the destruction of the JobScopedBean at the end of each job (through a JobListener and a CDI Event).
Can you tell me if I'm wrong with this solution?
It looks correct to me but maybe I'm missing something?
May be there is a better way to handle this?
Thanks.
So it boils down to creating #Dependent scoped beans that are based on a job on creation. Works fine for beans with a lifespan shorter than the job, so for the standard scopes only #Dependent (#Request/#Session/#Converstion might be ok but do not apply here).
It will cause problems for other Scopes, especially #ApplicationScoped/#Singleton. If you inject the JobScopedBean into one of them. You might be (un)lucky to have an active Job when you need them the first time, but the beans will always be attached to that initial job (#Dependent scope beans are not pseudoscoped so will not create proxies to get the contextual instance)
If you want something like that, create a customscope.

Categories