Passing variables to #Qualifier annotation in Spring - java

Is it possible to pass a variable to the #Qualifier annotation in Spring?
For example,
#Autowried
#Qualifier("datasource_" + "#{jobParameters['datasource.number']}")
private DataSource ds;
I have 10 different databases where my Spring batch job runs everyday. The database number is passed as a job parameter. I want to define the datasource to connect to based on the job parameter.
Thanks!

You are only allowed constant expressions in annotations.
So you are creating 10 data sources in your spring configuration - does your job need to use all ten in one run?? If you only need one connection for the lifetime of your spring context, can you just have 10 different sets of property files?
One thing you could do is to create all of your data sources in a map (keyed by "database number", then inject this map AND the key into your bean, for example...
public class MyBean {
#Autowired #Qualifier("dataSourceMap")
private Map<String, DataSource> dataSourceMap;
#Value("#{jobParameters['datasource.number']}")
private String dbKey;
public void useTheDataSource() {
DataSource ds = dataSourceMap.get(dbKey);
...
}
}
Or have I misunderstood?

no, you can't pass variables to any annotations in java. it has nothing to do with spring.
use a workaround. create and pass a service that will pick correct database each time it's needed

Related

How can I define multiple jobs dynamically in Spring Batch?

I have an application that uses Spring Batch to define a preset number of jobs, which are currently all defined in the XML.
We add more jobs over time which requires updating the XML, however these jobs are always based on the same parent and can easily be predetermined using a simple SQL query.
So I've been trying to switch to use some combination of XML configuration and Java-based configuration but am quickly getting confused.
Even though we have many jobs, each job definition falls into essentially one of two categories. All of the jobs inherit from one or the other parent job and are effectively identical, besides having different names. The job name is used in the process to select different data from the database.
I've come up with some code much like the following but have run into problems getting it to work.
Full disclaimer that I'm also not entirely sure I'm going about this in the right way. More on that in a second; first, the code:
#Configuration
#EnableBatchProcessing
public class DynamicJobConfigurer extends DefaultBatchConfigurer implements InitializingBean {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private JobRegistry jobRegistry;
#Autowired
private DataSource dataSource;
#Autowired
private CustomJobDefinitionService customJobDefinitionService;
private Flow injectedFlow1;
private Flow injectedFlow2;
public void setupJobs() throws DuplicateJobException {
List<JobDefinition> jobDefinitions = customJobDefinitionService.getAllJobDefinitions();
for (JobDefinition jobDefinition : jobDefinitions) {
Job job = null;
if (jobDefinition.getType() == 1) {
job = jobBuilderFactory.get(jobDefinition.getName())
.start(injectedFlow1).build()
.build();
} else if (jobDefinition.getType() == 2) {
job = jobBuilderFactory.get(jobDefinition.getName())
.start(injectedFlow2).build()
.build();
}
if (job != null) {
jobRegistry.register(new ReferenceJobFactory(job));
}
}
}
#Override
public void afterPropertiesSet() throws Exception {
setupJobs();
}
public void setInjectedFlow1(Flow injectedFlow1) {
this.injectedFlow1 = injectedFlow1;
}
public void setInjectedFlow2(Flow injectedFlow2) {
this.injectedFlow2 = injectedFlow2;
}
}
I have the flows that get injected defined in the XML, much like this:
<batch:flow id="injectedFlow1">
<batch:step id="InjectedFlow1.Step1" next="InjectedFlow1.Step2">
<batch:flow parent="InjectedFlow.Step1" />
</batch:step>
<batch:step id="InjectedFlow1.Step2">
<batch:flow parent="InjectedFlow.Step2" />
</batch:step>
</batch:flow>
So as you can see, I'm effectively kicking off the setupJobs() method (which is intended to dynamically create these job definitions) from the afterPropertiesSet() method of InitializingBean. I'm not sure that's right. It is running, but I'm not sure if there's a different entry point that's better intended for this purpose. Also I'm not sure what the point of the #Configuration annotation is to be honest.
The problem I'm currently running into is as soon as I call register() from JobRegistry, it throws the following IllegalStateException:
To use the default BatchConfigurer the context must contain no more than one DataSource, found 2.
Note: my project actually has two data sources defined. The first is the default dataSource bean which connects to the database that Spring Batch uses. The second data source is an external database, and this second one contains all the information I need to define my list of jobs. But the main one does use the default name "dataSource" so I'm not quite sure how else I can tell it to use that one.
First of all - I don't recommend using a combination of XML as well as Java Configuration. Use only one, preferably Java one as its not much of an effort to convert XML config to Java config. (Unless you have some very good reasons to do it - that you haven't explained)
I haven't used Spring Batch alone as I have always used it with Spring Boot and I have a project where I have defined multiple jobs and it always worked well for similar code that you have shown.
For your issue, there are some answers on SO like this OR this which are basically trying to say that you need to write your own BatchConfigurer and not rely on default one.
Now coming to solution using Spring Boot
With Spring Boot, You should try segregate job definitions and job executions.
You should first try to just define jobs and initialize Spring context without enabling jobs (spring.batch.job.enabled=false)
In your Spring Boot main method, when you start app with something like - SpringApplication.run(Application.class, args); ...you will get ApplicationContext ctx
Now you can get your relevant beans from this context & launch specif jobs by getting names from property or command line etc & using JobLauncher.run(...) method.
You can refer my this answer if willing to order job executions. You can also write job schedulers using Java.
Point being, you separate your job building / bean configurations & job execution concerns.
Challenge
Keeping multiple jobs in a single project can be challenging when you try to have different settings for each job as application.properties file is environment specific and not job specific i.e. spring boot properties will apply to all jobs.
In my particular case, the solution was to actually eliminate the #Configuration
and #EnableBatchProcessing annotations from my class above. Something about these caused it to try and use the DefaultBatchConfigurer which fails when you have more than one data source defined (even if you've identified them clearly with "dataSource" as the primary and some other name for the secondary).
The #Configuration class in particular wasn't necessary because all it really does is lets your class get auto-instantiated without having to define it as a bean in the app context. But since I was doing that anyway this one was superfluous.
One of the downsides of removing #EnableBatchProcessing was that I could no longer auto-wire the JobBuilderFactory bean. So instead I just had to do to create it:
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.afterPropertiesSet();
jobRepository = factory.getObject();
jobBuilderFactory = new JobBuilderFactory(jobRepository);
Then it seems I was on the right track already by using jobRegistry.register(...) to define my jobs. So essentially once I removed those annotations above everything started working. I'm going to mark Sabir's answer as the correct one however because it helped me out.

What's the Spring way of accessing properties from an entity?

I'm new to Spring and I'm building an application where some entities (JPA/Hibernate) need access to a property from application.properties. I do have a configuration class in which this is trivial:
#Configuration
public class FactoryBeanAppConfig {
#Value("${aws.accessKeyId}")
private String awsAccessKeyId;
#Value("${aws.secretKey}")
private String awsSecretKey;
}
but since entities do not have and I think they should not have the annotations such as #Configuration or #Component, what's the Spring way for them to access the property?
Now, I know I can create my own class, my own bean, and make it as a simple wrapper around the properties; but is that the Spring way to do it or is there another way?
specify Property file location using #PropertySource
Something like below
#PropertySource("classpath:/application.proerties")
You also need to add below bean in your config
#Bean
public static PropertySourcesPlaceholderConfigurer propertyConfigIn() {
return new PropertySourcesPlaceholderConfigurer();
}
There is no "Spring way", since JPA entities and Spring have nothing to do with each other. Most importantly, JPA entities are not Spring beans, so Spring doesn't know or care about them as they're not managed by Spring.
You can try to hack around, trying in vain to access Spring's configuration from code that should not be trying to access it, or you can accept the truth that your design is broken and you're trying to do something that's not meant to be done.
As was proposed several times, use a service class for this. It's managed by Spring, so it can access the Spring config, and it can handle entities, so there's no crossing boundaries.
First create a public static variable in some bean managed by Spring, then make the following use of the #Value annotation.
public static String variable;
#Value("${variable}")
private void setVariable(String value) {
variable = value;
}
It will be set at runtime on startup, now you can access it from entities and everywhere else because it is just a public static var.
You can use #PropertySource to load the properties file as follows
#Configuration
#PropertySource("classpath:/com/organization/config/application.proerties")
public class FactoryBeanAppConfig {
...
}
Entities should not acces environment properties. If you are using your entity through a service, then the service can access the properties to act on the entity.

Spring: Using PropertyPlaceholderConfigurer support outside of application context

My applications requirements mean that I need to create application beans manually at runtime and then add them to the application context.
These beans belong to third party libraries so I cannot modify them e.g. a TibjmsConnectionFactory
So my factory class that creates these beans needs to be provided with a Properties object that will set username, password, connectionTimeouts etc
Ideally I'd like to use Spring's property support so I do not need to convert Strings to Integers etc
Also, the Properties provided to my factory class will not be the same Properties used for by the PropertyPlaceholderConfigurer in my overall ApplicationContext
How do I achieve this, or is it even possible?
public class MyCustomFactoryStrategy {
#Override
public TibjmsConnectionFactory create(Properties properties) {
TibjmsConnectionFactory connectionFactory = new TibjmsConnectionFactory();
connectionFactory.setServerUrl(properties.getProperty("emsServerUrl")); // this is a string
connectionFactory.setConnAttemptCount(new Integer(properties.getProperty("connAttemptCount"))); // this is an integer
...
return connectionFactory;
}
Have a look in this post. I think it might be what you need.
[PropertyPlaceholderConfigurer not loading programmatically
[1]:

spring + testng + hibernate right approch for session factory

I am building a spring 4 + Hibernate5 application. I wonder whether is there any difference in defining the data base connection properties like url ,username etc via DataSource object and via hibernate properties like "hibernate.connection.url", "hibernate.connection.username" etc. Offcourse ultimately the datasource object will be tied to the session factory. Just want to make sure to do the things in right way.
I want to define a separate datesource object via dataSource property, so that I can use the AbstractTransactionalTestNGSpringContextTests for test cases. This classs is always expecting a data source object. I want to use the #Rollback feature and this feature is working with AbstractTransactionalTestNGSpringContextTests. AbstractTestNGSpringContextTests is not supporting the roll back feature but still persisting working perfectly.
Need inputs to implements in the right way.
Adding example code to provide more information.
#ContextConfiguration(locations = { "classpath:spring/fpda_persistence_config.xml" })
#Rollback
#Transactional
public class BankTransactionDAOTest extends AbstractTestNGSpringContextTests {
#Autowired
private BankTransactionDAO bankTransactionDao;
#Test
public void createBankTransactionTest(){
BankTransaction bt = new BankTransaction();
bt.setAuthoritativeTableId(new BigDecimal(1234));
bt.setBankTransactionTypeCode(new Character('C'));
bt.setInstanceId(new BigDecimal(1234));
bt.setRowCreatedEpochTime(new BigDecimal(1234));
bt.setTransactionId(new BigDecimal(1234));
bt.setTransactionKey(new BigDecimal(System.currentTimeMillis()));
bankTransactionDao.createBankTransaction(bt);
}
}
here to make transaction roll back happen sucessfully, I came to know that we should extend AbstractTransactionalTestNGSpringContextTests instead of AbstractTestNGSpringContextTests. Then I should have declared the datasource property instead of defining all properties in hibernate properties.
so overall is it a right approach to declare some properties in data source and some properties in hibernate?. will it make any difference.
Thanks in Advance.
In our project we are using class-level annotation for test classes
#TestPropertySource(locations = "classpath:application-test.properties")

Using java annotation with dynamic parameter

What I'm trying to achieve is have a #Resource with a dynamic name parameter. Specifically, I want to inject a DataSource object using #Resource(name = "{JNDI_NAME_PARAM}") because we can have many datasources configured in an application server, and the datasource used by the application is defined in an .xml or .config file. Since I do not know the name of the datasource during compile time I need to be able to get it at runtime. Right now I'm injecting a custom #ApplicationScoped bean which creates a datasource in its #PostConstruct method using InitialContext().lookup(). However I'm curious (mostly because it is more elegant) as to how I could achieve injection using the #Resource annotation.
I COULD create a custom default JNDI name in the app server and change the datasource it points to when needed but this can't work with more than one deployment and many times we have the application deployed twice, once in a test database and once in a production database so having the JNDI point at two different datasources at the same time.
You can use the Method based injection.
It requires the setter method (setMyDB).
public class Test {
public javax.sql.DataSource myDB;
#Resource(name="student")
private void setMyDB(javax.sql.DataSource ds) {
myDB = ds;
}
}
If the names are known, we can have multiple resources under
#Resources({
#Resource(your type)
#Resource(your type)
})

Categories