I am using Spring Boot for my application. I am defining JNDI name in the application.properties file.
When I am trying to get JdbcTemplate in below class, its null:
#Configuration
public class DemoClass
{
#Autowired
private JdbcTemplate template;
#Bean
private DataSource getDS(){
return template.getDataSource(); //NPE
}
}
Another Class
#Component
public class SecondClass {
#Autowired
private JdbcTemplate template;
public void show(){
template.getDataSource(): // Working Fine
}
}
I am not sure this configured by default.. In case it is not, then maybe you can try configuring it yourself:
#Autowired
DataSoure dataSource;
#Bean
public JdbcTemplate getJdbcTemplate() {
return new JdbcTemplate(dataSource);
}
in any case if you need only the DataSource, I think it is auto-configured by Spring Boot so you can autowire it directly when you need it.
#Repository
public class DataRepository {
private JdbcTemplate jdbcTemplate;
#Autowired
public void setJdbcTemplate(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
public int updateCandidate() {
return this.jdbcTemplate.update("update .... from table where ....");
}
}
application.properties
database connection details
spring.datasource.url=jdbc:oracle:thin:***
spring.datasource.username=Scott
spring.datasource.password=Tiger
spring.datasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.tomcat.initial-size=1
spring.datasource.tomcat.max-active=1
spring.datasource.tomcat.min-idle=1
spring.datasource.tomcat.max-idle=1
If you're getting a NPE at getDS. This means JdbcTemplate hasn't been injected yet, maybe it couldn't be injected.
Give spring a hint at bean dependencies by
#Bean
public JdbcTemplate getJdbcTemplate(DataSource dataSource){
return new JdbcTemplate(dataSource)
}
Or
#Bean
#DependsOn("template")
public DataSouce getDS(){
return template.getDataSource();
}
By default #Autowired sets required=true so the DemoClass should not be constructed by Spring.
Most likely you are creating new DemoClass() or disabled the annotation config altogether and the DemoClass class is registered manually e.g. using XML.
Instead ensure that the DemoClass class is discovered using Spring's component scan e.g. using #SpringBootApplication or #ComponentScan e.g. as per this example.
Related
I have these profiles, uat-nyc, uat-ldn.
uat-nyc datasource is oracle and uat-ldn is mysql server
This configuration is setup in application-uat-nyc.yml and application-uat-ldn.yml
I have below configuration class
#Profile({"uat-nyc", "uat-ldn"})
#Configuration
#EnableConfigurationPropeties(DatSourceProperties.class)
public class DataSourceConfig{
private DataSourceProperties properties; // server, username, password are set here
DataSource getDataSource(){// gets datasource based on profiles}
}
if my application is run with spring.profiles.active: uat-nyc,uat-ldn will it create two datasources, ?
one with configuration from uat-nyc and another from uat-ldn
I have a function below, in this function, I am getting data from third-party service, and depending on ldn or nyc , I need to persist into ldn or nyc database. How can I make the below if else section dynamic? How can I get respective datasources i.e ldn and nyc in the if else section in below getProducts method?
class Product{
String name;
int price;
int region;
}
#Component
Class ProductLoader{
JdbcTemplate jdbcTemplate;
public ProductLoader(DataSource ds){
jdbcTemplate = new JdbcTemplate(ds);
}
public void getProducts(){
List<Product> products = // rest service to get products
if(Product product : product){
if(product.getRegion().equals("LONDON"){
//write to LONDON datbase
// How can I get ldn datasource here?
}
if else(product.getRegion().equals("NewYork"){
//write to NewYork datbase
How can I get NewYork datasource here?
}
else{
// Unknown location
}
}
}
}
Question -
if my application is run with spring.profiles.active: uat-nyc,uat-ldn will it create two datasources, ?
How can I inject the datasources dynamically into ProductLoader and use specific datasource for ldn and nyc
At the first time you need to tell spring you gonna use two datasource will be managed by context spring. use #Bean on #Configuration class then use #Autowired to declareing vars managed by spring.
you could use #Qualifier to choose and qualify your beans
#Configuration
public class ConfigDataSource {
// example for dataSource
#Bean("dataSourceWithPropOnCode") // this name will qualify on #autowired
public DataSource dataSourceWithPropOnCode() {
return DataSourceBuilder.create().url("").password("").username("").driverClassName("").build();
}
#Bean("dataSourceWithPropFromProperties") // this name will qualify on #autowired
#ConfigurationProperties(prefix="spring.datasource.yourname-datasource") // this is the name for the prefix for datasource on .properties settings
public DataSource dataSourcePostgres() {
return DataSourceBuilder.create().build();
}
// example for jdbctemplate
#Bean("jdbcTemaplateWithPropFromProperties") // this name will qualify on #autowired
public JdbcTemplate jdbcTemplatePostgres(#Qualifier("dataSourceWithPropFromProperties") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Bean("jdbcTemaplateWithPropOnCode") // this name will qualify on #autowired
public JdbcTemplate jdbcTemplatePostgres(#Qualifier("dataSourceWithPropOnCode") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
settings on properties
spring.datasource.yourname-datasource.url=...
spring.datasource.yourname-datasource.jdbcUrl=${spring.datasource.yourname-datasource}
spring.datasource.yourname-datasource.username=user
spring.datasource.yourname-datasource.password=pass
spring.datasource.yourname-datasource.driver-class-name=your.driver
using on services
#Qualifier("jdbcTemaplateWithPropFromProperties")
#Autowired
private JdbcTemplate jdbcTemplate1;
#Qualifier("jdbcTemaplateWithPropOnCode")
#Autowired
private JdbcTemplate jdbcTemplate2;
#Qualifier("dataSourceWithPropOnCode")
#Autowired
private DataSource dataSource1;
private DataSource dataSource2;
public someContructorIfYouPrefer(#Qualifier("dataSourceWithPropFromProperties") #Autowired private DataSource dataSource2){
this.dataSource2 = dataSource2;
}
I have a DB access module that interacts with multiple types of data sources.
Some higher modules use the DB module without interacting with JDBC data source. Therefore, I would like the data source to be defined as lazy. However, I also define a transaction manager which I can't define as lazy since it is not a direct dependency in my code. Since the transaction manager depends on the data source, the latter is always created.
#Configuration
#EnableTransactionManagement
public class SpringConfig {
#Bean(destroyMethod = "close")
#Lazy
#Autowired
public DataSource dataSource() throws Exception {
//create data source
}
#Bean
#Lazy
#Autowired
public PlatformTransactionManager txManager() throws Exception {
DataSourceTransactionManager txManager = new DataSourceTransactionManager(dataSource());
return txManager;
}
}
#Lazy
#Repository("FundRepository")
public class MyRepository {
private JdbcTemplate jdbcTemplate;
#Autowired
public void setDataSource(DataSource dataSource) {
jdbcTemplate = new JdbcTemplate(dataSource);
}
}
Any idea how to make this work?
I have two projects one with DAO classes and Model and another with Rest Controller
Project A : DAO Classes + Model
Project B : Rest Controller
Project A
application.properties:
spring.abcDatasource.url=
spring.abcDatasource.username=
spring.abcDatasource.password=
spring.abcDatasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.xyzDatasource.url=
spring.xyzDatasource.username=
spring.xyzDatasource.password=
spring.xyzDatasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.initialize=false
DBConfiguration.java
#Configuration
public class DBConfiguration {
#Primary
#Bean(name = "abcDS")
#ConfigurationProperties(prefix = "spring.abcDatasource")
public DataSource abcDS() {
return DataSourceBuilder.create().build();
}
#Bean(name = "abcJdbc")
public JdbcTemplate abcJdbcTemplate(#Qualifier("abcDS") DataSource abcDS) {
return new JdbcTemplate(abcDS);
}
#Bean(name = "xyzDS")
#ConfigurationProperties(prefix = "spring.xyzDatasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "xyzJdbc")
public JdbcTemplate ebsJdbcTemplate(#Qualifier("xyzDS") DataSource xyzDatasource) {
return new JdbcTemplate(xyzDatasource);
}
}
AlphaDAO.Java
#Repository
public class AlphaDAO{
#Autowired
#Qualifier("abcJdbc")
private JdbcTemplate abcJdbc;
#Autowired
#Qualifier("xyzJdbc")
private JdbcTemplate xyzJdbc;
SqlParameterSource namedParameters;
public Collection<Alpha> findAll(String owner){
String sql = "SELECT * from alpha where OWNER in (:owner)" ;
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(abcJdbc.getDataSource());
namedParameters = new MapSqlParameterSource("owner", owner);
List<Alpha> list = namedParameterJdbcTemplate.query(sql,namedParameters,
new BeanPropertyRowMapper(Alpha.class));
return list;
}
Project B Rest Controller :
AlphaServiceApplication.java
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class AlphaServiceApplication extends SpringBootServletInitializer implements WebApplicationInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder builder) {
return builder.sources(AlphaServiceApplication.class);
}
public static void main(String[] args) {
SpringApplication.run(AlphaServiceApplication.class, args);
}
}
AlphaServiceController.java
#RestController
public class AlphaServiceController {
private static final Logger logger = LoggerFactory.getLogger(AlphaServiceController.class);
#Autowired
AlphaDAO dao;
#CrossOrigin(origins = "http://localhost:4200")
#RequestMapping("/alpha")
public Collection<Alpha> index(#RequestBody String owner) {
return dao.findAll(owner);
}
If I try to run the rest controller I am getting the error saying
APPLICATION FAILED TO START
Description:
Field dao in com.xyz.web.wip.AlphaService.AlphaServiceController required a bean of type 'com.xyz.comp.wip.alphaComp.dao.AlphaDAO' that could not be found.
Action:
Consider defining a bean of type 'com.xyz.comp.wip.alphaComp.dao.AlphaDAO' in your configuration.
Your AlphaDao class doesnt make much sense, you are trying to autowire two fields but you still have a constructor.
Spring cant build the object because there is no qualifier on the constructor.
You can either do constructor injection or field injection but you shouldn’t use both.
I would recommend using constructor injection.
#Repository
public class AlphaDAO{
private final JdbcTemplate abcJdbc;
private final JdbcTemplate xyzJdbc;
#Autowired
public AlphaDAO(
#Qualifier("abcJdbc") JdbcTemplate abcJdbc,
#Qualifier("xyzJdbc") JdbcTemplate xyzJdbc){
this.abcJdbc = abcJdbc;
this.xyzJdbc = xyzJdbc;
}
Also remove your #Bean method from the controller.
Since the DAO classes and Rest Controller is in different packages. Added scanBasePackages to #SpringBootApplication annotation with one level up worked fine.
AlphaServiceApplication.java
#SpringBootApplication(scanBasePackages = { "com.xyz" })
I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
#Configuration
#EnableBatchProcessing
public class BatchJobConfiguration {
#Primary
#Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
#Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
BatchConfigurer configurer(#Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using #Component annotation:
The configuration class where #EnableBatchProcessing is placed we can annotate with #ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
#EnableBatchProcessing
#ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the #Primary annotation works for DataSource bean as in the example below:
#Configuration
public class BatchTestDatabaseConfig {
#Bean
#Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make #Primary annotation on DataSource work might be just adding #ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with #EnableBatchProcessing annotation:
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
I would like to provide a solution here, which is very similar to the one answered by #vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
#Bean
#Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
#PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
This class is annotated with #EnableBatchProcessing and #Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
Annotate batchDataSource bean as #Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* #param firstDataSource
*/
#Autowired
public MyBatchConfigurer(#Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
You can define below beans and make sure you application.properties file has entries needed for
#Configuration
#PropertySource("classpath:application.properties")
public class DataSourceConfig {
#Primary
#Bean(name = "abcDataSource")
#ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#Bean(name = "xyzDataSource")
#ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
First, create a custom BatchConfigurer
#Configuration
#Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
#Autowired
#Qualifier("dataSource1")
DataSource dataSource;
#Override
public JobExplorer getJobExplorer() throws Exception {
...
}
#Override
public JobLauncher getJobLauncher() throws Exception {
...
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
#Configuration
#EnableBatchProcessing
#ComponentScan("package")
public class JobConfig {
// define job, step, ...
}
I am stuck with null values in an autowired property. I am hoping I could get some help.
We are using for the project spring-boot version 0.5.0.M6.
The four configuration files with beans are in one package and are sorted by "area":
Data source configuration
Global method security configuration (as we use Spring-ACL)
MVC configuration
Spring Security configuration
The main method that bootstraps everything is in the following file:
#EnableAspectJAutoProxy
#EnableSpringConfigured
#EnableAutoConfiguration(exclude = {
DataSourceTransactionManagerAutoConfiguration.class,
HibernateJpaAutoConfiguration.class,
JpaRepositoriesAutoConfiguration.class,
SecurityAutoConfiguration.class,
ThymeleafAutoConfiguration.class,
ErrorMvcAutoConfiguration.class,
MessageSourceAutoConfiguration.class,
WebSocketAutoConfiguration.class
})
#Configuration
#ComponentScan
public class IntegrationsImcApplication {
public static void main(String[] args) throws Exception {
ApplicationContext ctx = SpringApplication.run(
IntegrationsImcApplication.c lass, args);
}
}
The first file that holds the data source configuration beans is as follows (I have omitted some method body parts to make it more readable):
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
#Configuration
public class RootDataSourceConfig
extends TomcatDataSourceConfiguration
implements TransactionManagementConfigurer {
#Override
public DataSource dataSource() {
return jpaDataSource();
}
public PlatformTransactionManager annotationDrivenTransactionManager() {
return jpaTransactionManager();
}
#Bean
public HibernateExceptionTranslator hibernateExceptionTranslator() {
return new HibernateExceptionTranslator();
}
#Bean(name="jpaDataSource")
public DataSource jpaDataSource() {......}
#Bean(name = {"transactionManager","txMgr"})
public JpaTransactionManager jpaTransactionManager() {......}
#Bean(name = "entityManagerFactory")
public EntityManagerFactory jpaEmf() {......}
}
And here is the next configuration file, that depends on the data source from above. It has about 20 beans related to ACL configuration, but it fails on the firsts bean that uses data source:
#EnableGlobalMethodSecurity(prePostEnabled = true)
#Configuration
public class RootGlobalMethodSecurityConfig
extends GlobalMethodSecurityConfiguration
implements Ordered {
#Autowired
public DataSource dataSource;
#Override
public int getOrder() {
return IntegrationsImcApplication.ROOT_METHOD_SECURITY_CO NFIG_ORDER;
}
#Bean
public MutableAclService aclService()
throws CacheException, IOException {
MutableJdbcAclService aclService = new MutableJdbcAclService(
dataSource, aclLookupStrategy(), aclCache());
aclService.setClassIdentityQuery("SELECT ##IDENTITY");
aclService.setSidIdentityQuery("SELECT ##IDENTITY");
return aclService;
}
...................................
}
Basically invoking aclService() throws an error as dataSource is null. We have tried ordering the configuration files by implementing the Ordered interface. We also tried using #AutoConfigureAfter(RootDataSourceConfig.class) but this did not help either. Instead of doing #Autowired on the DataSource we also tried injecting the RootDataSourceConfig class itself, but it was still null. We tried using #DependsOn and #Ordered on those beans but again no success. It seems like nothing can be injected into this configuration.
The console output at the startup is listing the beans in the order we want them, with data source being the first. We are pretty much blocked by this.
Is there anything weird or unique we are doing here that is not working? If this is as designed, then how could we inject data source differently?
Repo: github
Eager initialization of a bean that depends on a DataSource is definitely the problem. The root cause is nothing to do with Spring Boot or autoconfiguration, but rather plain old-fashioned chicken and egg - method security is applied via an aspect which is wrapped around your business beans by a BeanPostProcessor. A bean can only be post processed by something that is initialized very early. In this case it is too early to have the DataSource injected (actually the #Configuration class that needs the DataSource is instantiated too early to be wrapped properly in the #Configuration processing machinery, so it cannot be autowired). My proposal (which only gets you to the same point with the missing AuthenticationManager) is to declare the GlobalMethodSecurityConfiguration as a nested class instead of the one that the DataSource is needed in:
#EnableGlobalMethodSecurity(prePostEnabled = true)
#Configuration
protected static class ActualMethodSecurityConfiguration extends GlobalMethodSecurityConfiguration {
#Autowired
#Qualifier("aclDaoAuthenticationProvider")
private AuthenticationProvider aclDaoAuthenticationProvider;
#Autowired
#Qualifier("aclAnonymousAuthenticationProvider")
private AnonymousAuthenticationProvider aclAnonymousAuthenticationProvider;
#Autowired
#Qualifier("aclExpressionHandler")
private MethodSecurityExpressionHandler aclExpressionHandler;
#Override
protected void configure(AuthenticationManagerBuilder auth)
throws Exception {
auth.authenticationProvider(aclDaoAuthenticationProvider);
auth.authenticationProvider(aclAnonymousAuthenticationProvider);
}
#Override
public MethodSecurityExpressionHandler createExpressionHandler() {
return aclExpressionHandler;
}
}
i.e. stick that inside the RootMethodSecurityConfiguration and remove the #EnableGlobalMethodSecurity annotation from that class.
I might have resolved the problem.
GlobalMethodSecurityConfiguration.class has the following setter that tries to autowire permission evaluators:
#Autowired(required = false)
public void setPermissionEvaluator(List<PermissionEvaluator> permissionEvaluators) {
....
}
And in my case the aclPermissionEvaluator() bean needs aclService() bean, which in turn depends on another autowired property: dataSource. Which seems not to be autowired yet.
To fix this I implemented BeanFactoryAware and get dataSource from beanFactory instead:
public class RootMethodSecurityConfiguration extends GlobalMethodSecurityConfiguration implements BeanFactoryAware {
private DataSource dataSource;
#Override
public void setBeanFactory(BeanFactory beanFactory) throws BeansException {
this.dataSource = beanFactory.getBean("dataSource", DataSource.class);
}
....
}
After this, other exception showed up, whereSecurityAutoConfiguration.class is complaining about missing AuthenticationManager, so I just excluded it from #EnableAutoConfiguration. I am not sure if its ideal, but I have custom security configuration, so this way everything works ok.