Set MongoDb converter programmatically - java

I'm trying to use a custom converter with spring-data-mongodb. I want to create it programmatically, but I get the following error:
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type org.joda.time.LocalDate to type java.lang.String
at org.springframework.core.convert.support.GenericConversionService.handleConverterNotFound(GenericConversionService.java:475)
at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:175)
at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:154)
....
....
The following is the failing code snippet:
Mongo mongo = new Mongo();
MongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(mongo, "database");
List<Converter> converters = new ArrayList<>();
converters.add(new LocalDateWriteConverter());
converters.add(new LocalDateReadConverter());
CustomConversions customConversions = new CustomConversions(converters);
MappingContext mappingContext = new SimpleMongoMappingContext();
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(mongoDbFactory, mappingContext);
mappingMongoConverter.setCustomConversions(customConversions);
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory, mappingMongoConverter);
MongoDbEvent mongoEvent = new MongoDbEvent(new LocalDate(2012, 12, 8));
mongoTemplate.insert(mongoEvent);
And here are my converter classes:
class LocalDateReadConverter implements Converter<String, LocalDate> {
#Override
public LocalDate convert(String s) {
// Conversion code omitted.
}
}
class LocalDateWriteConverter implements Converter<LocalDate, String> {
#Override
public String convert(LocalDate localDate) {
// Conversion code omitted.
}
}
The class I'm trying to persist looks like this:
import org.joda.time.LocalDate;
public class MongoDbEvent {
private String id;
private LocalDate date;
public MongoDbEvent(LocalDate date) {
this.date = date;
}
public String getId() {
return id;
}
public LocalDate getDate() {
return date;
}
#Override
public String toString() {
return "MongoDbEvent{" +
"id='" + id + '\'' +
", date=" + date +
'}';
}
}

This answer may be a little late for the OP, but I just ran into the same problem today and found a solution...
To set it up programmatically, you need to call MongoMappingConverter.afterPropertiesSet() before you use it. I realized this from reading the code for MongoTemplate.getDefaultMongoConverter(MongoDbFactory).
Here's an example:
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory, context);
converter.setTypeMapper(mapper);
converter.setCustomConversions(new CustomConversions(
Arrays.asList(
new TimeZoneReadConverter(),
new TimeZoneWriteConverter()
)
));
converter.afterPropertiesSet();
MongoTemplate template = new MongoTemplate(mongoDbFactory, converter);

Just a heads up. I was struggling with that problem on spring-data-mongodb 1.5.1.RELEASEusing Java Configuration. As some classes have changed, I'm posting my solution.
Add the following definition in your configuration class annotated with #Configuration:
#Bean
public Mongo mongo() throws Exception {
MongoPropertiesResolver resolver = mongoResolver();
return new MongoClient(resolver.getUrl());
}
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(mongo(), "database");
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mongoConverter());
}
#Bean
public CustomConversions customConversions() {
List<Converter<?, ?>> converters = new ArrayList<Converter<?, ?>>();
converters.add(new TimeZoneReadConverter());
converters.add(new TimeZoneReadConverter());
return new CustomConversions(converters);
}
#Bean
public MappingMongoConverter mongoConverter() throws Exception {
MongoMappingContext mappingContext = new MongoMappingContext();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter mongoConverter = new MappingMongoConverter(dbRefResolver, mappingContext);
mongoConverter.setCustomConversions(customConversions());
return mongoConverter;
}

How to customize mongo with custom converters is decribed here in detail:
http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mapping-configuration
I injected the default configuration values so i can benefit from the application.properties configuration settings.
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database:test}")
String database;
#Value("${spring.data.mongodb.host:localhost}:${spring.data.mongodb.port:27017}")
String host;
#Override
protected String getDatabaseName() {
return database;
}
#Override
public Mongo mongo() throws Exception {
return new MongoClient(host);
}
#Bean
#Override
public CustomConversions customConversions() {
List<Converter<?, ?>> converterList = new ArrayList<Converter<?, ?>>();
converterList.add(new MongoColorWriter());
converterList.add(new MongoColorReader());
return new CustomConversions(converterList);
}
}

With the introduction of the java.time package in java 8 I ran into a similar issue using the new LocalDate and LocalDateTime classes in the new package.
This is how I solved it:
I wrote a converter for all 4 of these conversion options:
DateToLocalDateTimeConverter
DateToLocalDateConverter
LocalDateTimeToDateConverter
LocalDateToDateConverter
Here is an example
public class DateToLocalDateTimeConverter implements Converter<Date, LocalDateTime> {
#Override
public LocalDateTime convert(Date source) {
return source == null ? null : LocalDateTime.ofInstant(source.toInstant(), ZoneId.systemDefault());
}
}
Then by including this in the xml configuration for the mongodb connection I was able to work in java 8 dates with mongodb (remember to add all the converters):
<mongo:mapping-converter>
<mongo:custom-converters>
<mongo:converter>
<bean class="package.DateToLocalDateTimeConverter" />
</mongo:converter>
</mongo:custom-converters>
</mongo:mapping-converter>

For me it was registering my converter as a reader instead of a writer. To fix that you need to add the #WritingConverter annotation to your converter class
#Component
#WritingConverter
public class NoteWriterConverter implements Converter<Note, DBObject> {
#Override
public DBObject convert(Note source) {
DBObject obj = new BasicDBObject();
obj.put("title", source.getTitle());
obj.put("reviewDate", source.getReviewDate());
obj.removeField("_class");
return obj;
}
}

Since org.springframework.data:spring-data-commons:1.13.3.RELEASE, here's how to programmatically create a MongoTemplate with custom converters
public MongoTemplate mongoTemplate(String mongoUri) throws Exception {
MongoDbFactory factory = new SimpleMongoDbFactory(new MongoClientURI(mongoUri));
CustomConversions conversions = new CustomConversions(
Arrays.asList(new FooWriteConverter(), new FooReadConverter()));
MongoMappingContext mappingContext = new MongoMappingContext();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
MappingMongoConverter mongoConverter = new MappingMongoConverter(dbRefResolver, mappingContext);
mongoConverter.setCustomConversions(conversions);
mongoConverter.afterPropertiesSet();
return new MongoTemplate(factory, mongoConverter);
}
The converters (implementation omitted)
class FooWriteConverter implements Converter<Foo, DBObject> { ... }
class FooReadConverter implements Converter<DBObject, Foo> { ... }

Related

spring data jdbc. Can't add custom converter for enum

I want to have enum as a field for my entity.
My application is look like:
Spring boot version
plugins {
id 'org.springframework.boot' version '2.6.2' apply false
repository:
#Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
...
entity:
#Table("my_entity")
public class MyEntity{
...
private FileType fileType;
// get + set
}
enum declaration:
public enum FileType {
TYPE_1(1),
TYPE_2(2);
int databaseId;
public static FileType byDatabaseId(Integer databaseId){
return Arrays.stream(values()).findFirst().orElse(null);
}
FileType(int databaseId) {
this.databaseId = databaseId;
}
public int getDatabaseId() {
return databaseId;
}
}
My attempt:
I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303
So I've added bean
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}
converters:
#WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
#Override
public Integer convert(FileType source) {
return source.getDatabaseId();
}
}
#ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
#Override
public FileType convert(Integer databaseId) {
return FileType.byDatabaseId(databaseId);
}
}
But I see error:
The bean 'jdbcCustomConversions', defined in class path resource
[org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class],
could not be registered. A bean with that name has already been
defined in my.pack.Main and overriding is disabled.
I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.
20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
I also tried to use the latest(currently) version of spring boot:
id 'org.springframework.boot' version '2.6.2' apply false
But it didn't help.
What have I missed ?
How can I map enum to integer column properly ?
P.S.
I use following code for testing:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
}
UPDATE
My code is:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
}
But I experience error:
Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
... 59 more
Try the following instead:
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
Explanation:
Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.
JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:
#Configuration(proxyBeanMethods = false)
#ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}
In turn, AbstractJdbcConfiguration has:
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
try {
Dialect dialect = applicationContext.getBean(Dialect.class);
SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
: new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
return new JdbcCustomConversions(
CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
} catch (NoSuchBeanDefinitionException exception) {
LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
return new JdbcCustomConversions();
}
}
As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.
Update
As discussed in comments:
FileType.byDatabaseId is broken - it ignores its input param
as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries
for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked.
As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a #Bean.
You need to override two methods
public int getSqlType(RelationalPersistentProperty property)
public Class<?> getColumnType(RelationalPersistentProperty property)
I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}

spring data mongo can't pikup custom ZonedDateTime converter, why?

My problem like this CodecConfigurationException when saving ZonedDateTime to MongoDB with Spring Boot >= 2.0.1.RELEASE
but I'm wrote a custom ZonedDateTime converters:
ZonedDateTimeToDateConverter
#WritingConverter
public class ZonedDateTimeToDateConverter implements Converter<ZonedDateTime, Date> {
#Override
public Date convert(ZonedDateTime source) {
if (source == null) {
return null;
}
return Date.from(source.toInstant());
}
}
DateToZonedDateTimeConverter
#ReadingConverter
public class DateToZonedDateTimeConverter implements Converter<Date, ZonedDateTime> {
#Override
public ZonedDateTime convert(Date source) {
if (source == null) {
return null;
}
return ZonedDateTime.ofInstant(source.toInstant(), ZoneId.of("UTC"));
}
}
and my test:
#Autowired
ReactiveMongoOperations operations;
#Test
void test() {
ObjectId id = new ObjectId();
Document doc = new Document();
doc.append("_id", id);
// doc.append("a", ZonedDateTime.now()); // works
doc.append("zd1", new Document("f", ZonedDateTime.now())); // not working
operations.insert(doc, "test-collection").block();
Document found = Mono.from(operations.getCollection("test-collection")
.find(new Document("_id", id)).first()).block();
Assertions.assertNotNull(found);
}
if I add a ZDT instance to a first level of a document like this doc.append("a", ZonedDateTime.now()) - document saves fine. But if I place a ZDT instance to a document as a nested field (second level of nesting) I receive an exception:
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.time.ZonedDateTime.
What I'm doing wrong?
I solved the alike problem by adding the converters to custom converters configuration:
#Configuration
public class MongoCustomConverterConfig {
#Bean
public MongoCustomConversions mongoCustomConversions(){
List<Converter<?,?>> converters = new ArrayList<>();
converters.add(new ZoneDateTimeWriteConverter());
converters.add(new ZonedDateTimeReadConverter());
return new MongoCustomConversions(converters);
}
}

Dynamic datasource routing - DataSource router not initialized

I'm referring to this article, in which we can use the AbstractRoutingDataSource from Spring Framework to dynamically change the data source used by the application. I'm using Mybatis (3.3.0) with Spring (4.1.6.RELEASE). I want to switch to the backup database if exception occurs while getting data from main db. In this example, i have used hsql and mysql db.
RoutingDataSource:
public class RoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return DataSourceContextHolder.getTargetDataSource();
}
}
DataSourceContextHolder:
public class DataSourceContextHolder {
private static final ThreadLocal<DataSourceEnum> contextHolder = new ThreadLocal<DataSourceEnum>();
public static void setTargetDataSource(DataSourceEnum targetDataSource) {
contextHolder.set(targetDataSource);
}
public static DataSourceEnum getTargetDataSource() {
return (DataSourceEnum) contextHolder.get();
}
public static void resetDefaultDataSource() {
contextHolder.remove();
}
}
ApplicationDataConfig:
#Configuration
#MapperScan(basePackages = "com.sample.mapper")
#ComponentScan("com.sample.config")
#PropertySource(value = {"classpath:app.properties"},
ignoreResourceNotFound = true)
public class ApplicationDataConfig {
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
PropertySourcesPlaceholderConfigurer configurer =
new PropertySourcesPlaceholderConfigurer();
return configurer;
}
#Bean
public SqlSessionFactoryBean sqlSessionFactoryBean() throws Exception {
SqlSessionFactoryBean sessionFactory = new SqlSessionFactoryBean();
RoutingDataSource routingDataSource = new RoutingDataSource();
routingDataSource.setDefaultTargetDataSource(dataSource1());
Map<Object, Object> targetDataSource = new HashMap<Object, Object>();
targetDataSource.put(DataSourceEnum.HSQL, dataSource1());
targetDataSource.put(DataSourceEnum.BACKUP, dataSource2());
routingDataSource.setTargetDataSources(targetDataSource);
sessionFactory.setDataSource(routingDataSource);
sessionFactory.setTypeAliasesPackage("com.sample.common.domain");
sessionFactory.setMapperLocations(
new PathMatchingResourcePatternResolver()
.getResources("classpath*:com/sample/mapper/**/*.xml"));
return sessionFactory;
}
#Bean
public DataSource dataSource1() {
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.HSQL).addScript(
"classpath:database/app-hsqldb-schema.sql").addScript(
"classpath:database/app-hsqldb-datascript.sql").build();
}
#Bean
public DataSource dataSource2() {
PooledDataSourceFactory pooledDataSourceFactory = new PooledDataSourceFactory();
pooledDataSourceFactory.setProperties(jdbcProperties());
return pooledDataSourceFactory.getDataSource();
}
#Bean
protected Properties jdbcProperties() {
//Get the data from properties file
Properties jdbcProperties = new Properties();
jdbcProperties.setProperty("url", datasourceUrl);
jdbcProperties.setProperty("driver", datasourceDriver);
jdbcProperties.setProperty("username", datasourceUsername);
jdbcProperties.setProperty("password", datasourcePassword);
jdbcProperties.setProperty("poolMaximumIdleConnections", maxConnectionPoolSize);
jdbcProperties.setProperty("poolMaximumActiveConnections", minConnectionPoolSize);
return jdbcProperties;
}
}
Client:
#Autowired
private ApplicationMapper appMapper;
public MyObject getObjectById(String Id) {
MyObject myObj = null;
try{
DataSourceContextHolder.setTargetDataSource(DataSourceEnum.HSQL);
myObj = appMapper.getObjectById(Id);
}catch(Throwable e){
DataSourceContextHolder.setTargetDataSource(DataSourceEnum.BACKUP);
myObj = appMapper.getObjectById(Id);
}finally{
DataSourceContextHolder.resetDefaultDataSource();
}
return getObjectDetails(myObj);
}
I'm getting the following exception
### Error querying database. Cause: java.lang.IllegalArgumentException: DataSource router not initialized
However i'm able to get things working if i'm using only one db at a time, this means there is no issue with data source configuration.
Can you try this last line once (in same order) :-
targetDataSource.put(DataSourceEnum.HSQL, dataSource1());
targetDataSource.put(DataSourceEnum.BACKUP, dataSource2());
routingDataSource.setTargetDataSources(targetDataSource);
routingDataSource.afterPropertiesSet();
I got the same issue and found a solution using the SchemaExport class of hibernate.
For each DataSourceEnum you can manually initialize the datasource.
here is my detailed answer to my own issue discription

Spring Batch Proxy Failure java.lang.ClassCastException: com.sun.proxy.$Proxy20 cannot be cast to

Why would I be getting a cast error from a proxy based on a bean that is properly implementing an interface? Package and object names have been changed to protect proprietary data.
I know the error
java.lang.ClassCastException: com.sun.proxy.$Proxy20 cannot be cast to package.Order
normally comes up when one is not implementing an interface as is expected. In this case I am implementing interfaces for facilitating Proxy generation.
My specific problem is using a collection of domain objects in an enhanced for loop within an ItemWriter implementation. This exception is being thrown from the enhanced for loop I reference below.
public void write(List<? extends Order> items) throws Exception {
for ( Order order : items){
Order is an interface implemented by OrderBean.
public class OrderBean implements Order
And is declared as a Prototype for use by a BeanWrapperFieldSetMapper in a Spring Java Configuration class. As below.
#Bean
#Scope("prototype")
public Order order()
As an experiment I commented out the Java Configuration declaration, and replaced it with the XML declaration below.
<bean name="order"
scope="prototype"
class="package.OrderBean"/>
The entirety of my configuration appears below as requested in comments. I am not sure why the Order objects are being proxied, unless possibly it comes from the BeanWrapperFieldSetMapper.
Upon further testing, I found I get the same error from any bean set with the Step scope, as in #Scope("step").
#Configuration
public class OrderProcessingBatchConfiguration {
#Value("${batch.jdbc.driver}")
private String driverClassName;
#Value("${batch.jdbc.url}")
private String driverUrl;
#Value("${batch.jdbc.user}")
private String driverUsername;
#Value("${batch.jdbc.password}")
private String driverPassword;
#Value("${order.delimiter}")
private String delimiter;
#Value("${order.item.field.names}")
private String[] orderItemFieldNames;
#Value("${order.item.file.path}")
private String orderItemFilePath;
#Value("${order.field.names}")
private String[] orderFieldNames;
#Value("${order.file.path}")
private String orderFilePath;
#Value("${query.order.clear}")
private String deleteOrderQuery;
#Value("${query.order.item.clear}")
private String deleteOrderItemQuery;
#Value("${ftp.host.name}")
private String ftpHostName;
#Value("${ftp.host.port}")
private Integer ftpHostPort;
#Value("${ftp.client.mode}")
private Integer ftpClientMode;
#Value("${ftp.file.type}")
private Integer ftpFileType;
#Value("${ftp.host.username}")
private String ftpUsername;
#Value("${ftp.host.password}")
private String ftpPassword;
#Value("${ftp.tasklet.retryIfNotFound}")
private Boolean retryIfNotFound;
#Value("${ftp.tasklet.download.attempts}")
private Integer downloadAttempts;
#Value("${ftp.tasklet.retry.interval}")
private Integer retryInterval;
#Value("${ftp.tasklet.file.name.pattern}")
private String fileNamePattern;
#Value("${ftp.host.remote.directory}")
private String remoteDirectory;
#Value("${ftp.client.local.directory}")
private File localDirectory;
#Value("${ftp.tasklet.sftp}")
private Boolean sftp;
#Value("${query.order.insert}")
private String orderInsertQuery;
#Value("${query.order.items.insert}")
private String orderItemInsertQuery;
#Autowired
#Qualifier("jobRepository")
private JobRepository jobRepository;
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(driverUrl);
dataSource.setUsername(driverUsername);
dataSource.setPassword(driverPassword);
return dataSource;
}
#Bean
public SimpleJobLauncher jobLauncher() {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}
#Bean
public PlatformTransactionManager transactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
#Scope("prototype")
public OrderItem orderItem(){
return new OrderItemBean();
}
#Bean
#Scope("prototype")
public Order order(){
return new OrderBean();
}
#Bean
//#Scope("step")
public DelimitedLineTokenizer orderItemLineTokenizer(){
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer(delimiter);
lineTokenizer.setNames(orderItemFieldNames);
return lineTokenizer;
}
#Bean
//#Scope("step")
public BeanWrapperFieldSetMapper<OrderItem> orderItemFieldSetMapper(){
BeanWrapperFieldSetMapper<OrderItem> orderItemFieldSetMapper = new BeanWrapperFieldSetMapper<OrderItem>();
orderItemFieldSetMapper.setPrototypeBeanName("orderItem");
return orderItemFieldSetMapper;
}
#Bean
//#Scope("step")
public DefaultLineMapper<OrderItem> orderItemLineMapper(){
DefaultLineMapper<OrderItem> orderItemLineMapper = new DefaultLineMapper<OrderItem>();
orderItemLineMapper.setLineTokenizer(orderItemLineTokenizer());
orderItemLineMapper.setFieldSetMapper(orderItemFieldSetMapper());
return orderItemLineMapper;
}
#Bean
//#Scope("step")
public Resource orderItemResource(){
Resource orderItemResource = new FileSystemResource(orderItemFilePath);
return orderItemResource;
}
#Bean
//#Scope("step")
public FlatFileItemReader<OrderItem> orderItemItemReader(){
FlatFileItemReader<OrderItem> orderItemItemReader = new FlatFileItemReader<OrderItem>();
orderItemItemReader.setLineMapper(orderItemLineMapper());
orderItemItemReader.setResource(orderItemResource());
return orderItemItemReader;
}
#Bean
//#Scope("step")
public DelimitedLineTokenizer orderLineTokenizer(){
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer(delimiter);
lineTokenizer.setNames(orderFieldNames);
return lineTokenizer;
}
#Bean
//#Scope("step")
public BeanWrapperFieldSetMapper<Order> orderFieldSetMapper(){
BeanWrapperFieldSetMapper<Order> orderItemFieldSetMapper = new BeanWrapperFieldSetMapper<Order>();
orderItemFieldSetMapper.setPrototypeBeanName("order");
return orderItemFieldSetMapper;
}
#Bean
//#Scope("step")
public DefaultLineMapper<Order> orderLineMapper(){
DefaultLineMapper<Order> orderItemLineMapper = new DefaultLineMapper<Order>();
orderItemLineMapper.setLineTokenizer(orderLineTokenizer());
orderItemLineMapper.setFieldSetMapper(orderFieldSetMapper());
return orderItemLineMapper;
}
#Bean
//#Scope("step")
public Resource orderResource(){
Resource orderItemResource = new FileSystemResource(orderFilePath);
return orderItemResource;
}
#Bean
//#Scope("step")
public FlatFileItemReader<Order> orderItemReader(){
FlatFileItemReader<Order> orderItemItemReader = new FlatFileItemReader<Order>();
orderItemItemReader.setLineMapper(orderLineMapper());
orderItemItemReader.setResource(orderResource());
return orderItemItemReader;
}
#Bean
#Scope("step")
public Map<String, Order> orderCache(){
Map<String, Order> orderCache = new HashMap<String, Order>();
return orderCache;
}
#Bean
public JdbcTemplate jdbcTemplate(){
return new JdbcTemplate(dataSource());
}
#Bean
//#Scope("step")
public AggregatingFlatFileOrderItemReader aggregatingFlatFileOrderItemReader(){
AggregatingFlatFileOrderItemReader aggregatingFlatFileOrderItemReader = new AggregatingFlatFileOrderItemReader();
aggregatingFlatFileOrderItemReader.setJdbcTemplate(jdbcTemplate());
aggregatingFlatFileOrderItemReader.setOrderCache(orderCache());
aggregatingFlatFileOrderItemReader.setOrderItemFlatFileItemReader(orderItemItemReader());
aggregatingFlatFileOrderItemReader.setOrderFlatFileItemReader(orderItemReader());
aggregatingFlatFileOrderItemReader.setDeleteOrderQuery(deleteOrderQuery);
aggregatingFlatFileOrderItemReader.setDeleteOrderItemQuery(deleteOrderItemQuery);
return aggregatingFlatFileOrderItemReader;
}
#Bean
#Scope("step")
public SessionFactory ftpSessionFactory(){
DefaultFtpSessionFactory ftpSessionFactory = new DefaultFtpSessionFactory();
ftpSessionFactory.setHost(ftpHostName);
ftpSessionFactory.setClientMode(ftpClientMode);
ftpSessionFactory.setFileType(ftpFileType);
ftpSessionFactory.setPort(ftpHostPort);
ftpSessionFactory.setUsername(ftpUsername);
ftpSessionFactory.setPassword(ftpPassword);
return ftpSessionFactory;
}
#Bean
#Scope(value="step")
public FtpGetRemoteFilesTasklet myFtpGetRemoteFilesTasklet(){
FtpGetRemoteFilesTasklet ftpTasklet = new FtpGetRemoteFilesTasklet();
ftpTasklet.setRetryIfNotFound(retryIfNotFound);
ftpTasklet.setDownloadFileAttempts(downloadAttempts);
ftpTasklet.setRetryIntervalMilliseconds(retryInterval);
ftpTasklet.setFileNamePattern(fileNamePattern);
ftpTasklet.setRemoteDirectory(remoteDirectory);
ftpTasklet.setLocalDirectory(localDirectory);
ftpTasklet.setSessionFactory(ftpSessionFactory());
ftpTasklet.setSftp(sftp);
return ftpTasklet;
}
#Bean
#Scope(value="step")
public OrderItemWriter orderItemWriter(){
OrderItemWriter orderItemWriter = new OrderItemWriter();
orderItemWriter.setJdbcTemplate(jdbcTemplate());
orderItemWriter.setOrderInsertQuery(orderInsertQuery);
orderItemWriter.setOrderItemInsertQuery(orderItemInsertQuery);
return orderItemWriter;
}

Convert annotation based configuration to XML

How to write this part of annotation based configuration in XML?
#Bean
public EventRepository eventRepository() throws Exception {
return new SolrRepositoryFactory(eventTemplate())
.getRepository(EventRepository.class, new EventRepositoryImpl(eventTemplate()));
}
Full code of this config:
#Configuration
public class SolrContext {
#Bean
public SolrServerFactory solrServerFactory() {
return new MulticoreSolrServerFactory(new HttpSolrServer("solr.host"));
}
#Bean
public SolrTemplate eventTemplate() throws Exception {
SolrTemplate solrTemplate = new SolrTemplate(solrServerFactory());
solrTemplate.setSolrCore("events");
return solrTemplate;
}
#Bean
public EventRepository eventRepository() throws Exception {
return new SolrRepositoryFactory(eventTemplate())
.getRepository(EventRepository.class, new EventRepositoryImpl(eventTemplate()));
}
}
I got this example from that answer.

Categories