spring data mongo can't pikup custom ZonedDateTime converter, why? - java

My problem like this CodecConfigurationException when saving ZonedDateTime to MongoDB with Spring Boot >= 2.0.1.RELEASE
but I'm wrote a custom ZonedDateTime converters:
ZonedDateTimeToDateConverter
#WritingConverter
public class ZonedDateTimeToDateConverter implements Converter<ZonedDateTime, Date> {
#Override
public Date convert(ZonedDateTime source) {
if (source == null) {
return null;
}
return Date.from(source.toInstant());
}
}
DateToZonedDateTimeConverter
#ReadingConverter
public class DateToZonedDateTimeConverter implements Converter<Date, ZonedDateTime> {
#Override
public ZonedDateTime convert(Date source) {
if (source == null) {
return null;
}
return ZonedDateTime.ofInstant(source.toInstant(), ZoneId.of("UTC"));
}
}
and my test:
#Autowired
ReactiveMongoOperations operations;
#Test
void test() {
ObjectId id = new ObjectId();
Document doc = new Document();
doc.append("_id", id);
// doc.append("a", ZonedDateTime.now()); // works
doc.append("zd1", new Document("f", ZonedDateTime.now())); // not working
operations.insert(doc, "test-collection").block();
Document found = Mono.from(operations.getCollection("test-collection")
.find(new Document("_id", id)).first()).block();
Assertions.assertNotNull(found);
}
if I add a ZDT instance to a first level of a document like this doc.append("a", ZonedDateTime.now()) - document saves fine. But if I place a ZDT instance to a document as a nested field (second level of nesting) I receive an exception:
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.time.ZonedDateTime.
What I'm doing wrong?

I solved the alike problem by adding the converters to custom converters configuration:
#Configuration
public class MongoCustomConverterConfig {
#Bean
public MongoCustomConversions mongoCustomConversions(){
List<Converter<?,?>> converters = new ArrayList<>();
converters.add(new ZoneDateTimeWriteConverter());
converters.add(new ZonedDateTimeReadConverter());
return new MongoCustomConversions(converters);
}
}

Related

How to specify DateTime in GraphQL schema?

I am building my GraphQL schema for my project and one of my models has a DateTime format.
How do I write out date formats on my GraphQL schema?
I tried DateTime or Date but nothing shows up.
This is the model:
public Integer Id;
public String name;
public String description;
public LocalDate birthDate;
This is what's in my GraphQL schema:
type Pet {
id: ID!
name: String!
description: String
birthDate: DateTime
}
But it says:
Unknown type DateTime
Create a custom scalar for your types that is not recognized by your framework.
I am not sure which graphql-java based framework you are using. I assume you are using the official Spring for GraphQL from Spring team.
Create a custom scalar, eg my LocalDateTime scalar.
public class LocalDateTimeScalar implements Coercing<LocalDateTime, String> {
#Override
public String serialize(Object dataFetcherResult) throws CoercingSerializeException {
if (dataFetcherResult instanceof LocalDateTime) {
return ((LocalDateTime) dataFetcherResult).format(DateTimeFormatter.ISO_DATE_TIME);
} else {
throw new CoercingSerializeException("Not a valid DateTime");
}
}
#Override
public LocalDateTime parseValue(Object input) throws CoercingParseValueException {
return LocalDateTime.parse(input.toString(), DateTimeFormatter.ISO_DATE_TIME);
}
#Override
public LocalDateTime parseLiteral(Object input) throws CoercingParseLiteralException {
if (input instanceof StringValue) {
return LocalDateTime.parse(((StringValue) input).getValue(), DateTimeFormatter.ISO_DATE_TIME);
}
throw new CoercingParseLiteralException("Value is not a valid ISO date time");
}
}
Register it in your custom RuntimeWiring bean, check here.
public class Scalars {
public static GraphQLScalarType localDateTimeType() {
return GraphQLScalarType.newScalar()
.name("LocalDateTime")
.description("LocalDateTime type")
.coercing(new LocalDateTimeScalar())
.build();
}
}
#Component
#RequiredArgsConstructor
public class PostsRuntimeWiring implements RuntimeWiringConfigurer {
private final DataFetchers dataFetchers;
#Override
public void configure(RuntimeWiring.Builder builder) {
builder
//...
.scalar(Scalars.localDateTimeType())
//...
.build();
}
}
If you are using Scalars in other graphql-java based frameworks(GraphQL Java, GraphQL Java Kickstart, GraphQL Kotlin, GraphQL SPQR, Netflix DGS etc) and spring integrations, check my Spring GraphQL Sample. The back-end principle is similar, just some different config.

spring data jdbc. Can't add custom converter for enum

I want to have enum as a field for my entity.
My application is look like:
Spring boot version
plugins {
id 'org.springframework.boot' version '2.6.2' apply false
repository:
#Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
...
entity:
#Table("my_entity")
public class MyEntity{
...
private FileType fileType;
// get + set
}
enum declaration:
public enum FileType {
TYPE_1(1),
TYPE_2(2);
int databaseId;
public static FileType byDatabaseId(Integer databaseId){
return Arrays.stream(values()).findFirst().orElse(null);
}
FileType(int databaseId) {
this.databaseId = databaseId;
}
public int getDatabaseId() {
return databaseId;
}
}
My attempt:
I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303
So I've added bean
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}
converters:
#WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
#Override
public Integer convert(FileType source) {
return source.getDatabaseId();
}
}
#ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
#Override
public FileType convert(Integer databaseId) {
return FileType.byDatabaseId(databaseId);
}
}
But I see error:
The bean 'jdbcCustomConversions', defined in class path resource
[org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class],
could not be registered. A bean with that name has already been
defined in my.pack.Main and overriding is disabled.
I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.
20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
I also tried to use the latest(currently) version of spring boot:
id 'org.springframework.boot' version '2.6.2' apply false
But it didn't help.
What have I missed ?
How can I map enum to integer column properly ?
P.S.
I use following code for testing:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
}
UPDATE
My code is:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
}
But I experience error:
Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
... 59 more
Try the following instead:
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
Explanation:
Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.
JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:
#Configuration(proxyBeanMethods = false)
#ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}
In turn, AbstractJdbcConfiguration has:
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
try {
Dialect dialect = applicationContext.getBean(Dialect.class);
SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
: new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
return new JdbcCustomConversions(
CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
} catch (NoSuchBeanDefinitionException exception) {
LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
return new JdbcCustomConversions();
}
}
As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.
Update
As discussed in comments:
FileType.byDatabaseId is broken - it ignores its input param
as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries
for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked.
As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a #Bean.
You need to override two methods
public int getSqlType(RelationalPersistentProperty property)
public Class<?> getColumnType(RelationalPersistentProperty property)
I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}

Annotation based ServiceLocatorFactoryBean?

I would like to implement Factory pattern in my project..i have gone through online resources and I came to know that spring ServiceLocatorFactoryBean should be implemented instead of normal java factory pattern....
i have followed this link but it is explained in xml based....can any one tell me how to do it using annotations based Factory pattern??
Spring Java Configuration ref guide #Configuration
Interface Parser.class
public interface Parser {
void parse(String str);
}
Implementation for above interface.
JsonParser.java
public class JsonParser implements Parser {
#Override
public void parse(String str) {
System.out.println("JsonParser.parse::" + str);
}
}
XMLParser.java
public class XMLParser implements Parser{
#Override
public void parse(String str) {
System.out.println("XMLParser.parse :: " + str);
}
}
ParserFactory.java actual Factory interface.
public interface ParserFactory {
public Parser getParser(ParserType parserType);
}
ParseType.java enum to specify parsing types(avoid typos and safe)
public enum ParserType {
JSON("jsonParser"), XML("xmlParser");
private final String value;
ParserType(String input) {
this.value = input;
}
public String getValue() {
return this.value;
}
#Override
public String toString() {
return this.value;
}
}
ParseService.java , where Business logic implemeted.
#Service
public class ParserService {
#Autowired
private ParserFactory parserFactory;
public void doParse(String parseString, ParserType parseType) {
Parser parser = parserFactory.getParser(parseType);
System.out.println("ParserService.doParse.." + parser);
parser.parse(parseString);
}
}
Finally AppConfig.java Spring java configuration class, where all of my beans registered as container managed beans.
#Configuration
#ComponentScan(basePackages = {"<Your Package Name>"})
public class AppConfig {
#Bean
public FactoryBean serviceLocatorFactoryBean() {
ServiceLocatorFactoryBean factoryBean = new ServiceLocatorFactoryBean();
factoryBean.setServiceLocatorInterface(ParserFactory.class);
return factoryBean;
}
#Bean(name = "jsonParser")
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public JsonParser jsonParser() {
return new JsonParser();
}
#Bean(name = "xmlParser")
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public XMLParser xmlParser() {
return new XMLParser();
}
}
Now autowire ParserService bean in either controller or test classs, and invoke parese(String, ParseType) method to test.
Here is my test.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = AppConfig.class)
public class ServiceLocatorFactoryExample {
#Autowired
private ParserService parserService;
#Test
public void testParserFactory() {
parserService.doParse("Srilekha", ParserType.JSON);
parserService.doParse("Srilekha", ParserType.XML);
}
}
Look this complete example: Serevice Locator factory
It helps me to understand how it works using spring boot.

Set MongoDb converter programmatically

I'm trying to use a custom converter with spring-data-mongodb. I want to create it programmatically, but I get the following error:
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type org.joda.time.LocalDate to type java.lang.String
at org.springframework.core.convert.support.GenericConversionService.handleConverterNotFound(GenericConversionService.java:475)
at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:175)
at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:154)
....
....
The following is the failing code snippet:
Mongo mongo = new Mongo();
MongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(mongo, "database");
List<Converter> converters = new ArrayList<>();
converters.add(new LocalDateWriteConverter());
converters.add(new LocalDateReadConverter());
CustomConversions customConversions = new CustomConversions(converters);
MappingContext mappingContext = new SimpleMongoMappingContext();
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(mongoDbFactory, mappingContext);
mappingMongoConverter.setCustomConversions(customConversions);
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory, mappingMongoConverter);
MongoDbEvent mongoEvent = new MongoDbEvent(new LocalDate(2012, 12, 8));
mongoTemplate.insert(mongoEvent);
And here are my converter classes:
class LocalDateReadConverter implements Converter<String, LocalDate> {
#Override
public LocalDate convert(String s) {
// Conversion code omitted.
}
}
class LocalDateWriteConverter implements Converter<LocalDate, String> {
#Override
public String convert(LocalDate localDate) {
// Conversion code omitted.
}
}
The class I'm trying to persist looks like this:
import org.joda.time.LocalDate;
public class MongoDbEvent {
private String id;
private LocalDate date;
public MongoDbEvent(LocalDate date) {
this.date = date;
}
public String getId() {
return id;
}
public LocalDate getDate() {
return date;
}
#Override
public String toString() {
return "MongoDbEvent{" +
"id='" + id + '\'' +
", date=" + date +
'}';
}
}
This answer may be a little late for the OP, but I just ran into the same problem today and found a solution...
To set it up programmatically, you need to call MongoMappingConverter.afterPropertiesSet() before you use it. I realized this from reading the code for MongoTemplate.getDefaultMongoConverter(MongoDbFactory).
Here's an example:
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory, context);
converter.setTypeMapper(mapper);
converter.setCustomConversions(new CustomConversions(
Arrays.asList(
new TimeZoneReadConverter(),
new TimeZoneWriteConverter()
)
));
converter.afterPropertiesSet();
MongoTemplate template = new MongoTemplate(mongoDbFactory, converter);
Just a heads up. I was struggling with that problem on spring-data-mongodb 1.5.1.RELEASEusing Java Configuration. As some classes have changed, I'm posting my solution.
Add the following definition in your configuration class annotated with #Configuration:
#Bean
public Mongo mongo() throws Exception {
MongoPropertiesResolver resolver = mongoResolver();
return new MongoClient(resolver.getUrl());
}
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(mongo(), "database");
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mongoConverter());
}
#Bean
public CustomConversions customConversions() {
List<Converter<?, ?>> converters = new ArrayList<Converter<?, ?>>();
converters.add(new TimeZoneReadConverter());
converters.add(new TimeZoneReadConverter());
return new CustomConversions(converters);
}
#Bean
public MappingMongoConverter mongoConverter() throws Exception {
MongoMappingContext mappingContext = new MongoMappingContext();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter mongoConverter = new MappingMongoConverter(dbRefResolver, mappingContext);
mongoConverter.setCustomConversions(customConversions());
return mongoConverter;
}
How to customize mongo with custom converters is decribed here in detail:
http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mapping-configuration
I injected the default configuration values so i can benefit from the application.properties configuration settings.
#Configuration
public class MongoConfiguration extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database:test}")
String database;
#Value("${spring.data.mongodb.host:localhost}:${spring.data.mongodb.port:27017}")
String host;
#Override
protected String getDatabaseName() {
return database;
}
#Override
public Mongo mongo() throws Exception {
return new MongoClient(host);
}
#Bean
#Override
public CustomConversions customConversions() {
List<Converter<?, ?>> converterList = new ArrayList<Converter<?, ?>>();
converterList.add(new MongoColorWriter());
converterList.add(new MongoColorReader());
return new CustomConversions(converterList);
}
}
With the introduction of the java.time package in java 8 I ran into a similar issue using the new LocalDate and LocalDateTime classes in the new package.
This is how I solved it:
I wrote a converter for all 4 of these conversion options:
DateToLocalDateTimeConverter
DateToLocalDateConverter
LocalDateTimeToDateConverter
LocalDateToDateConverter
Here is an example
public class DateToLocalDateTimeConverter implements Converter<Date, LocalDateTime> {
#Override
public LocalDateTime convert(Date source) {
return source == null ? null : LocalDateTime.ofInstant(source.toInstant(), ZoneId.systemDefault());
}
}
Then by including this in the xml configuration for the mongodb connection I was able to work in java 8 dates with mongodb (remember to add all the converters):
<mongo:mapping-converter>
<mongo:custom-converters>
<mongo:converter>
<bean class="package.DateToLocalDateTimeConverter" />
</mongo:converter>
</mongo:custom-converters>
</mongo:mapping-converter>
For me it was registering my converter as a reader instead of a writer. To fix that you need to add the #WritingConverter annotation to your converter class
#Component
#WritingConverter
public class NoteWriterConverter implements Converter<Note, DBObject> {
#Override
public DBObject convert(Note source) {
DBObject obj = new BasicDBObject();
obj.put("title", source.getTitle());
obj.put("reviewDate", source.getReviewDate());
obj.removeField("_class");
return obj;
}
}
Since org.springframework.data:spring-data-commons:1.13.3.RELEASE, here's how to programmatically create a MongoTemplate with custom converters
public MongoTemplate mongoTemplate(String mongoUri) throws Exception {
MongoDbFactory factory = new SimpleMongoDbFactory(new MongoClientURI(mongoUri));
CustomConversions conversions = new CustomConversions(
Arrays.asList(new FooWriteConverter(), new FooReadConverter()));
MongoMappingContext mappingContext = new MongoMappingContext();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
MappingMongoConverter mongoConverter = new MappingMongoConverter(dbRefResolver, mappingContext);
mongoConverter.setCustomConversions(conversions);
mongoConverter.afterPropertiesSet();
return new MongoTemplate(factory, mongoConverter);
}
The converters (implementation omitted)
class FooWriteConverter implements Converter<Foo, DBObject> { ... }
class FooReadConverter implements Converter<DBObject, Foo> { ... }

java spring MappingJacksonJsonView not doing toString on mongodb ObjectId

I am using the MappingJacksonJsonView in my SpringMVC application to render JSON from my controllers. I want the ObjectId from my object to render as .toString but instead it serializes the ObjectId into its parts. It works just fine in my Velocity/JSP pages:
Velocity:
$thing.id
Produces:
4f1d77bb3a13870ff0783c25
Json:
<script type="text/javascript">
$.ajax({
type: 'GET',
url: '/things/show/4f1d77bb3a13870ff0783c25',
dataType: 'json',
success : function(data) {
alert(data);
}
});
</script>
Produces:
thing: {id:{time:1327331259000, new:false, machine:974358287, timeSecond:1327331259, inc:-260555739},…}
id: {time:1327331259000, new:false, machine:974358287, timeSecond:1327331259, inc:-260555739}
inc: -260555739
machine: 974358287
new: false
time: 1327331259000
timeSecond: 1327331259
name: "Stack Overflow"
XML:
<script type="text/javascript">
$.ajax({
type: 'GET',
url: '/things/show/4f1d77bb3a13870ff0783c25',
dataType: 'xml',
success : function(data) {
alert(data);
}
});
</script>
Produces:
<com.place.model.Thing>
<id>
<__time>1327331259</__time>
<__machine>974358287</__machine>
<__inc>-260555739</__inc>
<__new>false</__new>
</id>
<name>Stack Overflow</name>
</com.place.model.Thing>
Is there a way to stop MappingJacksonJsonView from getting that much information out of the ObjectId? I just want the .toString() method, not all the details.
Thanks.
Adding the Spring config:
#Configuration
#EnableWebMvc
public class MyConfiguration {
#Bean(name = "viewResolver")
public ContentNegotiatingViewResolver viewResolver() {
ContentNegotiatingViewResolver contentNegotiatingViewResolver = new ContentNegotiatingViewResolver();
contentNegotiatingViewResolver.setOrder(1);
contentNegotiatingViewResolver.setFavorPathExtension(true);
contentNegotiatingViewResolver.setFavorParameter(true);
contentNegotiatingViewResolver.setIgnoreAcceptHeader(false);
Map<String, String> mediaTypes = new HashMap<String, String>();
mediaTypes.put("json", "application/x-json");
mediaTypes.put("json", "text/json");
mediaTypes.put("json", "text/x-json");
mediaTypes.put("json", "application/json");
mediaTypes.put("xml", "text/xml");
mediaTypes.put("xml", "application/xml");
contentNegotiatingViewResolver.setMediaTypes(mediaTypes);
List<View> defaultViews = new ArrayList<View>();
defaultViews.add(xmlView());
defaultViews.add(jsonView());
contentNegotiatingViewResolver.setDefaultViews(defaultViews);
return contentNegotiatingViewResolver;
}
#Bean(name = "xStreamMarshaller")
public XStreamMarshaller xStreamMarshaller() {
return new XStreamMarshaller();
}
#Bean(name = "xmlView")
public MarshallingView xmlView() {
MarshallingView marshallingView = new MarshallingView(xStreamMarshaller());
marshallingView.setContentType("application/xml");
return marshallingView;
}
#Bean(name = "jsonView")
public MappingJacksonJsonView jsonView() {
MappingJacksonJsonView mappingJacksonJsonView = new MappingJacksonJsonView();
mappingJacksonJsonView.setContentType("application/json");
return mappingJacksonJsonView;
}
}
And my controller:
#Controller
#RequestMapping(value = { "/things" })
public class ThingController {
#Autowired
private ThingRepository thingRepository;
#RequestMapping(value = { "/show/{thingId}" }, method = RequestMethod.GET)
public String show(#PathVariable ObjectId thingId, Model model) {
model.addAttribute("thing", thingRepository.findOne(thingId));
return "things/show";
}
}
By default Jackson provides the serialization of Object received. ObjectId returns the Object therefor its attributes are visible after conversion to JSON. You need to specify the type of serialization required, Here in this case it is string. Thing entity class which is used to create ThingRepository will look like this to get this done:
public class Thing {
#Id
#JsonSerialize(using= ToStringSerializer.class)
ObjectId id;
String name;
}
Here make a note of added anotation #JsonSerialize(using= ToStringSerializer.class) which instructs to serialize the ObjectID to String.
Previous answer did the trick, but it was ugly and not well thought out - a clear workaround to actually fixing the problem.
The real issue is that ObjectId deserializes into its component parts. MappingJacksonJsonView sees ObjectId for what it is, an object, and goes to work on it. The deserialized fields being seen in the JSON are the fields that make up an ObjectId. To stop the serialization/deserialization of such an object, you have to configure a CustomObjectMapper that extends ObjectMapper.
Here is the CustomeObjectMapper:
public class CustomObjectMapper extends ObjectMapper {
public CustomObjectMapper() {
CustomSerializerFactory sf = new CustomSerializerFactory();
sf.addSpecificMapping(ObjectId.class, new ObjectIdSerializer());
this.setSerializerFactory(sf);
}
}
Here is the ObjectIdSerializer that the CustomObjectMapper uses:
public class ObjectIdSerializer extends SerializerBase<ObjectId> {
protected ObjectIdSerializer(Class<ObjectId> t) {
super(t);
}
public ObjectIdSerializer() {
this(ObjectId.class);
}
#Override
public void serialize(ObjectId value, JsonGenerator jgen, SerializerProvider provider) throws IOException, JsonGenerationException {
jgen.writeString(value.toString());
}
}
And here is what needs to change in your #Configuration-annotated class:
#Bean(name = "jsonView")
public MappingJacksonJsonView jsonView() {
final MappingJacksonJsonView mappingJacksonJsonView = new MappingJacksonJsonView();
mappingJacksonJsonView.setContentType("application/json");
mappingJacksonJsonView.setObjectMapper(new CustomObjectMapper());
return mappingJacksonJsonView;
}
You are basically telling Jackson how to serialize/deserialize this particular object. Works like a charm.
If you're using autowired instance of the auto configured mapper in Spring Boot, you can just add this customizer bean:
#Bean
public Jackson2ObjectMapperBuilderCustomizer jsonCustomizer() {
return builder -> builder.serializerByType(ObjectId.class, ToStringSerializer.instance);
}
Relevant imports:
import com.fasterxml.jackson.databind.ser.std.ToStringSerializer;
import org.bson.types.ObjectId;
import org.springframework.boot.autoconfigure.jackson.Jackson2ObjectMapperBuilderCustomizer;
And then this will reflect anywhere the autowired mapper is used, for example:
#Service
public class MyService {
private final ObjectMapper objectMapper;
private final MongoTemplate mongoTemplate;
#Autowired
public MyService(ObjectMapper objectMapper) {
this.objectMapper = objectMapper;
}
public String getJsonForMongoCommand(Document document) {
return objectMapper.writeValueAsString(mongoTemplate.executeCommand(document));
}
}
Or in this specific case (untested, might be unnecessary):
#Bean(name = "jsonView")
public MappingJacksonJsonView jsonView(ObjectMapper objectMapper) {
final MappingJacksonJsonView mappingJacksonJsonView = new MappingJacksonJsonView();
mappingJacksonJsonView.setContentType("application/json");
mappingJacksonJsonView.setObjectMapper(objectMapper);
return mappingJacksonJsonView;
}
I had to just make the getId() method return a String. It was the only way to make Jackson stop serializing the ObjectId.
public String getId() {
if (id != null) {
return id.toString();
} else {
return null;
}
}
public void setId(ObjectId id) {
this.id = id;
}
setId() still has to be ObjectId so Mongo (and its driver) can set the ID correctly.
Just to complement the answers, if you face a scenario where you also need to serialize an array of ObjectId, you could create a custom serializer with the following logic:
public class ObjectIdSerializer extends JsonSerializer<Object> {
#Override
public void serialize(final Object value, final JsonGenerator jgen, final SerializerProvider provider) throws IOException {
if (value instanceof Collection) {
final Collection<String> ids = new ArrayList<>();
for (final Object id : ((Collection<?>) value)) {
ids.add(ObjectId.class.cast(id).toString());
}
jgen.writeObject(ids);
} else {
jgen.writeString(ObjectId.class.cast(value).toString());
}
}
}

Categories