I'm having the following error when using Mongo's aggregate with a Java 8 LocalDateTime criteria.
Caused by: org.bson.codecs.configuration.CodecConfigurationException:
Can't find a codec for class java.time.LocalDateTime.
with the following piece of code
#SpringBootApplication
public class MongojavatimeApplication implements CommandLineRunner {
#Autowired
private MongoTemplate template;
public static void main(String[] args) {
SpringApplication.run(MongojavatimeApplication.class, args);
}
#Override
public void run(String... args) throws Exception {
Criteria c = Criteria.where("createdDate").gt(LocalDateTime.now().minusDays(30));
template.aggregate(Aggregation.newAggregation(Aggregation.match(c)), "TestJavaTime", TestJavaTime.class);
}
}
You'll find few tests here, LocalDateTime works fine with a Spring repository, a classical query with the Criteria API using a MongoTemplate, but throws this error when creating an Aggregate query.
https://github.com/Farael49/spring-mongo-aggregate-localdatetime
I also did a little test replacing the LocalDateTime with the java util Date to show it's not throwing a codec error.
Is there something I can do, or is it a Mongo Driver/Spring issue ?
Thanks
I think your problem is due to the mongodb java driver not knowing how to serialise the LocalDateTime object. There is a good solution to this problem here: Cannot serialize LocalDate in Mongodb
in your code amending it like this might work:
#Override
public void run(String... args) throws Exception {
LocalDateTime startDateTime = LocalDateTime.now().minusDays(30);
Instant startInstant = startDateTime.atStartOfDay().atZone(ZoneId.systemDefault()).toInstant();
Criteria c = Criteria.where("createdDate").gt(Date.from(startInstant));
template.aggregate(Aggregation.newAggregation(Aggregation.match(c)), "TestJavaTime", TestJavaTime.class);
}
If you want to use LocalDateTime directly you should provide a codec like this:
public enum LocalDateTimeCodec
implements Codec<LocalDateTime> {
INSTANCE;
#Override
public void encode(
BsonWriter writer,
LocalDateTime value,
EncoderContext encoderContext) {
writer.writeDateTime(
value.toInstant(ZoneOffset.UTC)
.toEpochMilli()
);
}
#Override
public LocalDateTime decode(
BsonReader reader,
DecoderContext decoderContext) {
return Instant.ofEpochMilli(reader.readDateTime())
.atOffset(ZoneOffset.UTC)
.toLocalDateTime();
}
#Override
public Class<LocalDateTime> getEncoderClass() {
return LocalDateTime.class;
}
}
You can register it this way:
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
CodecRegistry registry = CodecRegistries.fromRegistries(
CodecRegistries.fromCodecs(LocalDateTimeCodec.INSTANCE),
MongoClient.getDefaultCodecRegistry()
);
MongoClientOptions options = MongoClientOptions
.builder()
.codecRegistry(registry)
.build();
return new SimpleMongoDbFactory(new MongoClient(host, options), dbName);
}
where host and dbName might be autowired fields of some configuration class.
Related
I'm using spring-boot 2.1.6 and there is an API to accept a form including a date like:
#Data
public class MyForm {
private LocalDate date;
...
}
#Controller
public class MyController {
#PostMapping("...")
public ResponseEntity<...> post(#RequestBody MyForm myForm) {
...
}
}
By default spring MVC accept this JSON format:
{
"date": [2020, 6, 17],
...
}
So in Front-End, my JavaScript code just submit a form like this, i.e. JS will convert a date to an array.
But when I run spring-boot test, this serialization does not work, with the following code:
private ObjectMapper mapper = new ObjectMapper();
#Autowired
private MockMvc mockMvc;
#Test
public void doTest() {
MyForm form = ...
MvcResult result = mockMvc.perform(MockMvcRequestBuilders.post("/...").
contentType("application/json").content(mapper.writeValueAsString(form)).andReturn();
...
}
This is because Jackson by default serialize LocalDate as:
{
"date": {
"year":2020,
"month":"JUNE",
"monthValue":6,
...
}
...
}
As mentioned here: LocalDate Serialization: date as array? , there are many configurations to force spring-boot serialize data as format yyyy-MM-dd. But I don't want to change my JS code. I just want to make my test case work.
How can I configure ObjectMapper to force Jackson to serialize LocalDate to Array? I just want to get this:
{
"date": [2020, 6, 17],
...
}
UPDATE
LocalDate here is java.time.LocalDate but not org.joda.time.LocalDate.
You need to register JavaTimeModule. Maven dependency:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</dependency>
Example, how to use it:
import com.fasterxml.jackson.databind.json.JsonMapper;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
import java.time.LocalDate;
public class JsonApp {
public static void main(String[] args) throws Exception {
JsonMapper mapper = JsonMapper.builder()
.addModule(new JavaTimeModule())
.build();
mapper.writeValue(System.out, new MyForm());
}
}
class MyForm {
private LocalDate value = LocalDate.now();
public LocalDate getValue() {
return value;
}
public void setValue(LocalDate value) {
this.value = value;
}
}
Above code prints:
{"value":[2020,6,17]}
See also:
jackson-modules-java8
Jackson Serialize Instant to Nanosecond Issue
Jackson deserialize elasticsearch long as LocalDateTime with Java 8
You could try to create a custom deserializer for LocalDate
class LocalDateDeserializer extends StdDeserializer<LocalDate> {
#Override
public LocalDate deserialize(JsonParser parser, DeserializationContext context)
throws IOException, JsonProcessingException {
// implement;
}
}
And then register it by adding a Module bean. From the documentation:
Any beans of type com.fasterxml.jackson.databind.Module are automatically registered with the auto-configured Jackson2ObjectMapperBuilder and are applied to any ObjectMapper instances that it creates. This provides a global mechanism for contributing custom modules when you add new features to your application.
#Bean
public Module LocalDateDeserializer() {
SimpleModule module = new SimpleModule();
module.addDeserializer(LocalDate.class, new LocalDateDeserializer());
return module;
}
you can bulid a converter that gets the date value an returns the wanted array.
this will be your entity
#JsonSerialize(converter=DateToArray.class)
private LocalDate date;
your converter
#Component
public class DateToArray extends StdConverter< Date, String[]> {
#Override
public String[] convert(Date value) {
//logic for pushing data into Array and return it
}
}
I'm currently migrating from Spring Data Elasticsearch 3.2.x to 4.0.0.
I'm removing a JacksonEntityMapper, that defined a custom ZonedDateTimeDeserializer, to use the ElasticsearchEntityMapper
I have a ZonedDateTime field defined as follows:
#Field(type = Date, format = DateFormat.date_time)
private final ZonedDateTime loggedIn;
However, the deserialization of this loses the zone information, so that a comparison between the field before and after being stored fails:
before
loggedIn=2020-06-01T09:50:27.389589+01:00[Europe/London]
after
loggedIn=2020-06-01T09:50:27.389+01:00
I expect the zone information to be lost as only the timezone offset is being stored. With the Jackson ZonedDateTimeDeserializer I was able to apply the Zone during the ZonedDateTime construction.
Ideally, I'd like to define a custom date format and converter classes to handle my scenario.
I've tried the following field configuration:
#Field(type = Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ")
private final ZonedDateTime loggedIn;
With Reading/WritingConverters
#WritingConverter
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
#Override
public String convert(ZonedDateTime source) {
return source.format(DateTimeFormatter.ISO_OFFSET_DATE_TIME);
}
}
#ReadingConverter
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
#Override
public ZonedDateTime convert(String source) {
return ZonedDateTime.parse(source, DateTimeFormatter.ISO_OFFSET_DATE_TIME.withZone(ZoneId.systemDefault()));
}
}
and configuration
public class ElasticConfiguration extends AbstractElasticsearchConfiguration {
#Bean
#Override
public ElasticsearchCustomConversions elasticsearchCustomConversions() {
return new ElasticsearchCustomConversions(List.of(new ZonedDateTimeToStringConverter(),
new StringToZonedDateTimeConverter()));
}
}
However, the reading of the field fails with an exception
Caused by: java.time.DateTimeException: Unable to obtain LocalDate from TemporalAccessor: {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123 of type java.time.format.Parsed
at java.base/java.time.LocalDate.from(LocalDate.java:396)
at java.base/java.time.ZonedDateTime.from(ZonedDateTime.java:560)
at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:109)
at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:114)
...
Looking at the exception, when comparing the parsing against the successful DateFormat.date_time read, I may have an error in the pattern. The TemporalAccessor for the DateFormat.date_time is {OffsetSeconds=3600, InstantSeconds=1597918271},ISO resolved to 2020-08-20T11:11:11.123, whereas my custom pattern parses to {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123
But it also seems that the custom converters I specified aren't being picked up. Note. I have other customer converters specified that are being picked up so don't believe it's a configuration issue.
Any help would be appreciated, I'm not sure why the custom pattern fails, but think I could avoid it if the custom converters were picked up. I can workaround the issue for now, but ideally I'd like everything to be consistent before and after the upgrade.
Don't use yyyy in a date pattern, change it to (see the Elasticsearch docs)
pattern = "uuuu-MM-dd'T'HH:mm:ss.SSSSSSZ")
By defining the property as FieldType.Dateinternally a converter is created for this property and used; the custom converters aren't needed
ElasticsearchDateConverter is a final class and causes error on custom date patterns.
ElasticsearchCustomConversions work only on "non-mapped" date types.
This is a limitation for the newest versions of spring-data-elasticsearch.
The fields on elastic can accept many date formats but on spring this is blocked.
solution: use only rest client and jackson with custom date formats:
private ObjectMapper getJacksonObjectMapper() {
if (jacksonMapper == null) {
jacksonMapper = new ObjectMapper();
jacksonMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
jacksonMapper.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true);
jacksonMapper.configure(DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT, true);
jacksonMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
// strictMapper.disable(DeserializationFeature.READ_DATE_TIMESTAMPS_AS_NANOSECONDS);
SimpleModule module = new SimpleModule();
module.addDeserializer(LocalDateTime.class, new CustomLocalDeteTimeDeserializer());
module.addDeserializer(ZonedDateTime.class, new CustomZonedDeteTimeDeserializer());
module.addDeserializer(Date.class, new CustomDeteDeserializer());
jacksonMapper.registerModule(module);
}
return jacksonMapper;
}
public class CustomLocalDeteTimeDeserializer extends JsonDeserializer<LocalDateTime> {
#Override
public LocalDateTime deserialize(JsonParser jsonparser, DeserializationContext context)
throws IOException, JsonProcessingException {
String dateAsString = jsonparser.getText();
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"));
} catch (Exception e) {
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMddHHmmss"));
} catch (Exception e1) {
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMdd"));
} catch (Exception e2) {
throw new RuntimeException(e2);
}
}
}
}
}
#Bean(name="customConverter")
public ElasticsearchConverter elasticsearchConverter(SimpleElasticsearchMappingContext mappingContext,
ElasticsearchCustomConversions elasticsearchCustomConversions) {
DefaultConversionService cs=new DefaultConversionService();
MappingElasticsearchConverter converter = new MappingElasticsearchConverter(mappingContext,cs) {
#Override
public <R> R read(Class<R> type, org.springframework.data.elasticsearch.core.document.Document source) {
return getJacksonObjectMapper().convertValue(source, type);
}
};
converter.setConversions(elasticsearchCustomConversions);
return converter;
}
public ElasticsearchRestTemplate elasticSearchTemplate(#Qualifier("customConverter")ElasticsearchConverter elasticsearchConverter) {
return new ElasticsearchRestTemplate(client(), elasticsearchConverter);
}
I am attempting to persist a java.time.LocalDateTime object in my Cassandra database and keep it timezone agnostic. I am using Spring Data Cassandra to do this.
The problem is that somewhere along the line, something is treating these LocalDateTime objects as if they are in the timezone of my server, and offsetting them to UTC time when it stores them in the database.
Is this a bug or a feature? Can I work around it in some way?
Configuration:
#Configuration
#EnableCassandraRepositories(
basePackages = "my.base.package")
public class CassandraConfig extends AbstractCassandraConfiguration{
#Override
protected String getKeyspaceName() {
return "keyspacename";
}
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster =
new CassandraClusterFactoryBean();
cluster.setContactPoints("127.0.0.1");
cluster.setPort(9142);
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
}
Booking record I wish to persist:
#Table("booking")
public class BookingRecord {
#PrimaryKeyColumn(
ordinal = 0,
type = PrimaryKeyType.PARTITIONED
)
private UUID bookingId = null;
#PrimaryKeyColumn(
ordinal = 1,
type = PrimaryKeyType.CLUSTERED,
ordering = Ordering.ASCENDING
)
private LocalDateTime startTime = null;
...
}
Simple Repository Interface:
#Repository
public interface BookingRepository extends CassandraRepository<BookingRecord> { }
Save Call:
...
#Autowired
BookingRepository bookingRepository;
...
public void saveBookingRecord(BookingRecord bookingRecord) {
bookingRepository.save(bookingRecord);
}
Here is the string used to populate the starttime in BookingRecord:
"startTime": "2017-06-10T10:00:00Z"
And here is the output from cqlsh after the timestamp has been persisted:
cqlsh:keyspacename> select * from booking ;
bookingid | starttime
--------------------------------------+--------------------------------
8b640c30-4c94-11e7-898b-6dab708ec5b4 | 2017-06-10 15:00:00.000000+0000
Cassandra stores a Date (timestamp) as milliseconds since epoch without a specific timezone information. Timezone data is handled in the layers above Cassandra.
LocalDate/LocalDateTime represent a point in time relative to your local time. Before the date/time can be saved, it needs to be enhanced with a timezone to calculate the generic representation, which can be saved.
Spring Data uses your system-default timezone (Date.from(source.atZone(systemDefault()).toInstant())).
If you need timezone precision and want to omit any implicit timezone conversions, use java.util.Date directly which corresponds with Cassandra's (well, it's the Datastax Driver to be precise) storage format representation.
I do actually want to use LocalDateTime and LocalDate in my project, rather than java.util.Date, since they are newer and have more attractive functionality.
After much searching I have found a workaround.
First, you must create custom implementations of Spring's Converter interface as follows:
One for Date to LocalDateTime:
public class DateToLocalDateTime implements Converter<Date, LocalDateTime> {
#Override
public LocalDateTime convert(Date source) {
return source == null ? null : LocalDateTime.ofInstant(source.toInstant(), ZoneOffset.UTC);
}
}
And one for LocalDateTime to Date:
public class LocalDateTimeToDate implements Converter<LocalDateTime, Date> {
#Override
public Date convert(LocalDateTime source) {
return source == null ? null : Date.from(source.toInstant(ZoneOffset.UTC));
}
}
Finally, you must override the customConversions method in CassandraConfig as follows:
#Configuration
#EnableCassandraRepositories(basePackages = "my.base.package")
public class CassandraConfig extends AbstractCassandraConfiguration{
#Override
protected String getKeyspaceName() {
return "keyspacename";
}
#Override
public CustomConversions customConversions() {
List<Converter> converters = new ArrayList<>();
converters.add(new DateToLocalDateTime());
converters.add(new LocalDateTimeToDate());
return new CustomConversions(converters);
}
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster =
new CassandraClusterFactoryBean();
cluster.setContactPoints("127.0.0.1");
cluster.setPort(9142);
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
}
Thanks to mp911de for putting me in the ballpark of where to look for the solution!
Following this tutorial, working with complete code, how do I enable exception translation for mongo?
When my mongo db is down, i'm getting 500 error from com.mongodb.MongoServerSelectionException. Shouldn't this be translated into DataAccessResourceFailureException with MongoExceptionTranslator? Am i supposed to register this bean somehow? I've tried with:
#Bean
public MongoExceptionTranslator mongoExceptionTranslator() {
return new MongoExceptionTranslator();
}
but still no change
EDIT:
I've created a demo with the suggestions from Stackee007 but still can't get this to work
MongoExceptionTranslator is already registered if your configuration registers MongoFactoryBean or SimpleMongoDbFactory. You could configure mongo something like below which registers SimpleMongoDbFactory.
#Configuration
#EnableMongoRepositories
public class ApplicationConfig extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
return "yyy";
}
#Override
protected UserCredentials getUserCredentials() {
return new UserCredentials("abc", "***");
}
#Override
#Bean
public Mongo mongo() throws Exception {
List<ServerAddress> seeds = new ArrayList<ServerAddress>();
seeds.add(new ServerAddress("xxxx"));
seeds.add(new ServerAddress("xxx"));
seeds.add(new ServerAddress("xx"));
MongoClient mongo = new MongoClient(seeds);
mongo.setReadPreference(ReadPreference.secondaryPreferred());
mongo.setWriteConcern(WriteConcern.ACKNOWLEDGED);
return mongo;
}
#Bean
public GridFsTemplate gridFsTemplate() throws Exception {
return new GridFsTemplate(mongoDbFactory(), mappingMongoConverter());
}
}
I have simple document with Java 8 date/time fields
#Document
public class Token {
private Instant createdAt;
...
}
that I want to persist with Spring Data MongoDB version 1.5. But fields of type java.time.Instant could not be de-serialized correctly because MappingMongoConverter lacks converters for java.time classes.
In Spring 4 I found org.springframework.format.datetime.standard.DateTimeConverters with different Converters including InstantToLongConverter and LongToInstantConverter declared as private static classes.
How can I configure MongoTemplate to use them to map Instant fields to longs?
I don't know if this is the best way but I added Java 8 Date/Time (JSR-310) types support to Spring Data MongoDB 1.5.0.RELEASE like this:
First step. Add simple Spring Converters
public class InstantToLongConverter implements Converter<Instant, Long> {
#Override
public Long convert(Instant instant) {
return instant.toEpochMilli();
}
}
public class LongToInstantConverter implements Converter<Long, Instant> {
#Override
public Instant convert(Long source) {
return Instant.ofEpochMilli(source);
}
}
public class LocalDateToStringConverter implements Converter<LocalDate, String> {
#Override
public String convert(LocalDate localDate) {
return localDate.toString();
}
}
public class StringToLocalDateConverter implements Converter<String, LocalDate> {
#Override
public LocalDate convert(String source) {
return LocalDate.parse(source);
}
}
Second step. Register these custom Converters with MappingMongoConverter in your AbstractMongoConfiguration implementation like this:
#Configuration
#EnableMongoRepositories(basePackages = {"my.app.repository"})
public class MongoConfiguration extends AbstractMongoConfiguration {
...
#Override
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new InstantToLongConverter(), new LongToInstantConverter(),
new LocalDateToStringConverter(), new StringToLocalDateConverter()));
}
}
Now your document's Instant fields will be persisted as long values and LocalDates as Strings.
#user882209 explained it all just perfectly.
Since Spring Data MongoDB 1.7 the support for JSR-310 has been added.
If application is backed by Spring Boot every version over 1.2.8 would contain it as well.
In a Spring Boot-ed app you can just do it the following:
#Configuration
public class MongoDbConfig {
#Autowired
private MongoDbFactory mongoDbFactory;
#Bean
public MongoTemplate mongoTemplate() {
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory),
new MongoMappingContext());
converter.setCustomConversions(new CustomConversions(Arrays.asList(...)));
return new MongoTemplate(mongoDbFactory, converter);
}
}
The following converters are provided by the Jsr310Converters class:
DateToLocalDateTimeConverter - LocalDateTimeToDateConverter
DateToLocalDateConverter - LocalDateToDateConverter
DateToLocalTimeConverter - LocalTimeToDateConverter
DateToInstantConverter - InstantToDateConverter