I am attempting to persist a java.time.LocalDateTime object in my Cassandra database and keep it timezone agnostic. I am using Spring Data Cassandra to do this.
The problem is that somewhere along the line, something is treating these LocalDateTime objects as if they are in the timezone of my server, and offsetting them to UTC time when it stores them in the database.
Is this a bug or a feature? Can I work around it in some way?
Configuration:
#Configuration
#EnableCassandraRepositories(
basePackages = "my.base.package")
public class CassandraConfig extends AbstractCassandraConfiguration{
#Override
protected String getKeyspaceName() {
return "keyspacename";
}
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster =
new CassandraClusterFactoryBean();
cluster.setContactPoints("127.0.0.1");
cluster.setPort(9142);
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
}
Booking record I wish to persist:
#Table("booking")
public class BookingRecord {
#PrimaryKeyColumn(
ordinal = 0,
type = PrimaryKeyType.PARTITIONED
)
private UUID bookingId = null;
#PrimaryKeyColumn(
ordinal = 1,
type = PrimaryKeyType.CLUSTERED,
ordering = Ordering.ASCENDING
)
private LocalDateTime startTime = null;
...
}
Simple Repository Interface:
#Repository
public interface BookingRepository extends CassandraRepository<BookingRecord> { }
Save Call:
...
#Autowired
BookingRepository bookingRepository;
...
public void saveBookingRecord(BookingRecord bookingRecord) {
bookingRepository.save(bookingRecord);
}
Here is the string used to populate the starttime in BookingRecord:
"startTime": "2017-06-10T10:00:00Z"
And here is the output from cqlsh after the timestamp has been persisted:
cqlsh:keyspacename> select * from booking ;
bookingid | starttime
--------------------------------------+--------------------------------
8b640c30-4c94-11e7-898b-6dab708ec5b4 | 2017-06-10 15:00:00.000000+0000
Cassandra stores a Date (timestamp) as milliseconds since epoch without a specific timezone information. Timezone data is handled in the layers above Cassandra.
LocalDate/LocalDateTime represent a point in time relative to your local time. Before the date/time can be saved, it needs to be enhanced with a timezone to calculate the generic representation, which can be saved.
Spring Data uses your system-default timezone (Date.from(source.atZone(systemDefault()).toInstant())).
If you need timezone precision and want to omit any implicit timezone conversions, use java.util.Date directly which corresponds with Cassandra's (well, it's the Datastax Driver to be precise) storage format representation.
I do actually want to use LocalDateTime and LocalDate in my project, rather than java.util.Date, since they are newer and have more attractive functionality.
After much searching I have found a workaround.
First, you must create custom implementations of Spring's Converter interface as follows:
One for Date to LocalDateTime:
public class DateToLocalDateTime implements Converter<Date, LocalDateTime> {
#Override
public LocalDateTime convert(Date source) {
return source == null ? null : LocalDateTime.ofInstant(source.toInstant(), ZoneOffset.UTC);
}
}
And one for LocalDateTime to Date:
public class LocalDateTimeToDate implements Converter<LocalDateTime, Date> {
#Override
public Date convert(LocalDateTime source) {
return source == null ? null : Date.from(source.toInstant(ZoneOffset.UTC));
}
}
Finally, you must override the customConversions method in CassandraConfig as follows:
#Configuration
#EnableCassandraRepositories(basePackages = "my.base.package")
public class CassandraConfig extends AbstractCassandraConfiguration{
#Override
protected String getKeyspaceName() {
return "keyspacename";
}
#Override
public CustomConversions customConversions() {
List<Converter> converters = new ArrayList<>();
converters.add(new DateToLocalDateTime());
converters.add(new LocalDateTimeToDate());
return new CustomConversions(converters);
}
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster =
new CassandraClusterFactoryBean();
cluster.setContactPoints("127.0.0.1");
cluster.setPort(9142);
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
}
Thanks to mp911de for putting me in the ballpark of where to look for the solution!
Related
I'm currently migrating from Spring Data Elasticsearch 3.2.x to 4.0.0.
I'm removing a JacksonEntityMapper, that defined a custom ZonedDateTimeDeserializer, to use the ElasticsearchEntityMapper
I have a ZonedDateTime field defined as follows:
#Field(type = Date, format = DateFormat.date_time)
private final ZonedDateTime loggedIn;
However, the deserialization of this loses the zone information, so that a comparison between the field before and after being stored fails:
before
loggedIn=2020-06-01T09:50:27.389589+01:00[Europe/London]
after
loggedIn=2020-06-01T09:50:27.389+01:00
I expect the zone information to be lost as only the timezone offset is being stored. With the Jackson ZonedDateTimeDeserializer I was able to apply the Zone during the ZonedDateTime construction.
Ideally, I'd like to define a custom date format and converter classes to handle my scenario.
I've tried the following field configuration:
#Field(type = Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ")
private final ZonedDateTime loggedIn;
With Reading/WritingConverters
#WritingConverter
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
#Override
public String convert(ZonedDateTime source) {
return source.format(DateTimeFormatter.ISO_OFFSET_DATE_TIME);
}
}
#ReadingConverter
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
#Override
public ZonedDateTime convert(String source) {
return ZonedDateTime.parse(source, DateTimeFormatter.ISO_OFFSET_DATE_TIME.withZone(ZoneId.systemDefault()));
}
}
and configuration
public class ElasticConfiguration extends AbstractElasticsearchConfiguration {
#Bean
#Override
public ElasticsearchCustomConversions elasticsearchCustomConversions() {
return new ElasticsearchCustomConversions(List.of(new ZonedDateTimeToStringConverter(),
new StringToZonedDateTimeConverter()));
}
}
However, the reading of the field fails with an exception
Caused by: java.time.DateTimeException: Unable to obtain LocalDate from TemporalAccessor: {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123 of type java.time.format.Parsed
at java.base/java.time.LocalDate.from(LocalDate.java:396)
at java.base/java.time.ZonedDateTime.from(ZonedDateTime.java:560)
at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:109)
at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:114)
...
Looking at the exception, when comparing the parsing against the successful DateFormat.date_time read, I may have an error in the pattern. The TemporalAccessor for the DateFormat.date_time is {OffsetSeconds=3600, InstantSeconds=1597918271},ISO resolved to 2020-08-20T11:11:11.123, whereas my custom pattern parses to {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123
But it also seems that the custom converters I specified aren't being picked up. Note. I have other customer converters specified that are being picked up so don't believe it's a configuration issue.
Any help would be appreciated, I'm not sure why the custom pattern fails, but think I could avoid it if the custom converters were picked up. I can workaround the issue for now, but ideally I'd like everything to be consistent before and after the upgrade.
Don't use yyyy in a date pattern, change it to (see the Elasticsearch docs)
pattern = "uuuu-MM-dd'T'HH:mm:ss.SSSSSSZ")
By defining the property as FieldType.Dateinternally a converter is created for this property and used; the custom converters aren't needed
ElasticsearchDateConverter is a final class and causes error on custom date patterns.
ElasticsearchCustomConversions work only on "non-mapped" date types.
This is a limitation for the newest versions of spring-data-elasticsearch.
The fields on elastic can accept many date formats but on spring this is blocked.
solution: use only rest client and jackson with custom date formats:
private ObjectMapper getJacksonObjectMapper() {
if (jacksonMapper == null) {
jacksonMapper = new ObjectMapper();
jacksonMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
jacksonMapper.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true);
jacksonMapper.configure(DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT, true);
jacksonMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
// strictMapper.disable(DeserializationFeature.READ_DATE_TIMESTAMPS_AS_NANOSECONDS);
SimpleModule module = new SimpleModule();
module.addDeserializer(LocalDateTime.class, new CustomLocalDeteTimeDeserializer());
module.addDeserializer(ZonedDateTime.class, new CustomZonedDeteTimeDeserializer());
module.addDeserializer(Date.class, new CustomDeteDeserializer());
jacksonMapper.registerModule(module);
}
return jacksonMapper;
}
public class CustomLocalDeteTimeDeserializer extends JsonDeserializer<LocalDateTime> {
#Override
public LocalDateTime deserialize(JsonParser jsonparser, DeserializationContext context)
throws IOException, JsonProcessingException {
String dateAsString = jsonparser.getText();
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"));
} catch (Exception e) {
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMddHHmmss"));
} catch (Exception e1) {
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMdd"));
} catch (Exception e2) {
throw new RuntimeException(e2);
}
}
}
}
}
#Bean(name="customConverter")
public ElasticsearchConverter elasticsearchConverter(SimpleElasticsearchMappingContext mappingContext,
ElasticsearchCustomConversions elasticsearchCustomConversions) {
DefaultConversionService cs=new DefaultConversionService();
MappingElasticsearchConverter converter = new MappingElasticsearchConverter(mappingContext,cs) {
#Override
public <R> R read(Class<R> type, org.springframework.data.elasticsearch.core.document.Document source) {
return getJacksonObjectMapper().convertValue(source, type);
}
};
converter.setConversions(elasticsearchCustomConversions);
return converter;
}
public ElasticsearchRestTemplate elasticSearchTemplate(#Qualifier("customConverter")ElasticsearchConverter elasticsearchConverter) {
return new ElasticsearchRestTemplate(client(), elasticsearchConverter);
}
I'm having the following error when using Mongo's aggregate with a Java 8 LocalDateTime criteria.
Caused by: org.bson.codecs.configuration.CodecConfigurationException:
Can't find a codec for class java.time.LocalDateTime.
with the following piece of code
#SpringBootApplication
public class MongojavatimeApplication implements CommandLineRunner {
#Autowired
private MongoTemplate template;
public static void main(String[] args) {
SpringApplication.run(MongojavatimeApplication.class, args);
}
#Override
public void run(String... args) throws Exception {
Criteria c = Criteria.where("createdDate").gt(LocalDateTime.now().minusDays(30));
template.aggregate(Aggregation.newAggregation(Aggregation.match(c)), "TestJavaTime", TestJavaTime.class);
}
}
You'll find few tests here, LocalDateTime works fine with a Spring repository, a classical query with the Criteria API using a MongoTemplate, but throws this error when creating an Aggregate query.
https://github.com/Farael49/spring-mongo-aggregate-localdatetime
I also did a little test replacing the LocalDateTime with the java util Date to show it's not throwing a codec error.
Is there something I can do, or is it a Mongo Driver/Spring issue ?
Thanks
I think your problem is due to the mongodb java driver not knowing how to serialise the LocalDateTime object. There is a good solution to this problem here: Cannot serialize LocalDate in Mongodb
in your code amending it like this might work:
#Override
public void run(String... args) throws Exception {
LocalDateTime startDateTime = LocalDateTime.now().minusDays(30);
Instant startInstant = startDateTime.atStartOfDay().atZone(ZoneId.systemDefault()).toInstant();
Criteria c = Criteria.where("createdDate").gt(Date.from(startInstant));
template.aggregate(Aggregation.newAggregation(Aggregation.match(c)), "TestJavaTime", TestJavaTime.class);
}
If you want to use LocalDateTime directly you should provide a codec like this:
public enum LocalDateTimeCodec
implements Codec<LocalDateTime> {
INSTANCE;
#Override
public void encode(
BsonWriter writer,
LocalDateTime value,
EncoderContext encoderContext) {
writer.writeDateTime(
value.toInstant(ZoneOffset.UTC)
.toEpochMilli()
);
}
#Override
public LocalDateTime decode(
BsonReader reader,
DecoderContext decoderContext) {
return Instant.ofEpochMilli(reader.readDateTime())
.atOffset(ZoneOffset.UTC)
.toLocalDateTime();
}
#Override
public Class<LocalDateTime> getEncoderClass() {
return LocalDateTime.class;
}
}
You can register it this way:
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
CodecRegistry registry = CodecRegistries.fromRegistries(
CodecRegistries.fromCodecs(LocalDateTimeCodec.INSTANCE),
MongoClient.getDefaultCodecRegistry()
);
MongoClientOptions options = MongoClientOptions
.builder()
.codecRegistry(registry)
.build();
return new SimpleMongoDbFactory(new MongoClient(host, options), dbName);
}
where host and dbName might be autowired fields of some configuration class.
I'm exposing the following Map via a #RestController Servlet:
List<Map<String, Object>> results = jdbcTemplate.queryForList(..);
The map then contains a java.sql.Timestamp object.
Question: how can I set the output format that Spring with jaxb/jackson generates for the Timestamp? I want to set it globally. I do not want to look the map to detect and reformat the values manually.
The following did not work:
#Configuration
public class DateConfig extends WebMvcConfigurerAdapter {
#Override
public void addFormatters(FormatterRegistry registry) {
super.addFormatters(registry);
registry.addFormatterForFieldType(java.sql.Timestamp.class, new Formatter<Timestamp>() {
#Override
public String print(Timestamp object, Locale locale) {
return "my custom format";
}
});
}
}
The formatter is registered, but never called during serialization to json!
Current result is always like: 2017-07-10T11:06:02.000+0000. But I'd like to get 2017-07-10 11:06:02 everywhere.
By default, date and time fields that are not annotated with #DateTimeFormat are converted from strings using the DateFormat.SHORT style. If you prefer, you can change this by defining your own global format.
Example below:
#Configuration
public class AppConfig {
#Bean
public FormattingConversionService conversionService() {
// Use the DefaultFormattingConversionService but do not register defaults
DefaultFormattingConversionService conversionService = new DefaultFormattingConversionService(false);
// Ensure #NumberFormat is still supported
conversionService.addFormatterForFieldAnnotation(new NumberFormatAnnotationFormatterFactory());
// Register date conversion with a specific global format
DateFormatterRegistrar registrar = new DateFormatterRegistrar();
registrar.setFormatter(new YourTimeStampFormatter());
registrar.registerFormatters(conversionService);
return conversionService;
}
}
See here
Solution as follows.
#Bean
public Jackson2ObjectMapperBuilderCustomizer jsonCustomizer() {
return new Jackson2ObjectMapperBuilderCustomizer() {
#Override
public void customize(Jackson2ObjectMapperBuilder builder) {
builder.dateFormat(new ISO8601DateFormat()); //or your custom date converter
}
};
}
I want my auditable (#CreatedDate and #LastModifiedDate) MongoDB document to work with ZonedDateTime fields.
Apparently this type is not supported by Spring Data (have a look at org.springframework.data.auditing.AnnotationAuditingMetadata).
Framework version: Spring Boot 2.0.0 and Spring Data MongoDB 2.0.0
Spring Data auditing error:
java.lang.IllegalArgumentException: Invalid date type for member <MEMBER NAME>!
Supported types are [org.joda.time.DateTime, org.joda.time.LocalDateTime, java.util.Date, java.lang.Long, long].
Mongo configuration:
#Configuration
#EnableMongoAuditing
public class MongoConfiguration {
}
The auditable entity:
public abstract class BaseDocument {
#CreatedDate
private ZonedDateTime createdDate;
#LastModifiedDate
private ZonedDateTime lastModifiedDate;
}
Things I tried
I also tried creating a custom converter for ZonedDateTime, but it is not considered by Spring Data. The class DateConvertingAuditableBeanWrapper has a ConversionService which is configured in the constructor method with JodaTimeConverters, Jsr310Converters and ThreeTenBackPortConverters.
Custom converter:
#Component
public class LocalDateTimeToZonedDateTimeConverter implements Converter<LocalDateTime, ZonedDateTime> {
#Override
public ZonedDateTime convert(LocalDateTime source) {
return source.atZone(ZoneId.systemDefault());
}
}
Spring Data DateConvertingAuditableBeanWrapper:
class DefaultAuditableBeanWrapperFactory implements AuditableBeanWrapperFactory {
abstract static class DateConvertingAuditableBeanWrapper implements AuditableBeanWrapper {
private final ConversionService conversionService;
}
}
Is it possible to audit ZonedDateTime fields?
How can I register a converter?
Create a DateTimeProvider to provide the current time to be used when auditing:
#Component("dateTimeProvider")
public class CustomDateTimeProvider implements DateTimeProvider {
#Override
public Optional<TemporalAccessor> getNow() {
return Optional.of(ZonedDateTime.now());
}
}
And then:
Reference the DateTimeProvider component in the #EnableMongoAuditing annotation;
Create Converters for Date and ZonedDateTime;
Add the Converter instances to a MongoCustomConversions instance;
Expose the MongoCustomConversions instance as a #Bean.
#Configuration
#EnableMongoAuditing(dateTimeProviderRef = "dateTimeProvider")
public class MongoConfiguration {
#Bean
public MongoCustomConversions customConversions() {
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new DateToZonedDateTimeConverter());
converters.add(new ZonedDateTimeToDateConverter());
return new MongoCustomConversions(converters);
}
class DateToZonedDateTimeConverter implements Converter<Date, ZonedDateTime> {
#Override
public ZonedDateTime convert(Date source) {
return source == null ? null :
ZonedDateTime.ofInstant(source.toInstant(), ZoneId.systemDefault());
}
}
class ZonedDateTimeToDateConverter implements Converter<ZonedDateTime, Date> {
#Override
public Date convert(ZonedDateTime source) {
return source == null ? null : Date.from(source.toInstant());
}
}
}
I wouldn't, however, use ZonedDateTime for this purpose. I would stick to OffsetDateTime:
OffsetDateTime, ZonedDateTime and Instant all store an instant on the time-line to nanosecond precision. Instant is the simplest, simply representing the instant. OffsetDateTime adds to the instant the offset from UTC/Greenwich, which allows the local date-time to be obtained. ZonedDateTime adds full time-zone rules.
It is intended that ZonedDateTime or Instant is used to model data in simpler applications. This class may be used when modeling date-time concepts in more detail, or when communicating to a database or in a network protocol.
I have simple document with Java 8 date/time fields
#Document
public class Token {
private Instant createdAt;
...
}
that I want to persist with Spring Data MongoDB version 1.5. But fields of type java.time.Instant could not be de-serialized correctly because MappingMongoConverter lacks converters for java.time classes.
In Spring 4 I found org.springframework.format.datetime.standard.DateTimeConverters with different Converters including InstantToLongConverter and LongToInstantConverter declared as private static classes.
How can I configure MongoTemplate to use them to map Instant fields to longs?
I don't know if this is the best way but I added Java 8 Date/Time (JSR-310) types support to Spring Data MongoDB 1.5.0.RELEASE like this:
First step. Add simple Spring Converters
public class InstantToLongConverter implements Converter<Instant, Long> {
#Override
public Long convert(Instant instant) {
return instant.toEpochMilli();
}
}
public class LongToInstantConverter implements Converter<Long, Instant> {
#Override
public Instant convert(Long source) {
return Instant.ofEpochMilli(source);
}
}
public class LocalDateToStringConverter implements Converter<LocalDate, String> {
#Override
public String convert(LocalDate localDate) {
return localDate.toString();
}
}
public class StringToLocalDateConverter implements Converter<String, LocalDate> {
#Override
public LocalDate convert(String source) {
return LocalDate.parse(source);
}
}
Second step. Register these custom Converters with MappingMongoConverter in your AbstractMongoConfiguration implementation like this:
#Configuration
#EnableMongoRepositories(basePackages = {"my.app.repository"})
public class MongoConfiguration extends AbstractMongoConfiguration {
...
#Override
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new InstantToLongConverter(), new LongToInstantConverter(),
new LocalDateToStringConverter(), new StringToLocalDateConverter()));
}
}
Now your document's Instant fields will be persisted as long values and LocalDates as Strings.
#user882209 explained it all just perfectly.
Since Spring Data MongoDB 1.7 the support for JSR-310 has been added.
If application is backed by Spring Boot every version over 1.2.8 would contain it as well.
In a Spring Boot-ed app you can just do it the following:
#Configuration
public class MongoDbConfig {
#Autowired
private MongoDbFactory mongoDbFactory;
#Bean
public MongoTemplate mongoTemplate() {
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory),
new MongoMappingContext());
converter.setCustomConversions(new CustomConversions(Arrays.asList(...)));
return new MongoTemplate(mongoDbFactory, converter);
}
}
The following converters are provided by the Jsr310Converters class:
DateToLocalDateTimeConverter - LocalDateTimeToDateConverter
DateToLocalDateConverter - LocalDateToDateConverter
DateToLocalTimeConverter - LocalTimeToDateConverter
DateToInstantConverter - InstantToDateConverter