I'm currently migrating from Spring Data Elasticsearch 3.2.x to 4.0.0.
I'm removing a JacksonEntityMapper, that defined a custom ZonedDateTimeDeserializer, to use the ElasticsearchEntityMapper
I have a ZonedDateTime field defined as follows:
#Field(type = Date, format = DateFormat.date_time)
private final ZonedDateTime loggedIn;
However, the deserialization of this loses the zone information, so that a comparison between the field before and after being stored fails:
before
loggedIn=2020-06-01T09:50:27.389589+01:00[Europe/London]
after
loggedIn=2020-06-01T09:50:27.389+01:00
I expect the zone information to be lost as only the timezone offset is being stored. With the Jackson ZonedDateTimeDeserializer I was able to apply the Zone during the ZonedDateTime construction.
Ideally, I'd like to define a custom date format and converter classes to handle my scenario.
I've tried the following field configuration:
#Field(type = Date, format = DateFormat.custom, pattern = "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ")
private final ZonedDateTime loggedIn;
With Reading/WritingConverters
#WritingConverter
public class ZonedDateTimeToStringConverter implements Converter<ZonedDateTime, String> {
#Override
public String convert(ZonedDateTime source) {
return source.format(DateTimeFormatter.ISO_OFFSET_DATE_TIME);
}
}
#ReadingConverter
public class StringToZonedDateTimeConverter implements Converter<String, ZonedDateTime> {
#Override
public ZonedDateTime convert(String source) {
return ZonedDateTime.parse(source, DateTimeFormatter.ISO_OFFSET_DATE_TIME.withZone(ZoneId.systemDefault()));
}
}
and configuration
public class ElasticConfiguration extends AbstractElasticsearchConfiguration {
#Bean
#Override
public ElasticsearchCustomConversions elasticsearchCustomConversions() {
return new ElasticsearchCustomConversions(List.of(new ZonedDateTimeToStringConverter(),
new StringToZonedDateTimeConverter()));
}
}
However, the reading of the field fails with an exception
Caused by: java.time.DateTimeException: Unable to obtain LocalDate from TemporalAccessor: {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123 of type java.time.format.Parsed
at java.base/java.time.LocalDate.from(LocalDate.java:396)
at java.base/java.time.ZonedDateTime.from(ZonedDateTime.java:560)
at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:109)
at org.springframework.data.elasticsearch.core.convert.ElasticsearchDateConverter.parse(ElasticsearchDateConverter.java:114)
...
Looking at the exception, when comparing the parsing against the successful DateFormat.date_time read, I may have an error in the pattern. The TemporalAccessor for the DateFormat.date_time is {OffsetSeconds=3600, InstantSeconds=1597918271},ISO resolved to 2020-08-20T11:11:11.123, whereas my custom pattern parses to {YearOfEra=2020, MonthOfYear=8, DayOfMonth=20, OffsetSeconds=3600},ISO resolved to 11:11:11.123
But it also seems that the custom converters I specified aren't being picked up. Note. I have other customer converters specified that are being picked up so don't believe it's a configuration issue.
Any help would be appreciated, I'm not sure why the custom pattern fails, but think I could avoid it if the custom converters were picked up. I can workaround the issue for now, but ideally I'd like everything to be consistent before and after the upgrade.
Don't use yyyy in a date pattern, change it to (see the Elasticsearch docs)
pattern = "uuuu-MM-dd'T'HH:mm:ss.SSSSSSZ")
By defining the property as FieldType.Dateinternally a converter is created for this property and used; the custom converters aren't needed
ElasticsearchDateConverter is a final class and causes error on custom date patterns.
ElasticsearchCustomConversions work only on "non-mapped" date types.
This is a limitation for the newest versions of spring-data-elasticsearch.
The fields on elastic can accept many date formats but on spring this is blocked.
solution: use only rest client and jackson with custom date formats:
private ObjectMapper getJacksonObjectMapper() {
if (jacksonMapper == null) {
jacksonMapper = new ObjectMapper();
jacksonMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
jacksonMapper.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true);
jacksonMapper.configure(DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT, true);
jacksonMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
// strictMapper.disable(DeserializationFeature.READ_DATE_TIMESTAMPS_AS_NANOSECONDS);
SimpleModule module = new SimpleModule();
module.addDeserializer(LocalDateTime.class, new CustomLocalDeteTimeDeserializer());
module.addDeserializer(ZonedDateTime.class, new CustomZonedDeteTimeDeserializer());
module.addDeserializer(Date.class, new CustomDeteDeserializer());
jacksonMapper.registerModule(module);
}
return jacksonMapper;
}
public class CustomLocalDeteTimeDeserializer extends JsonDeserializer<LocalDateTime> {
#Override
public LocalDateTime deserialize(JsonParser jsonparser, DeserializationContext context)
throws IOException, JsonProcessingException {
String dateAsString = jsonparser.getText();
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"));
} catch (Exception e) {
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMddHHmmss"));
} catch (Exception e1) {
try {
return LocalDateTime.parse(dateAsString, DateTimeFormatter.ofPattern("yyyyMMdd"));
} catch (Exception e2) {
throw new RuntimeException(e2);
}
}
}
}
}
#Bean(name="customConverter")
public ElasticsearchConverter elasticsearchConverter(SimpleElasticsearchMappingContext mappingContext,
ElasticsearchCustomConversions elasticsearchCustomConversions) {
DefaultConversionService cs=new DefaultConversionService();
MappingElasticsearchConverter converter = new MappingElasticsearchConverter(mappingContext,cs) {
#Override
public <R> R read(Class<R> type, org.springframework.data.elasticsearch.core.document.Document source) {
return getJacksonObjectMapper().convertValue(source, type);
}
};
converter.setConversions(elasticsearchCustomConversions);
return converter;
}
public ElasticsearchRestTemplate elasticSearchTemplate(#Qualifier("customConverter")ElasticsearchConverter elasticsearchConverter) {
return new ElasticsearchRestTemplate(client(), elasticsearchConverter);
}
Related
I am using MongoDBTempalate with Springboot and trying to aggregate data basis LocalDateTime in which I am getting this error : org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.time.LocalDateTime] to type [java.util.Date]
I tried adding a custom convertor but it did not help, the code I added is :
`#Bean
public MongoCustomConversions customConversions(){
List<Converter<?,?>> converters = new ArrayList<>();
converters.add(DateToLocalDateTimeConverter.INSTANCE);
converters.add( LocalDateTimeToDateConverter.INSTANCE);
return new MongoCustomConversions(converters);
}
enum DateToLocalDateTimeConverter implements Converter<Date, LocalDateTime> {
INSTANCE;
#Override
public LocalDateTime convert(Date source) {
return ofInstant(source.toInstant(), systemDefault());
}
}
enum LocalDateTimeToDateConverter implements Converter<LocalDateTime, Date> {
INSTANCE;
#Override
public Date convert(LocalDateTime source) {
return Date.from(source.toInstant(ZoneOffset.UTC));
}
}`
Can someone tell me where have I gone wrong in creating the convertor, or is there some alternative apart from changing the LocalDateTime to Date in the code, as the occurance are very huge and refactoring might take a lot of time and effort
You can try this annotation on your entity or DTO, it will automatically format the date.
#Document(collection = "sample")
public class Foo{
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = 'yyyy-MM-dd HH:mm', locale = "en-PH")
private Date createdAt;
}
I'm using spring-boot 2.1.6 and there is an API to accept a form including a date like:
#Data
public class MyForm {
private LocalDate date;
...
}
#Controller
public class MyController {
#PostMapping("...")
public ResponseEntity<...> post(#RequestBody MyForm myForm) {
...
}
}
By default spring MVC accept this JSON format:
{
"date": [2020, 6, 17],
...
}
So in Front-End, my JavaScript code just submit a form like this, i.e. JS will convert a date to an array.
But when I run spring-boot test, this serialization does not work, with the following code:
private ObjectMapper mapper = new ObjectMapper();
#Autowired
private MockMvc mockMvc;
#Test
public void doTest() {
MyForm form = ...
MvcResult result = mockMvc.perform(MockMvcRequestBuilders.post("/...").
contentType("application/json").content(mapper.writeValueAsString(form)).andReturn();
...
}
This is because Jackson by default serialize LocalDate as:
{
"date": {
"year":2020,
"month":"JUNE",
"monthValue":6,
...
}
...
}
As mentioned here: LocalDate Serialization: date as array? , there are many configurations to force spring-boot serialize data as format yyyy-MM-dd. But I don't want to change my JS code. I just want to make my test case work.
How can I configure ObjectMapper to force Jackson to serialize LocalDate to Array? I just want to get this:
{
"date": [2020, 6, 17],
...
}
UPDATE
LocalDate here is java.time.LocalDate but not org.joda.time.LocalDate.
You need to register JavaTimeModule. Maven dependency:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</dependency>
Example, how to use it:
import com.fasterxml.jackson.databind.json.JsonMapper;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
import java.time.LocalDate;
public class JsonApp {
public static void main(String[] args) throws Exception {
JsonMapper mapper = JsonMapper.builder()
.addModule(new JavaTimeModule())
.build();
mapper.writeValue(System.out, new MyForm());
}
}
class MyForm {
private LocalDate value = LocalDate.now();
public LocalDate getValue() {
return value;
}
public void setValue(LocalDate value) {
this.value = value;
}
}
Above code prints:
{"value":[2020,6,17]}
See also:
jackson-modules-java8
Jackson Serialize Instant to Nanosecond Issue
Jackson deserialize elasticsearch long as LocalDateTime with Java 8
You could try to create a custom deserializer for LocalDate
class LocalDateDeserializer extends StdDeserializer<LocalDate> {
#Override
public LocalDate deserialize(JsonParser parser, DeserializationContext context)
throws IOException, JsonProcessingException {
// implement;
}
}
And then register it by adding a Module bean. From the documentation:
Any beans of type com.fasterxml.jackson.databind.Module are automatically registered with the auto-configured Jackson2ObjectMapperBuilder and are applied to any ObjectMapper instances that it creates. This provides a global mechanism for contributing custom modules when you add new features to your application.
#Bean
public Module LocalDateDeserializer() {
SimpleModule module = new SimpleModule();
module.addDeserializer(LocalDate.class, new LocalDateDeserializer());
return module;
}
you can bulid a converter that gets the date value an returns the wanted array.
this will be your entity
#JsonSerialize(converter=DateToArray.class)
private LocalDate date;
your converter
#Component
public class DateToArray extends StdConverter< Date, String[]> {
#Override
public String[] convert(Date value) {
//logic for pushing data into Array and return it
}
}
I have patching problem which is related to converting the String value the corresponding type. When I try to patch the "Locale" type (or primitives), it works. But it fails for Instant
Entity:
#JsonIgnore
#Field("locale")
private Locale locale;
#JsonIgnore
#Field("dateOfBirth")
private Instant dateOfBirth;
#JsonIgnore
public Locale getLocale() {
return this.locale;
}
#JsonIgnore
public void setLocale(Locale locale) {
this.locale = locale;
}
#JsonIgnore
public Instant getDateOfBirth() {
return this.dateOfBirth;
}
#JsonIgnore
public void setDateOfBirth(Instant dateOfBirth) {
this.dateOfBirth = dateOfBirth;
}
Patch method:
public static <T> T applyPatchOnObject(Class<T> type, T object, JsonNode jsonNode) {
try {
ObjectMapper mapper = new ObjectMapper();
mapper.registerModule(new JavaTimeModule());
return new JsonPatchPatchConverter(mapper).convert(jsonNode).apply(object, type);
} catch (Exception e) {
throw new UnprocessableEntityException(e.getMessage());
}
}
pom.xml
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.10.RELEASE</version>
<relativePath />
</parent>
<!-- Date -->
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</dependency>
Data:
[{"op": "replace", "path": "dateOfBirth", "value": "1971-01-01T01:01:01.001Z"}]
The exception:
EL1034E: A problem occurred whilst attempting to set the property 'dateOfBirth': Type conversion failure
Deeper exception:
EL1001E: Type conversion problem, cannot convert from java.lang.String to #com.fasterxml.jackson.annotation.JsonIgnore #org.springframework.data.mongodb.core.mapping.Field java.time.Instant
Edit 1:
The following code blocks work:
Code: System.out.println(mapper.readValue("1517846620.12312312", Instant.class));
Result: 2018-02-05T16:03:40.123123120Z
The following code blocks DO NOT work:
Patch: [{"op": "replace", "path": "dateOfBirth", "value": "1517846620.12312312"}]
Solution:
Although the answer from #Babl will probably work, I figure the following things out.
As #Babl pointed out, the Spring framework patching is NOT done FasterXML but by Spring Expression Context so all Jackson annotations DO NOT take any effect.
I was patching the User entity directly which is VERY BAD practice.
So I ended up with the following implementation
The Patch library
<dependency>
<groupId>com.flipkart.zjsonpatch</groupId>
<artifactId>zjsonpatch</artifactId>
<version>${zjsonpatch.version}</version>
</dependency>
The Patch metod
public static <T extends EmbeddedResource> T applyPatchOnObject(Class<T> type, T object, JsonNode jsonNode) {
Assert.notNull(type, "Given type must not be null!");
Assert.notNull(object, "Given object must not be null!");
Assert.notNull(jsonNode, "Given jsonNode must not be null!");
try {
ObjectMapper mapper = new ObjectMapper().registerModule(new JavaTimeModule());
return mapper.convertValue(JsonPatch.fromJson(jsonNode).apply(mapper.convertValue(object, JsonNode.class)),
type);
} catch (Exception e) {
throw new UnprocessableEntityException(e.getMessage());
}
}
!NOTE: theapplyPatchOnObject method ONLY accepts classes which extend EmbeddedResource, which in extends ResourceSupport. So basically DTOs only.
The entity is the same
Introduce UserDTO with all the proper Jackson annotations:
#NotNull(message = "locale cannot be null")
#JsonProperty("locale")
private Locale locale;
#NotNull(message = "dateOfBirth cannot be null")
#JsonProperty("dateOfBirth")
private Instant dateOfBirth;
#JsonIgnore
public Locale getLocale() {
return this.locale;
}
#JsonIgnore
public void setLocale(Locale locale) {
this.locale = locale;
}
#JsonIgnore
public Instant getDateOfBirth() {
return this.dateOfBirth;
}
#JsonIgnore
public void setDateOfBirth(Instant dateOfBirth) {
this.dateOfBirth = dateOfBirth;
}
After I have my DTO patched with values. I will use ObjectMapper or some custom way to apply changes from the DTO to the Entity.
All recommendations and advices are welcome.
Basically, the problem is the fact that the data binding is not done by FasterXML but by Spring Expression Context. So adding the jackson-datatype-jsr310 will not help at all. The FasterXML will be only used if the patch value is object or an array. But in your case, the patch value is string type so JsonPatchPatchConverter will try to convert values using purely Spring tools (Spring Expression Context). So what now you are missing is the String to Instant converter for a Spring Framework. I'm quite sure that there are some implementations available and even maybe some are within the Spring libraries, but I will create here a simple one and show how you can register it.
Initially, let's create a converter (Not the best implementation, just for proof of concept).
public static enum StringToInstantConverter implements Converter<String, Instant> {
INSTANCE;
#Override
public Instant convert(String source) {
try {
return Instant.parse(source);
} catch(DateTimeParseException ex) {
}
return null;
}
}
And register it before calling the applyPatchOnObject method
Something like this will work.
// REGISTER THE CONVERTER
ConversionService conversionService = DefaultConversionService.getSharedInstance();
ConverterRegistry converters = (ConverterRegistry) conversionService;
converters.addConverter(StringToInstantConverter.INSTANCE);
ObjectMapper mapper = new ObjectMapper();
ArrayNode patchArray = mapper.createArrayNode();
ObjectNode patch = mapper.createObjectNode();
patch.put("op", "replace");
patch.put("path", "dateOfBirth");
patch.put("value", "1971-01-01T01:01:01.001Z");
// [{"op": "replace", "path": "dateOfBirth", "value": "1971-01-01T01:01:01.001Z"}]
patchArray.add(patch);
// apply the patch
User patchedUser = applyPatchOnObject(User.class, new User(), patchArray);
Just to complement the answer above. If you have the Instant class available, why use SimpleDateFormat then? Just parse the input directly:
public Instant convert(String source) {
try {
return Instant.parse(source);
} catch(DateTimeParseException ex) {
}
return null;
}
A String like "1971-01-01T01:01:01.001Z" is already in the format parsed by the method above, so this should work.
If you need to parse inputs in different formats, just use a DateTimeFormatter: https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html
I am attempting to persist a java.time.LocalDateTime object in my Cassandra database and keep it timezone agnostic. I am using Spring Data Cassandra to do this.
The problem is that somewhere along the line, something is treating these LocalDateTime objects as if they are in the timezone of my server, and offsetting them to UTC time when it stores them in the database.
Is this a bug or a feature? Can I work around it in some way?
Configuration:
#Configuration
#EnableCassandraRepositories(
basePackages = "my.base.package")
public class CassandraConfig extends AbstractCassandraConfiguration{
#Override
protected String getKeyspaceName() {
return "keyspacename";
}
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster =
new CassandraClusterFactoryBean();
cluster.setContactPoints("127.0.0.1");
cluster.setPort(9142);
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
}
Booking record I wish to persist:
#Table("booking")
public class BookingRecord {
#PrimaryKeyColumn(
ordinal = 0,
type = PrimaryKeyType.PARTITIONED
)
private UUID bookingId = null;
#PrimaryKeyColumn(
ordinal = 1,
type = PrimaryKeyType.CLUSTERED,
ordering = Ordering.ASCENDING
)
private LocalDateTime startTime = null;
...
}
Simple Repository Interface:
#Repository
public interface BookingRepository extends CassandraRepository<BookingRecord> { }
Save Call:
...
#Autowired
BookingRepository bookingRepository;
...
public void saveBookingRecord(BookingRecord bookingRecord) {
bookingRepository.save(bookingRecord);
}
Here is the string used to populate the starttime in BookingRecord:
"startTime": "2017-06-10T10:00:00Z"
And here is the output from cqlsh after the timestamp has been persisted:
cqlsh:keyspacename> select * from booking ;
bookingid | starttime
--------------------------------------+--------------------------------
8b640c30-4c94-11e7-898b-6dab708ec5b4 | 2017-06-10 15:00:00.000000+0000
Cassandra stores a Date (timestamp) as milliseconds since epoch without a specific timezone information. Timezone data is handled in the layers above Cassandra.
LocalDate/LocalDateTime represent a point in time relative to your local time. Before the date/time can be saved, it needs to be enhanced with a timezone to calculate the generic representation, which can be saved.
Spring Data uses your system-default timezone (Date.from(source.atZone(systemDefault()).toInstant())).
If you need timezone precision and want to omit any implicit timezone conversions, use java.util.Date directly which corresponds with Cassandra's (well, it's the Datastax Driver to be precise) storage format representation.
I do actually want to use LocalDateTime and LocalDate in my project, rather than java.util.Date, since they are newer and have more attractive functionality.
After much searching I have found a workaround.
First, you must create custom implementations of Spring's Converter interface as follows:
One for Date to LocalDateTime:
public class DateToLocalDateTime implements Converter<Date, LocalDateTime> {
#Override
public LocalDateTime convert(Date source) {
return source == null ? null : LocalDateTime.ofInstant(source.toInstant(), ZoneOffset.UTC);
}
}
And one for LocalDateTime to Date:
public class LocalDateTimeToDate implements Converter<LocalDateTime, Date> {
#Override
public Date convert(LocalDateTime source) {
return source == null ? null : Date.from(source.toInstant(ZoneOffset.UTC));
}
}
Finally, you must override the customConversions method in CassandraConfig as follows:
#Configuration
#EnableCassandraRepositories(basePackages = "my.base.package")
public class CassandraConfig extends AbstractCassandraConfiguration{
#Override
protected String getKeyspaceName() {
return "keyspacename";
}
#Override
public CustomConversions customConversions() {
List<Converter> converters = new ArrayList<>();
converters.add(new DateToLocalDateTime());
converters.add(new LocalDateTimeToDate());
return new CustomConversions(converters);
}
#Bean
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster =
new CassandraClusterFactoryBean();
cluster.setContactPoints("127.0.0.1");
cluster.setPort(9142);
return cluster;
}
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
return new BasicCassandraMappingContext();
}
}
Thanks to mp911de for putting me in the ballpark of where to look for the solution!
I have an item I'd like to store in Dynamo:
public class Statement {
#DynamoDBTypeConverted(converter = ListLineItemConverter.class)
private List<LineItem> items;
}
and the definition of LineItem is the following:
public class LineItem {
private ZonedDateTime dateStart;
private ZonedDateTime dateEnd;
private long balance;
#DynamoDBTypeConverted(converter = ZonedDateTimeConverter.class)
public getDateStart() {...}
#DynamoDBTypeConverted(converter = ZonedDateTimeConverter.class)
public getDateEnd() {...}
}
I've been using a known working converter for ZonedDateTime which is the following:
public class ZonedDateTimeConverter implements DynamoDBTypeConverter<String, ZonedDateTime> {
public ZonedDateTimeConverter(){}
#Override
public String convert(final ZonedDateTime time) {
return time.toString();
}
#Override
public ZonedDateTime unconvert(final String stringValue) {
return ZonedDateTime.parse(stringValue);
}
}
And the converter works perfectly when it's annotated on a base class. But I have a custom type nested in a list of items and I can't seem to figure out how to get DynamoDB to correctly convert / unconvert a nested ZonedDateTime.
I even wrote a custom converter for List of LineItem without luck:
public class ListLineItemConverter implements DynamoDBTypeConverter<String, List<LineItem>> {
private ObjectMapper objectMapper;
public ListLineItemConverter() {
objectMapper = new ObjectMapper();
objectMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
// THIS LINE OF CODE FIXED THE ISSUE FOR ME
objectMapper.findAndRegisterModules();
// THIS LINE OF CODE FIXED THE ISSUE FOR ME
}
#Override
public String convert(List<LineItem> object) {
try {
String result = objectMapper.writeValueAsString(object);
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
throw new RuntimeException("bad json marshalling");
}
}
#Override
public List<LineItem> unconvert(String object) {
try {
return objectMapper.readValue(object, new TypeReference<List<LineItem>>() {});
} catch (IOException e) {
throw new RuntimeException("bad json unmarshalling");
}
}
}
There's no combination of annotations that I can seem to use to get this to work. I always get:
com.fasterxml.jackson.databind.JsonMappingException: Can not construct instance of java.time.ZonedDateTime: no suitable constructor found, can not deserialize from Object value (missing default constructor or creator, or perhaps need to add/enable type information?)
EDIT: If I comment out the instances of ZonedDateTime from LineItem then the code works totally fine. So DynamoDB is having trouble reading the #DynamoDBTypeConverted annotation when it's buried 3 levels deep:
Statement.items.get[0].dateStart // annotations aren't working at this level
Statement.items.get[0].dateEnd // annotations aren't working at this level
Looks like there was an issue when using the Jackson parser on nested instances of ZonedDateTime and it not automatically picking it up.
I've switched to the newer version of Jackson and even included the JSR310 module to ensure that compatibility with marshaling / unmarshaling the newer java 8 time constructs was supported but alas.
Weirdly one line of code fixed it for me:
objectMapper.findAndRegisterModules();
taken from: https://stackoverflow.com/a/37499348/584947