I am trying to format the datetime by using #Convert(converter = MyConverter.class).
This is working as expected while saving and data is properly saved in the DB.
The issue I am facing is the object that's being returned while responseEntity = myrepository.save(myEntity) is not having the formated date. The field in the responseEntity is still returning old format. Am I missing anything?
My converter class:
public class DateTimeConverter implements
AttributeConverter<LocalDateTime, String> {
#Override
public String convertToDatabaseColumn(LocalDateTime attribute) {
if(Objects.isNull(attribute)) {
return null;
}
attribute = attribute.atZone(ZoneId.systemDefault()).withZoneSameInstant(ZoneOffset.UTC).toLocalDateTime();
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
return attribute.format(formatter);
}
#Override
public LocalDateTime convertToEntityAttribute(String dbData) {
if(Objects.isNull(dbData)) {
return null;
}
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
return LocalDateTime.parse(dbData, formatter);
}
}
Ran into a similar problem and discovered that the attribute conversion wasn't done by save() but it was done by saveAndFlush(). Breakpoints within the conversion weren't hit during save() but were by saveAndFlush(). If save() was used the conversion was hit on a subsequent find... query. This was also an issue if save should have thrown an exception as it got delayed until flush or subsequent query.
repository.save() does return the converted value, entity value converted before flush time. It only attached to the persistence context.
But you want converted value using convertToEntityAttribute. convertToEntityAttribute only called when you fetch from the database.
Do this operation in service
entity.setProperty(entity.getProperty().atZone(ZoneId.systemDefault()).withZoneSameInstant(ZoneOffset.UTC).toLocalDateTime());
Related
I am connecting to a mysql server using the following DSN: jdbc:mysql://localhost/my_database?useUnicode=true&characterEncoding=utf-8&serverTimezone=UTC.
The problem I'm getting is that the java.sql.Date instance is getting timezone converted to UTC from my local timezone. My application treats dates as timezone agnostic and this is causing a few problems.
For instance, I'm in IST (UTC+05:30), when I set some date field to say '2020-01-22' in code, it gets sent to the server as '2020-01-21'. I have verified this from the mysql general log.
I have tried a few combinations of useLegacyDatetimeCode, useTimezone and noTimezoneConversionForDateType but I've been so far unable to get the mysql driver to skip conversion of the date field.
How do I get the mysql driver to skip the conversion for the Date and Time fields?
I have tried both version 6 and 8 of the Connector/J driver mysql:mysql-connector-java:<version>.
Also, I'm using JOOQ and using a simple converter to convert between LocalDate and java.sql.Date.
If you really want to have a "time-zone agnostic" date, you would have to use LocalDate within Java. LocalDate maps quite nicely to MySQL's DATE type.
https://thoughts-on-java.org/hibernate-5-date-and-time/
No time zones involved at all.
I just had the same problem myself and for now I solved it with this converter:
public final class DateConverter implements Converter<Date, LocalDate> {
#Override
public final LocalDate from(final Date value) {
if(null == value) {
return null;
} else {
return value.toLocalDate();
}
}
#Override
public final Date to(final LocalDate value) {
if(null == value) {
return null;
} else {
return new Date(value.atStartOfDay(ZoneOffset.UTC).toInstant().toEpochMilli());
}
}
#Override
public final Class<Date> fromType() {
return Date.class;
}
#Override
public final Class<LocalDate> toType() {
return LocalDate.class;
}
}
I must admit that I did not yet think about the behaviour, if this converter is used with different timezones...
I am working on an application which sends an object to a server for processing. The object is sent in JSON format using Spring.
My issue is that all the fields are passed correctly - EXCEPT for the Date variables. They show up as a completely different value, and I am stumped as to why.
Here is an abbreviated version of the object that is being passed:
public class TransactionParameters {
public Date startDate;
public Date endDate;
public List<String> transactionCodes;
public Date getStartDate() {
return startDate;
}
public void setStartDate(Date startDate) {
this.startDate = startDate;
}
public Date getEndDate() {
return endDate;
}
public void setEndDate(Date endDate) {
this.endDate = endDate;
}
public List<String> getTransactionCodes() {
return transactionCodes;
}
public void setTransactionCodes(List<String> transactionCodes) {
this.transactionCodes = transactionCodes;
}
}
Here is the JSON created:
{"transactionCodes":["195"],"startDate":1524456000000,"endDate":1524456000000}
Here is the client code:
String responseString =
restTemplate.postForObject("http://localhost:9080/app/transaction"
+ "testUser123", transactionParameters, String.class);
Here is the server code:
#ApiOperation(value="Get Transactions for Customer")
#POST
#Produces({ MediaType.APPLICATION_JSON })
#Consumes(MediaType.APPLICATION_JSON)
#Path("/customerAccountTransactions/{customerCode: [a-zA-Z0-9]+}")
#RequestMapping(value ="/transaction/{customerCode: [a-zA-Z0-9]+}", method=RequestMethod.POST, produces=MediaType.APPLICATION_JSON, consumes=MediaType.APPLICATION_JSON)
#ApiImplicitParams(#ApiImplicitParam(name = AUTHORIZATION, value = AUTHORIZATION, required = true, dataType = STRING, paramType = HEADER))
public Response getAccountTransactionsForCustomer(#PathVariable(CUSTOMER_CODE) #PathParam(CUSTOMER_CODE) final String customerCode, TransactionParameters transactionParameters) throws IntegrationException {
LOGGER.info("getAccountTransactionsForCustomer()");
Response response = null;
try {
final AccountTransactionsBean atb = getTransactions(customerCode, transactionParameters)
response = ResponseBuilder.buildSuccessResponse(atb);
} catch (final NotAuthorizedException nae) {
response = ResponseBuilder.buildNotAuthorizedResponse();
}
return response;
}
But here's my issue - When I put a breakpoint at where the client calls the endpoint, the date is correct.
However, the date is wildly incorrect as it enters the server's endpoint.
All the the other variables in the TransactionParameters bean are correct. I have also replicated this call using SOAP UI, to rule out any issues with the client, and the issue still persists.
Can anyone offer any suggestions?
Thanks in advance for any help.
The reason for this issue is that Date and String are two different data types. When you are converting your Object to JSON, it is directly converting the date to String and in that process losing its essence.
In order to solve this, you need to tell the code that those particular fields are dates and thus, need to be retained as it is. You can do that by using annotations in your POJO:
Example:
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd HH:mm:ss.SSSZ")
private Date changeTimestamp;
You can use the above syntax and then change the pattern as per your need.
Disclaimer admittedly I don't know much about Spring REST so I can only give you general pointers, but this really does seem like a de-serialization issue.
Some general things to consider:
Make sure the server and client have the same settings for serializing/de-serializing.
Make sure they are running the same versions of Spring REST and Jackson.
Set the JVM arg -Djavax.net.debug=all and run again to look at what is really being sent/recieved.
Being Spring REST this uses Jackson under the hood right?
Try explicitly annotating your dates and see if that helps:
public class TransactionParameters {
#JsonFormat(pattern="yyyy-MM-dd HH:mm:ss")
public Date startDate;
#JsonFormat(pattern="yyyy-MM-dd HH:mm:ss")
public Date endDate;
// ...
}
You probably have to either add or remove the milliseconds to get the conversion to work correctly. 000
I am trying to add a timestamp field in an Android client with Firebase Firestore.
According to the documentation:
Annotation used to mark a Date field to be populated with a server
timestamp. If a POJO being written contains null for a
#ServerTimestamp-annotated field, it will be replaced with a
server-generated timestamp.
But when I try it:
#ServerTimestamp
Date serverTime = null; // I tried both java.util.Date and java.sql.Date
//...
Map<String, Object> msg = new HashMap<>();
// ... more data
msg.put("timestamp", serverTime);
On the Cloud Firestore database this field is always null.
That is not the correct way of how to add the time and date to a Cloud Firestore database. The best practice is to have a model class in which you can add a date field of type Date together with an annotation. This is how your model class should look like:
import java.util.Date;
public class YourModelClass {
#ServerTimestamp
private Date date;
YourModelClass() {}
public Date getDate() {
return date;
}
public void setDate(Date date) {
this.date = date;
}
}
When you create on object of YourModelClass class, there is no need to set the date. Firebase servers will read your date field, as it is a ServerTimestamp (see the annotation), and it will populate that field with the server timestamp accordingly.
Another approach would be to use FieldValue.serverTimestamp() method like this:
Map<String, Object> map = new HashMap<>();
map.put("date", FieldValue.serverTimestamp());
docRef.update(map).addOnCompleteListener(new OnCompleteListener<Void>() {/* ... */}
use FieldValue.serverTimestamp() get server timestamp
Map<String, Object> msg = new HashMap<>();
msg.put("timestamp", FieldValue.serverTimestamp());
I have similar problem, and I found this at my catlog and solved my problem
firebaseFirestore = FirebaseFirestore.getInstance();
FirebaseFirestoreSettings settings = new FirebaseFirestoreSettings.Builder()
.setTimestampsInSnapshotsEnabled(true)
.build();
firebaseFirestore.setFirestoreSettings(settings);
I had a similar problem,
I was getting the exception ...has type java.sql.Timestamp, got java.util.Date ..., so I just replace the type from Timestamp to Date ( from java.util.Date) and worked fine.
I've implemented a class for a Java Web App I'm working on. The class has a LocalDateTime property 'created'. However, when I try to set that property (once), its setter is somehow called twice in succession - first setting the value I want, then setting it to null on a second call that should not even happen.
I've traced through the following method and everything looks well up to the third line.
public static ICEDocument mapDocumentFromSOLR(SolrDocument document) {
ICEDocument result = new ICEDocument();
Date uploaded = (Date) document.getFieldValue("CREATED");
LocalDateTime uploadDate = LocalDateUtils.convertUtcDateToLocalDateTime(uploaded); // custom class
result.setCreated(uploadDate); // **faulty line**
}
Here's the class, shortened for clarity:
import java.time.LocalDateTime;
import org.springframework.data.annotation.Transient;
[...]
#JsonIgnoreProperties(ignoreUnknown=true)
public class ICEDocument implements java.io.Serializable {
[...]
#Transient
private LocalDateTime created;
[...]
#JsonDeserialize(using=LocalDateTimeJsonDeserializer.class)
public void setCreated(LocalDateTime created) {
System.out.println("Setting creation date " + created); // added for debugging
this.created = created;
}
}
Steps I've taken trying to resolve this
Removing the #Transient. The data is filled in via Hibernate (ver5.1), and I originally annotated the property since the field itself is not in the corresponding database table. I thought that might be the problem (see Object Serialization and Java Transient Variables), but removing it didn't change anything.
Changing the third line. I switched it with what was basically inside the static LocalDateUtils method. This didn't resolve the issue.
LocalDateTime uploadDate = uploaded.toInstant().atZone(ZoneId.of("UTC")).toLocalDateTime();
Removing the JSON Deserializer. I don't think the JsonDeserializer is at fault since it isn't supposed to (and doesn't accd. to Debug) do anything at this point, but I'll add it here for completeness sake. Could be I'm just grasping at straws at this point.
public class LocalDateTimeJsonDeserializer extends JsonDeserializer<LocalDateTime> {
private static final String DATE_TIME = "yyyy-MM-dd' 'HH:mm:ss";
#Override
public LocalDateTime deserialize(JsonParser parser, DeserializationContext context)
throws IOException, JsonProcessingException {
DateTimeFormatter formatter = DateTimeFormatter.ofPattern(DATE_TIME);
LocalDateTime deserializedDate = LocalDateTime.parse(parser.getText(), formatter);
return deserializedDate;
}
}
Thank you for reading to the end of my rather long post.
After debugging the code I found a line further down that set the property to null. So it was in fact a second call to the setter and a lot of bad luck, I suppose.
But it might help to know that there wasn't anything wrong with the other factors, so I"ll just leave this here. Thanks again.
I've got a dynamodb table with a timestamp ("creationDate") as a range. The model is using a joda DateTime to make it easy to use (compatibility with the rest of the code). To be able to make between queries on this range, I used a numeric type for the attribute in the table, and planned to store it as a java timestamp (milliseconds since epoch). Then, I added a marshaller to convert a joda DateTime to a String representing a long and vice-versa.
The table structure (creation):
void CreateTable()
{
CreateTableRequest createTableRequest = new CreateTableRequest().withTableName(LinkManager.TABLE_NAME);
ProvisionedThroughput pt = new ProvisionedThroughput()
.withReadCapacityUnits(LinkManager.READ_CAPACITY_UNITS)
.withWriteCapacityUnits(LinkManager.WRITE_CAPACITY_UNITS);
createTableRequest.setProvisionedThroughput(pt);
ArrayList<AttributeDefinition> ad = new ArrayList<AttributeDefinition>();
ad.add(new AttributeDefinition().withAttributeName("creationDate").withAttributeType(ScalarAttributeType.N));
ad.add(new AttributeDefinition().withAttributeName("contentHash").withAttributeType(ScalarAttributeType.S));
createTableRequest.setAttributeDefinitions(ad);
ArrayList<KeySchemaElement> ks = new ArrayList<KeySchemaElement>();
ks.add(new KeySchemaElement().withAttributeName("contentHash").withKeyType(KeyType.HASH));
ks.add(new KeySchemaElement().withAttributeName("creationDate").withKeyType(KeyType.RANGE));
createTableRequest.setKeySchema(ks);
this.kernel.DDB.createTable(createTableRequest);
}
The model:
#DynamoDBTable(tableName="Link")
public class Link {
private String ContentHash;
private DateTime CreationDate;
#DynamoDBHashKey(attributeName = "contentHash")
public String getContentHash() {
return ContentHash;
}
public void setContentHash(String contentHash) {
ContentHash = contentHash;
}
#DynamoDBRangeKey(attributeName = "creationDate")
#DynamoDBMarshalling(marshallerClass = DateTimeMarshaller.class)
public DateTime getCreationDate() {
return CreationDate;
}
public void setCreationDate(DateTime creationDate) {
CreationDate = creationDate;
}
}
The marshaller:
public class DateTimeMarshaller extends JsonMarshaller<DateTime>
{
public String marshall(DateTime dt)
{
return String.valueOf(dt.getMillis());
}
public DateTime unmarshall(String dt)
{
long ldt = Long.parseLong(dt);
return new DateTime(ldt);
}
}
I get the following error:
Exception in thread "main" com.amazonaws.AmazonServiceException: Type of specified attribute inconsistent with type in table (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 8aabb703-cb44-4e93-ab47-c527a5aa7d52)
I guess this is because the marshaller returns a String, and dynamoDB wants a numeric type, as the attribute type is N. I don't know what people do in this case, I searched for a solution but couldn't find it. I only tested this on a local dynamodb instance, which I don't think makes any difference (this is a validation check failing, there's no request even made).
The obvious workaround is to use long type for the dates in the model and add special getters and setters to work with DateTime. Still, is there a cleaner way ? I am sure I'm not the only one using DatTime range in a model.
What I would do is re-create the table with the Range key as String itself.
Even if it's going to be populated with long numbers, making it type: S will ensure compatibility with the Marshaller