Joda DateTime in OrientDB Object API - java

I am using JDK 8, JodaTime 2.9.9 and OrientDB version 2.2.26.
Is there a way to use DateTime objects with OrientDB Object API?
Example class:
class Entity {
private DateTime date;
public Entity(DateTime date){
this.date = date
}
public DateTime getDate(){
return date;
}
public void setDate(DateTime newDate){
this.date = newDate;
}
}
Registering in OrientDB:
database.getEntityManager().registerEntityClass(Entity)
If I try to save it:
database.save(new Entity(DateTime.now()))
Then I get:
com.orientechnologies.orient.core.exception.OSerializationException:
Linked type [class org.joda.time.DateTime:2017–09–12T11:50:25.709–03:00]
cannot be serialized because is not part of registered entities.
If I try to register DateTime:
database.getEntityManager().registerEntityClass(DateTime)
And I try to save entity again:
database.save(new Entity(DateTime.now()))
Since it is a final class, javassist cannot proxy it, so I got a:
java.lang.RuntimeException: org.joda.time.DateTime is final
I don’t wanna to change my class to store a long instead of a DateTime. Is there a way to implement and register some sort of serializer and deserializer for DateTime or something that, similarly, would not interfere with my entity?

Ok, I find out how to do it (code in Groovy):
def orientDbServer = OServerMain.create()
System.setProperty("ORIENTDB_HOME", new File("").getAbsolutePath());
orientDbServer.startup(new OServerConfiguration().with { cfg ->
location = "memory"
network = new OServerNetworkConfiguration(this)
users = [new OServerUserConfiguration(name: "root",
password: "root",
resources: "*")] as OServerUserConfiguration[]
cfg
})
orientDbServer.activate()
addShutdownHook {
orientDbServer.shutdown()
}
new OServerAdmin("localhost").connect("root", "root")
.createDatabase("test", "document", "memory").close()
OObjectDatabaseTx database =
new OObjectDatabaseTx("memory:localhost/test").open("admin", "admin")
OObjectSerializerContext serializerContext = new OObjectSerializerContext();
serializerContext.bind(new OObjectSerializer<DateTime, Long>() {
#Override
Object serializeFieldValue(Class<?> iClass, DateTime iFieldValue) {
return iFieldValue.getMillis()
}
#Override
Object unserializeFieldValue(Class<?> iClass, Long iFieldValue) {
return new DateTime(iFieldValue)
}
}, database)
OObjectSerializerHelper.bindSerializerContext(null, serializerContext)
It is important that the serializerContext is registered before any entity class registration

Related

What could be the reason of auto datetime conversion in java api response

I created one simple model class User. I used Util date here.
class User{
private int id;
private String name;
private Date createdAt;
}
On user post API call, I simply do setCreatedAt(new Date).
The problem is in the response, I am getting createdAt as -5.30 of the actual time. No additional time conversion method is called.
For Example, I hit the POST API user created at 28-10-2021 11:30:00 which I can see in the logs. But when it returns the response to the postman it shows 28-10-2021 06:00:00 time. There is no time conversion method in the code. I checked the return object in the return statement in debug mode even there is showing 28-10-2021 11:30:00.
I wanted to know where is this conversion happening. And how to stop this.
If it's the problem with datetime library, then which one should I use.
Extra information:
* My system timezone is in UTC.
* I am using ubuntu.
* Creating restFull APIs(JaxRs)
EDIT 1:
client and server are on the same machine(UTC timezone). For client, I am using Postman.
URL: [POST] /user
Request Body:
{
"name": "XYZ"
}
Actual Response:
{
"id": 1,
"name": "XYZ",
"createdAt: "28-10-2021 06:00:00"
}
Expected Response:
{
"id": 1,
"name": "XYZ",
"createdAt: "28-10-2021 11:30:00"
}
On user post API call, I simply do setCreatedAt(new Date).
It appears that you have not set the timezone while creating an instance of java.util.Date
By default it will set as UTC irrespective of your system timezone. You can use the https://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html instead.
final TIMEZONE = ""; // need to set the timezone here
SimpleDateFormat formatter = new SimpleDateFormat("dd-M-yyyy hh:mm:ss", Locale.ENGLISH);
formatter.setTimeZone(TimeZone.getTimeZone(TIMEZONE));
String dateInString = "28-10-2021 11:30:00";
Date date = formatter.parse(dateInString);
There might be JsonFormat annotations that have timeZone issues. Please check the link for more details on the issue.jackson-data-bind issue Overriding the timezone in ObjectMapper didn't work either. You can refer the solved example by implementing a custom Date Deserializer as below:
#Component
public class CustomDateDeserializer extends StdDeserializer<Date> {
private static final long serialVersionUID = 1L;
private SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd hh:mm:ss"); // specify your specific timezone
public CustomDateDeserializer() {
this(null);
}
public CustomDateDeserializer(Class<?> vc) {
super(vc);
}
#Override
public Date deserialize(JsonParser jsonparser, DeserializationContext context)
throws IOException, JsonProcessingException {
String date = jsonparser.getText();
try {
return formatter.parse(date);
} catch (ParseException e) {
throw new RuntimeException(e);
}
}
}
Also, add the deserializer on the setter method in your bean properties.
#JsonDeserialize(using = CustomDateDeserializer.class)
public void setReturnDateTime(Date returnDateTime) {
this.returnDateTime = returnDateTime;
}

#Convert is not getting called for the respository.save

I am trying to format the datetime by using #Convert(converter = MyConverter.class).
This is working as expected while saving and data is properly saved in the DB.
The issue I am facing is the object that's being returned while responseEntity = myrepository.save(myEntity) is not having the formated date. The field in the responseEntity is still returning old format. Am I missing anything?
My converter class:
public class DateTimeConverter implements
AttributeConverter<LocalDateTime, String> {
#Override
public String convertToDatabaseColumn(LocalDateTime attribute) {
if(Objects.isNull(attribute)) {
return null;
}
attribute = attribute.atZone(ZoneId.systemDefault()).withZoneSameInstant(ZoneOffset.UTC).toLocalDateTime();
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
return attribute.format(formatter);
}
#Override
public LocalDateTime convertToEntityAttribute(String dbData) {
if(Objects.isNull(dbData)) {
return null;
}
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
return LocalDateTime.parse(dbData, formatter);
}
}
Ran into a similar problem and discovered that the attribute conversion wasn't done by save() but it was done by saveAndFlush(). Breakpoints within the conversion weren't hit during save() but were by saveAndFlush(). If save() was used the conversion was hit on a subsequent find... query. This was also an issue if save should have thrown an exception as it got delayed until flush or subsequent query.
repository.save() does return the converted value, entity value converted before flush time. It only attached to the persistence context.
But you want converted value using convertToEntityAttribute. convertToEntityAttribute only called when you fetch from the database.
Do this operation in service
entity.setProperty(entity.getProperty().atZone(ZoneId.systemDefault()).withZoneSameInstant(ZoneOffset.UTC).toLocalDateTime());

Spring- Boot : How to handle timestamp (ie save and get timestamp to and from sql database)

I'm developing a spring boot application and i'm having problems in handling java.sql.timestamp. When i store the timestamp to database it store the right timestamp but when i fetch the timestamp from database it fetches the timestamp with 5:30 hours of difference. I'm getting wired result, sometimes i get same timestamp as in database and sometimes i'm getting timestamp with 5:30 hours of difference. I even used #JsonFormat(timezone = "GMT+05:30") annotation to get the consistent results. But some times it gives different results.
There is a better way to handle all Timezone problems while reading/writing to database in SpringBoot applications.
Use Java8 LocalDateTime instead of Timestamp. Create a converter class like this to match Timestamp in DB and LocalDateTime in your application:
#Converter(autoApply = true)
public class LocalDateTimeAttributeConverter implements AttributeConverter<LocalDateTime, Timestamp> {
#Override
public Timestamp convertToDatabaseColumn(LocalDateTime locDateTime) {
return (locDateTime == null ? null : Timestamp.valueOf(locDateTime));
}
#Override
public LocalDateTime convertToEntityAttribute(Timestamp sqlTimestamp) {
return (sqlTimestamp == null ? null : sqlTimestamp.toLocalDateTime());
}
}
Set your application's default timezone in your main class:
#SpringBootApplication
public class ExampleApplication {
private static final String ZONE_ID_ISTANBUL = "Europe/Istanbul";
public static void main(String[] args) {
TimeZone.setDefault(TimeZone.getTimeZone(ZONE_ID_ISTANBUL));
System.out.println("Application time zone: " + TimeZone.getDefault().getID());
SpringApplication.run(ExampleApplication.class, args);
}
}
By setting default time zone, whenever you use LocalDateTime, it will use this timezone as default, therefore even though your database runs in a different timezone with your application, you will run your code in your time zone.
Note that after creating the converter class, you have to use it in your entity as follows:
#Column(name = "insert_time", nullable = false)
#Convert(converter = LocalDateTimeAttributeConverter.class)
private LocalDateTime insertTime;

How to add a Timestamp in Firestore with Android?

I am trying to add a timestamp field in an Android client with Firebase Firestore.
According to the documentation:
Annotation used to mark a Date field to be populated with a server
timestamp. If a POJO being written contains null for a
#ServerTimestamp-annotated field, it will be replaced with a
server-generated timestamp.
But when I try it:
#ServerTimestamp
Date serverTime = null; // I tried both java.util.Date and java.sql.Date
//...
Map<String, Object> msg = new HashMap<>();
// ... more data
msg.put("timestamp", serverTime);
On the Cloud Firestore database this field is always null.
That is not the correct way of how to add the time and date to a Cloud Firestore database. The best practice is to have a model class in which you can add a date field of type Date together with an annotation. This is how your model class should look like:
import java.util.Date;
public class YourModelClass {
#ServerTimestamp
private Date date;
YourModelClass() {}
public Date getDate() {
return date;
}
public void setDate(Date date) {
this.date = date;
}
}
When you create on object of YourModelClass class, there is no need to set the date. Firebase servers will read your date field, as it is a ServerTimestamp (see the annotation), and it will populate that field with the server timestamp accordingly.
Another approach would be to use FieldValue.serverTimestamp() method like this:
Map<String, Object> map = new HashMap<>();
map.put("date", FieldValue.serverTimestamp());
docRef.update(map).addOnCompleteListener(new OnCompleteListener<Void>() {/* ... */}
use FieldValue.serverTimestamp() get server timestamp
Map<String, Object> msg = new HashMap<>();
msg.put("timestamp", FieldValue.serverTimestamp());
I have similar problem, and I found this at my catlog and solved my problem
firebaseFirestore = FirebaseFirestore.getInstance();
FirebaseFirestoreSettings settings = new FirebaseFirestoreSettings.Builder()
.setTimestampsInSnapshotsEnabled(true)
.build();
firebaseFirestore.setFirestoreSettings(settings);
I had a similar problem,
I was getting the exception ...has type java.sql.Timestamp, got java.util.Date ..., so I just replace the type from Timestamp to Date ( from java.util.Date) and worked fine.

Using joda DateTime Range Key with AWS DynamoDB Object Persistence model

I've got a dynamodb table with a timestamp ("creationDate") as a range. The model is using a joda DateTime to make it easy to use (compatibility with the rest of the code). To be able to make between queries on this range, I used a numeric type for the attribute in the table, and planned to store it as a java timestamp (milliseconds since epoch). Then, I added a marshaller to convert a joda DateTime to a String representing a long and vice-versa.
The table structure (creation):
void CreateTable()
{
CreateTableRequest createTableRequest = new CreateTableRequest().withTableName(LinkManager.TABLE_NAME);
ProvisionedThroughput pt = new ProvisionedThroughput()
.withReadCapacityUnits(LinkManager.READ_CAPACITY_UNITS)
.withWriteCapacityUnits(LinkManager.WRITE_CAPACITY_UNITS);
createTableRequest.setProvisionedThroughput(pt);
ArrayList<AttributeDefinition> ad = new ArrayList<AttributeDefinition>();
ad.add(new AttributeDefinition().withAttributeName("creationDate").withAttributeType(ScalarAttributeType.N));
ad.add(new AttributeDefinition().withAttributeName("contentHash").withAttributeType(ScalarAttributeType.S));
createTableRequest.setAttributeDefinitions(ad);
ArrayList<KeySchemaElement> ks = new ArrayList<KeySchemaElement>();
ks.add(new KeySchemaElement().withAttributeName("contentHash").withKeyType(KeyType.HASH));
ks.add(new KeySchemaElement().withAttributeName("creationDate").withKeyType(KeyType.RANGE));
createTableRequest.setKeySchema(ks);
this.kernel.DDB.createTable(createTableRequest);
}
The model:
#DynamoDBTable(tableName="Link")
public class Link {
private String ContentHash;
private DateTime CreationDate;
#DynamoDBHashKey(attributeName = "contentHash")
public String getContentHash() {
return ContentHash;
}
public void setContentHash(String contentHash) {
ContentHash = contentHash;
}
#DynamoDBRangeKey(attributeName = "creationDate")
#DynamoDBMarshalling(marshallerClass = DateTimeMarshaller.class)
public DateTime getCreationDate() {
return CreationDate;
}
public void setCreationDate(DateTime creationDate) {
CreationDate = creationDate;
}
}
The marshaller:
public class DateTimeMarshaller extends JsonMarshaller<DateTime>
{
public String marshall(DateTime dt)
{
return String.valueOf(dt.getMillis());
}
public DateTime unmarshall(String dt)
{
long ldt = Long.parseLong(dt);
return new DateTime(ldt);
}
}
I get the following error:
Exception in thread "main" com.amazonaws.AmazonServiceException: Type of specified attribute inconsistent with type in table (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 8aabb703-cb44-4e93-ab47-c527a5aa7d52)
I guess this is because the marshaller returns a String, and dynamoDB wants a numeric type, as the attribute type is N. I don't know what people do in this case, I searched for a solution but couldn't find it. I only tested this on a local dynamodb instance, which I don't think makes any difference (this is a validation check failing, there's no request even made).
The obvious workaround is to use long type for the dates in the model and add special getters and setters to work with DateTime. Still, is there a cleaner way ? I am sure I'm not the only one using DatTime range in a model.
What I would do is re-create the table with the Range key as String itself.
Even if it's going to be populated with long numbers, making it type: S will ensure compatibility with the Marshaller

Categories