Grails loosing time information of a Date when parsing JSON - java

My grails API is getting the time string but is loosing the seconds info.
I receive "16/02/2023 17:52:31" and results in a Date object with "16/02/2023 17:52:00"
class SampleObject {
static hasOne = [timeObject: TimeObject]
}
class TimeObject {
Date begin
Date end
static belongsTo = [sampleObject: SampleObject]
static constraints = {
begin nullable: true
end nullable: true
sampleObject nullable: false, unique: true
}
}
def controllerMethod() {
def bodyJSON = request.reader.text
def sampleObject = new SampleObject (JSON.parse(bodyJSON))
println sampleObject
}
In this sample above I send:
{
sampleObject: {
"begin": "16/02/2023 17:52:31",
"end": "16/02/2023 17:53:12"
}
}
Result:
begin= {Date#16875} "Thu Feb 16 17:52:00 BRT 2023"
end= {Date#16876} "Thu Feb 16 17:53:00 BRT 2023"
In my application.groovy I have:
grails.databinding.dateFormats = ['dd/MM/yyyy HH:mm', 'dd/MM/yyyy HH:mm:ss', 'yyyy-MM-dd HH:mm:ss.S', "yyyy-MM-dd'T'hh:mm:ss'Z'"]
PS: I'm using grails 3.1

The problem was in the "grails.databinding.dateFormats" order.
I solve changing to this:
grails.databinding.dateFormats = ['dd/MM/yyyy HH:mm:ss', 'dd/MM/yyyy HH:mm', 'yyyy-MM-dd HH:mm:ss.S', "yyyy-MM-dd'T'hh:mm:ss'Z'"]

Related

Spring JPA Mongo Datediff operator

I am trying to convert a native mongo query to jpa.
The basis is that I have to subtract two dates that are in string format, that is, first I have to convert and then use datediff, this is how I get the result I want from my compass client:
$project: {
waiting: {
$dateDiff: {
startDate: {
$toDate: "$date_start"
},
endDate: {
$toDate: "$date_end"
},
unit: "second"
}
}
_id: 0
}
In the documentation there is a reference to the use of datediff but I don't see any example and I'm somewhat new to this technology
https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mongo.aggregation
(Date Aggregation Operators)
this has been the closest I have come, giving a failed result because the dates are not in date format
ProjectionOperation projectStage = Aggregation.project("test").andExpression("date_start - date_end").as("test");
error during aggregation :: caused by :: can't $subtract string from string"

Google Calendar API v3 always returns BadRequest when creating events

I created a shared calendar and want to add events to the calendar.
I created a project and set up a Service Account xxx#xxx.iam.gserviceaccount.com.
Then I shared the calendar to the Service Account as owner.
Then I noticed
Service Account must manually add shared calendar
as described here
https://stackoverflow.com/a/62232361/298430 and https://issuetracker.google.com/issues/148804709
So I wrote a code:
#Test
fun addCalendarToServiceAccount() {
val calList1: CalendarList = calendar.calendarList().list().execute()
logger.info("calList1 = {}", calList1)
val inserted = calendar.calendarList().insert(CalendarListEntry().setId(calendarId)).execute()
logger.info("inserted = {}", inserted)
val calList2: CalendarList = calendar.calendarList().list().execute()
logger.info("calList2 = {}", calList2)
}
It works perfectly. When first called, I can see calList1 is empty, and calList2 contains something.
Then I manually insert one event to the calendar (with google calendar WEB UI), I want to check if I can retrieve the event:
#Test
fun listEvents() {
val events: Events = calendar.events().list(calendarId).execute()
logger.info("events = {}", events)
events.items.forEachIndexed { index, e ->
logger.info("Event [index = {}] , event = {}", index, e)
}
}
It also works.
{
"accessRole":"owner",
"defaultReminders":[
],
"etag":"\"xxx\"",
"items":[
{
"created":"2020-08-17T17:51:21.000Z",
"creator":{
"email":"xxx#gmail.com"
},
"end":{
"date":"2020-08-20"
},
"etag":"\"xxx\"",
"htmlLink":"https://www.google.com/calendar/event?eid=xxx",
"iCalUID":"xxx#google.com",
"id":"xxx",
"kind":"calendar#event",
"organizer":{
"displayName":"xxx",
"email":"xxx#group.calendar.google.com",
"self":true
},
"reminders":{
"useDefault":false
},
"sequence":0,
"start":{
"date":"2020-08-19"
},
"status":"confirmed",
"summary":"xxx test1",
"transparency":"transparent",
"updated":"2020-08-18T01:07:54.441Z"
}
],
"kind":"calendar#events",
"nextSyncToken":"xxx",
"summary":"xxx",
"timeZone":"Asia/Taipei",
"updated":"2020-08-18T01:07:54.688Z"
}
Then I want to programmatically insert something, like the API example shows:
#Test
fun testInsertEvent() {
val now = LocalDateTime.now().withSecond(0).withNano(0)
val zoneId = ZoneId.of("Asia/Taipei")
val fromDate = Date.from(now.atZone(zoneId).toInstant())
val endDate = Date.from(now.plusMinutes(60).atZone(zoneId).toInstant())
val event = Event()
.setSummary("Google I/O 2015")
.setLocation("800 Howard St., San Francisco, CA 94103")
.setDescription("A chance to hear more about Google's developer products.")
.setStart(EventDateTime().setDate(DateTime(fromDate, TimeZone.getTimeZone(zoneId))))
.setEnd(EventDateTime().setDate(DateTime(endDate, TimeZone.getTimeZone(zoneId))))
logger.info("before insert event : {}", event)
val eventResult: Event = calendar.events().insert(calendarId, event).execute()
logger.info("eventResult = {}", eventResult)
}
I can see the client truly POST to google'e endpoint:
The body is:
{
"description":"A chance to hear more about Google's developer products.",
"end":{
"date":"2020-08-18T11:32:00.000+08:00"
},
"location":"800 Howard St., San Francisco, CA 94103",
"start":{
"date":"2020-08-18T10:32:00.000+08:00"
},
"summary":"Google I/O 2015"
}
But google just replied 400 BadRequest, without any further description:
2020-08-18 10:32:15.974 [main] INFO c.g.a.c.h.HttpResponse - -------------- RESPONSE --------------
HTTP/1.1 400 Bad Request
Transfer-Encoding: chunked
Alt-Svc: h3-29=":443"; ma=2592000,h3-27=":443"; ma=2592000,h3-T050=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"
Server: ESF
X-Content-Type-Options: nosniff
Pragma: no-cache
Date: Tue, 18 Aug 2020 02:32:15 GMT
X-Frame-Options: SAMEORIGIN
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Content-Encoding: gzip
Vary: Referer
Vary: X-Origin
Vary: Origin
Expires: Mon, 01 Jan 1990 00:00:00 GMT
X-XSS-Protection: 0
Content-Type: application/json; charset=UTF-8
2020-08-18 10:32:15.980 [main] INFO c.g.a.c.u.LoggingByteArrayOutputStream - Total: 171 bytes
2020-08-18 10:32:15.980 [main] INFO c.g.a.c.u.LoggingByteArrayOutputStream - {
"error": {
"errors": [
{
"domain": "global",
"reason": "badRequest",
"message": "Bad Request"
}
],
"code": 400,
"message": "Bad Request"
}
}
I am using the same calendar instance, can successfully addCalendarToServiceAccount() (as owner) and listEvents().
But what goes wrong when inserting an event? Did I miss anything?
Other fields are initialized as follows:
#Value("\${google.calendar.id}")
private lateinit var calendarId: String
#Value("\${google.calendar.apiKey}")
private lateinit var apiKey : String
private val httpTransport: HttpTransport by lazy {
GoogleNetHttpTransport.newTrustedTransport()
}
private val jacksonFactory: JsonFactory by lazy {
JacksonFactory.getDefaultInstance()
}
private val saCredentials: GoogleCredentials by lazy {
javaClass.getResourceAsStream("/chancer-d1de03c4c25a.json").use { iStream ->
ServiceAccountCredentials.fromStream(iStream)
.createScoped(listOf(
"https://www.googleapis.com/auth/cloud-platform",
*CalendarScopes.all().toTypedArray()
))
}.apply {
refreshIfExpired()
}
}
private val requestInitializer: HttpRequestInitializer by lazy {
HttpCredentialsAdapter(saCredentials)
}
private val calendar: Calendar by lazy {
Calendar.Builder(httpTransport, jacksonFactory, requestInitializer)
.build()
}
Environments:
<java.version>1.8</java.version>
<kotlin.version>1.4.0</kotlin.version>
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client</artifactId>
<version>1.30.10</version>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-calendar</artifactId>
<version>v3-rev20200610-1.30.10</version>
</dependency>
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.21.1</version>
</dependency>
Answer:
You need to use start.dateTime and end.dateTime rather than start.date and end.date
Fix:
As per the documentation:
end.date: The date, in the format "yyyy-mm-dd", if this is an all-day event.
end.dateTime: The time, as a combined date-time value (formatted according to RFC3339). A time zone offset is required unless a time zone is explicitly specified in timeZone.
start.date: The date, in the format "yyyy-mm-dd", if this is an all-day event.
start.dateTime: The time, as a combined date-time value (formatted according to RFC3339). A time zone offset is required unless a time zone is explicitly specified in timeZone.
Therefore, you need to change your date & time setting method from:
EventDateTime().setDate(DateTime(fromDate, TimeZone.getTimeZone(zoneId))))
to:
EventDateTime().setDateTime(DateTime(fromDate, TimeZone.getTimeZone(zoneId))))
Which will change the request body to:
{
"description": "A chance to hear more about Google's developer products.",
"end": {
"dateTime": "2020-08-18T11:32:00.000+08:00" // modified
},
"location": "800 Howard St., San Francisco, CA 94103",
"start": {
"dateTime": "2020-08-18T10:32:00.000+08:00" // modified
},
"summary": "Google I/O 2015"
}
You can see the documentation for this method here.
References:
Events: insert | Calendar API | Google Developers
RFC 3339 - Date and Time on the Internet: Timestamps
EventDateTime (Calendar API v3-rev20200610-1.30.10)

How to combine "not" operators using the spring data Criteria builder

To setup the scene, the query I want to produce is similar to the following that I've manually written. The idea is that I want to exclude a date range from the result set. In this case, nothing from today, but if there's any in the past or in the future, I want them!
{
"createdOn": {
$not: {
"$lte": ISODate("2020-20-23T23:59:59.999999999"),
"$gte": ISODate("2020-20-23T00:00"),
}
}
}
The code I am using to try and get the above is the following:
Criteria.where("createdOn")
.not()
.gte(LocalDate.now().atTime(LocalTime.MIN))
.lte(LocalDate.now().atTime(LocalTime.MAX))
However, that generates the following:
{
"createdOn": {
"$not": {
"$gte": {
"$java": "2020-03-23T00:00"
}
},
"$lte": {
"$java": "2020-03-23T23:59:59.999999999"
}
}
}
If I changed the code to add another not before the lte it still produces the same output.
Is there any way of producing the query I want, or an alternate way of excluding a range of dates from the result set?
I tried your mongo shell and the MongoDB Spring Java code. The shell code works fine, but the corresponding Java code using the not() doesn't work (I don't know why).
Here is another way working with the same functionality your looking for:
...nothing from today, but if there's any in the past or in the future
I am using the following input documents:
{ _id: 1, createdOn: ISODate("2020-03-21T12:05:00") },
{ _id: 2, createdOn: ISODate("2020-03-28T18:33:00") },
{ _id: 3, createdOn: ISODate("2020-03-24T01:56:00") }
And, assuming today's date: ISODate("2020-03-24T02:50:04.992Z"), the result should exclude the document with _id: 3, where the createdOn is within today.
The mongo shell query:
db.collection.find( {
$or: [
{ createdOn: { $gt: ISODate("2020-03-24T23:59:59.99") } },
{ createdOn: { $lt: ISODate("2020-03-24T00:00:00") } }
]
} )
This returns the documents with _id's 1 and 2 (these exclude today's date).
The corresponding Java code:
DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd H:m:s");
Date fromDate = dateFormat.parse("2020-03-24 00:00:00");
Date toDate = dateFormat.parse("2020-03-24 23:59:59");
Criteria c = new Criteria().orOperator(
Criteria.where("createdOn").lt(fromDate),
Criteria.where("createdOn").gt(toDate) );
Query q = Query.query(c);
MongoOperations mongoOps = new MongoTemplate(MongoClients.create(), "testDB");
List<Document> result = mongoOps.find(q, Document.class, "collection");
result.forEach(doc -> System.out.println(doc.toJson()));

Pretty print String to JSON format

I execute an AWS command to retrieve the spot price history.
DescribeSpotPriceHistoryRequest request = new DescribeSpotPriceHistoryRequest().withEndTime(start)
.withInstanceTypes("m1.xlarge").withProductDescriptions("Linux/UNIX (Amazon VPC)").withStartTime(end);
DescribeSpotPriceHistoryResult response = client.describeSpotPriceHistory(request);
System.out.println(response.toString());
I obtained the result in json format but I receive it in String like:
{SpotPriceHistory: [{AvailabilityZone: us-east-1d,InstanceType: m1.xlarge,ProductDescription: Linux/UNIX,SpotPrice: 0.035000,Timestamp: Wed Nov 07 00:18:50 CET 2018}, {AvailabilityZone: us-east-1c,InstanceType: m1.xlarge,ProductDescription: Linux/UNIX,SpotPrice: 0.035000,Timestamp: Wed Nov 07 00:18:50 CET 2018}, {AvailabilityZone: us-east-1b,InstanceType: m1.xlarge,ProductDescription: Linux/UNIX,SpotPrice: 0.035000,Timestamp: Wed Nov 07 00:18:50 CET 2018}, {AvailabilityZone: us-east-1a,InstanceType: m1.xlarge,ProductDescription: Linux/UNIX,SpotPrice: 0.035000,Timestamp: Wed Nov 07 00:18:50 CET 2018}, {AvailabilityZone: us-east-1e,InstanceType: m1.xlarge,ProductDescription: Linux/UNIX,SpotPrice: 0.350000,Timestamp: Thu Sep 20 01:08:39 CEST 2018}]}
my question is: How can I improve the displaying of the results ? like
{
"Timestamp": "2018-09-08T17:07:14.000Z",
"AvailabilityZone": "us-east-1d",
"InstanceType": "p2.16xlarge",
"ProductDescription": "Linux/UNIX",
"SpotPrice": "4.320000"
},
{
"Timestamp": "2018-09-08T17:07:14.000Z",
"AvailabilityZone": "us-east-1c",
"InstanceType": "p2.16xlarge",
"ProductDescription": "Linux/UNIX",
"SpotPrice": "4.320000"
},
{
"Timestamp": "2018-09-08T17:07:14.000Z",
"AvailabilityZone": "us-east-1b",
"InstanceType": "p2.16xlarge",
"ProductDescription": "Linux/UNIX",
"SpotPrice": "4.320000"
},
{
"Timestamp": "2018-09-08T12:32:28.000Z",
"AvailabilityZone": "us-east-1e",
"InstanceType": "p2.16xlarge",
"ProductDescription": "Linux/UNIX",
"SpotPrice": "4.320000"
}
]
}
You're calling the .toString() on the response object, just be careful with this as there is no guarantee that will always be json, as you're seeing above, it's not even valid json as it's missing quotes around the attribute names and values.
One option to get what you want is to call response.getSpotPriceHistory() to get you the array of spot prices, then pass that thru ObjectMapper and write it as a pretty string, like so:
public static void main(String[] args) throws IOException {
AmazonEC2 client = AmazonEC2Client.builder().build();
DescribeSpotPriceHistoryRequest request = new DescribeSpotPriceHistoryRequest()
.withEndTime(new Date())
.withInstanceTypes("m1.xlarge").withProductDescriptions("Linux/UNIX (Amazon VPC)")
.withStartTime(new Date());
DescribeSpotPriceHistoryResult response = client.describeSpotPriceHistory(request);
ObjectMapper mapper = new ObjectMapper();
String asPrettyJSon = mapper.writerWithDefaultPrettyPrinter()
.writeValueAsString(response.getSpotPriceHistory());
System.out.println(asPrettyJSon);
}
Both represent a json array containing jsonObjects of same structure.
Displaying result will depend on your front implementation not the layaout of your jsonRespense.

Spark Sql mapping issue

Sparks2/Java8 Cassandra2
Trying to read some data from Cassandra and then run a group by query in sparks. I have only 2 columns in my DF
transdate (Date), origin (String)
Dataset<Row> maxOrigindate = sparks.sql("SELECT origin, transdate, COUNT(*) AS cnt FROM origins GROUP BY (origin,transdate) ORDER BY cnt DESC LIMIT 1"); `
Get Error:
`Exception in thread "main" org.apache.spark.sql.AnalysisException: expression 'origins.`origin`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value)`
The group by issue got solved removing ( ) in group by as below
Complete code: (trying to get max number of trans on date for a origin/location)
JavaRDD<TransByDate> originDateRDD = javaFunctions(sc).cassandraTable("trans", "trans_by_date", CassandraJavaUtil.mapRowTo(TransByDate.class))
.select(CassandraJavaUtil.column("origin"), CassandraJavaUtil.column("trans_date").as("transdate")).limit((long)100) ;
Dataset<Row> originDF = sparks.createDataFrame(originDateRDD, TransByDate.class);
String[] columns = originDF.columns();
System.out.println("originDF columns: "+columns[0]+" "+columns[1]) ; -> transdate origin
originDF.createOrReplaceTempView("origins");
Dataset<Row> maxOrigindate = sparks.sql("SELECT origin, transdate, COUNT(*) AS cnt FROM origins GROUP BY origin,transdate ORDER BY cnt DESC LIMIT 1");
List list = maxOrigindate.collectAsList(); -> Exception here
int j = list.size();
originDF columns: transdate origin
`public static class TransByDate implements Serializable {
private String origin;
private Date transdate;
public TransByDate() { }
public TransByDate (String origin, Date transdate) {
this.origin = origin;
this.transdate= transdate;
}
public String getOrigin() { return origin; }
public void setOrigin(String origin) { this.origin = origin; }
public Date getTransdate() { return transdate; }
public void setTransdate(Date transdate) { this.transdate = transdate; }
}
Schema
root
|-- transdate: struct (nullable = true)
| |-- date: integer (nullable = false)
| |-- day: integer (nullable = false)
| |-- hours: integer (nullable = false)
| |-- minutes: integer (nullable = false)
| |-- month: integer (nullable = false)
| |-- seconds: integer (nullable = false)
| |-- time: long (nullable = false)
| |-- timezoneOffset: integer (nullable = false)
| |-- year: integer (nullable = false)
|-- origin: string (nullable = true)
Exception
ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 12)
scala.MatchError: Sun Jan 01 00:00:00 PST 2012 (of class java.util.Date)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:256)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:251)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$CatalystTypeConverter.toCatalyst(CatalystTypeConverters.scala:103)
....
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0 (TID 12, localhost): scala.MatchError: Sun Jan 01 00:00:00 PST 2012 (of class java.util.Date)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:256)
...
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1454)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1441)
...
at org.apache.spark.sql.Dataset$$anonfun$collectAsList$1.apply(Dataset.scala:2184)
at org.apache.spark.sql.Dataset.withCallback(Dataset.scala:2559)
at org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2184)
at spark.SparkTest.sqlMaxCount(SparkTest.java:244) -> List list = maxOrigindate.collectAsList();
Caused by: scala.MatchError: Sun Jan 01 00:00:00 PST 2012 (of class java.util.Date)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:256)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:251)
You are getting below error.
Caused by: scala.MatchError: Sun Jan 01 00:00:00 PST 2012 (of class java.util.Date) at
This error is because Spark sql supports java.sql.Date type. Please check the Spark documentation here. You can also refer SPARK-2562.
change the query to
Dataset<Row> maxOrigindate = sparks.sql("SELECT origin,
transdate,
COUNT(*) AS cnt FROM origins GROUP BY origin,transdate
ORDER BY cnt DESC LIMIT 1");
this will work.

Categories