I am building a Java map of two different Json data streams. However if one of the streams contains a null value in the quantity, the flow will error with the following condition:
($.itemid) : '"' ++ skuLookup[$.sku][0].qty as :number as :string {format: "#.0000"} ++ '"'
^
Cannot coerce a :null to a :number
(com.mulesoft.weave.mule.exception.WeaveExecutionException). Message payload is of type: String
The ideal scenario would be ignore the null values within the iteration Here is the dataweave as untouched:
%output application/java
%var skuLookup = flowVars.SSRCreateStarshipItems groupBy $.sku
---
flowVars.SSRGetOrderItems map ({
($.itemid) : '"' ++ skuLookup[$.sku][0].qty as :number as :string
{format: "#.0000"} ++ '"'
})
Here is my ideal solution, however I cant seem to get it to work:
%output application/java
%var skuLookup = flowVars.SSRCreateStarshipItems groupBy $.sku
---
flowVars.SSRGetOrderItems map ({
($.itemid) : '"' ++ skuLookup[$.sku][0].qty as :number as :string
{format: "#.0000"} ++ '"'
}) when skuLookup[$.sku][0].qty != null
Setting the default value resolve your problem.
For example, If you were passing in a payload such as this:
{
"testMap": [
{
"testName": "TestName",
"Value": "3"
},
{
"testName": "TestName",
"Value": null
}
]
}
You could convert the Value key to a number or pass null with this:
%dw 1.0
%input payload application/json
%output application/json
---
payload.testMap map (data, index) -> {
testName: data.testName,
Value: data.Value as :number as :string {format: "#.0000"} default null
}
And this is the result:
[
{
"testName": "TestName",
"Value": "3.0000"
},
{
"testName": "TestName",
"Value": null
}
]
Also, if you want to remove the key completely, insert this after the output header for json:
skipNullOn="everywhere"
Related
I am trying to validate a JSON which has multiple JSON objects nested.
example
Scenario: temp1
* def response1 =
"""
{
"productGroups": [
{
"dateLabel": "28 Aug, Wed",
"products": [
{
"id": 1439,
"product": "product 1"
},
{
"id": 1401,
"product": "product 2"
}
]
}
]
}
"""
* print response1.productGroups
Then match response1.productGroups[*] contains
"""
{
'dateLabel': #string,
'products': [
{
'id': #number,
'product': #string
}
]
}
"""
Getting the response as
reason: actual value does not contain expected
if I change the validate as
Then match response1.productGroups[0] contains
Getting the response as
reason: actual and expected arrays are not the same size - 2:1
What I wanted to do is verify the schema of "productGroups" object along with the inner objects of "products"
Please spend some time reading the docs, it is worth it: https://github.com/intuit/karate#schema-validation
* def product = { 'id': #number, 'product': #string }
Then match response1.productGroups[*] contains
"""
{
'dateLabel': #string,
'products': '#[] product'
}
"""
I use DataTables with server-side processing. The json object I receive contains an array of LocalDateTime element:
...
"SimpleDate": [ 2000,12,31,0,0 ]
...
My columns definition in the initialization script is the following:
"columns": [
{ "data": "SimpleDate"}
]
By default, the column is rendered comma-separated: 2000,12,31,0,0
How can I change it to 31.12.2000?
I tried columnDefsand render like:
"columnDefs": [
{
"render": function ( data, type, row ) {
return data.2 + '.' + data.1 + '.' + data.0;
},
"targets": 0
}
but this simply stops the table from rendering. I assume, accessing the array via data.xis not possible in this state.
So, how do I do it?
You are not accessing the elements of the data array properly.
"render": function ( data, type, row ) {
return data[2] + '.' + data[1] + '.' + data[0];
},
Try something like below.
"columnDefs": ["targets": 0 , "data": "SimpleDate","render": function ( data, type, row ) { return data[2] + '.' + data[1]+ '.' + data[0]; }}
I wrote a query in MongoDB as follows:
db.getCollection('student').aggregate(
[
{
$match: { "student_age" : { "$ne" : 15 } }
},
{
$group:
{
_id: "$student_name",
count: {$sum: 1},
sum1: {$sum: "$student_age"}
}
}
])
In others words, I want to fetch the count of students that aren't 15 years old and the summary of their age. The query works fine and I get two data items.
In my application, I want to do the query by Spring Data.
I wrote the following code:
Criteria where = Criteria.where("AGE").ne(15);
Aggregation aggregation = Aggregation.newAggregation(
Aggregation.match(where),
Aggregation.group().sum("student_age").as("totalAge"),
count().as("countOfStudentNot15YearsOld"));
When this code is run, the output query will be:
"aggregate" : "MyDocument", "pipeline" :
[ { "$match" { "AGE" : { "$ne" : 15 } } },
{ "$group" : { "_id" : null, "totalAge" : { "$sum" : "$student_age" } } },
{ "$count" : "countOfStudentNot15YearsOld" }],
"cursor" : { "batchSize" : 2147483647 }
Unfortunately, the result is only countOfStudentNot15YearsOld item.
I want to fetch the result like my native query.
If your're asking to return the grouping for both "15" and "not 15" as a result then you're looking for the $cond operator which will allow a "branching" based on conditional evaluation.
From the "shell" content you would use it like this:
db.getCollection('student').aggregate([
{ "$group": {
"_id": null,
"countFiteen": {
"$sum": {
"$cond": [{ "$eq": [ "$student_age", 15 ] }, 1, 0 ]
}
},
"countNotFifteen": {
"$sum": {
"$cond": [{ "$ne": [ "$student_age", 15 ] }, 1, 0 ]
}
},
"sumNotFifteen": {
"$sum": {
"$cond": [{ "$ne": [ "$student_age", 15 ] }, "$student_age", 0 ]
}
}
}}
])
So you use the $cond to perform a logical test, in this case whether the "student_age" in the current document being considered is 15 or not, then you can return a numerical value in response which is 1 here for "counting" or the actual field value when that is what you want to send to the accumulator instead. In short it's a "ternary" operator or if/then/else condition ( which in fact can be shown in the more expressive form with keys ) you can use to test a condition and decide what to return.
For the spring mongodb implementation you use ConditionalOperators.Cond to construct the same BSON expressions:
import org.springframework.data.mongodb.core.aggregation.*;
ConditionalOperators.Cond isFifteen = ConditionalOperators.when(new Criteria("student_age").is(15))
.then(1).otherwise(0);
ConditionalOperators.Cond notFifteen = ConditionalOperators.when(new Criteria("student_age").ne(15))
.then(1).otherwise(0);
ConditionalOperators.Cond sumNotFifteen = ConditionalOperators.when(new Criteria("student_age").ne(15))
.thenValueOf("student_age").otherwise(0);
GroupOperation groupStage = Aggregation.group()
.sum(isFifteen).as("countFifteen")
.sum(notFifteen).as("countNotFifteen")
.sum(sumNotFifteen).as("sumNotFifteen");
Aggregation aggregation = Aggregation.newAggregation(groupStage);
So basically you just extend off of that logic, using .then() for a "constant" value such as 1 for the "counts", and .thenValueOf() where you actually need the "value" of a field from the document, so basically equal to the "$student_age" as shown for the common shell notation.
Since ConditionalOperators.Cond shares the AggregationExpression interface, this can be used with .sum() in the form that accepts an AggregationExpression as opposed to a string. This is an improvement on past releases of spring mongo which would require you to perform a $project stage so there were actual document properties for the evaluated expression prior to performing a $group.
If all you want is to replicate the original query for spring mongodb, then your mistake was using the $count aggregation stage rather than appending to the group():
Criteria where = Criteria.where("AGE").ne(15);
Aggregation aggregation = Aggregation.newAggregation(
Aggregation.match(where),
Aggregation.group()
.sum("student_age").as("totalAge")
.count().as("countOfStudentNot15YearsOld")
);
For testing purpose i need to override 'equals' method:
def any = [equals: { true }] as String
any == 'should be true'
// false
More detailed about problem:
class EmployeeEndpointSpec extends RestSpecification {
void "test employee" () {
when:
get "/v1/employee", parameters
then:
expectedStatus.equals(response.statusCode)
expectedJson.equals(response.json)
where:
parameters << [
[:],
[id: 824633720833, style: "small"]
]
expectedStatus << [
HttpStatus.BAD_REQUEST,
HttpStatus.OK
]
expectedJson << [
[errorCode: "badRequest"],
[
id: 824633720833,
name: "Jimmy",
email: "jimmy#fakemail.com",
dateCreated:"2015-01-01T01:01:00.000", // this value should be ignored
lastUpdated: "2015-01-01T01:01:00.000" // and this
]
]
}
}
lastUpdated and dateCreated may change in time, and i need
somehow ignore them.
If there's no need to compare mentioned fields - remove them:
class EmployeeEndpointSpec extends RestSpecification {
void "test employee" () {
when:
get "/v1/employee", parameters
then:
expectedStatus.equals(response.statusCode)
def json = response.json
json.remove('dateCreated')
json.remove('lastUpdated')
expectedJson.equals(response.json)
where:
parameters << [
[:],
[id: 824633720833, style: "small"]
]
expectedStatus << [
HttpStatus.BAD_REQUEST,
HttpStatus.OK
]
expectedJson << [
[errorCode: "badRequest"],
[
id: 824633720833,
name: "Jimmy",
email: "jimmy#fakemail.com",
dateCreated:"2015-01-01T01:01:00.000",
lastUpdated: "2015-01-01T01:01:00.000"
]
]
}
}
I'd also separate testing negative and positive scenarios.
You can also test keySet() separately from testing keys values instead of comparing the whole map. This is the way I'd do that:
then:
def json = response.json
json.id == 824633720833
json.name == "Jimmy"
json.email == "jimmy#fakemail.com"
json.dateCreated.matches('<PATTERN>')
json.lastUpdated.matches('<PATTERN>')
In case You don't like the last two lines it can be replaced with:
json.keySet().contains('lastUpdated', 'dateCreated')
The answer is:
String.metaClass.equals = { Object _ -> true }
I'm trying to execute a query with more than 2 optional parameters but I don't get any result. For the 2 parameters I followed the answer of this question spring-data-mongo - optional query parameters?.
So for example, with the following query everything is ok:
#Query("{$and: [{$or : [ { $where: '?0 == null' } , { a : ?0 } ]}, {$or : [ { $where: '?1 == null' } , { b : ?1 }]}]}")
But if I add one more parameter it stops to work:
#Query("{ $and : [{$and: [{$or : [ { $where: '?0 == null' } , { a : ?0 } ]}, {$or : [ { $where: '?1 == null' } , { b : ?1 }]}]},{$or : [ { $where: '?2 == null' } , { c : ?2 }]}]}")
I triple checked the syntax and seems ok, but I get empty results (even if I'm sure I should obtain at least few documents).
Any idea?
If you try carefully hand-format you query to be more readable, you will note that you have made a few mistakes with the closing brackets.
Try this query instead:
{ $and :
[{
$and:
[
{$or : [ { $where: '?0 == null' } , { a : ?0 }]},
{$or : [ { $where: '?1 == null' } , { b : ?1 }]},
{$or : [ { $where: '?2 == null' } , { c : ?2 }]}
]
}]
}
Side note: I think one $and will be enough, i.e. remove the top-level $and operator.