I new to mongodb. I have a doc which is look like
So I want to fire a query for object "users" which gives me all the key names who values is true.
Please answer in java
You can try below aggregation pipeline in 3.4.4 version.
Change the users embedded document into array of key value pairs using $objectToArray followed by $filter + $map to extract the keys for matching value.
db.collection.aggregate([
{
$project: {
keys: {
$map: {
input: {
$filter: {
input: {$objectToArray: "$users"},
as: "resultf",
cond: {
$eq: ["$$resultf.v", true]
}
}
},
as: "resultm",
in: "$$resultm.k"
}
}
}
}
])
Following query will give you all the users keys whose value is true in array format -
db.collectionName.find({},{users:1}).map(function(myDoc){
var names = [];
for (var key in myDoc.users) {
if(myDoc.users[key]){
names.push(key);
}
}
return names;
});
Related
I have to validate the values of output json file. First validation is all the key value pairs needed are in the file and the second one, values are in the correct format as expected ( say, timestamp, string, integer ...). I cannot compare the values directly with another file having same content as some of the values like timestamp, id are random.
JSON File:
[
{
"metadata": {
"RecordEnd": 19,
"Type": null,
"RecordOffset": 0,
"Character_Set": "UTF-8",
"MsgType": 8,
"Expire": 0,
"Name": "y1",
"delivered": false,
"Timestamp": 1664189426609,
"UserID": "1jnj2232",
"Encoding": 273,
"id": "ID:414d51205120202020210040",
"DType": "type1"
}
]
I have two approches to do this.
Create a hashmap with keys as same in the JSON file and have values as regex patterns and compare each key value pairs with the regex matching the keys.
HashMap<String,String> metaData = new HashMap();
metaData.put("RecordEnd","\\d+");metaData.put("Type","\\w+");
metaData.put("RecordOffset","\\d+");metaData.put("Character_Set","UTF-8");
metaData.put("MsgType","\\d+");metaData.put("Expire","\\w+");
metaData.put("Name","\\w+");metaData.put("delivered","\\w+");
metaData.put("Timestamp","\\d+");metaData.put("UserID","\\w+");
metaData.put("Encoding","\\d+");metaData.put("id","ID\\:\\w+");
metaData.put("DType","type1");
ObjectMapper mapper = new ObjectMapper();
String json = FileUtils.readFileToString(new File(outputJSONFile),"UTF-8");
json = json.substring(1, json.length() - 1);
Map<?, ?> map = mapper.readValue(json, Map.class);
HashMap<String,Object> metaMap = (HashMap<String, Object>) map.get("metadata");
metaMap.entrySet().forEach(e-> {
if (!(e.getValue() == null)) {
if (e.getValue().toString().matches(metaData.get(e.getKey()))) {
log.info(e + "- Matched");
} else {
throw new RuntimeException(
"MetaData key " + e.getKey() + " data is invalid");
}
}
});
Here, if the fields size become larger ( say 40+ fields) , I have to add hashmap values for all the fields , it becomes a tedious process and reduces readability.
Create a static json file with the all the keys and the values in the regex format. Then, compare the two files based on regex of one file to match with another file. I haven't tried this yet.
Is the second approach more efficient or is there a more suitable approach to do this?
I have a json object how can I get all the keys and later without hard coding the keys how can I get the key values.
{
"A":"M1",
"Data":[
{
"B":[
{
"B1":"111",
"B2":"Warning "
},
{
"B1":"222",
"B2":"Warning "
}
],
"C":[
{
"c1":"IL2",
"c2":"[0.750183,0.00933380975964486]"
},
{
"c1":"IL1b",
"c2":"[0.750183,-1.5216938335421]"
}
]
}
]
}
Try this out...
This might work for you....
You have to use JSONObject keys() to get the key and then iterate each key to get to the dynamic value.
Roughly the code will look like:
// searchResult refers to the current element in the array "search_result"
JSONObject questionMark = searchResult.getJSONObject("question_mark");
Iterator keys = questionMark.keys();
while(keys.hasNext()) {
// loop to get the dynamic key
String currentDynamicKey = (String)keys.next();
// get the value of the dynamic key
JSONObject currentDynamicValue = questionMark.getJSONObject(currentDynamicKey);
// do something here with the value...
}
I have json which I am trying to parse using Jackson. JSON looks as -
coupons: {
1: {
title: "Mode von",
description: "",
type: "offer",
code: "-",
expiry: "0000-00-00",
link: ""
},
2: {
title: "Prime 1",
description: "",
type: "offer",
code: "-",
expiry: "0000-00-00",
link: "http://test.com/"
}
}
The number of coupons are not constant here and would vary from response to response.
My dilemma is to create corresponding java class which could hold such object.
I tried with Map as -
public class Coupons {
Map<String, String>coupons = new HashMap<String, String>();
public Map<String, String> getCoupons() {
return coupons;
}
}
But -
System.out.println(coupons.getCoupons().get("type"));
System.out.println(coupons.getCoupons().get("code"));
always get me null. What would be right java class for this json?
your first level of keys are the index numbers 1, 2, 3, etc.
so in order to get the type and code, you have to specificy the key.
you could do this:
var coupons = coupons.getCoupons(); //<--breakpoint to see if this is really populated.
foreach( String key in coupons.Keys ){ //<- pseudo code, iterate over keys
var obj = coupons.get(key);
var type = obj.get("type");
etc..
}
hope this helps you move on
I am using com.mongodb.util.JSON.parse to parse a JSON file to DBObject. How do I specify dates, refs, and objects IDs in the JSON file?
Dates : { myDate: {$date: "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" } } // Date in the mentioned ISODate string format.
Refs : { myRef : { $ref : <collname>, $id : <idvalue>[, $db : <dbname>] } } // collname is collection name, idvalue is the _id of the referred document and optionally dbname is the database the document is in.
ObjectIds : { _id : {$oid: "4e942f36de3eda51d5a7436c"} }
I have some data like this:
{id: 1, text: "This is a sentence about dogs", indices: ["sentence", "dogs"]}
{id: 2, text: "This sentence is about cats and dogs", indices: ["sentence", "cats", "dogs"]}
Where I have manually extracted key terms from the text and stored them as indices. I want to be able to do a search and order the results with the most matching indices. So for this example, I would like to be able to pass "cats" and "dogs" and get both objects returned, but id=2 should be first with score=2.
I first tried to use the DBCollection.group function
{public DBObject group(DBObject key,
DBObject cond,
DBObject initial,
String reduce,
String finalize)}
But I don't see a way to send parameters. I tried:
key: {id: true},
cond: {"indices" $in ['cats', 'dogs']},
initial: {score: 0}
reduce: function(doc, out){ out.score++; }
but obviously this will just return a count of 1 for each of the 2 objects.
I realised that I could send the keyword parameters as part of the initial config of the reduced object.
final List<String> targetTerms = Arrays.asList("dogs", "cats");
final Datastore ds = ….
final DBCollection coll = ds.getCollection(Example.class);
BasicDBObject key = new BasicDBObject("_id", true);
BasicDBObject cond = new BasicDBObject();
cond.append("indices", new BasicDBObject("$in", targetTerms));
BasicDBObject initial = new BasicDBObject();
initial.append("score", 0);
initial.append("targetTerms", targetTerms);
String reduce = "function (obj, prev) { " +
" for (i in prev.targetTerms) {" +
" targetTerm = prev.targetTerms[i];"+
" for (j in obj.indices) {" +
" var index = obj.indices[j];"+
" if (targetTerm === index) prev.score++;" +
" }" +
" }" +
"}";
String fn = null;
final BasicDBList group = (BasicDBList) coll.group(key, cond, initial, reduce, fn);
I get results like this:
{ "_id" : { "$oid" : "4dcfe16c05a063bb07ccbb7b"} , "score" : 1.0 , "targetTerms" : [ "virtual" , "library"]}
{ "_id" : { "$oid" : "4dcfe17d05a063bb07ccbb83"} , "score" : 2.0 , "targetTerms" : [ "virtual" , "library"]}
This got me the score values that I wanted, and I am able to narrow down the entries to be processed with more specific conditional rules.
So I have a few questions:
Is this a good way to send "parameters" to the group action's reduce function?
Is there a way to sort (and perhaps limit) the output inside mongodb before returning to the client?
Will this break on sharded Mongodb instances?