I'm currently working on a diagram / tree graph generator, to achieve this I'm using two libraries: GraphView to generate the graph and ZoomLayout to move around the view. The main idea of this project is to save all JSON's within an AWS database and then load a list of all the created graphs.
Since the GraphView library doesn't have capability to change or add data from the nodes I decided to create a JSON parser in order to notify new changes and redraw the shape of the graph. So far I managed to create a JSON parser that can read the following format.
example.json
{
"name": "A",
"children": [
{
"name": "B",
"children": [
{
"name": "G",
"children": [
{}
]
}
]
},
{
"name": "C",
"children": [
{
"name": "D",
"children": [
{
"name": "E",
"children": [
{}
]
},
{
"name": "F",
"children": [
{}
]
}
]
}
]
}
]
}
The parser uses a class to iterate over all the nodes within the JSON string named Nodes.
Nodes.kt
class Nodes(
var name: String,
val children: MutableList<Nodes>
){
override fun toString(): String {
return "\nName:$name\nChildren:[$children]"
}
fun hasChildren(): Boolean {
return !children.isNullOrEmpty()
}
}
With that JSON, the app generates the following graph:
The problem
Within this section you can enter a new string which will replace the current one in the selected node. This is done by editing the string without any mapping, using the String.replace() method. But this method doesn't allow me to erase or add new nodes to the current JSON string.
To map the JSON properly I decided to make use of GSON and a MutableList. First I set up the MutableList with the data from the current JSON and then I add a new node in front of the clicked node. The issue is that when I try to print the MutableList as a string the app throws an stackoverflow. This also happens if I try to map it to JSON format using GSON.
This the code that I use to replace the JSON.
// Method used to replace the current JSON with a new one by replacing the selected node with new data
private fun replaceJson(oldData: String, newData: String): Graph {
newGraph = Graph()
newStack.clear()
mNodesList.clear()
val gson = Gson()
var mappedNodes: Nodes = gson.fromJson(json, Nodes::class.java)
val mapper = ObjectMapper()
newStack.push(mappedNodes)
while (newStack.isNotEmpty()) {
replaceData(newStack.pop(), oldData, newData)
}
var position = -1
for(element in mNodesList){
if(element.name == currentNode!!.data.toString()){
println("Adding new node to ${mNodesList.indexOf(element)}")
position = mNodesList.indexOf(element)
}
}
mNodesList.add(position + 1, Nodes(newData, mNodesList))
for(node in mNodesList){
println(node.name)
}
//Stackoverflow
// println(mNodesList.toString())
//Stackoverflow
// val newJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(mNodesList)
// println("json::: \n $newJson")
json = json.replace(oldData, newData, ignoreCase = false) //WIP Not final
return newGraph
}
// This method replaces some node data with the newly entered data
// this method uses recursivity to load all children and names in order
private fun replaceData(nodes: Nodes, oldData: String, newData: String) {
for (node in nodes.children) {
if (node.hasChildren()) {
if (node.name == oldData) {
mNodesList.add(node)
newGraph.addEdge(Node(nodes.name), Node(newData)) //<--- replaces data
newStack.push(Nodes(newData, node.children))
} else {
mNodesList.add(node)
newGraph.addEdge(Node(nodes.name), Node(node.name))
newStack.push(node)
}
}
}
}
I read some posts where people uses HashMaps but I'm quite lost and I don't think I understand how JSON mapping works.
Summary
I'm looking for a way to add and delete nodes from the string (JSON) provided above, but I don't quite know how to fix what I already have. It's the first time I'm working with JSON and Lists with Kotlin so I would greatly apreciate any information or help, any insights on how to improve or workaround will also be apreciated.
If anyone wants to see the code it's currently public in my GitHub repository.
PD: I tried providing as much information as possible, if the question is still unclear I will try to improve it.
In case anyone is in a similar situation, here's the solution I came up with.
I ended up simplifying the JSON structure I was using since having a nested JSON was giving me so many problems. I decided to link children and parents in another way. This is the current JSON structure:
{
"nodes": [
{
"data": "A",
"parent": "root"
},
{
"data": "B",
"parent": "A"
},
{
"data": "C",
"parent": "A"
},
{
"data": "G",
"parent": "B"
},
{
"data": "D",
"parent": "C"
},
{
"data": "E",
"parent": "D"
},
{
"data": "F",
"parent": "D"
},
{
"data": "H",
"parent": "F"
},
{
"data": "I",
"parent": "H"
},
{
"data": "J",
"parent": "I"
},
{
"data": "K",
"parent": "J"
}
]
}
I also remade my Nodes class, and separated it two parts: Nodes.kt and SingleNode.kt.
Now the Nodes class only contains a list of SingleNode, and SingleNode contains the data of the node and its parent.
/**
* This class represets all the nodes
* #param nodes represents a list of all the existing nodes
*/
class Nodes(var nodes: List<SingleNode>)
/**
* This class represents the instance of a single node
* #param data name of the node
* #param name of its parent or upper node
*/
class SingleNode(var data: String, var parent: String)
Once I had those classes, I used the GSON library to map the JSON string into a Nodes object.
val tree: Nodes = gson.fromJson(json, Nodes::class.java)
With this structure I was able to map the nodes into a LinkedHashMap, which I can then use to add, remove or edit any key and value (which represent the name of the node and the parent).
By using a mutableListOf<SingleNode> and GSON I can then recreate a JSON based on the previously modified HashMap.
Related
I'm trying to find all the objects in a list of objects that contain a particular field name. For example
"list": [
{
"namesArray": [],
"name": "Bob",
"id": "12345",
},
{
"namesArray": [
"Jenny"
],
"name": "Ned",
},
{
"namesArray": [],
"name": "Jane",
"id": "gkggglg",
}
]
The class looks like this:
class ListItem {
String id;
String name;
List<String> namesArray;
}
So basically I need to find all the objects that contain the field "id". Something like:
list.stream().filter(li -> li.equals("id")).collect(Collectors.toList());
I've tried following this page and it isn't quite what I want. I don't care about the values of the id's, just whether or not the object has the field at all.
From the comments, we get your actual requirement:
So all objects with a non-null id field.
It's easy to adapt the code you've already got using streams and a filter - you just need to change the predicate that's being passed to the filter method. That predicate needs to return true for any value you want to be in the result, and false for any value you want to be discarded. So all you need is:
var result = list
.stream()
.filter(item -> item.id != null)
.collect(Collectors.toList());
i wanna retrieve map(ArrayList) according to the key, help me with the code below
public class MainActivity extends AppCompatActivity {
Map<String,ArrayList<Model>> map=new ArrayMap<>(); <----
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
......
.....
for (int j=0;j<jsonArray.length();j++)
{
JSONObject tempObj=jsonArray.getJSONObject(j);
String price =tempObj.getString("price");
String product=tempObj.getString("Product");
String qty=tempObj.getString("Qty");
modelList.add(new Model(id,price,product,qty,date));
map.put(key,modelList); <----
modelList.clear();
}
......
......
//here to retrieve that map in the same mainactivity
CODE....??????? <----
here is my json where those months are dynamic (jan, april, jun,...they are not constant).
{
"response": "success",
"servicecode": "134",
"forecast": {
"month": {
"jan": [
{
"id": "1",
"price": "12",
"Product": "1086",
"Qty": "14"
},
{
"id": "2",
"price": "19",
"Product": "1746",
"Qty": "45"
}
],
"april": [
{
"id": "3",
"price": "89",
"Product": "1986",
"Qty": "15"
},
{
"id": "1",
"price": "12",
"Product": "1086",
"Qty": "145"
}
],
"jun": [
{
"id": "81",
"price": "132",
"Product": "17086",
"Qty": "1445"
},
{
"id": "11",
"price": "132",
"Product": "10786",
"Qty": "1445"
}
]
}
},
"message": "Competitor Sales."
}
what i did is i took all the response separately with the key and stored in MAp, now what i want to do is to display the array in according to month in View pager. so tell me map'll do good or any alternative....
You're definitely on the right idea with maps, since you're able to assign each month an object.
However, you may want to consider using a LinkedHashMap instead of an ArrayMap because a LinkedHashMap preserves the insertion order, while ArrayMap's docs does not mention preserving insertion order anywhere.
The reason preserving insertion is important is because the months object in the JSON are presented in the order that you want to display it in. Therefore, if you parse "January" first, it will be guaranteed to be at the front of the Map.
You can declare and initialize a LinkedHashMap as follows:
Map<String, ArrayList<Model>> map = new LinkedHashMap<>();
I am assuming that you have correctly parsed this json into your objects, since you did not say anything in regards to the map being incorrectly populated, and by the placement of your arrows "<-----"
Now, for the code to actually use the Map.
for (String monthName : map.keySet()) {
ArrayList<Model> modelArray = map.get(monthName);
//now do whatever your view pager needs to do with this Model and its fields
for (Model model : modelArray) {
String price = model.getPrice(); //if you have a getter
String product = model.product; //if the field is public
}
}
Notes:
Using the LinkedHashMap's keyset is valid because the documentation guarantees that the keyset will be in insertion order. Also, this SO answer also confirms this fact.
I personally recommend GSON for parsing JSON objects, but since you seem to be okay with the parsing, this is not a big deal.
If you put ArrayMap correctly, you could get it as below:
Model model = map.get("april");
String id = model.id;
String price = model.price;
.....
You should ensure the key variable in your codes is a month name.
Is the month the key of your map?
If so, then you could simply define a list of months and retrieve the values accordingly, something like this:
for(String month: Arrays.asList("jan", "feb", "mar", "apr", "...")) {
if(map.containsKey(month)) { //skip months if not present
//--> code here
Model model = map.get(month);
}
}
I'm trying to take an array of objects from json file and i have an issue.
path.get("wgcTournaments.items")
What path i should use to get all items(item0, item1, item2 ...) in items?
Can you please give me an advice how to do it.
Json example
{
"wgcTournaments": {
"items": {
"jcr:primaryType": "nt:unstructured",
"item0": {
"jcr:primaryType": "nt:unstructured",
"test": "test",
"test1": "test1"
},
"item1": {
"jcr:primaryType": "nt:unstructured",
"test": "test",
"test1": "test1"
},
"item2": {
"jcr:primaryType": "nt:unstructured",
"test": "test",
"test1": "test1"
},
"item3": {
"jcr:primaryType": "nt:unstructured",
"test": "test",
"test1": "test1"
}
}
}
}
The best way to filter item from items object but i don't understand how to do it with json path.
Finally i found a solution for my question.
If you want to get item from items you need to use this one json Path
path.getObject("wgcTournaments.items*.
find{it.key.startsWith('item')}.value",ItemClass[].class);
Note:
it was RestAssured and he uses Gpath more details you can find here
http://docs.groovy-lang.org/latest/html/documentation/#_gpath
You are trying to deserialize an object into an array of objects. Either your code or your JSON (most likely) is wrong.
If you want to deserialize items as an array, your JSON should be the following:
{
"wgcTournaments": {
"items": [
{
"jcr:primaryType": "nt:unstructured",
"item0": {},
"item1": {},
"item2": {},
"item3": {}
}
]
}
}
Otherwise, if your JSON is correct, you should deserialize your JSON using the following line:
path.getObject("wgcTournaments.items", MyClass.class)
EDIT: After your edit, this seems to be what you want:
If your JSON in correct and you indeed want an array, I assume that each itemX is a the key and {} the corresponding value. In this case, you have to know that you cannot have an associative array in JSON, you should use a custom solution to deserialize it, because your associative array will be converted into an object.
Using Java 8 and Jackson, I am trying to parse JSON from here:
REST Endpoint
The problem is, there is a single value that sometimes is an empty array and sometimes is a HashMap. I pasted two excerpts below, check out cor_icon in each:
"2630775": {
"id": "2630775",
"label": "breakfast grill",
"description": "breakfast sandwiches, turkey sausage and bacon, home fries, pork sausage and bacon, omelets made to order with whole eggs, egg whites, eggbeaters and assorted fillings",
"zero_entree": "0",
"cor_icon": {
"18": "humane"
},
Or:
"4779080": {
"id": "4779080",
"label": "sweet chili vegan soup\nchicken egg drop soup",
"description": "",
"zero_entree": "0",
"cor_icon": {
"1": "vegetarian",
"4": "vegan"
},
As opposed to:
"2630777": {
"id": "2630777",
"label": "morning pastries",
"description": "assorted danish, muffins, bagels, coffee cakes and tea breads",
"zero_entree": "0",
"cor_icon": [],
"ordered_cor_icon": [],
For my Java setter:
#JsonSetter("cor_icon")
public void setCorIcon(HashMap ci)
{
cor_icon = ci;
}
or better:
#JsonSetter("cor_icon")
public void setCorIcon(HashMap<String, String> ci)
{
cor_icon = ci;
}
This works fine when there is data but when there is not, when I get "cor_icon":[] Jackson throws an exception saying it can not deserialize a HashMap from cor_icon. If I change cor_icon to an ArrayList then when there is data Jackson complains that it can't deserialize an ArrayList from cor_icon.
So what's the trick?
You need to specify type that is valid for both JSON Arrays and JSON Objects. Two obvious choices are java.lang.Object and JsonNode: in first case, you'll get either List or Map (and need to cast); in latter case ArrayNode or ObjectNode.
I am developing a wireless network survey tool built with Java (Swing GUI) and a MongoDB data storage solution. I am new to MongoDB and hardly a Java guru so I need some help. I want to find if a network exists in my database and append heard points to the network document. If the network doesn't exist, I would like to create a document for that network and add the heard points. I have been trying to fix this for days but I just can't seem to wrap my head around the solution. Also, it would be nice if the BSSID was the unique id so I don't get any duplicate networks. My ideal data structure would look something like this:
{ 'bssid' : 'ca:fe:de:ad:be:ef',
'channel' : 6,
'heardpoints' : {
'point' : { 'lat' : 36.12345, 'long' : -75.234564 },
'point' : { 'lat' : 36.34567, 'long' : -75.345678 }
}
This is what I have tried so far. It seems to add the initial point but it does not add additional points after the first one was made.
BasicDBObject query = new BasicDBObject();
query.put("bssid", pkt[1]);
DBCursor cursor = coll.find(query);
if (!cursor.hasNext()) {
// Document doesnt exist so create one
BasicDBObject document = new BasicDBObject();
document.put("bssid", pkt[1]);
BasicDBObject heardpoints = new BasicDBObject();
BasicDBObject point = new BasicDBObject();
point.put("lat", latitude);
point.put("long", longitude);
heardpoints.put("point", point);
document.put("heardpoints", heardpoints);
coll.insert(document);
} else {
// Document exists so we will update here
DBObject network = cursor.next();
BasicDBObject heardpoints = new BasicDBObject();
BasicDBObject point = new BasicDBObject();
point.put("lat", latitude);
point.put("long", longitude);
heardpoints.put("point", point);
network.put("heardpoints", heardpoints);
coll.save(network);
}
I feel like I am way off the reservation on this one. Any support would help, thanks a lot!
UPDATE
I am using the upsert suggestion but still having some issue. No doubt this will work for me, I am just not doing it correctly. I am still not getting any new points past the first one added.
BasicDBObject query = new BasicDBObject("bssid", pkt[1]);
System.out.println(query);
DBCursor cursor = coll.find(query);
System.out.println(cursor);
try {
DBObject network = cursor.next();
System.out.println(network);
network.put("heardpoints", new BasicDBObject("point",
new BasicDBObject("lat", latitude)
.append("long", longitude)));
coll.update(query, network, true, false);
} catch (NoSuchElementException ex) {
System.err.println("mongo error");
} finally {
cursor.close();
}
You've got two ways to address this really, it just depends on how you actually want to use the data. In either case the first thing to address is your "ideal data structure", and mostly because it is invalid. This is the wrong part:
'heardpoints' : {
'point' : { 'lat' : 36.12345, 'long' : -75.234564 },
'point' : { 'lat' : 36.34567, 'long' : -75.345678 }
}
So this "hash/map" is invalid because you have the same "key" named twice. You cannot do that and you probably want and "array" instead, as well as something that you have a hope of using GeoSpatial queries on later when you want to:
Array Approach
"heardpoints": [
{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
},
"time": ISODate("2014-11-04T21:09:18.437Z")
},
{
"geometry": {
"type": "Point",
"coordinates": [ -75.345678, 36.34567 ]
},
"time": ISODate("2014-11-04T21:10:28.919Z")
}
]
And a correct ordering for "lon" and "lat" as how MongoDB and the GeoJSON spec it follows does it.
Now this is for the form where you are going to keep all of your "hearddata" in a "single document" per "bssid" value, with each location kept in an array. Note that this is not really necessarily and "upsert" per se, except in the first creation instance. The main intent is to "update" the same "bssid" value document. Just in shell form now with a Java syntax translation later:
db.collection.update(
{ "bssid": "ca:fe:de:ad:be:ef" },
{
"$setOnInsert": { "channel": 6 },
"$push": {
"heardpoints": {
"$each": [{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
},
"time": ISODate("2014-11-04T21:09:18.437Z")
}],
"$sort": { "time": -1 },
"$slice": 20
}
}
},
{ "upsert": true }
);
Whatever the language and API representation, there are basically two parts to a MongoDB update operation. Essentially this:
[ < Query >, < Update > ]
Depending on the API presentation there are technically "three" parts where the third is Options but on the basic consideration on the "upsert" option, it is important to understand how both the Query and Update document portions are handled in an update operation.
The most important thing to apply to the Update document is that it has two forms. If you just supply "keys" and "values" in a standard object form then whatever is supplied will "overwrite" any existing content in a matched document. The other form (which will be used in all examples) is to use "update operators" which allow "parts" of the document to be modified or "augmented". That is important distinction. But on with the examples.
On a blank collection or at least one where the specified "bssid" value does not exist, then a new document would be created containing that "bssid" field value. Additionally there is some other behavior that is going to happen.
There is a special "update operator" in here called $setOnInsert. Just like the conditions specified in the Query portion of the statement, any fields and values mentioned here are only "created" in the document when a "new" document is inserted. So if the document matching the query condition was found then none of the operations here are actually performed to change the found document. This is a good place to set initial values and also limit the write activity on the document to just the fields where it is required.
The second section in the Update document is another "update operator" called $push. As expected by the common term in computing languages, this "adds items" to an "array". So on document creation then a new array is made and the items are appended or otherwise added to the "existing" array content in the found document.
There are some interesting modifiers here which have their own purpose. $each is a modifier that allow more than one item to be sent to an operator like $push at a time. We are only using it for a single item, but it's use it generally required with the other two modifiers we are interested in.
The next is $sort which is applied to the array elements present in the document in order to "sort" them by the condition. In this case there is a "time" field on the array elements, so the "sort" makes sure that as new elements are added then the contents of the array is always ordered so that the "newest" entries are always at the front of the array.
The final there is $slice which is complementing $sort by essentially specifying a "capped amount" for the array. So just to make sure out documents never get too large, the $slice modifier, which would be applied "after" the $sort modifier has done it's work then "removes" any entries beyond the specified "maximum" entries, and maintains the "maximum" length at that number. So quite a useful feature.
Of course if you did not care about a "time" value then there is another way to handle this so that the "coordinate" data is only kept for "unique" combinations. That way is to use the $addToSet operator to manage array or "set" entries by itself:
db.collection.update(
{ "bssid": "ca:fe:de:ad:be:ef" },
{
"$setOnInsert": { "channel": 6 },
"$addToSet": {
"heardpoints": {
"$each": [{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
}
}]
}
}
},
{ "upsert": true }
);
Now that does not actually need the $each modifier, but it's just left there for a future point. $addToSet essentially looks at the existing array content and compares it do the element you have supplied. Where that data does not exactly match something already present in the array then it is added to the "set". Otherwise, nothing happens since the data is already there.
So if you just want the data collected for specific points where they vary then this is a good approach. But there is a "catch", and a couple actually that are worth mentioning.
Suppose you want to keep only 20 entries as was mentioned before. While $addToSet supports the $each modifier, unfortunately the other modifiers such as $slice are not supported. So you cant "maintain a cap" with a single update statement and you would in fact have to issue "two" update operations in order to achieve this:
db.collection.update(
{ "bssid": "ca:fe:de:ad:be:ef" },
{
"$setOnInsert": { "channel": 6 },
"$addToSet": {
"heardpoints": {
"$each": [{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
}
}]
}
}
},
{ "upsert": true }
);
db.collection.update(
{ "bssid": "ca:fe:de:ad:be:ef" },
{
"$setOnInsert": { "channel": 6 },
"$push": {
"heardpoints": {
"$each": [],
"$slice": 20
}
}
}
)
But even so we have a new problem here. Aside from now counting in "two" operations, keeping this cap has another problem, which basically is that a "set" is "not ordered" in any way. So you can limit the total number of items in the list with the second update, but there is no way to remove the "oldest" item for example.
In order to do this then you want a "time" field for the "last update", but yes there is a catch again. Once you supply a "time" value then the "distinct data" that makes a "set" is no longer true. An $addToSet operation considers the following to be two "different" entries as all fields and not just the "coordinate" data is considered:
"heardpoints": [
{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
},
"time": ISODate("2014-11-04T21:09:18.437Z")
},
{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
},
"time": ISODate("2014-11-04T21:10:28.919Z")
}
]
Where the intent is to just "update the time" on the existing point at the given coordinates, then you need to take a different approach. But again this is two updates and in reverse, you try to update a document first and then do something else if that does not succeed. Meaning the "upsert" attempt is the second operation:
var result = db.collection.update(
{
"bssid": "ca:fe:de:ad:be:ef",
"heardpoints.geometry.coordinates": [-75.234564, 36.12345 ]
},
{
"$set": {
"heardpoints.$.time": ISODate("2014-11-04T21:10:28.919Z")
}
}
);
// If result did not match and modify anything existing then perform the upsert
if ( ) {
db.collection.update(
{ "bssid": "ca:fe:de:ad:be:ef" }, // just this key and not the array
{
"$setOnInsert": { "channel": 6 },
"$push": {
"heardpoints": {
"$each": [{
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
},
"time": ISODate("2014-11-04T21:09:18.437Z")
}],
"$sort": { "time": -1 },
"$slice": 20
}
}
},
{ "upsert": true }
);
}
So two sepations where one tries to "update" an existing array entry by first querying for that position. That first operation cannot be an upsert since it would create a new document with the same "bssid" and the array entry that was not found. If it could that would be, but this is not allowed with the positional $ operator which is using a matched position of the found element so that that element can be altered via the $set operator.
In the Java invocation there is a WriteResult type that is returned which can be used like this:
WriteResult writeResult = collection.update(query1, update1, false, false);
if ( writeResult.getN() == 0 ) {
// Upsert would be tried if the array item was not found
writeResult = collection.update(query2, update2, true, false);
}
If something was not updated then the serialized content looks like this:
{ "serverUsed" : "192.168.2.3:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : true}
Which means you basically nest the n value to see what happened and make your decision on whether to "update" the array item or "push" a new one depending on where the query matched that array item or not.
Document Approach
The general conclusion from the above is that where you want to keep distinct data for the "coordinates" and just modify a "time" entry then the above process can get messy. The operations are not ideally atomic, and though there can be some tuning, it is probably not well suited to high volume updates.
This is a case then where the logic is to "remove" the array storage, and then store each distinct "point" in it's own document with the related "bssid" field. This simplifies the case of whether to update or "insert" a new one into a single operation model. Documents in the collection now look like this:
{
"bssid": "ca:fe:de:ad:be:ef",
"channel": 6,
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
},
"time": ISODate("2014-11-04T21:09:18.437Z")
},
{
"bssid": "ca:fe:de:ad:be:ef",
"channel": 6,
"geometry": {
"type": "Point",
"coordinates": [ -75.345678, 36.34567 ]
},
"time": ISODate("2014-11-04T21:10:28.919Z")
}
Distinct in their own collection and not bound in the same document under an array. There is data duplication but the "update" process is now much simplified:
db.collection.update(
{
"bssid": "ca:fe:de:ad:be:ef",
"geometry": {
"type": "Point",
"coordinates": [-75.234564, 36.12345 ]
}
},
{
"$setOnInsert": { "channel": 6 },
"$set": { "time": ISODate("2014-11-04T21:10:28.919Z") }
}
{ "upsert": true }
)
And all that does would be match a document based on the supplied "bssid" and "point" values either "updating" the "time" where it matched or just inserting a new document with all values where that "bssid" and "point" data was not found.
The overall case is that where this started off with simple needs and it was fine to "embed" the array into the array, maintaining more complex needs can be a possible pain to use that storage form. On the other hand, using separate documents in the collection has it's benefits on one side, but then you do have to do your own work to "clean up" entries beyond any cap limits you might want. But it is arguable that may not necessarily need to be a "real time" operation.
Different approaches, so work with the one that suits you best. This is just a guide to implement in either way and showing the pitfalls and solutions. What works best for you, only you can tell.
This really is more about the technique than the specific Java coding. That part is not hard, so here is just some of the most difficult structure from above for reference:
DBObject update = new BasicDBObject(
"$setOnInsert", new BasicDBObject(
"channel", 6
)
).append(
"$push", new BasicDBObject(
"heardpoints", new BasicDBObject(
"$each", new DBObject[]{
new BasicDBObject(
"geometry",
new BasicDBObject("type","Point").append(
"coordinates", new double[]{-75.234564, 36.12345}
)
).append(
"time", new DateTime(2014,1,1,0,0,DateTimeZone.UTC).toDate()
)
}
).append(
"$sort", new BasicDBObject(
"time", -1
)
).append("$slice", 20)
)
);