I need to replace a whole existing document of mongodb from java instead of setting every field.Is there any way? I am using mongo morphia.
Right now i am setting fields one by one ,following is code :
DBObject searchObject =new BasicDBObject();
searchObject.put("procId", procId);
final UpdateOperations<Timesheet> updateOperations = ds.createUpdateOperations(Timesheet.class)
.set("wheelInTime", timesheet.getWheelInTime())
.set("wheelOutTime", timesheet.getWheelOutTime())
.set("tableOnTime", timesheet.getTableOnTime())
.set("tableOffTime", timesheet.getTableOffTime())
final UpdateResults results = ds.updateFirst(findQuery,updateOperations);
You can 'overwrite' any entry in a MongoDB collection but simply creating a new DbObject with the same _id field and saving it to the database. So just set the fields in your object as you would any Java object and use myCollection.save(obj)
Just save the object and it will overwrite the document with the same #id. This can be done with one line of code:
dao.save(timesheet);
More complete example code of the usage of the Morphia DAO:
class Dao extends BasicDAO<TimeSheet, String> {
Dao(Datastore ds) {
super(TimeSheet.class, ds);
}
}
Datastore ds = morphia.createDatastore(mongoClient, DB_NAME);
Dao dao = new Dao(ds);
dao.save(timesheet);
I am using the latest version of alfresco 5.1 version.
one of my requirement is to create properties (key / value) where user enter the key as well as the value.
so I have done that like this
Map<QName, Serializable> props = new HashMap<QName, Serializable>();
props.put(QName.createQName("customProp1"), "prop1");
props.put(QName.createQName("customProp2"), "prop2");
ChildAssociationRef associationRef = nodeService.createNode(nodeService.getRootNode(storeRef), ContentModel.ASSOC_CHILDREN, QName.createQName(GUID.generate()), ContentModel.TYPE_CMOBJECT, props);
Now what I want to do is search the nodes with these newly created properties. I was able to search the newly created property like this.
public List<NodeRef> findNodes() throws Exception {
authenticate("admin", "admin");
StoreRef storeRef = new StoreRef(StoreRef.PROTOCOL_WORKSPACE, "SpacesStore");
List<NodeRef> nodeList = null;
Map<QName, Serializable> props = new HashMap<QName, Serializable>();
props.put(QName.createQName("customProp1"), "prop1");
props.put(QName.createQName("customProp2"), "prop2");
ChildAssociationRef associationRef = nodeService.createNode(nodeService.getRootNode(storeRef), ContentModel.ASSOC_CHILDREN, QName.createQName(GUID.generate()), ContentModel.TYPE_CMOBJECT, props);
NodeRef nodeRef = associationRef.getChildRef();
String query = "#cm\\:customProp1:\"prop1\"";
SearchParameters sp = new SearchParameters();
sp.addStore(storeRef);
sp.setLanguage(SearchService.LANGUAGE_LUCENE);
sp.setQuery(query);
try {
ResultSet results = serviceRegistry.getSearchService().query(sp);
nodeList = new ArrayList<NodeRef>();
for (ResultSetRow row : results) {
nodeList.add(row.getNodeRef());
System.out.println(row.getNodeRef());
}
System.out.println(nodeList.size());
} catch (Exception e) {
e.printStackTrace();
}
return nodeList;
}
The alfresco-global.properties indexer configuration is
index.subsystem.name=buildonly
index.recovery.mode=AUTO
dir.keystore=${dir.root}/keystore
Now my question is
Is it possible to achieve the same using the solr4 indexer ?
Or Is there any way to use buildonly indexer for a particular query ?
In your query
String query = "#cm\\:customProp1:\"prop1\"";
remove cm as you are building the QName on the fly so it does not come under cm i.e. (ContentModel) properties. So your query will be
String query = "#\\:customProp1:\"prop1\"";
Hope this will work for you
First, double check if you're simply experiencing eventual consistency, as described below. If you are, and if this presents a problem for you, consider switching to CMIS queries while staying on SOLR.
http://docs.alfresco.com/5.1/concepts/solr-event-consistency.html
Other than this, check if the node has been indexed at all. If it has, take a closer look at how you build your query.
How to find List of unindexed file in alfresco
When executing queries on a standalone Neo4J server using the RestCypherEngine, what is the best practice to retrieve a collection of nodes?
I have this code snippet running....
public DbService() {
gd = new RestGraphDatabase("http://neo4jbox:7474/db/data/");
engine = new RestCypherQueryEngine(gd.getRestAPI());
}
public String testData() {
try (Transaction tx = gd.beginTx()) {
QueryResult<Map<String, Object>> result;
result = engine.query(
"match (n:Person{username:'jomski2009'}) return n ",
null);
Iterator<Map<String, Object>> itr = result.iterator();
while (itr.hasNext()) {
Map<String, Object> item = itr.next();
log.info(item.get("n"));
}
tx.success();
return result.toString();
}
}
When I run the code, I get the following result...
services.DbService : http://neo4jbox:7474/db/data/node/177
which is a link to the node rather than the node itself. Now I know that if I return just a subset of the properties of the node in the same query that works well. What I'd like to know is how do I retrieve complete node object without necessarily specifying the properties in the query?
Thanks for your help guys.
It is just the to-string representation of a RestNode, it still has the properties. But not the relationships fetched those will be fetched on demand.
I would recommend to try to fetch primitive values over the wire with Cypher, works best as it minimizes the transferred data and you only get what you need.
Here is a simple pojo:
public class Description {
private String code;
private String name;
private String norwegian;
private String english;
}
And please see the following code to apply an upsert to MongoDb via spring MongoTemplate:
Query query = new Query(Criteria.where("code").is(description.getCode()));
Update update = new Update().set("name", description.getName()).set("norwegian", description.getNorwegian()).set("english", description.getEnglish());
mongoTemplate.upsert(query, update, "descriptions");
The line to generate the Update object specifies every field of the Item class manually.
But if my Item object changes then my Dao layer breaks.
So is there a way to avoid doing this, so that all fields from my Item class are applied automatically to the update?
E.g.
Update update = new Update().fromObject(item);
Note that my pojo does not extend DBObject.
I found a pretty good solution for this question
//make a new description here
Description d = new Description();
d.setCode("no");
d.setName("norwegian");
d.setNorwegian("norwegian");
d.setEnglish("english");
//build query
Query query = new Query(Criteria.where("code").is(description.getCode()));
//build update
DBObject dbDoc = new BasicDBObject();
mongoTemplate.getConverter().write(d, dbDoc); //it is the one spring use for convertions.
Update update = Update.fromDBObject(dbDoc);
//run it!
mongoTemplate.upsert(query, update, "descriptions");
Plz note that Update.fromDBObject return an update object with all fields in dbDoc. If you just want to update non-null fields, you should code a new method to exclude null fields.
For example, the front-end post a doc like below:
//make a new description here
Description d = new Description();
d.setCode("no");
d.setEnglish("norwegian");
We only need to update the field 'language':
//return Update object
public static Update fromDBObjectExcludeNullFields(DBObject object) {
Update update = new Update();
for (String key : object.keySet()) {
Object value = object.get(key);
if(value!=null){
update.set(key, value);
}
}
return update;
}
//build udpate
Update update = fromDBObjectExcludeNullFields(dbDoc);
The solution for a new spring-data-mongodb version 2.X.X.
The API has evolved, since 2.X.X version there is:
Update.fromDocument(org.bson.Document object, String... exclude)
instead of (1.X.X):
Update.fromDBObject(com.mongodb.DBObject object, String... exclude)
The full solution:
//make a new description here
Description d = new Description();
d.setCode("no");
d.setName("norwegian");
d.setNorwegian("norwegian");
d.setEnglish("english");
Query query = new Query(Criteria.where("code").is(description.getCode()));
Document doc = new Document(); // org.bson.Document
mongoTemplate.getConverter().write(item, doc);
Update update = Update.fromDocument(doc);
mongoTemplate.upsert(query, update, "descriptions");
It works!
you can use save : (if non exist = insert else = upsert)
save(Object objectToSave, String collectionName)
read : javadoc
Just like previous answers said, use mongoTemplate.getConverter().write() and Update.fromDocument() functions. But i found Update.fromDocument() won't add "$set" key and won't work directly, the solution is to add "$set" yourself, like below (PS: I'm using 2.2.1.RELEASE version):
public static Update updateFromObject(Object object, MongoTemplate mongoTemplate) {
Document doc = new Document();
mongoTemplate.getConverter().write(object, doc);
return Update.fromDocument(new Document("$set", doc));
}
If you want to upsert Pojos incl. property String id; you have to exclude the _id field in the fromDBObject method Update.fromDBObject(dbDoc,"_id").
Otherwise you get the Exception:
org.springframework.dao.DuplicateKeyException: { "serverUsed" : "127.0.0.1:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: db.description index: _id_ dup key: { : null }" , "code" : 11000}; nested exception is com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "127.0.0.1:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: db.description index: _id_ dup key: { : null }" , "code" : 11000}
because the _id field of the first is null
{
"_id" : null,
...
}
Fullcode based on #PaniniGelato answer would be
public class Description(){
public String id;
...
}
Description d = new Description();
d.setCode("no");
d.setName("norwegian");
d.setNorwegian("norwegian");
d.setEnglish("english");
//build query
Query query = new Query(Criteria.where("code").is(description.getCode()));
//build update
DBObject dbDoc = new BasicDBObject();
mongoTemplate.getConverter().write(d, dbDoc); //it is the one spring use for convertions.
Update update = Update.fromDBObject(dbDoc, "_id");
//run it!
mongoTemplate.upsert(query, update, "descriptions");
Then the upsert is working in the cases of insert and update. Corrections & thoughts are welcome ;)
This is what I am doing for the time being. Not so much elegant way to do it, but it does save a precious DB call:
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Query;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Perform an upsert operation to update ALL FIELDS in an object using native mongo driver's methods
* since mongoTemplate's upsert method doesn't allow it
* #param upsertQuery
* #param object
* #param collectionName
*/
private void performUpsert(Query upsertQuery, Object object, String collectionName){
ObjectMapper mapper = new ObjectMapper();
try {
String jsonStr = mapper.writeValueAsString(object);
DB db = mongoTemplate.getDb();
DBCollection collection = db.getCollection(collectionName);
DBObject query = upsertQuery.getQueryObject();
DBObject update = new BasicDBObject("$set", JSON.parse(jsonStr));
collection.update(query, update, true, false);
} catch (IOException e) {
LOGGER.error("Unable to persist the metrics in DB. Error while parsing object: {}", e);
}
}
There are two cases here that need to be distinguished:
Update an item that was previously fetched from the DB.
Update or insert (upsert) an item you created by code.
In Case 1) You can simply use mongoTemplate.save(pojo, "collection"), because your POJO will already have a filled ObjectID in its id field.
In case 2) You have to explain to mongo what "already exists" means in case of your domain model: By default the mongoTemplate.save() method updates an existing item, if there is one with that same ObjectId. But with a newly instantiated POJO you do not have that id. Therefore the mongoTemplate.upsert() method has a query parameter that you can create like this:
MyDomainClass pojo = new MyDomainClass(...);
Query query = Query.query(Criteria.where("email").is("user1#domain.com"));
DBObject dbDoc = new BasicDBObject();
mongoTemplate.getConverter().write(pojo, dbDoc); //it is the one spring use for convertions.
dbDoc.removeField("_id"); // just to be sure to not create any duplicates
Update update = Update.fromDBObject(dbDoc);
WriteResult writeResult = mongoTemplate.upsert(query, update, UserModel.class);
I ran into the same problem. In het current Spring Data MongoDB version no such thing is available. You have to update the seperate fields by hand.
However it is possible with another framework: Morphia.
This framework has a wrapper for DAO functionality: https://github.com/mongodb/morphia/wiki/DAOSupport
You can use the DAO API to do things like this:
SomePojo pojo = daoInstance.findOne("some-field", "some-value");
pojo.setAProperty("changing this property");
daoInstance.save(pojo);
I think that:
Description add a property
#Id
private String id;
then get a document by the query condition,set Description's id by document's id.
and save
Just use ReflectionDBObject - if you make Description extend it, you should just get your object's fields transferred to Update reflectively, automagically. The note from above about null fields included in the update still holds true.
public void saveOrUpdate(String json) {
try {
JSONObject jsonObject = new JSONObject(json);
DBObject update1 = new BasicDBObject("$set", JSON.parse(json));
mongoTemplate.getCollection("collectionName").update(new Query(Criteria.where("name").is(jsonObject.getString("name"))).getQueryObject(), update1, true, false);
} catch (Exception e) {
throw new GenericServiceException("Error while save/udpate. Error msg: " + e.getMessage(), e);
}
}
this is very simple way to save json string into collection using mongodb
and spring.
This method can be override to use as JSONObject.
#Override
public void updateInfo(UpdateObject algorithm) {
Document document = new Document();
mongoTemplate.getConverter().write(algorithm, document);
Update update = Update.fromDocument(document);
mongoTemplate.updateFirst(query(where("_id").is(algorithm.get_id())), update, UpdateObject.class);
}
After upsert, I was Tring to fetch same record but it was given me the old one.
But in dB I am having new records.
I'm using Neo4j (1.9) and lucene core 3.5, through the support offered by the neo4j-lucene-index library.
In my code, I create a new node index like that:
HashMap<String, String> stringMap = new HashMap<String, String>();
stringMap.put("type", "fulltext");
stringMap.put(IndexManager.PROVIDER, "lucene");
stringMap.put("to_lower_case", "false");
stringMap.put("similarity", MySimilarity.class.getName());
stringMap.put("analyzer", MyAnalyzer.class.getName());
simpleIndex = (LuceneIndex<Node>) template.getGraphDatabaseService()
.index().forNodes("simple-index", stringMap);
and query the node index by means of a multi-field query:
Analyzer analyzer = new MyAnalyzer();
Query query = null;
try {
query = MultiFieldQueryParser.parse(Version.LUCENE_35,
queriesArray, fieldsArray, flagsArray, analyzer);
} catch (ParseException e) {
e.printStackTrace();
return;
}
if (query == null)
return;
QueryContext qc = new QueryContext(query).sortByScore();
IndexHits<Node> hits = simpleIndex.query(qc);
How can I change the Similarity implementation, used for sorting by score, to MySimilarity class (which extends the DefaultSimilarity and overrides some of its methods)? The reported query always uses the DefaultSimilarity and I was not able to find any hook to set the similarity used by the searcher activated during my query.
The fact that the supplied similarity isn't used during queries is a bug and should be fixed. Created https://github.com/neo4j/neo4j/issues/375 to track it.