when do we need to close MongoConnection - java

I am using mongo cashbash here is my code
case class A (id:String,name:String)
class InsertClassA(a:A)
{
def inserA()={
val mongoClient=MongoClient(hostName,port)
//get collection and insert record in mongo
mongoClient.close()
}
}
class UpdateClassA(a:A)
{
def UpdateA()={
val mongoClient=MongoClient(hostName,port)
//get collection and update record in mongo
mongoClient.close()
}
}
class DeleteClassA(a:A)
{
def deleteA()={
val mongoClient=MongoClient(hostName,port)
//get collection and delete record in mongo
mongoClient.close()
}
}
object test extends App {
val a =A("123","bob")
val insert =new InsertClassA(a)
val update =new UpdateClassA(a)
val delete =new DeleteClassA(a)
insert.insertA()
update.UpdateA()
delete.deleteA()
}
I want to know when should i close a mongoConnection? The above approach is correct ? if not what is the right way to not waste any resources and utilize the mongoCLient instance in well manner ,please guide me

Generally, the MongoClient is a heavyweight component tailored for long lifetime (i.e. application lifetime.) Thus, you should rather open it once in the beginning and retain the reference.

Related

Unable to perform Ignite SQL query over [CustomKey, CustomValue] cache in Scala.

I am trying to setup a distributed cache using Apache Ignite with Scala.
After setting up the cache, I am able to put and get items knowing the key, but SQL queries of any type returns always a cursor with null iterator.
Here is how I setup my cache (please note that this is done before the ignition.start):
def setupTelemetryCache(): CacheConfiguration[TelemetryKey, TelemetryValue] = {
val dataRegionName = "persistent-region"
val cacheName = "telemetry-cache"
// This object is required to perform SQL queries over custom key object
val queryEntity = new QueryEntity("TelemetryKey", "TelemetryValue")
val fields: util.LinkedHashMap[String, String] = new util.LinkedHashMap[String, String]
fields.put("deviceId", classOf[String].getName)
fields.put("metricName", classOf[String].getName)
fields.put("timestamp", classOf[String].getName)
queryEntity.setFields(fields)
val keyFields: util.HashSet[String] = new util.HashSet[String]()
keyFields.add("deviceId")
keyFields.add("metricName")
keyFields.add("timestamp")
queryEntity.setKeyFields(keyFields)
queryEntity.setIndexes(Collections.emptyList[QueryIndex]())
new CacheConfiguration()
.setName(cacheName)
.setDataRegionName(dataRegionName)
.setCacheMode(CacheMode.PARTITIONED) // Data is split among nodes
.setBackups(1) // each partition has 1 backup
.setIndexedTypes(classOf[String], classOf[TelemetryKey]) // Index by ID
.setWriteSynchronizationMode(CacheWriteSynchronizationMode.FULL_ASYNC) // Faster, clients do not wait for cache
// synchronization. Consistency issues?
.setAtomicityMode(CacheAtomicityMode.TRANSACTIONAL) // Allows transactional query
.setQueryEntities(Collections.singletonList(queryEntity))
}
And those are the code of my TelemetryKey:
case class TelemetryKey private (
#(AffinityKeyMapped #field)
#(QuerySqlField#field)(index = true)
deviceId: String,
#(QuerySqlField#field)(index = false)
metricName: String,
#(QuerySqlField#field)(index = true)
timestamp: String) extends Serializable
And TelemetryValue:
class TelemetryValue private(valueType: ValueTypes.Value, doubleValue: Option[Double],
stringValue: Option[String],
longValue: Option[Long]) extends Serializable
A sample SQL query I have to achieve could be "Select * from CACHE where deviceId = 'dev1234'" and I expect to receive all the Cache.Entry[TelemetryKey, TelemetryValue] of the same deviceId
Here is how I perform the query:
private def sqlQuery(query: SqlQuery[TelemetryKey, TelemetryValue]):
QueryCursor[Cache.Entry[TelemetryKey, TelemetryValue]] = {
cache.query(query)
}
def getEntries(ofDeviceId: String):
QueryCursor[Cache.Entry[TelemetryKey, TelemetryValue]] = {
val q = new SqlQuery[TelemetryKey, TelemetryValue](classOf[TelemetryKey], "deviceId = ?")
sqlQuery(q.setArgs(ofDeviceId))
}
Even changing the body of the query i receive a cursor object which is empty. I cannot even perform a "Select *" query.
Thanks for the help
There are two ways to configure indexes and queryable fields.
Annotation based configuration
Your key and value classes need to be annotated #QuerySqlField as follows.
case class TelemetryKey private (
#(AffinityKeyMapped #field)
#(QuerySqlField#field)(index = true)
deviceId: String,
#(QuerySqlField#field)(index = false)
metricName: String,
#(QuerySqlField#field)(index = true)
timestamp: String) extends Serializable
After indexed and queryable fields are defined, they have to be registered in the SQL engine along with the object types they belong to.
new CacheConfiguration()
.setName(cacheName)
.setDataRegionName(dataRegionName)
.setCacheMode(CacheMode.PARTITIONED)
.setBackups(1)
.setIndexedTypes(classOf[TelemetryKey], classOf[TelemetryValue])
.setWriteSynchronizationMode(CacheWriteSynchronizationMode.FULL_ASYNC)
.setAtomicityMode(CacheAtomicityMode.TRANSACTIONAL)
UPD:
One more thing that should be fixed as well is your SqlQuery
def getEntries(ofDeviceId: String):
QueryCursor[Cache.Entry[TelemetryKey, TelemetryValue]] = {
val q = new SqlQuery[TelemetryKey, TelemetryValue](classOf[TelemetryValue], "deviceId = ?")
sqlQuery(q.setArgs(ofDeviceId))
}
QueryEntity based approach
val queryEntity = new QueryEntity(classOf[TelemetryKey], classOf[TelemetryValue]);
new CacheConfiguration()
.setName(cacheName)
.setDataRegionName(dataRegionName)
.setCacheMode(CacheMode.PARTITIONED)
.setBackups(1)
.setWriteSynchronizationMode(CacheWriteSynchronizationMode.FULL_ASYNC)
.setAtomicityMode(CacheAtomicityMode.TRANSACTIONAL)
.setQueryEntities(Collections.singletonList(queryEntity))
Long story short, you should supply full JVM class names to QueryEntity.
As in:
val queryEntity = new QueryEntity("com.pany.telemetry.TelemetryKey",
"com.pany.telemetry.TelemetryValue") // or e.g. TelemetryKey.class.getName()
Ignite needs these to distinguish various types that can be stored in one cache, it's not decorative - there's got to be an exact match.
Better yet? Use setIndexedTypes() instead of setQueryEntities(). It allows you to pass classes instead of Strings and it will scan annotations, which you already have.

Is there any way to update/replace whole document of mongoDB from java using mongoDB morphia?

I need to replace a whole existing document of mongodb from java instead of setting every field.Is there any way? I am using mongo morphia.
Right now i am setting fields one by one ,following is code :
DBObject searchObject =new BasicDBObject();
searchObject.put("procId", procId);
final UpdateOperations<Timesheet> updateOperations = ds.createUpdateOperations(Timesheet.class)
.set("wheelInTime", timesheet.getWheelInTime())
.set("wheelOutTime", timesheet.getWheelOutTime())
.set("tableOnTime", timesheet.getTableOnTime())
.set("tableOffTime", timesheet.getTableOffTime())
final UpdateResults results = ds.updateFirst(findQuery,updateOperations);
You can 'overwrite' any entry in a MongoDB collection but simply creating a new DbObject with the same _id field and saving it to the database. So just set the fields in your object as you would any Java object and use myCollection.save(obj)
Just save the object and it will overwrite the document with the same #id. This can be done with one line of code:
dao.save(timesheet);
More complete example code of the usage of the Morphia DAO:
class Dao extends BasicDAO<TimeSheet, String> {
Dao(Datastore ds) {
super(TimeSheet.class, ds);
}
}
Datastore ds = morphia.createDatastore(mongoClient, DB_NAME);
Dao dao = new Dao(ds);
dao.save(timesheet);

creating unique index in mongoDB

I am using a java program for mongo db insertion trying to create a unique index for a field. product_src is a field in my collection and I want to set it as unique index for avoiding the duplicate insertion. I am trying the following code but showing syntax error what is the problem with this.
DB db;
try {
sample = new MongoClient("myIP",PORT);
db = sample.getDB("client_mahout");
t = db.getCollection("data_flipkart_in_avoid_duplicate_checking");
System.out.println("enter the system ip");
db.t.ensureIndex({"product_src":1});
} catch (Exception e) {}
t is the collection. there is problem with line db.t.ensureIndex({"product_src":1});
Please give me a sample code how to create unique index in mongo DB
For future reference, the way to handle this in the Java Mongo driver v3.0+ is by:
public void createUniqueIndex() {
Document index = new Document("field", 1);
MongoCollection<Document> collection = client.getDatabase("db").getCollection("Collection");
collection.createIndex(index, new IndexOptions().unique(true));
}
You need to pass a DBObject to the ensureIndex() method.
db.t.ensureIndex(new BasicDBObject("product_src",1))
But, the ensureIndex method has been deprecated since version 2.12, you need to use createIndex() instead.
db.t.createIndex(new BasicDBObject("product_src",1));

Calling a method from inside StreamingMarkupBuilder

I'm using Groovy's StreamingMarkupBuilder to generate XML dynamically based on the results of a few SQL queries. I'd like to call a method from inside of the closure but the markup builder tries to create an XML node using the method name.
Here's an example of what I'm trying to do:
Map generateMapFromRow(GroovyRowResult row) {
def map = [:]
def meta = row.getMetaData()
// Dynamically generate the keys and values
(1..meta.getColumnCount()).each { column -> map[meta.getColumnName(column)] = row[column-1] }
return map
}
def sql = Sql.newInstance(db.url, db.user, db.password, db.driver)
def builder = new StreamingMarkupBuilder()
def studentsImport = {
students {
sql.eachRow('select first_name, middle_name, last_name from students') { row ->
def map = generateMapFromRow(row) // Here is the problem line
student(map)
}
}
}
println builder.bind(studentsImport).toString()
This will generate XML similar to the following:
<students>
<generateMapFromRow>
[first_name:Ima, middle_name:Good, last_name:Student]
</generateMapFromRow>
<student/>
<generateMapFromRow>
[first_name:Ima, middle_name:Bad, last_name:Student]
</generateMapFromRow>
<student/>
</students>
I've tried moving the method out to a class and calling to statically on the class, which doesn't work also.
Due to the nature of how StreamingMarkupBuilder works, I'm afraid that it isn't actually possible to do this, but I'm hoping that it is.
I may loose smth during example simplification, but such code will work.
In your example students is a closure call, so it may mess smth inside.
def builder = new groovy.xml.StreamingMarkupBuilder()
def generateMapFromRow = { ["$it": it] }
builder.bind {
10.times {
def map = generateMapFromRow(it) // Now closure is escaped, there is local variable with such name.
student(map)
}
}
As said here: http://groovy.codehaus.org/Using+MarkupBuilder+for+Agile+XML+creation
Things to be careful about when using markup builders is not to overlap variables you currently have in scope. The following is a good example
import groovy.xml.MarkupBuilder
def book = "MyBook"
def writer = new StringWriter()
def xml = new MarkupBuilder(writer)
xml.shelf() {
book(name:"Fight Club") { // Will produce error.
}
}
println writer.toString()
Builder's work similar to MethodMissing captors, ans if there is local variable in scope, no node will be produced.

MongoTemplate upsert - easy way to make Update from pojo (which user has editted)?

Here is a simple pojo:
public class Description {
private String code;
private String name;
private String norwegian;
private String english;
}
And please see the following code to apply an upsert to MongoDb via spring MongoTemplate:
Query query = new Query(Criteria.where("code").is(description.getCode()));
Update update = new Update().set("name", description.getName()).set("norwegian", description.getNorwegian()).set("english", description.getEnglish());
mongoTemplate.upsert(query, update, "descriptions");
The line to generate the Update object specifies every field of the Item class manually.
But if my Item object changes then my Dao layer breaks.
So is there a way to avoid doing this, so that all fields from my Item class are applied automatically to the update?
E.g.
Update update = new Update().fromObject(item);
Note that my pojo does not extend DBObject.
I found a pretty good solution for this question
//make a new description here
Description d = new Description();
d.setCode("no");
d.setName("norwegian");
d.setNorwegian("norwegian");
d.setEnglish("english");
//build query
Query query = new Query(Criteria.where("code").is(description.getCode()));
//build update
DBObject dbDoc = new BasicDBObject();
mongoTemplate.getConverter().write(d, dbDoc); //it is the one spring use for convertions.
Update update = Update.fromDBObject(dbDoc);
//run it!
mongoTemplate.upsert(query, update, "descriptions");
Plz note that Update.fromDBObject return an update object with all fields in dbDoc. If you just want to update non-null fields, you should code a new method to exclude null fields.
For example, the front-end post a doc like below:
//make a new description here
Description d = new Description();
d.setCode("no");
d.setEnglish("norwegian");
We only need to update the field 'language':
//return Update object
public static Update fromDBObjectExcludeNullFields(DBObject object) {
Update update = new Update();
for (String key : object.keySet()) {
Object value = object.get(key);
if(value!=null){
update.set(key, value);
}
}
return update;
}
//build udpate
Update update = fromDBObjectExcludeNullFields(dbDoc);
The solution for a new spring-data-mongodb version 2.X.X.
The API has evolved, since 2.X.X version there is:
Update.fromDocument(org.bson.Document object, String... exclude)
instead of (1.X.X):
Update.fromDBObject(com.mongodb.DBObject object, String... exclude)
The full solution:
//make a new description here
Description d = new Description();
d.setCode("no");
d.setName("norwegian");
d.setNorwegian("norwegian");
d.setEnglish("english");
Query query = new Query(Criteria.where("code").is(description.getCode()));
Document doc = new Document(); // org.bson.Document
mongoTemplate.getConverter().write(item, doc);
Update update = Update.fromDocument(doc);
mongoTemplate.upsert(query, update, "descriptions");
It works!
you can use save : (if non exist = insert else = upsert)
save(Object objectToSave, String collectionName)
read : javadoc
Just like previous answers said, use mongoTemplate.getConverter().write() and Update.fromDocument() functions. But i found Update.fromDocument() won't add "$set" key and won't work directly, the solution is to add "$set" yourself, like below (PS: I'm using 2.2.1.RELEASE version):
public static Update updateFromObject(Object object, MongoTemplate mongoTemplate) {
Document doc = new Document();
mongoTemplate.getConverter().write(object, doc);
return Update.fromDocument(new Document("$set", doc));
}
If you want to upsert Pojos incl. property String id; you have to exclude the _id field in the fromDBObject method Update.fromDBObject(dbDoc,"_id").
Otherwise you get the Exception:
org.springframework.dao.DuplicateKeyException: { "serverUsed" : "127.0.0.1:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: db.description index: _id_ dup key: { : null }" , "code" : 11000}; nested exception is com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "127.0.0.1:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: db.description index: _id_ dup key: { : null }" , "code" : 11000}
because the _id field of the first is null
{
"_id" : null,
...
}
Fullcode based on #PaniniGelato answer would be
public class Description(){
public String id;
...
}
Description d = new Description();
d.setCode("no");
d.setName("norwegian");
d.setNorwegian("norwegian");
d.setEnglish("english");
//build query
Query query = new Query(Criteria.where("code").is(description.getCode()));
//build update
DBObject dbDoc = new BasicDBObject();
mongoTemplate.getConverter().write(d, dbDoc); //it is the one spring use for convertions.
Update update = Update.fromDBObject(dbDoc, "_id");
//run it!
mongoTemplate.upsert(query, update, "descriptions");
Then the upsert is working in the cases of insert and update. Corrections & thoughts are welcome ;)
This is what I am doing for the time being. Not so much elegant way to do it, but it does save a precious DB call:
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.query.Query;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Perform an upsert operation to update ALL FIELDS in an object using native mongo driver's methods
* since mongoTemplate's upsert method doesn't allow it
* #param upsertQuery
* #param object
* #param collectionName
*/
private void performUpsert(Query upsertQuery, Object object, String collectionName){
ObjectMapper mapper = new ObjectMapper();
try {
String jsonStr = mapper.writeValueAsString(object);
DB db = mongoTemplate.getDb();
DBCollection collection = db.getCollection(collectionName);
DBObject query = upsertQuery.getQueryObject();
DBObject update = new BasicDBObject("$set", JSON.parse(jsonStr));
collection.update(query, update, true, false);
} catch (IOException e) {
LOGGER.error("Unable to persist the metrics in DB. Error while parsing object: {}", e);
}
}
There are two cases here that need to be distinguished:
Update an item that was previously fetched from the DB.
Update or insert (upsert) an item you created by code.
In Case 1) You can simply use mongoTemplate.save(pojo, "collection"), because your POJO will already have a filled ObjectID in its id field.
In case 2) You have to explain to mongo what "already exists" means in case of your domain model: By default the mongoTemplate.save() method updates an existing item, if there is one with that same ObjectId. But with a newly instantiated POJO you do not have that id. Therefore the mongoTemplate.upsert() method has a query parameter that you can create like this:
MyDomainClass pojo = new MyDomainClass(...);
Query query = Query.query(Criteria.where("email").is("user1#domain.com"));
DBObject dbDoc = new BasicDBObject();
mongoTemplate.getConverter().write(pojo, dbDoc); //it is the one spring use for convertions.
dbDoc.removeField("_id"); // just to be sure to not create any duplicates
Update update = Update.fromDBObject(dbDoc);
WriteResult writeResult = mongoTemplate.upsert(query, update, UserModel.class);
I ran into the same problem. In het current Spring Data MongoDB version no such thing is available. You have to update the seperate fields by hand.
However it is possible with another framework: Morphia.
This framework has a wrapper for DAO functionality: https://github.com/mongodb/morphia/wiki/DAOSupport
You can use the DAO API to do things like this:
SomePojo pojo = daoInstance.findOne("some-field", "some-value");
pojo.setAProperty("changing this property");
daoInstance.save(pojo);
I think that:
Description add a property
#Id
private String id;
then get a document by the query condition,set Description's id by document's id.
and save
Just use ReflectionDBObject - if you make Description extend it, you should just get your object's fields transferred to Update reflectively, automagically. The note from above about null fields included in the update still holds true.
public void saveOrUpdate(String json) {
try {
JSONObject jsonObject = new JSONObject(json);
DBObject update1 = new BasicDBObject("$set", JSON.parse(json));
mongoTemplate.getCollection("collectionName").update(new Query(Criteria.where("name").is(jsonObject.getString("name"))).getQueryObject(), update1, true, false);
} catch (Exception e) {
throw new GenericServiceException("Error while save/udpate. Error msg: " + e.getMessage(), e);
}
}
this is very simple way to save json string into collection using mongodb
and spring.
This method can be override to use as JSONObject.
#Override
public void updateInfo(UpdateObject algorithm) {
Document document = new Document();
mongoTemplate.getConverter().write(algorithm, document);
Update update = Update.fromDocument(document);
mongoTemplate.updateFirst(query(where("_id").is(algorithm.get_id())), update, UpdateObject.class);
}
After upsert, I was Tring to fetch same record but it was given me the old one.
But in dB I am having new records.

Categories