Convert DBObject to a POJO using MongoDB Java Driver - java

MongoDB seems to return BSON/JSON objects.
I thought that surely you'd be able to retrieve values as Strings, ints etc. which can then be saved as POJO.
I have a DBObject (instantiated as a BasicDBObject) as a result of iterating over a list ... (cur.next()).
Is the only way (other than using some sort of persistence framework) to get the data into a POJO to use a JSON serlialiser/deserialiser?
My method looks like this:
public List<User> findByEmail(String email){
DBCollection userColl;
try {
userColl = Dao.getDB().getCollection("users"); } catch (UnknownHostException e) { e.printStackTrace(); } catch (MongoException e) { e.printStackTrace();}
DBCursor cur = userColl.find();
List<User> usersWithMatchEmail = new ArrayList<User>();
while(cur.hasNext()) {
// this is where I want to convert cur.next() into a <User> POJO
usersWithMatchEmail.add(cur.next());
}
return null;
}
EDIT: It's pretty obvious, just do something like this.

Let Spring do the heavy lifting with the stuff it already has built for this...
The real trick is: mongoTemplate.getConverter().read(Foo.class, obj);
For example, when using a DBCursor -
while (cursor.hasNext()) {
DBObject obj = cursor.next();
Foo foo = mongoTemplate.getConverter().read(Foo.class, obj);
returnList.add(foo);
}
http://revelfire.com/spring-data-mongodb-convert-from-raw-query-dbobject/

There is a few java libs that can help you with it:
Morhpia - http://code.google.com/p/morphia/
Spring Data for MongoDB - http://www.springsource.org/spring-data/mongodb

Though a late answer , someone might find this useful.
I use GSON to convert from BasicDBObject to my own POJO which is TinyBlogDBObject
TinyBlogDBObject obj = convertJSONToPojo(cursor.next().toString());
private static TinyBlogDBObject convertJSONToPojo(String json){
Type type = new TypeToken< TinyBlogDBObject >(){}.getType();
return new Gson().fromJson(json, type);
}

1. Provide MongoDatabase bean with proper CodecRegistry
#Bean
public MongoClient mongoClient() {
ConnectionString connectionString = new ConnectionString("mongodb://username:password#127.0.0.1:27017/dbname");
ConnectionPoolSettings connectionPoolSettings = ConnectionPoolSettings.builder()
.minSize(2)
.maxSize(20)
.maxWaitQueueSize(100)
.maxConnectionIdleTime(60, TimeUnit.SECONDS)
.maxConnectionLifeTime(300, TimeUnit.SECONDS)
.build();
SocketSettings socketSettings = SocketSettings.builder()
.connectTimeout(5, TimeUnit.SECONDS)
.readTimeout(5, TimeUnit.SECONDS)
.build();
MongoClientSettings clientSettings = MongoClientSettings.builder()
.applyConnectionString(connectionString)
.applyToConnectionPoolSettings(builder -> builder.applySettings(connectionPoolSettings))
.applyToSocketSettings(builder -> builder.applySettings(socketSettings))
.build();
return MongoClients.create(clientSettings);
}
#Bean
public MongoDatabase mongoDatabase(MongoClient mongoClient) {
CodecRegistry defaultCodecRegistry = MongoClientSettings.getDefaultCodecRegistry();
CodecRegistry fromProvider = CodecRegistries.fromProviders(PojoCodecProvider.builder().automatic(true).build());
CodecRegistry pojoCodecRegistry = CodecRegistries.fromRegistries(defaultCodecRegistry, fromProvider);
return mongoClient.getDatabase("dbname").withCodecRegistry(pojoCodecRegistry);
}
2. Annotate POJOS
public class ProductEntity {
#BsonProperty("name") public final String name;
#BsonProperty("description") public final String description;
#BsonProperty("thumb") public final ThumbEntity thumbEntity;
#BsonCreator
public ProductEntity(
#BsonProperty("name") String name,
#BsonProperty("description") String description,
#BsonProperty("thumb") ThumbEntity thumbEntity) {
this.name = name;
this.description = description;
this.thumbEntity = thumbEntity;
}
}
public class ThumbEntity {
#BsonProperty("width") public final Integer width;
#BsonProperty("height") public final Integer height;
#BsonProperty("url") public final String url;
#BsonCreator
public ThumbEntity(
#BsonProperty("width") Integer width,
#BsonProperty("height") Integer height,
#BsonProperty("url") String url) {
this.width = width;
this.height = height;
this.url = url;
}
}
3. Query mongoDB and obtain POJOS
MongoCollection<Document> collection = mongoDatabase.getCollection("product");
Document query = new Document();
List<ProductEntity> products = collection.find(query, ProductEntity.class).into(new ArrayList<>());
Please check my answer in other post
POJO to org.bson.Document and Vice Versa

You can use GSON library provided by Google. Here is the example of it. There are many other api that you can use to convert json into pojo like jettision api,etc.

Related

How to fetch cached value using redisson client

I wanted to fetch cached(#Cachable) value using redisson client but it return strange data if i use any codec in redisson client (getBucket("fruit::1",StringCodec.INSTANCE)) and it throws error unless i use codec.
i have used below code for caching
#Cacheable(value = "fruits", key = "#id")
public Fruit getFruitById(int id) {
// get fruit by id
CriteriaBuilder builder = em.getCriteriaBuilder();
CriteriaQuery<Fruit> query = builder.createQuery(Fruit.class);
Root<Fruit> root = query.from(Fruit.class);
query.select(root);
query.where(builder.equal(root.get("id"), id));
TypedQuery<Fruit> fruitQuery = em.createQuery(query);
return fruitQuery.getSingleResult();
}
When i use codec for getting that cached data
RBucket<String> bucket = client.getBucket("fruits::1",
StringCodec.INSTANCE);
String fruit = bucket.get();
It returns following strange data
��srcom.home.redis.Fruit��.ܵo*rIidIpriceLnametLjava/lang/String;xp,tpomegrantite
RedisConfiguration
#Bean
public RedisCacheConfiguration cacheConfiguration() {
RedisCacheConfiguration cacheConfig = RedisCacheConfiguration
.defaultCacheConfig().entryTtl(Duration.ofSeconds(600))
.disableCachingNullValues();
return cacheConfig;
}
#Bean
public RedisCacheManager cacheManager() {
RedisCacheManager rcm = RedisCacheManager
.builder(this.getRedissonStoreFactory())
.cacheDefaults(cacheConfiguration()).transactionAware().build();
return rcm;
}
#Bean
#Primary
public RedisProperties redisProperties() {
return new RedisProperties();
}
#Bean
public RedissonConnectionFactory getRedissonStoreFactory() {
return new RedissonConnectionFactory(getConfig());
}
#Bean
public RedissonNode getNode() {
RedissonNodeConfig nodeConfig = new RedissonNodeConfig(getConfig());
nodeConfig.setExecutorServiceWorkers(
Collections.singletonMap("ensimp", 1));
RedissonNode node = RedissonNode.create(nodeConfig);
node.start();
return node;
}
#Bean
public Config getConfig()
{
Config config = new Config();
RedisProperties properties = redisProperties();
config.useSingleServer().setAddress(
"redis://" + properties.getHost() + ":" + properties.getPort());
return config;
}
redisson.json
{
"singleServerConfig":{
"idleConnectionTimeout":500,
"connectTimeout":1000,
"timeout":3000,
"retryAttempts":3,
"retryInterval":1500,
"password":null,
"subscriptionsPerConnection":5,
"clientName":null,
"address": "redis://127.0.0.1:6379",
"subscriptionConnectionMinimumIdleSize":0,
"subscriptionConnectionPoolSize":1,
"connectionMinimumIdleSize":0,
"connectionPoolSize":20,
"database":0,
"dnsMonitoringInterval":5000
},
"threads":16,
"nettyThreads":32,
"codec":{
"class":"org.redisson.codec.FstCodec"
},
"transportMode":"NIO"
}
i've used fst codec too but got the same strange data. i want correctly decoded data it'd be great if anyone help me with a right code.
You need to use RMapCache data to obtain data and not RBucket.
client.getMapCache("fruits::1", StringCodec.INSTANCE);
try this:
RMapCache mycache;
mycache=client.getMapCache("fruits::1");
then to retrieve the data use readAllValues()
Collection<Fruit> map=mycache.readAllValues();
System.out.println(map);

How do you execute a MongoDB query stored as string in Java?

I'm kind of new to the MongoDB Java driver and I was wondering how you could execute a query stored as a string. Is this the best way to execute them, or what would be a better approach?
I've stumbled across the piece of the below on another stackoverflow thread, but haven't been able to get anything useful out of it. The output does not contain the result of the query at all.
The code I'm running right now:
#Test
public void testExecuteStoredQueries() {
String code = "db.getCollection('users').find({})";
final BasicDBObject command = new BasicDBObject();
String formattedCode = String.format("function() { return %s ; }", code);
System.out.println("Formatted code:");
System.out.println(formattedCode);
command.put("eval", formattedCode);
Document result = DbEngine.getInstance().getDatabase().runCommand(command);
System.out.println(result.toJson());
}
Summarized output:
{
"retval": {
"_mongo": "....",
"_db": "...",
"_collection": "...",
"_ns": "cezy.users",
"_query": {},
"_fields": null,
"_limit": 0,
"_skip": 0,
"_batchSize": 0,
"_options": 0,
"_cursor": null,
"_numReturned": 0,
"_special": false
},
"ok": 1
}
I use morphia when i have to deal with objects. As when you retrieve the data from MongoDb, for the long values you get extended Json instead of Json Response. Parsing Extended Json could be a trouble and might break the code. As Gson doesn't support the conversion from Extended Json to Json.
private void createDatastore(boolean createIndexes) {
Morphia morphia = new Morphia();
morphia.map(classname.class);
datastore = morphia.createDatastore(mongoClient, databaseName);
if (createIndexes) {
datastore.ensureIndexes();
}
}
#Override
public Datastore getDatastore() {
return this.datastore;
}
#Test
public void testExecuteStoredQueries() {
String code = "db.getCollection('users').find({})";
String formattedCode = String.format("function() { return %s ; }", code);
final BasicDBObject basicObject = new BasicDBObject(new BasicDBObject("$in", formattedCode));
Query<ClassName> query = getDatastore().createQuery(<Classname>.class).filter("_eval", basicObject);
List<Classname> List = query.asList();
//if you want to access each object and perform some task
List.forEach((cursor) -> {
//perform your task
});
}
Removing the function creation and adding ".toArray()" pretty much solved the problem.
#Test
public void testExecuteStoredQueries() {
String code = "db.users.find({}).toArray();";
final BasicDBObject command = new BasicDBObject();
command.put("eval", code);
Document result = DbEngine.getInstance().getDatabase().runCommand(command);
System.out.println(result.toJson());
assertNotNull(result.get("retval"));
}
The array is in the "retval" field of the response.

DynamoDB - how to get Primary Key (which is random id) from database to make endpoind class?

I've made method that I use to edit Item from database.
This is how my method looks:
public Product editProduct(PrimaryKey primaryKey, Product content) {
UpdateItemSpec updateItemSpec = new UpdateItemSpec().withPrimaryKey(primaryKey).withValueMap(createValueMap(content));
UpdateItemOutcome itemOutcome = databaseController.getTable(PRODUCT_TABLE).updateItem(updateItemSpec);
return convertToProduct(itemOutcome);
}
private Map<String, Object> createValueMap(Product content) {
Map<String, Object> result = new HashMap<>();
result.put("name", content.getName());
result.put("calories", content.getCalories());
result.put("fat", content.getFat());
result.put("carbo", content.getCarbo());
result.put("protein", content.getProtein());
result.put("productKinds", content.getProductKinds());
result.put("author", content.getAuthor());
result.put("media", content.getMedia());
result.put("approved", content.getApproved());
return result;
}
private Product convertToProduct(UpdateItemOutcome itemOutcome) {
Product product = new Product();
product.setName(itemOutcome.getItem().get("name").toString());
product.setCalories(itemOutcome.getItem().getInt("calories"));
product.setFat(itemOutcome.getItem().getDouble("fat"));
product.setCarbo(itemOutcome.getItem().getDouble("carbo"));
product.setProtein(itemOutcome.getItem().getDouble("protein"));
product.setProductKinds(itemOutcome.getItem().getList("productKinds"));
ObjectMapper objectMapper = new ObjectMapper();
try {
Author productAuthor = objectMapper.readValue(itemOutcome.getItem().getString("author"), Author.class);
product.setAuthor(productAuthor);
} catch (IOException e) {
e.printStackTrace();
}
try {
Media productMedia = objectMapper.readValue(itemOutcome.getItem().getString("media"), Media.class);
product.setMedia(productMedia);
} catch (IOException e) {
e.printStackTrace();
}
return product;
}
Now I want to create endpoint class for this method but I have problem, I need to get primarykey as parameter (it's looks like this for example: 2567763a-d21e-4146-8d61-9d52c2561fc0) and I don't know how to do this.
At the moment my class looks like that:
public class EditProductLambda implements RequestHandler<Map<String, Object>, ApiGatewayResponse> {
private LambdaLogger logger;
#Override
public ApiGatewayResponse handleRequest(Map<String, Object> input, Context context) {
logger = context.getLogger();
logger.log(input.toString());
try{
Product product = RequestUtil.parseRequest(input, Product.class);
//PrimaryKey primaryKey = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
KitchenService kitchenService = new KitchenService(new DatabaseController(context, Regions.EU_CENTRAL_1), logger);
Product editedProduct = kitchenService.editProduct(primaryKey, product);
return ResponseUtil.generateResponse(HttpStatus.SC_CREATED, editedProduct);
} catch (IllegalArgumentException e){
return ResponseUtil.generateResponse(HttpStatus.SC_BAD_REQUEST, e.getMessage());
}
}
Can someone give me some advice how to do that? Or maybe my method is done wrong?
So first you have to create a trigger to Lambda function and ideal prefer here would be an API gateway. You can pass your data as query string or as a request body to API gateway.
You can use body mapping template in the integration request section of API gateway and get request body/query string. Construct a new json at body mapping template, which will have data from request body/query string. As we are adding body mapping template your business logic will get the json we have constructed at body mapping template.
Inside body mapping template to get query string please do ,
$input.params('querystringkey')
For example inside body mapping template (If using query string),
#set($inputRoot = $input.path('$'))
{
"primaryKey" : "$input.params('$.primaryKey')"
}
if passing data as body then,
#set($inputRoot = $input.path('$'))
{
"primaryKey" : "$input.path('$.primaryKey')"
}
Please read https://aws.amazon.com/blogs/compute/tag/mapping-templates/ for more details on body mapping template

POJO to org.bson.Document and Vice Versa

Is there a simple way to convert Simple POJO to org.bson.Document?
I'm aware that there are ways to do this like this one:
Document doc = new Document();
doc.append("name", person.getName()):
But does it have a much simpler and typo less way?
Currently Mongo Java Driver 3.9.1 provide POJO support out of the box
http://mongodb.github.io/mongo-java-driver/3.9/driver/getting-started/quick-start-pojo/
Let's say you have such example collection with one nested object
db.createCollection("product", {
validator: {
$jsonSchema: {
bsonType: "object",
required: ["name", "description", "thumb"],
properties: {
name: {
bsonType: "string",
description: "product - name - string"
},
description: {
bsonType: "string",
description: "product - description - string"
},
thumb: {
bsonType: "object",
required: ["width", "height", "url"],
properties: {
width: {
bsonType: "int",
description: "product - thumb - width"
},
height: {
bsonType: "int",
description: "product - thumb - height"
},
url: {
bsonType: "string",
description: "product - thumb - url"
}
}
}
}
}
}});
1. Provide a MongoDatabase bean with proper CodecRegistry
#Bean
public MongoClient mongoClient() {
ConnectionString connectionString = new ConnectionString("mongodb://username:password#127.0.0.1:27017/dbname");
ConnectionPoolSettings connectionPoolSettings = ConnectionPoolSettings.builder()
.minSize(2)
.maxSize(20)
.maxWaitQueueSize(100)
.maxConnectionIdleTime(60, TimeUnit.SECONDS)
.maxConnectionLifeTime(300, TimeUnit.SECONDS)
.build();
SocketSettings socketSettings = SocketSettings.builder()
.connectTimeout(5, TimeUnit.SECONDS)
.readTimeout(5, TimeUnit.SECONDS)
.build();
MongoClientSettings clientSettings = MongoClientSettings.builder()
.applyConnectionString(connectionString)
.applyToConnectionPoolSettings(builder -> builder.applySettings(connectionPoolSettings))
.applyToSocketSettings(builder -> builder.applySettings(socketSettings))
.build();
return MongoClients.create(clientSettings);
}
#Bean
public MongoDatabase mongoDatabase(MongoClient mongoClient) {
CodecRegistry defaultCodecRegistry = MongoClientSettings.getDefaultCodecRegistry();
CodecRegistry fromProvider = CodecRegistries.fromProviders(PojoCodecProvider.builder().automatic(true).build());
CodecRegistry pojoCodecRegistry = CodecRegistries.fromRegistries(defaultCodecRegistry, fromProvider);
return mongoClient.getDatabase("dbname").withCodecRegistry(pojoCodecRegistry);
}
2. Annotate your POJOS
public class ProductEntity {
#BsonProperty("name") public final String name;
#BsonProperty("description") public final String description;
#BsonProperty("thumb") public final ThumbEntity thumbEntity;
#BsonCreator
public ProductEntity(
#BsonProperty("name") String name,
#BsonProperty("description") String description,
#BsonProperty("thumb") ThumbEntity thumbEntity) {
this.name = name;
this.description = description;
this.thumbEntity = thumbEntity;
}
}
public class ThumbEntity {
#BsonProperty("width") public final Integer width;
#BsonProperty("height") public final Integer height;
#BsonProperty("url") public final String url;
#BsonCreator
public ThumbEntity(
#BsonProperty("width") Integer width,
#BsonProperty("height") Integer height,
#BsonProperty("url") String url) {
this.width = width;
this.height = height;
this.url = url;
}
}
3. Query mongoDB and obtain POJOS
MongoCollection<Document> collection = mongoDatabase.getCollection("product");
Document query = new Document();
List<ProductEntity> products = collection.find(query, ProductEntity.class).into(new ArrayList<>());
And that's it !!! You can easily obtain your POJOS
without cumbersome manual mappings
and without loosing ability to run native mongo queries
You can use Gson and Document.parse(String json) to convert a POJO to a Document. This works with the version 3.4.2 of java driver.
Something like this:
package com.jacobcs;
import org.bson.Document;
import com.google.gson.Gson;
import com.mongodb.MongoClient;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
public class MongoLabs {
public static void main(String[] args) {
// create client and connect to db
MongoClient mongoClient = new MongoClient("localhost", 27017);
MongoDatabase database = mongoClient.getDatabase("my_db_name");
// populate pojo
MyPOJO myPOJO = new MyPOJO();
myPOJO.setName("MyName");
myPOJO.setAge("26");
// convert pojo to json using Gson and parse using Document.parse()
Gson gson = new Gson();
MongoCollection<Document> collection = database.getCollection("my_collection_name");
Document document = Document.parse(gson.toJson(myPOJO));
collection.insertOne(document);
}
}
I don't know your MongoDB version. But now-a-days, there is no need to convert Document to POJO or vice versa. You just need to create your collection depending on what you want to work with, Document or POJO as follows.
//If you want to use Document
MongoCollection<Document> myCollection = db.getCollection("mongoCollection");
Document doc=new Document();
doc.put("name","ABC");
myCollection.insertOne(doc);
//If you want to use POJO
MongoCollection<Pojo> myCollection = db.getCollection("mongoCollection",Pojo.class);
Pojo obj= new Pojo();
obj.setName("ABC");
myCollection.insertOne(obj);
Please ensure your Mongo DB is configured with proper codecregistry if you want to use POJO.
MongoClient mongoClient = new MongoClient();
//This registry is required for your Mongo document to POJO conversion
CodecRegistry codecRegistry = fromRegistries(MongoClient.getDefaultCodecRegistry(),
fromProviders(PojoCodecProvider.builder().automatic(true).build()));
MongoDatabase db = mongoClient.getDatabase("mydb").withCodecRegistry(codecRegistry);
The point is, that you do not need to put your hands on org.bson.Document.
Morphia will do all that for you behind the curtain.
import com.mongodb.MongoClient;
import org.mongodb.morphia.Datastore;
import org.mongodb.morphia.DatastoreImpl;
import org.mongodb.morphia.Morphia;
import java.net.UnknownHostException;
.....
private Datastore createDataStore() throws UnknownHostException {
MongoClient client = new MongoClient("localhost", 27017);
// create morphia and map classes
Morphia morphia = new Morphia();
morphia.map(FooBar.class);
return new DatastoreImpl(morphia, client, "testmongo");
}
......
//with the Datastore from above you can save any mapped class to mongo
Datastore datastore;
final FooBar fb = new FooBar("hello", "world");
datastore.save(fb);
Here you will find several examples: https://mongodb.github.io/morphia/
If you are using Morphia, you can convert a POJO to document using this piece of code.
Document document = Document.parse( morphia.toDBObject( Entity ).toString() )
If you are not using Morphia, then you can do the same by writing custom mapping and converting the POJO into a DBObject and further converting the DBObject to a string and then parsing it.
If you use Spring Data MongoDB with springboot, MongoTemplate has a method to do this well.
Spring Data MongoDB API
Here's a sample.
1.First autowire the mongoTemplate in spring boot project.
#Autowired
MongoTemplate mongoTemplate;
2.Use mongoTemplate in your service
Document doc = new Document();
mongoTemplate.getConverter().write(person, doc);
In order to do this, you need config your pom file and yml to inject mongotemplate
pom.xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.1.10.RELEASE</version>
</dependency>
application.yml
# mongodb config
spring:
data:
mongodb:
uri: mongodb://your-mongodb-url
No i think it can be usefull for bulkInsert. I think bulkInsert cannot work with pojo (if i'm not mistaken).
If someone know how to user bulkInsert

Nested Query in DynamoDB returns nothing

I'm using DynamoDB with the Java SDK, but I'm having some issues with querying nested documents. I've included simplified code below. If I remove the filter expression, then everything gets returned. With the filter expression, nothing is returned. I've also tried using withQueryFilterEntry(which I'd prefer to use) and I get the same results. Any help is appreciated. Most of the documentation and forums online seem to use an older version of the java sdk than I'm using.
Here's the Json
{
conf:
{type:"some"},
desc: "else"
}
Here's the query
DynamoDBQueryExpression<JobDO> queryExpression = new DynamoDBQueryExpression<PJobDO>();
queryExpression.withFilterExpression("conf.Type = :type").addExpressionAttributeValuesEntry(":type", new AttributeValue(type));
return dbMapper.query(getItemType(), queryExpression);
Is it a naming issue? (your sample json has "type" but the query is using "Type")
e.g. the following is working for me using DynamoDB Local:
public static void main(String [] args) {
AmazonDynamoDBClient client = new AmazonDynamoDBClient(new BasicAWSCredentials("akey1", "skey1"));
client.setEndpoint("http://localhost:8000");
DynamoDBMapper mapper = new DynamoDBMapper(client);
client.createTable(new CreateTableRequest()
.withTableName("nested-data-test")
.withAttributeDefinitions(new AttributeDefinition().withAttributeName("desc").withAttributeType("S"))
.withKeySchema(new KeySchemaElement().withKeyType("HASH").withAttributeName("desc"))
.withProvisionedThroughput(new ProvisionedThroughput().withReadCapacityUnits(1L).withWriteCapacityUnits(1L)));
NestedData u = new NestedData();
u.setDesc("else");
Map<String, String> c = new HashMap<String, String>();
c.put("type", "some");
u.setConf(c);
mapper.save(u);
DynamoDBQueryExpression<NestedData> queryExpression = new DynamoDBQueryExpression<NestedData>();
queryExpression.withHashKeyValues(u);
queryExpression.withFilterExpression("conf.#t = :type")
.addExpressionAttributeNamesEntry("#t", "type") // returns nothing if use "Type"
.addExpressionAttributeValuesEntry(":type", new AttributeValue("some"));
for(NestedData u2 : mapper.query(NestedData.class, queryExpression)) {
System.out.println(u2.getDesc()); // "else"
}
}
NestedData.java:
#DynamoDBTable(tableName = "nested-data-test")
public class NestedData {
private String desc;
private Map<String, String> conf;
#DynamoDBHashKey
public String getDesc() { return desc; }
public void setDesc(String desc) { this.desc = desc; }
#DynamoDBAttribute
public Map<String, String> getConf() { return conf; }
public void setConf(Map<String, String> conf) { this.conf = conf; }
}

Categories