I have a method called printStatements(Model m,Resource s,Property p,Resource o). if I try to find all predicate and objects attached to 'adam' Eclipse throws 'adam cannot be resolved as a variable'. How can I use printStatements to interrogate subject, predicate or object in the model? Following is the code:
package jenaHelloWorld;
import java.util.*;
import org.apache.jena.rdf.model.Model;
import org.apache.jena.rdf.model.Property;
import org.apache.jena.rdf.model.Resource;
import org.apache.jena.rdf.model.Statement;
import org.apache.jena.rdf.model.StmtIterator;
import org.apache.jena.util.PrintUtil;
import org.apache.jena.rdf.model.*;
/**
* A small family tree held in a Jena Model
*/
public class FamilyModel {
// Namespace declarations
static final String familyUri = "http://family/";
static final String relationshipUri = "http://purl.org/vocab relationship/";
// Jena model representing the family
private Model model = ModelFactory.createDefaultModel();
/**
* Creates a model and populates it with family members and their
* relationships
*/
private FamilyModel() {
// Create an empty Model
// Create the types of Property we need to describe relationships
// in the model
Property childOf = model.createProperty(relationshipUri,"childOf");
Property parentOf = model.createProperty(relationshipUri,"parentOf");
Property siblingOf = model.createProperty(relationshipUri,"siblingOf");
Property spouseOf = model.createProperty(relationshipUri,"spouseOf");
// Create resources representing the people in our model
Resource adam = model.createResource(familyUri+"adam");
Resource beth = model.createResource(familyUri+"beth");
Resource chuck = model.createResource(familyUri+"chuck");
Resource dotty = model.createResource(familyUri+"dotty");
Resource edward = model.createResource(familyUri+"edward");
Resource fran = model.createResource(familyUri+"fran");
Resource greg = model.createResource(familyUri+"greg");
Resource harriet = model.createResource(familyUri+"harriet");
// Add properties to describing the relationships between them
adam.addProperty(siblingOf,beth);
adam.addProperty(spouseOf,dotty);
adam.addProperty(parentOf,edward);
adam.addProperty(parentOf,fran);
beth.addProperty(siblingOf,adam);
beth.addProperty(spouseOf,chuck);
chuck.addProperty(spouseOf,beth);
dotty.addProperty(spouseOf,adam);
dotty.addProperty(parentOf,edward);
dotty.addProperty(parentOf,fran);
// Statements can also be directly created ...
Statement statement1 = model.createStatement(edward,childOf,adam);
Statement statement2 = model.createStatement(edward,childOf,dotty);
Statement statement3 = model.createStatement(edward,siblingOf,fran);
// ... then added to the model:
model.add(statement1);
model.add(statement2);
model.add(statement3);
// Arrays of Statements can also be added to a Model:
Statement statements[] = new Statement[5];
statements[0] = model.createStatement(fran,childOf,adam);
statements[1] = model.createStatement(fran,childOf,dotty);
statements[2] = model.createStatement(fran,siblingOf,edward);
statements[3] = model.createStatement(fran,spouseOf,greg);
statements[4] = model.createStatement(fran,parentOf,harriet);
model.add(statements);
// A List of Statements can also be added
List list = new ArrayList();
list.add(model.createStatement(greg,spouseOf,fran));
list.add(model.createStatement(greg,parentOf,harriet));
list.add(model.createStatement(harriet,childOf,fran));
list.add(model.createStatement(harriet,childOf,greg));
model.add(list);
}
/**
* Creates a FamilyModel and dumps the content of its RDF representation
*/
public static void main(String args[]) {
// Create a model representing the family
FamilyModel theFamily = new FamilyModel();
// Dump out a String representation of the model
//System.out.println(theFamily.model);
//StmtIterator stmts = theFamily.listStatements( adam, null, (RDFNode) null );
printStatements(theFamily.model, adam,null, null);
}
private static void printStatements(Model m,Resource s,Property p,Resource o) {
for (StmtIterator i = m.listStatements(s,p,o); i.hasNext(); ) {
Statement stmt = i.nextStatement();
System.out.println(PrintUtil.print(stmt));
}
}
}
Related
The following python code passes ["hello", "world"] into the universal sentence encoder and returns an array of floats denoting their encoded representation.
import tensorflow as tf
import tensorflow_hub as hub
module = hub.KerasLayer("https://tfhub.dev/google/universal-sentence-encoder/4")
model = tf.keras.Sequential(module)
print("model: ", model(["hello", "world"]))
This code works but I'd now like to do the same thing using the Java API. I've successfully loaded the module, but I am unable to pass inputs into the model and extract the output. Here is what I've got so far:
import org.tensorflow.Graph;
import org.tensorflow.SavedModelBundle;
import org.tensorflow.Session;
import org.tensorflow.Tensor;
import org.tensorflow.Tensors;
import org.tensorflow.framework.ConfigProto;
import org.tensorflow.framework.GPUOptions;
import org.tensorflow.framework.GraphDef;
import org.tensorflow.framework.MetaGraphDef;
import org.tensorflow.framework.NodeDef;
import org.tensorflow.util.SaverDef;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
public final class NaiveBayesClassifier
{
public static void main(String[] args)
{
new NaiveBayesClassifier().run();
}
protected SavedModelBundle loadModule(Path source, String... tags) throws IOException
{
return SavedModelBundle.load(source.toAbsolutePath().normalize().toString(), tags);
}
public void run()
{
try (SavedModelBundle module = loadModule(Paths.get("universal-sentence-encoder"), "serve"))
{
Graph graph = module.graph();
try (Session session = new Session(graph, ConfigProto.newBuilder().
setGpuOptions(GPUOptions.newBuilder().setAllowGrowth(true)).
setAllowSoftPlacement(true).
build().toByteArray()))
{
Tensor<String> input = Tensors.create(new byte[][]
{
"hello".getBytes(StandardCharsets.UTF_8),
"world".getBytes(StandardCharsets.UTF_8)
});
List<Tensor<?>> result = session.runner().feed("serving_default_inputs", input).
addTarget("???").run();
}
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
I used https://stackoverflow.com/a/51952478/14731 to scan the model for possible input/output nodes. I believe the input node is "serving_default_inputs" but I can't figure out the output node. More importantly, I don't have to specify any of these values when invoking the code in python through Keras so is there a way to do the same using the Java API?
UPDATE: Thanks to roywei I can now that confirm the input node is serving_default_input and output node is StatefulPartitionedCall_1 but when I plug these names into the aforementioned code I get:
2020-05-22 22:13:52.266287: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at lookup_table_op.cc:809 : Failed precondition: Table not initialized.
Exception in thread "main" java.lang.IllegalStateException: [_Derived_]{{function_node __inference_pruned_6741}} {{function_node __inference_pruned_6741}} Error while reading resource variable EncoderDNN/DNN/ResidualHidden_0/dense/kernel/part_25 from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/EncoderDNN/DNN/ResidualHidden_0/dense/kernel/part_25/class tensorflow::Var does not exist.
[[{{node EncoderDNN/DNN/ResidualHidden_0/dense/kernel/ConcatPartitions/concat/ReadVariableOp_25}}]]
[[StatefulPartitionedCall_1/StatefulPartitionedCall]]
at libtensorflow#1.15.0/org.tensorflow.Session.run(Native Method)
at libtensorflow#1.15.0/org.tensorflow.Session.access$100(Session.java:48)
at libtensorflow#1.15.0/org.tensorflow.Session$Runner.runHelper(Session.java:326)
at libtensorflow#1.15.0/org.tensorflow.Session$Runner.run(Session.java:276)
Meaning, I still cannot invoke the model. What am I missing?
I figured it out after roywei pointed me in the right direction.
I needed to use SavedModuleBundle.session() instead of constructing my own instance. This is because the loader initializes the graph variables.
Instead of passing a ConfigProto to the Session constructor, I passed it into the SavedModelBundle loader instead.
I needed to use fetch() instead of addTarget() to retrieve the output tensor.
Here is the working code:
public final class NaiveBayesClassifier
{
public static void main(String[] args)
{
new NaiveBayesClassifier().run();
}
public void run()
{
try (SavedModelBundle module = loadModule(Paths.get("universal-sentence-encoder"), "serve"))
{
try (Tensor<String> input = Tensors.create(new byte[][]
{
"hello".getBytes(StandardCharsets.UTF_8),
"world".getBytes(StandardCharsets.UTF_8)
}))
{
MetaGraphDef metadata = MetaGraphDef.parseFrom(module.metaGraphDef());
Map<String, Shape> nameToInput = getInputToShape(metadata);
String firstInput = nameToInput.keySet().iterator().next();
Map<String, Shape> nameToOutput = getOutputToShape(metadata);
String firstOutput = nameToOutput.keySet().iterator().next();
System.out.println("input: " + firstInput);
System.out.println("output: " + firstOutput);
System.out.println();
List<Tensor<?>> result = module.session().runner().feed(firstInput, input).
fetch(firstOutput).run();
for (Tensor<?> tensor : result)
{
{
float[][] array = new float[tensor.numDimensions()][tensor.numElements() /
tensor.numDimensions()];
tensor.copyTo(array);
System.out.println(Arrays.deepToString(array));
}
}
}
}
catch (IOException e)
{
e.printStackTrace();
}
}
/**
* Loads a graph from a file.
*
* #param source the directory containing to load from
* #param tags the model variant(s) to load
* #return the graph
* #throws NullPointerException if any of the arguments are null
* #throws IOException if an error occurs while reading the file
*/
protected SavedModelBundle loadModule(Path source, String... tags) throws IOException
{
// https://stackoverflow.com/a/43526228/14731
try
{
return SavedModelBundle.loader(source.toAbsolutePath().normalize().toString()).
withTags(tags).
withConfigProto(ConfigProto.newBuilder().
setGpuOptions(GPUOptions.newBuilder().setAllowGrowth(true)).
setAllowSoftPlacement(true).
build().toByteArray()).
load();
}
catch (TensorFlowException e)
{
throw new IOException(e);
}
}
/**
* #param metadata the graph metadata
* #return the first signature, or null
*/
private SignatureDef getFirstSignature(MetaGraphDef metadata)
{
Map<String, SignatureDef> nameToSignature = metadata.getSignatureDefMap();
if (nameToSignature.isEmpty())
return null;
return nameToSignature.get(nameToSignature.keySet().iterator().next());
}
/**
* #param metadata the graph metadata
* #return the output signature
*/
private SignatureDef getServingSignature(MetaGraphDef metadata)
{
return metadata.getSignatureDefOrDefault("serving_default", getFirstSignature(metadata));
}
/**
* #param metadata the graph metadata
* #return a map from an output name to its shape
*/
protected Map<String, Shape> getOutputToShape(MetaGraphDef metadata)
{
Map<String, Shape> result = new HashMap<>();
SignatureDef servingDefault = getServingSignature(metadata);
for (Map.Entry<String, TensorInfo> entry : servingDefault.getOutputsMap().entrySet())
{
TensorShapeProto shapeProto = entry.getValue().getTensorShape();
List<Dim> dimensions = shapeProto.getDimList();
long firstDimension = dimensions.get(0).getSize();
long[] remainingDimensions = dimensions.stream().skip(1).mapToLong(Dim::getSize).toArray();
Shape shape = Shape.make(firstDimension, remainingDimensions);
result.put(entry.getValue().getName(), shape);
}
return result;
}
/**
* #param metadata the graph metadata
* #return a map from an input name to its shape
*/
protected Map<String, Shape> getInputToShape(MetaGraphDef metadata)
{
Map<String, Shape> result = new HashMap<>();
SignatureDef servingDefault = getServingSignature(metadata);
for (Map.Entry<String, TensorInfo> entry : servingDefault.getInputsMap().entrySet())
{
TensorShapeProto shapeProto = entry.getValue().getTensorShape();
List<Dim> dimensions = shapeProto.getDimList();
long firstDimension = dimensions.get(0).getSize();
long[] remainingDimensions = dimensions.stream().skip(1).mapToLong(Dim::getSize).toArray();
Shape shape = Shape.make(firstDimension, remainingDimensions);
result.put(entry.getValue().getName(), shape);
}
return result;
}
}
There are two ways to get the names:
1) Using Java:
You can read the input and output names from the org.tensorflow.proto.framework.MetaGraphDef stored in saved model bundle.
Here is an example on how to extract the information:
https://github.com/awslabs/djl/blob/master/tensorflow/tensorflow-engine/src/main/java/ai/djl/tensorflow/engine/TfSymbolBlock.java#L149
2) Using python:
load the saved model in tensorflow python and print the names
loaded = tf.saved_model.load("path/to/model/")
print(list(loaded.signatures.keys()))
infer = loaded.signatures["serving_default"]
print(infer.structured_outputs)
I recommend to take a look at Deep Java Library, it automatically handle the input, output names.
It supports TensorFlow 2.1.0 and allows you to load Keras models as well as TF Hub Saved Model. Take a look at the documentation here and here
Feel free to open an issue if you have problem loading your model.
You can load TF model with Deep Java Library
System.setProperty("ai.djl.repository.zoo.location", "https://storage.googleapis.com/tfhub-modules/google/universal-sentence-encoder/1.tar.gz?artifact_id=encoder");
Criteria.Builder<NDList, NDList> builder =
Criteria.builder()
.setTypes(NDList.class, NDList.class)
.optArtifactId("ai.djl.localmodelzoo:encoder")
.build();
ZooModel<NDList, NDList> model = ModelZoo.loadModel(criteria);
See https://github.com/awslabs/djl/blob/master/docs/load_model.md#load-model-from-a-url for detail
I need to do the same, but seems still lots of missing pieces RE DJL usage. E.g., what to do after this?:
ZooModel<NDList, NDList> model = ModelZoo.loadModel(criteria);
I finally found an example in the DJL source code. The key take-away is to not use NDList for the input/output at all:
Criteria<String[], float[][]> criteria =
Criteria.builder()
.optApplication(Application.NLP.TEXT_EMBEDDING)
.setTypes(String[].class, float[][].class)
.optModelUrls(modelUrl)
.build();
try (ZooModel<String[], float[][]> model = ModelZoo.loadModel(criteria);
Predictor<String[], float[][]> predictor = model.newPredictor()) {
return predictor.predict(inputs.toArray(new String[0]));
}
See https://github.com/awslabs/djl/blob/master/examples/src/main/java/ai/djl/examples/inference/UniversalSentenceEncoder.java for the complete example.
I am using Riak KV with Java client and I am unable to write on the RiakNode, although I have created a Bucket with the name of the space I want to create an object on.
I basically have the TasteOfRiak.java class, which has been provided by the basho developer website: https://raw.githubusercontent.com/basho/basho_docs/master/extras/code-examples/TasteOfRiak.java
import com.basho.riak.client.api.RiakClient;
import com.basho.riak.client.api.commands.kv.DeleteValue;
import com.basho.riak.client.api.commands.kv.FetchValue;
import com.basho.riak.client.api.commands.kv.StoreValue;
import com.basho.riak.client.api.commands.kv.UpdateValue;
import com.basho.riak.client.core.RiakCluster;
import com.basho.riak.client.core.RiakNode;
import com.basho.riak.client.core.query.Location;
import com.basho.riak.client.core.query.Namespace;
import com.basho.riak.client.core.query.RiakObject;
import com.basho.riak.client.core.util.BinaryValue;
import java.net.UnknownHostException;
public class TasteOfRiak {
// A basic POJO class to demonstrate typed exchanges with Riak
public static class Book {
public String title;
public String author;
public String body;
public String isbn;
public Integer copiesOwned;
}
// This will allow us to update the book object handling the
// entire fetch/modify/update cycle.
public static class BookUpdate extends UpdateValue.Update<Book> {
private final Book update;
public BookUpdate(Book update){
this.update = update;
}
#Override
public Book apply(Book t) {
if(t == null) {
t = new Book();
}
t.author = update.author;
t.body = update.body;
t.copiesOwned = update.copiesOwned;
t.isbn = update.isbn;
t.title = update.title;
return t;
}
}
// This will create a client object that we can use to interact with Riak
private static RiakCluster setUpCluster() throws UnknownHostException {
// This example will use only one node listening on localhost:10017
RiakNode node = new RiakNode.Builder()
.withRemoteAddress("127.0.0.1")
.withRemotePort(8087)
.build();
// This cluster object takes our one node as an argument
RiakCluster cluster = new RiakCluster.Builder(node)
.build();
// The cluster must be started to work, otherwise you will see errors
cluster.start();
return cluster;
}
public static void main( String[] args ) {
try {
// First, we'll create a basic object storing a movie quote
RiakObject quoteObject = new RiakObject()
// We tell Riak that we're storing plaintext, not JSON, HTML, etc.
.setContentType("text/plain")
// Objects are ultimately stored as binaries
.setValue(BinaryValue.create("You're dangerous, Maverick"));
System.out.println("Basic object created");
// In the new Java client, instead of buckets you interact with Namespace
// objects, which consist of a bucket AND a bucket type; if you don't
// supply a bucket type, "default" is used; the Namespace below will set
// only a bucket, without supplying a bucket type
Namespace quotesBucket = new Namespace("quotes");
// With our Namespace object in hand, we can create a Location object,
// which allows us to pass in a key as well
Location quoteObjectLocation = new Location(quotesBucket, "Iceman");
System.out.println("Location object created for quote object");
// With our RiakObject in hand, we can create a StoreValue operation
StoreValue storeOp = new StoreValue.Builder(quoteObject)
.withLocation(quoteObjectLocation)
.build();
System.out.println("StoreValue operation created");
// And now we can use our setUpCluster() function to create a cluster
// object which we can then use to create a client object and then
// execute our storage operation
RiakCluster cluster = setUpCluster();
RiakClient client = new RiakClient(cluster);
System.out.println("Client object successfully created");
StoreValue.Response storeOpResp = client.execute(storeOp);
System.out.println("Object storage operation successfully completed");
// Now we can verify that the object has been stored properly by
// creating and executing a FetchValue operation
FetchValue fetchOp = new FetchValue.Builder(quoteObjectLocation)
.build();
RiakObject fetchedObject = client.execute(fetchOp).getValue(RiakObject.class);
assert(fetchedObject.getValue().equals(quoteObject.getValue()));
System.out.println("Success! The object we created and the object we fetched have the same value");
// Now update the fetched object
fetchedObject.setValue(BinaryValue.create("You can be my wingman any time."));
StoreValue updateOp = new StoreValue.Builder(fetchedObject)
.withLocation(quoteObjectLocation)
.build();
StoreValue.Response updateOpResp = client.execute(updateOp);
updateOpResp = client.execute(updateOp);
// And we'll delete the object
DeleteValue deleteOp = new DeleteValue.Builder(quoteObjectLocation)
.build();
client.execute(deleteOp);
System.out.println("Quote object successfully deleted");
Book mobyDick = new Book();
mobyDick.title = "Moby Dick";
mobyDick.author = "Herman Melville";
mobyDick.body = "Call me Ishmael. Some years ago...";
mobyDick.isbn = "1111979723";
mobyDick.copiesOwned = 3;
System.out.println("Book object created");
// Now we'll assign a Location for the book, create a StoreValue
// operation, and store the book
Namespace booksBucket = new Namespace("books");
Location mobyDickLocation = new Location(booksBucket, "moby_dick");
StoreValue storeBookOp = new StoreValue.Builder(mobyDick)
.withLocation(mobyDickLocation)
.build();
client.execute(storeBookOp);
System.out.println("Moby Dick information now stored in Riak");
// And we'll verify that we can fetch the info about Moby Dick and
// that that info will match the object we created initially
FetchValue fetchMobyDickOp = new FetchValue.Builder(mobyDickLocation)
.build();
Book fetchedBook = client.execute(fetchMobyDickOp).getValue(Book.class);
System.out.println("Book object successfully fetched");
assert(mobyDick.getClass() == fetchedBook.getClass());
assert(mobyDick.title.equals(fetchedBook.title));
assert(mobyDick.author.equals(fetchedBook.author));
// And so on...
// Now to update the book with additional copies
mobyDick.copiesOwned = 5;
BookUpdate updatedBook = new BookUpdate(mobyDick);
UpdateValue updateValue = new UpdateValue.Builder(mobyDickLocation)
.withUpdate(updatedBook).build();
UpdateValue.Response response = client.execute(updateValue);
System.out.println("Success! All of our tests check out");
// Now that we're all finished, we should shut our cluster object down
cluster.shutdown();
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
}
Whenever Eclipse executes this code: "StoreValue.Response storeOpResp = client.execute(storeOp);
System.out.println("Object storage operation successfully completed");"
I get an error that "ERROR com.basho.riak.client.core.RiakNode - Write failed on RiakNode".
Before running that program I have already created a quotesBucket bucket and have activated it.
Does anyone know where the problem is?
Can you store an object through http? Try this in terminal:
curl -XPUT \
-H "Content-Type: text/plain" \
-d "You're dangerous, Maverick" \
http://localhost:8098/types/default/buckets/quotes/keys/Iceman?returnbody=true
I'm trying to use the upsert feature of mongodb v3.2 using java,
so every solution not including a java response would not be accepted.
My problem is that the upsert command override nested objects instead of adding new ones, I have tried to use '$addToSet' and 'push', but without success and I get an error message indicating that the storage engine does not support this command.
I want to update the client's document as well as their inner objects such as checks and checks's values.
the global structure of the client doc is as below.
Client
|
|__Checks // array of checks , update or insert operation
|
|__values // array of values, every check has its own values (20 max)
// update using index(id)
link of the: Example's source code
My intention is to use only one query to update client's document without using many queries.
I'm not specialist in mongodb, so every advice or critics would be appreciated.
Even if I'm doing this all wrong, feel free to notify me, and please using java for mongo 3.2.
Here is the source code used to generate the last result.
package org.egale.core;
import com.mongodb.MongoClient;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import com.mongodb.client.model.UpdateOptions;
import java.util.ArrayList;
import java.util.List;
import org.bson.Document;
/**
*
* #author Zied
*/
public class MongoTest {
/**
* Pojo used to populate data
*/
static class CheckModel {
public String client;
public String checkId;
public String name;
public String command;
public String description;
public String topic;
public int refresh = 60;
public int status;
public String output;
}
static MongoClient mongoClient = new MongoClient();
static String dbName = "eagle";
private static List<Document> getCheckValues(CheckModel checkModel, int index) {
final List<Document> checkValues = new ArrayList<>();
final Document val = new Document()
.append("id", index)
.append("output", checkModel.output)
.append("status", checkModel.status);
checkValues.add(val); // second execution should not ovveride the content of value but a new
return checkValues;
}
private static void insertCheck(MongoDatabase db, CheckModel checkModel) {
int idx =++index % 20;
final List<Document> checks = new ArrayList<>();
final Document check = new Document()
.append("name", checkModel.name)
.append("command", checkModel.command)
.append("id", checkModel.checkId)
.append("description", checkModel.description)
.append("topic", checkModel.topic)
.append("last_output", checkModel.output)
.append("index", index)
.append("last_status", checkModel.status)
.append("values", getCheckValues(checkModel,idx))
.append("refresh", checkModel.refresh);
checks.add(check);
Document client = new Document()
.append("name", checkModel.client)
.append("checks", checks);
//.append("$addToSet" , new Document("checks", checks)); // <<- error here '$addToSet' is not recocnized
db.getCollection("clients") // execute client insert or update
.updateOne(
new Document().append("_id", checkModel.client), new Document("$set", client), new UpdateOptions().upsert(true)
);
}
static int index = 0;
// Name of the topic from which we will receive messages from = " testt"
public static void main(String[] args) {
MongoDatabase db = mongoClient.getDatabase(dbName);
CheckModel checkModel = new CheckModel();
checkModel.command = "ls -lA";
checkModel.client = "client_001";
checkModel.description = "ls -l command";
checkModel.checkId = "lsl_command";
checkModel.name = "client 001";
checkModel.output = "result of ls -l";
checkModel.status = 0;
checkModel.topic = "basic_checks";
checkModel.refresh = 5000;
initDB(db);
// insert the first check
insertCheck(db, checkModel);
// insert the second check after some modification
// insertCheck(db, modifyData(checkModel));
}
// mdofiy data to test the check
private static CheckModel modifyData(CheckModel checkModel){
checkModel.status = 1;
checkModel.output = "ls commadn not found";
return checkModel;
}
private static void initDB(MongoDatabase db) {
MongoCollection<Document> collection = db.getCollection("configuration");
if (collection.count() == 0) {
Document b = new Document()
.append("_id", "app_config")
.append("historical_data", 20)
.append("current_index", 0);
collection.insertOne(b);
}
Document b = new Document().append("none", "none");
MongoCollection<Document> clients = db.getCollection("clients");
clients.insertOne(b);
clients.deleteOne(b);
MongoCollection<Document> topics = db.getCollection("topics");
topics.insertOne(b);
topics.deleteOne(b);
}
}
You may use $push, $each, $slice to solve your problem, see alse https://docs.mongodb.org/manual/reference/operator/update/slice/.
db.students has following documents
{ "_id" : 10, "scores" : [ 1, 2, 3 ] }
db.students.update(
{ _id: 10 },
{
$push: {
scores: {
$each: [ 4 ],
$slice: -3
}
}
}
)
result is:
{ "_id" : 10, "scores" : [ 2, 3, 4] }
I have created an ontology in protege and want to display the values of object properties and save the values in an array so that i can use the value to perform reasoning .The problem is i am not able to retrieve the datatype values only the domain and range are being displayed ,but no errors are there please help to find the solution
import java.io.IOException;
import com.hp.hpl.jena.ontology.DatatypeProperty;
import com.hp.hpl.jena.ontology.EnumeratedClass;
import com.hp.hpl.jena.ontology.Individual;
import com.hp.hpl.jena.ontology.OntClass;
import com.hp.hpl.jena.ontology.OntModel;
import com.hp.hpl.jena.ontology.OntResource;
import com.hp.hpl.jena.rdf.model.Model;
import com.hp.hpl.jena.rdf.model.ModelFactory;
import com.hp.hpl.jena.rdf.model.RDFNode;
import com.hp.hpl.jena.util.FileManager;
import com.hp.hpl.jena.util.iterator.ExtendedIterator;
public class GetStartedSemWeb {
static String defaultNameSpace ="http://semanticweb.org/ontologies#";
Model schema = null;
public static void main(String[] args) throws IOException
{
GetStartedSemWeb getsemweb = new GetStartedSemWeb();
System.out.println(" Adding student ontology ");
getsemweb.loadontology();
}
private void loadontology() throws IOException
{
schema = ModelFactory.createOntologyModel();
java.io.InputStream inschema = FileManager.get().open("C:/Users/Desktop/Documents/Extracting knowledge from ontology using jena/getstarted.owl");
schema.read(inschema,defaultNameSpace);
System.out.println("new ontology added");
ExtendedIterator it = ((OntModel) schema).listClasses();
while(it.hasNext())
{
OntClass cls= (OntClass)it.next();
System.out.println("URI of classes of Merged University Ontology are "+cls.getURI());
ExtendedIterator pinstance = ((OntClass)cls).listInstances();
while(pinstance.hasNext())
{
Individual pinstance1= (Individual)pinstance.next();
System.out.println("Individual of " +cls.getLocalName() + pinstance1.getLocalName());
ExtendedIterator dp = ((OntModel) schema).listDatatypeProperties();
while(dp.hasNext())
{
DatatypeProperty p = (DatatypeProperty) dp.next();
if (p.isDatatypeProperty() && p.getDomain()!=null && p.getRange()!=null)
{
System.out.println("Data Property Name: "+ p.getLocalName());
System.out.println("Domain: "+ p.getDomain().getLocalName());
EnumeratedClass e = null;
ExtendedIterator i = null;
if(p.getRange().asClass().isEnumeratedClass())
{
e = p.getRange().asClass().asEnumeratedClass();
i = e.getOneOf().iterator();
RDFNode prop = null;
String s=null;
System.out.println("Range: ");
while(i.hasNext())
{
prop = (RDFNode) i.next();
s=((Object) prop).getClass().toString().split("\\^\\^")[0];
RDFNode propvalue = ((OntResource) prop).getPropertyValue(p);
System.out.println(" Property value is" +propvalue);
System.out.println(s);
}
}
else
System.out.println("Range: "+ p.getRange().getLocalName());
}
System.out.println("\n");
}
}
}
}
schema.close();
}
}
OWL file content
// OWL file content
<rdf:RDFxml:base="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl"><owl:Ontology rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl"><owl:versionIRI rdf:resource="http://www.semanticweb.org/ontologies/2013/9/Ontology1382504980350.owl"/></owl:Ontology><!--
///////////////////////////////////////////////////////////////////////////////////////
//
// Datatypes
//
///////////////////////////////////////////////////////////////////////////////////////
--><!--
///////////////////////////////////////////////////////////////////////////////////////
//
// Object Properties
//
///////////////////////////////////////////////////////////////////////////////////////
--><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasaAcquired - -><owl:ObjectProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasaAcquired"> <rdfs:range rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Academic"/><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/></owl:ObjectProperty><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#isAcquiredBy --><owl:ObjectProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#isAcquiredBy"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Academic"/><rdfs:range rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><owl:inverseOf rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasaAcquired"/></owl:ObjectProperty><!--
///////////////////////////////////////////////////////////////////////////////////////
//
// Data properties
//
///////////////////////////////////////////////////////////////////////////////////////
--><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasAge --> <owl:DatatypeProperty r df:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasAge"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><rdfs:range rdf:resource="http://www.w3.org/2002/07/owl#real"/></owl:DatatypeProperty><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasAggregate --><owl:DatatypeProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasAggregate"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><rdfs:range rdf:resource="http://www.w3.org/2001/XMLSchema#integer"/></owl:DatatypeProperty><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasEmailaddress --><owl:DatatypeProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasEmailaddress"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><rdfs:range rdf:resource="http://www.w3.org/2001/XMLSchema#string"/></owl:DatatypeProperty><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasGender --><owl:DatatypeProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasGender"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><rdfs:range rdf:resource="http://www.w3.org/2001/XMLSchema#string"/></owl:DatatypeProperty><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasName --><owl:DatatypeProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasName"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><rdfs:range rdf:resource="http://www.w3.org/2001/XMLSchema#string"/></owl:DatatypeProperty><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasPostaladress --><owl:DatatypeProperty rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#hasPostaladress"><rdfs:domain rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><rdfs:range rdf:resource="http://www.w3.org/2001/XMLSchema#string"/></owl:DatatypeProperty><!--
///////////////////////////////////////////////////////////////////////////////////////
//
// Classes
//
///////////////////////////////////////////////////////////////////////////////////////
--><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Academic --><owl:Class rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Academic"><rdfs:subClassOf rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/></owl:Class><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#General --><owl:Class rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#General"><rdfs:subClassOf rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/></owl:Class><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Personal --><owl:Class rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Personal"><rdfs:subClassOf rdf:resource="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/></owl:Class><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student --><owl:Class rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Student"/><!--
///////////////////////////////////////////////////////////////////////////////////////
//
// Individuals
//
///////////////////////////////////////////////////////////////////////////////////////
--><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#jack --><owl:NamedIndividual rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#jack"><hasGender>male</hasGender><hasAggregate>50</hasAggregate><hasAge>20</hasAge><hasPostaladress>illonious chicago</hasPostaladress><hasName>jack</hasName><hasEmailaddress>jackid</hasEmailaddress></owl:NamedIndividual><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#jill_ --><owl:NamedIndividual rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#jill_"><hasGender>male</hasGender><hasEmailaddress>jillid</hasEmailaddress><hasName>jill</hasName><hasAggregate>34</hasAggregate><hasPostaladress>chicago</hasPostaladress><hasAge>21</hasAge></owl:NamedIndividual><!-- http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#jim --><owl:NamedIndividual rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#jim"><hasEmailaddress>jimid</hasEmailaddress><hasAggregate>70</hasAggregate><hasName>jim</hasName><hasGender>male</hasGender><hasAge>23</hasAge><hasPostaladress>chicago</hasPostaladress></owl:NamedIndividual><!--
///////////////////////////////////////////////////////////////////////////////////////
//
// General axioms
//
///////////////////////////////////////////////////////////////////////////////////////
--><rdf:Description><rdf:type rdf:resource="http://www.w3.org/2002/07/owl#AllDisjointClasses"/><owl:members rdf:parseType="Collection"><rdf:Description rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Academic"/><rdf:Description rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#General"/><rdf:Description rdf:about="http://www.semanticweb.org/ontologies/2013/9/23/Ontology1382505604507.owl#Personal"/></owl:members></rdf:Description></rdf:RDF>
Note that the range of a DatatypeProperty cannot logically be an EnumeratedClass. An enumerated class is an owl class defined to be a fixed collection of individuals. Those individuals cannot be literals. The values within the range of a DatatypeProperty can only be a Literal.
You want to use DataRange, which represents an enumeration of Literal values. Replace your tests of .asClass().isEnumeratedClass() with .isDataRange()
In your data, you want the range for your property to be crafted using owl:DataRange and owl:oneOf as demonstrated here.
This applies only if you wish to explore the theoretical space of allowed values for an owl:DatatypeProperty whose range is enumerated as a owl:DataRange. While your code implies that is your goal, the wording of your question suggests an alternative goal: you have some owl:DatatypeProperty, ex:p, and you wish to identify all of the objects of that property as they exist within your model.
If that is your goal, then you can simply ask the model for the literal values:
final ExtendedIterator<Literal> observedRange =
new NiceIterator<RDFNode>()
.andThen(Iter.distinct(model.listObjectsOfProperty(p)))
.mapWith(new Map1<RDFNode,Literal>(){
#Override
public Literal map1(final RDFNode o) {
return o.asLiteral();
}});
My goal is to write an java applet application that writes a word document(this document is fetched from DB)in a temporary directory on client machine and opens that document using Jacob.
Through Jacob I need to keep the handle to the opened document, so that after the user closes the document I need to save it back to the DB with the changes.
That said, the first thing I want to know is how to capture a close/exit event through Jacob when the user closes/exits the MS Word document. How can I achieve this?
I tried the code below, which is based in the code i saw in this answer: https://stackoverflow.com/a/12332421/3813385 but it only opens the document and does not listen the closing event...
package demo;
import com.jacob.activeX.ActiveXComponent;
import com.jacob.com.Dispatch;
import com.jacob.com.DispatchEvents;
import com.jacob.com.Variant;
public class WordEventTest {
public static void main(String[] args) {
WordEventTest wordEventTest = new WordEventTest();
wordEventTest.execute();
}
public void execute() {
String strDir = "D:\\fabricasw\\workspace\\jacob\\WebContent\\docs\\";
String strInputDoc = strDir + "file_in.doc";
String pid = "Word.Application";
ActiveXComponent axc = new ActiveXComponent(pid);
axc.setProperty("Visible", new Variant(true));
Dispatch oDocuments = axc.getProperty("Documents").toDispatch();
Dispatch oDocument = Dispatch.call(oDocuments, "Open", strInputDoc).toDispatch();
WordEventHandler w = new WordEventHandler();
new DispatchEvents(oDocument, w);
}
public class WordEventHandler {
public void Close(Variant[] arguments) {
System.out.println("closed word document");
}
}
I would appreciate if you guys post some java code showing how. At least how to obtain the contents of a Microsoft Word document and how to detect the application closing event.
For handling events, I got help from this site:
http://danadler.com/jacob/
Here is my solution that work:
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import com.jacob.com.DispatchEvents;
import com.jacob.com.Variant;
import com.microsoft.word.WordApplication;
import com.microsoft.word.WordDocument;
import com.microsoft.word.WordDocuments;
public class WordEventDemo {
private WordApplication wordApp = null;
private WordDocuments wordDocs = null;
private WordDocument wordDoc = null;
private WordAppEventListener wordAppEventListener = null;
private WordDocEventListener wordDocEventListener = null;
private List<DispatchEvents> dispatchEvents = new ArrayList<DispatchEvents>();
public WordEventDemo() {
}
/**
* Start Word, open the document and register listener to Word events
*
* #param docId
* The id of the document in the database
*/
public void start(String filename) throws Exception {
// get document from DB
File fFile = new File(filename); // replace by your code to retrieve file from your DB
// open document
// create WORD instance
wordApp = new WordApplication();
// get document list
wordDocs = wordApp.getDocuments();
Object oFile = fFile.getAbsolutePath();
Object oConversion = new Boolean(false);
Object oReadOnly = new Boolean(false);
wordDoc = wordDocs.Open(oFile, oConversion, oReadOnly);
wordDoc.Activate();
wordApp.setVisible(true);
// register listeners for the word app and document
wordAppEventListener = new WordAppEventListener();
dispatchEvents.add(new DispatchEvents(wordApp, wordAppEventListener));
wordDocEventListener = new WordDocEventListener();
dispatchEvents.add(new DispatchEvents(wordDoc, wordDocEventListener));
}
// This is the event interface for the word application
public class WordAppEventListener {
public WordAppEventListener() {
}
/**
* Triggered when the Word Application is closed.
*/
public void Quit(Variant[] args) {
// Perform operations on "Quit" event
System.out.println("quitting Word!");
}
/**
* Event called by Word Application when it attempt to save a file.<br>
* For Microsoft API reference, see <a
* href="http://msdn.microsoft.com/en-us/library/ff838299%28v=office.14%29.aspx"
* >http://msdn.microsoft.com/en-us/library/ff838299%28v=office.14%29.aspx</a>
*
* #param args
* An array of 3 Variants (WARNING, they are not in the same order indicated in the msdn link)
* #param args
* [0] <b>Cancel</b> : False when the event occurs. If the event procedure sets this argument to
* True, the document is not saved when the procedure is finished.
* #param args
* [1] <b>SaveAsUI</b> : True to display the Save As dialog box.
* #param args
* [2] <b>Doc</b> : The document that is being saved.
*/
public void DocumentBeforeSave(Variant[] args) {
// Perform operations on "DocumentBeforeSave" event
System.out.println("saving Word Document");
}
}
// This is the event interface for a word document
public class WordDocEventListener {
/**
* Triggered when a Word Document is closed.
*
* #param args
*/
public void Close(Variant[] args) {
// Perform operations on "Close" event
System.out.println("closing document");
}
}
}
Then I call it simply like the following:
WordEventDemo fixture = new WordEventDemo();
fixture.start("path/to/file.docx");
// add a waiting mechanism (could be linked to the close or quit event), to make it simple here
Thread.sleep(20000);