My goal is to write an java applet application that writes a word document(this document is fetched from DB)in a temporary directory on client machine and opens that document using Jacob.
Through Jacob I need to keep the handle to the opened document, so that after the user closes the document I need to save it back to the DB with the changes.
That said, the first thing I want to know is how to capture a close/exit event through Jacob when the user closes/exits the MS Word document. How can I achieve this?
I tried the code below, which is based in the code i saw in this answer: https://stackoverflow.com/a/12332421/3813385 but it only opens the document and does not listen the closing event...
package demo;
import com.jacob.activeX.ActiveXComponent;
import com.jacob.com.Dispatch;
import com.jacob.com.DispatchEvents;
import com.jacob.com.Variant;
public class WordEventTest {
public static void main(String[] args) {
WordEventTest wordEventTest = new WordEventTest();
wordEventTest.execute();
}
public void execute() {
String strDir = "D:\\fabricasw\\workspace\\jacob\\WebContent\\docs\\";
String strInputDoc = strDir + "file_in.doc";
String pid = "Word.Application";
ActiveXComponent axc = new ActiveXComponent(pid);
axc.setProperty("Visible", new Variant(true));
Dispatch oDocuments = axc.getProperty("Documents").toDispatch();
Dispatch oDocument = Dispatch.call(oDocuments, "Open", strInputDoc).toDispatch();
WordEventHandler w = new WordEventHandler();
new DispatchEvents(oDocument, w);
}
public class WordEventHandler {
public void Close(Variant[] arguments) {
System.out.println("closed word document");
}
}
I would appreciate if you guys post some java code showing how. At least how to obtain the contents of a Microsoft Word document and how to detect the application closing event.
For handling events, I got help from this site:
http://danadler.com/jacob/
Here is my solution that work:
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import com.jacob.com.DispatchEvents;
import com.jacob.com.Variant;
import com.microsoft.word.WordApplication;
import com.microsoft.word.WordDocument;
import com.microsoft.word.WordDocuments;
public class WordEventDemo {
private WordApplication wordApp = null;
private WordDocuments wordDocs = null;
private WordDocument wordDoc = null;
private WordAppEventListener wordAppEventListener = null;
private WordDocEventListener wordDocEventListener = null;
private List<DispatchEvents> dispatchEvents = new ArrayList<DispatchEvents>();
public WordEventDemo() {
}
/**
* Start Word, open the document and register listener to Word events
*
* #param docId
* The id of the document in the database
*/
public void start(String filename) throws Exception {
// get document from DB
File fFile = new File(filename); // replace by your code to retrieve file from your DB
// open document
// create WORD instance
wordApp = new WordApplication();
// get document list
wordDocs = wordApp.getDocuments();
Object oFile = fFile.getAbsolutePath();
Object oConversion = new Boolean(false);
Object oReadOnly = new Boolean(false);
wordDoc = wordDocs.Open(oFile, oConversion, oReadOnly);
wordDoc.Activate();
wordApp.setVisible(true);
// register listeners for the word app and document
wordAppEventListener = new WordAppEventListener();
dispatchEvents.add(new DispatchEvents(wordApp, wordAppEventListener));
wordDocEventListener = new WordDocEventListener();
dispatchEvents.add(new DispatchEvents(wordDoc, wordDocEventListener));
}
// This is the event interface for the word application
public class WordAppEventListener {
public WordAppEventListener() {
}
/**
* Triggered when the Word Application is closed.
*/
public void Quit(Variant[] args) {
// Perform operations on "Quit" event
System.out.println("quitting Word!");
}
/**
* Event called by Word Application when it attempt to save a file.<br>
* For Microsoft API reference, see <a
* href="http://msdn.microsoft.com/en-us/library/ff838299%28v=office.14%29.aspx"
* >http://msdn.microsoft.com/en-us/library/ff838299%28v=office.14%29.aspx</a>
*
* #param args
* An array of 3 Variants (WARNING, they are not in the same order indicated in the msdn link)
* #param args
* [0] <b>Cancel</b> : False when the event occurs. If the event procedure sets this argument to
* True, the document is not saved when the procedure is finished.
* #param args
* [1] <b>SaveAsUI</b> : True to display the Save As dialog box.
* #param args
* [2] <b>Doc</b> : The document that is being saved.
*/
public void DocumentBeforeSave(Variant[] args) {
// Perform operations on "DocumentBeforeSave" event
System.out.println("saving Word Document");
}
}
// This is the event interface for a word document
public class WordDocEventListener {
/**
* Triggered when a Word Document is closed.
*
* #param args
*/
public void Close(Variant[] args) {
// Perform operations on "Close" event
System.out.println("closing document");
}
}
}
Then I call it simply like the following:
WordEventDemo fixture = new WordEventDemo();
fixture.start("path/to/file.docx");
// add a waiting mechanism (could be linked to the close or quit event), to make it simple here
Thread.sleep(20000);
Related
The following python code passes ["hello", "world"] into the universal sentence encoder and returns an array of floats denoting their encoded representation.
import tensorflow as tf
import tensorflow_hub as hub
module = hub.KerasLayer("https://tfhub.dev/google/universal-sentence-encoder/4")
model = tf.keras.Sequential(module)
print("model: ", model(["hello", "world"]))
This code works but I'd now like to do the same thing using the Java API. I've successfully loaded the module, but I am unable to pass inputs into the model and extract the output. Here is what I've got so far:
import org.tensorflow.Graph;
import org.tensorflow.SavedModelBundle;
import org.tensorflow.Session;
import org.tensorflow.Tensor;
import org.tensorflow.Tensors;
import org.tensorflow.framework.ConfigProto;
import org.tensorflow.framework.GPUOptions;
import org.tensorflow.framework.GraphDef;
import org.tensorflow.framework.MetaGraphDef;
import org.tensorflow.framework.NodeDef;
import org.tensorflow.util.SaverDef;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
public final class NaiveBayesClassifier
{
public static void main(String[] args)
{
new NaiveBayesClassifier().run();
}
protected SavedModelBundle loadModule(Path source, String... tags) throws IOException
{
return SavedModelBundle.load(source.toAbsolutePath().normalize().toString(), tags);
}
public void run()
{
try (SavedModelBundle module = loadModule(Paths.get("universal-sentence-encoder"), "serve"))
{
Graph graph = module.graph();
try (Session session = new Session(graph, ConfigProto.newBuilder().
setGpuOptions(GPUOptions.newBuilder().setAllowGrowth(true)).
setAllowSoftPlacement(true).
build().toByteArray()))
{
Tensor<String> input = Tensors.create(new byte[][]
{
"hello".getBytes(StandardCharsets.UTF_8),
"world".getBytes(StandardCharsets.UTF_8)
});
List<Tensor<?>> result = session.runner().feed("serving_default_inputs", input).
addTarget("???").run();
}
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
I used https://stackoverflow.com/a/51952478/14731 to scan the model for possible input/output nodes. I believe the input node is "serving_default_inputs" but I can't figure out the output node. More importantly, I don't have to specify any of these values when invoking the code in python through Keras so is there a way to do the same using the Java API?
UPDATE: Thanks to roywei I can now that confirm the input node is serving_default_input and output node is StatefulPartitionedCall_1 but when I plug these names into the aforementioned code I get:
2020-05-22 22:13:52.266287: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at lookup_table_op.cc:809 : Failed precondition: Table not initialized.
Exception in thread "main" java.lang.IllegalStateException: [_Derived_]{{function_node __inference_pruned_6741}} {{function_node __inference_pruned_6741}} Error while reading resource variable EncoderDNN/DNN/ResidualHidden_0/dense/kernel/part_25 from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/EncoderDNN/DNN/ResidualHidden_0/dense/kernel/part_25/class tensorflow::Var does not exist.
[[{{node EncoderDNN/DNN/ResidualHidden_0/dense/kernel/ConcatPartitions/concat/ReadVariableOp_25}}]]
[[StatefulPartitionedCall_1/StatefulPartitionedCall]]
at libtensorflow#1.15.0/org.tensorflow.Session.run(Native Method)
at libtensorflow#1.15.0/org.tensorflow.Session.access$100(Session.java:48)
at libtensorflow#1.15.0/org.tensorflow.Session$Runner.runHelper(Session.java:326)
at libtensorflow#1.15.0/org.tensorflow.Session$Runner.run(Session.java:276)
Meaning, I still cannot invoke the model. What am I missing?
I figured it out after roywei pointed me in the right direction.
I needed to use SavedModuleBundle.session() instead of constructing my own instance. This is because the loader initializes the graph variables.
Instead of passing a ConfigProto to the Session constructor, I passed it into the SavedModelBundle loader instead.
I needed to use fetch() instead of addTarget() to retrieve the output tensor.
Here is the working code:
public final class NaiveBayesClassifier
{
public static void main(String[] args)
{
new NaiveBayesClassifier().run();
}
public void run()
{
try (SavedModelBundle module = loadModule(Paths.get("universal-sentence-encoder"), "serve"))
{
try (Tensor<String> input = Tensors.create(new byte[][]
{
"hello".getBytes(StandardCharsets.UTF_8),
"world".getBytes(StandardCharsets.UTF_8)
}))
{
MetaGraphDef metadata = MetaGraphDef.parseFrom(module.metaGraphDef());
Map<String, Shape> nameToInput = getInputToShape(metadata);
String firstInput = nameToInput.keySet().iterator().next();
Map<String, Shape> nameToOutput = getOutputToShape(metadata);
String firstOutput = nameToOutput.keySet().iterator().next();
System.out.println("input: " + firstInput);
System.out.println("output: " + firstOutput);
System.out.println();
List<Tensor<?>> result = module.session().runner().feed(firstInput, input).
fetch(firstOutput).run();
for (Tensor<?> tensor : result)
{
{
float[][] array = new float[tensor.numDimensions()][tensor.numElements() /
tensor.numDimensions()];
tensor.copyTo(array);
System.out.println(Arrays.deepToString(array));
}
}
}
}
catch (IOException e)
{
e.printStackTrace();
}
}
/**
* Loads a graph from a file.
*
* #param source the directory containing to load from
* #param tags the model variant(s) to load
* #return the graph
* #throws NullPointerException if any of the arguments are null
* #throws IOException if an error occurs while reading the file
*/
protected SavedModelBundle loadModule(Path source, String... tags) throws IOException
{
// https://stackoverflow.com/a/43526228/14731
try
{
return SavedModelBundle.loader(source.toAbsolutePath().normalize().toString()).
withTags(tags).
withConfigProto(ConfigProto.newBuilder().
setGpuOptions(GPUOptions.newBuilder().setAllowGrowth(true)).
setAllowSoftPlacement(true).
build().toByteArray()).
load();
}
catch (TensorFlowException e)
{
throw new IOException(e);
}
}
/**
* #param metadata the graph metadata
* #return the first signature, or null
*/
private SignatureDef getFirstSignature(MetaGraphDef metadata)
{
Map<String, SignatureDef> nameToSignature = metadata.getSignatureDefMap();
if (nameToSignature.isEmpty())
return null;
return nameToSignature.get(nameToSignature.keySet().iterator().next());
}
/**
* #param metadata the graph metadata
* #return the output signature
*/
private SignatureDef getServingSignature(MetaGraphDef metadata)
{
return metadata.getSignatureDefOrDefault("serving_default", getFirstSignature(metadata));
}
/**
* #param metadata the graph metadata
* #return a map from an output name to its shape
*/
protected Map<String, Shape> getOutputToShape(MetaGraphDef metadata)
{
Map<String, Shape> result = new HashMap<>();
SignatureDef servingDefault = getServingSignature(metadata);
for (Map.Entry<String, TensorInfo> entry : servingDefault.getOutputsMap().entrySet())
{
TensorShapeProto shapeProto = entry.getValue().getTensorShape();
List<Dim> dimensions = shapeProto.getDimList();
long firstDimension = dimensions.get(0).getSize();
long[] remainingDimensions = dimensions.stream().skip(1).mapToLong(Dim::getSize).toArray();
Shape shape = Shape.make(firstDimension, remainingDimensions);
result.put(entry.getValue().getName(), shape);
}
return result;
}
/**
* #param metadata the graph metadata
* #return a map from an input name to its shape
*/
protected Map<String, Shape> getInputToShape(MetaGraphDef metadata)
{
Map<String, Shape> result = new HashMap<>();
SignatureDef servingDefault = getServingSignature(metadata);
for (Map.Entry<String, TensorInfo> entry : servingDefault.getInputsMap().entrySet())
{
TensorShapeProto shapeProto = entry.getValue().getTensorShape();
List<Dim> dimensions = shapeProto.getDimList();
long firstDimension = dimensions.get(0).getSize();
long[] remainingDimensions = dimensions.stream().skip(1).mapToLong(Dim::getSize).toArray();
Shape shape = Shape.make(firstDimension, remainingDimensions);
result.put(entry.getValue().getName(), shape);
}
return result;
}
}
There are two ways to get the names:
1) Using Java:
You can read the input and output names from the org.tensorflow.proto.framework.MetaGraphDef stored in saved model bundle.
Here is an example on how to extract the information:
https://github.com/awslabs/djl/blob/master/tensorflow/tensorflow-engine/src/main/java/ai/djl/tensorflow/engine/TfSymbolBlock.java#L149
2) Using python:
load the saved model in tensorflow python and print the names
loaded = tf.saved_model.load("path/to/model/")
print(list(loaded.signatures.keys()))
infer = loaded.signatures["serving_default"]
print(infer.structured_outputs)
I recommend to take a look at Deep Java Library, it automatically handle the input, output names.
It supports TensorFlow 2.1.0 and allows you to load Keras models as well as TF Hub Saved Model. Take a look at the documentation here and here
Feel free to open an issue if you have problem loading your model.
You can load TF model with Deep Java Library
System.setProperty("ai.djl.repository.zoo.location", "https://storage.googleapis.com/tfhub-modules/google/universal-sentence-encoder/1.tar.gz?artifact_id=encoder");
Criteria.Builder<NDList, NDList> builder =
Criteria.builder()
.setTypes(NDList.class, NDList.class)
.optArtifactId("ai.djl.localmodelzoo:encoder")
.build();
ZooModel<NDList, NDList> model = ModelZoo.loadModel(criteria);
See https://github.com/awslabs/djl/blob/master/docs/load_model.md#load-model-from-a-url for detail
I need to do the same, but seems still lots of missing pieces RE DJL usage. E.g., what to do after this?:
ZooModel<NDList, NDList> model = ModelZoo.loadModel(criteria);
I finally found an example in the DJL source code. The key take-away is to not use NDList for the input/output at all:
Criteria<String[], float[][]> criteria =
Criteria.builder()
.optApplication(Application.NLP.TEXT_EMBEDDING)
.setTypes(String[].class, float[][].class)
.optModelUrls(modelUrl)
.build();
try (ZooModel<String[], float[][]> model = ModelZoo.loadModel(criteria);
Predictor<String[], float[][]> predictor = model.newPredictor()) {
return predictor.predict(inputs.toArray(new String[0]));
}
See https://github.com/awslabs/djl/blob/master/examples/src/main/java/ai/djl/examples/inference/UniversalSentenceEncoder.java for the complete example.
I have two pc's on which i am running agents.Both are connected by LAN(or wifi). I want these agents to communicate. One of the ways i found is by giving agent's full addresses.Below is the code snippet.
AID a = new AID("A#192.168.14.51:1099/JADE",AID.ISGUID);
a.addAddresses("http://192.168.14.51:7778/acc");
msg.addReceiver(a);
send(msg);
however Once i start agents at one platform, i want the agents on other platform to be able to register services on its yellow pages so that i can search for appropriate agent from a list of same.I looked but could not find anything about it. Please give me suggestion on how i can achieve this.
Well, you are looking for DF federation. As far as I understand, it is nothing but 'connecting' DFs.
There is an example in yelloPages package in 'jade all examples' folder. It creates register,subscriber,searcher and a subDF agent. registrer agent registers agent with soe property and other agents do their jobs. SubDF creates child DF which involves DF Federation. For you, I modified the code as this:
Next three agents run on port 1099 as:
1)
package examples.yellowPages;
import jade.core.Agent;
import jade.core.AID;
import jade.domain.DFService;
import jade.domain.FIPAException;
import jade.domain.FIPANames;
import jade.domain.FIPAAgentManagement.DFAgentDescription;
import jade.domain.FIPAAgentManagement.ServiceDescription;
import jade.domain.FIPAAgentManagement.Property;
/**
This example shows how to register an application specific service in the Yellow Pages
catalogue managed by the DF Agent so that other agents can dynamically discover it.
In this case in particular we register a "Weather-forecast" service for
Italy. The name of this service is specified as a command line argument.
#author Giovanni Caire - TILAB
*/
public class DFRegisterAgent extends Agent {
protected void setup() {
String serviceName = "unknown";
// Read the name of the service to register as an argument
Object[] args = getArguments();
if (args != null && args.length > 0) {
serviceName = (String) args[0];
}
// Register the service
System.out.println("Agent "+getLocalName()+" registering service \""+serviceName+"\" of type \"weather-forecast\"");
try {
DFAgentDescription dfd = new DFAgentDescription();
dfd.setName(getAID());
ServiceDescription sd = new ServiceDescription();
sd.setName(serviceName);
sd.setType("weather-forecast");
// Agents that want to use this service need to "know" the weather-forecast-ontology
sd.addOntologies("weather-forecast-ontology");
// Agents that want to use this service need to "speak" the FIPA-SL language
sd.addLanguages(FIPANames.ContentLanguage.FIPA_SL);
sd.addProperties(new Property("country", "Italy"));
dfd.addServices(sd);
DFService.register(this, dfd);
}
catch (FIPAException fe) {
fe.printStackTrace();
}
}
}
2)
package examples.yellowPages;
import jade.core.Agent;
import jade.core.AID;
import jade.domain.DFService;
import jade.domain.FIPAException;
import jade.domain.FIPANames;
import jade.domain.FIPAAgentManagement.DFAgentDescription;
import jade.domain.FIPAAgentManagement.ServiceDescription;
import jade.domain.FIPAAgentManagement.SearchConstraints;
import jade.util.leap.Iterator;
/**
This example shows how to search for services provided by other agents
and advertised in the Yellow Pages catalogue managed by the DF agent.
In this case in particular we search for agents providing a
"Weather-forecast" service.
#author Giovanni Caire - TILAB
*/
public class DFSearchAgent extends Agent {
protected void setup() {
// Search for services of type "weather-forecast"
System.out.println("Agent "+getLocalName()+" searching for services of type \"weather-forecast\"");
try {
// Build the description used as template for the search
DFAgentDescription template = new DFAgentDescription();
ServiceDescription templateSd = new ServiceDescription();
templateSd.setType("weather-forecast");
template.addServices(templateSd);
SearchConstraints sc = new SearchConstraints();
// We want to receive 10 results at most
sc.setMaxResults(new Long(10));
DFAgentDescription[] results = DFService.search(this, template, sc);
if (results.length > 0) {
System.out.println("Agent "+getLocalName()+" found the following weather-forecast services:");
for (int i = 0; i < results.length; ++i) {
DFAgentDescription dfd = results[i];
AID provider = dfd.getName();
// The same agent may provide several services; we are only interested
// in the weather-forcast one
Iterator it = dfd.getAllServices();
while (it.hasNext()) {
ServiceDescription sd = (ServiceDescription) it.next();
if (sd.getType().equals("weather-forecast")) {
System.out.println("- Service \""+sd.getName()+"\" provided by agent "+provider.getName());
}
}
}
}
else {
System.out.println("Agent "+getLocalName()+" did not find any weather-forecast service");
}
}
catch (FIPAException fe) {
fe.printStackTrace();
}
}
}
3)
package examples.yellowPages;
import jade.core.Agent;
import jade.core.AID;
import jade.domain.DFService;
import jade.domain.FIPAException;
import jade.domain.FIPANames;
import jade.domain.FIPAAgentManagement.DFAgentDescription;
import jade.domain.FIPAAgentManagement.ServiceDescription;
import jade.domain.FIPAAgentManagement.Property;
import jade.domain.FIPAAgentManagement.SearchConstraints;
import jade.proto.SubscriptionInitiator;
import jade.lang.acl.ACLMessage;
import jade.util.leap.Iterator;
/**
This example shows how to subscribe to the DF agent in order to be notified
each time a given service is published in the yellow pages catalogue.
In this case in particular we want to be informed whenever a service of type
"Weather-forecast" for Italy becomes available.
#author Giovanni Caire - TILAB
*/
public class DFSubscribeAgent extends Agent {
protected void setup() {
// Build the description used as template for the subscription
DFAgentDescription template = new DFAgentDescription();
ServiceDescription templateSd = new ServiceDescription();
templateSd.setType("weather-forecast");
templateSd.addProperties(new Property("country", "Italy"));
template.addServices(templateSd);
SearchConstraints sc = new SearchConstraints();
// We want to receive 10 results at most
sc.setMaxResults(new Long(10));
addBehaviour(new SubscriptionInitiator(this, DFService.createSubscriptionMessage(this, getDefaultDF(), template, sc)) {
protected void handleInform(ACLMessage inform) {
System.out.println("Agent "+getLocalName()+": Notification received from DF");
try {
DFAgentDescription[] results = DFService.decodeNotification(inform.getContent());
if (results.length > 0) {
for (int i = 0; i < results.length; ++i) {
DFAgentDescription dfd = results[i];
AID provider = dfd.getName();
// The same agent may provide several services; we are only interested
// in the weather-forcast one
Iterator it = dfd.getAllServices();
while (it.hasNext()) {
ServiceDescription sd = (ServiceDescription) it.next();
if (sd.getType().equals("weather-forecast")) {
System.out.println("Weather-forecast service for Italy found:");
System.out.println("- Service \""+sd.getName()+"\" provided by agent "+provider.getName());
}
}
}
}
System.out.println();
}
catch (FIPAException fe) {
fe.printStackTrace();
}
}
} );
}
}
4) This is the last one. It creates a DF and registers in DFRegister agent i.e. DF federation is done. I ran this on 1331 port. Remember to change IP addresses. (u can run agent on different port by using -local-port 1331.
Remember to run previous agents before this.
You can put it in different eclipse project and run it.
import jade.core.*;
import jade.core.behaviours.*;
import jade.domain.FIPAAgentManagement.*;
import jade.domain.FIPAException;
import jade.domain.DFService;
import jade.domain.FIPANames;
import jade.util.leap.Iterator;
/**
This is an example of an agent that plays the role of a sub-df by
automatically registering with a parent DF.
Notice that exactly the same might be done by using the GUI of the DF.
<p>
This SUBDF inherits all the functionalities of the default DF, including
its GUI.
#author Giovanni Rimassa - Universita` di Parma
#version $Date: 2003-12-03 17:57:03 +0100 (mer, 03 dic 2003) $ $Revision: 4638 $
*/
public class SubDF2 extends jade.domain.df {
public void setup() {
// Input df name
int len = 0;
byte[] buffer = new byte[1024];
try {
// AID parentName = getDefaultDF();
AID parentName = new AID("df#10.251.216.135:1099/JADE");
parentName.addAddresses("http://NikhilChilwant:7778/acc");
//Execute the setup of jade.domain.df which includes all the default behaviours of a df
//(i.e. register, unregister,modify, and search).
super.setup();
//Use this method to modify the current description of this df.
setDescriptionOfThisDF(getDescription());
//Show the default Gui of a df.
super.showGui();
DFService.register(this,parentName,getDescription());
addParent(parentName,getDescription());
System.out.println("Agent: " + getName() + " federated with default df.");
DFAgentDescription template = new DFAgentDescription();
ServiceDescription templateSd = new ServiceDescription();
templateSd.setType("weather-forecast");
templateSd.addProperties(new Property("country", "Italy"));
template.addServices(templateSd);
SearchConstraints sc = new SearchConstraints();
// We want to receive 10 results at most
sc.setMaxResults(new Long(10));
DFAgentDescription[] results = DFService.search(this,parentName, template, sc);
/* if (results.length > 0) {*/
System.out.println("SUB DF ***Agent "+getLocalName()+" found the following weather-forecast services:");
for (int i = 0; i < results.length; ++i) {
DFAgentDescription dfd = results[i];
AID provider = dfd.getName();
// The same agent may provide several services; we are only interested
// in the weather-forcast one
Iterator it = dfd.getAllServices();
while (it.hasNext()) {
ServiceDescription sd = (ServiceDescription) it.next();
if (sd.getType().equals("weather-forecast")) {
System.out.println("- Service \""+sd.getName()+"\" provided by agent "+provider.getName());
}
}
}/*}*/
String serviceName = "unknown2";
DFAgentDescription dfd = new DFAgentDescription();
dfd.setName(getAID());
ServiceDescription sd = new ServiceDescription();
sd.setName(serviceName);
sd.setType("weather-forecast2");
// Agents that want to use this service need to "know" the weather-forecast-ontology
sd.addOntologies("weather-forecast-ontology2");
// Agents that want to use this service need to "speak" the FIPA-SL language
sd.addLanguages(FIPANames.ContentLanguage.FIPA_SL);
sd.addProperties(new Property("country2", "Italy2"));
dfd.addServices(sd);
DFService.register(this, parentName,dfd);
}catch(FIPAException fe){fe.printStackTrace();}
}
private DFAgentDescription getDescription()
{
DFAgentDescription dfd = new DFAgentDescription();
dfd.setName(getAID());
ServiceDescription sd = new ServiceDescription();
sd.setName(getLocalName() + "-sub-df");
sd.setType("fipa-df");
sd.addProtocols(FIPANames.InteractionProtocol.FIPA_REQUEST);
sd.addOntologies("fipa-agent-management");
sd.setOwnership("JADE");
dfd.addServices(sd);
return dfd;
}
}
After running the code you can see that, subDF agent is able to find agent which is registered on its federated DF.
You can download complete code here also: http://tinyurl.com/Agent-on-different-platforms
I have the following classes in the project:
MailServer
MailClient
MailItem
I have to modify the MailServer so that it uses a HashMap to store MailItems instead of
an ArrayList. The keys to the HashMap must be the names of the recipients,
and each value must be an ArrayList containing all the MailItems stored for
that recipient.The names of the recipients must be case-insensitive, i.e. “paul” and “Paul” and “PAUL” are all the same person.
I'm not sure how or where to start for setting up the mail system where the names of the recipients are case insensitive. Would appreciate any help. Thanks.
Below is my source code:
import java.util.ArrayList;
import java.util.List;
import java.util.Iterator;
import java.util.HashMap;
/**
* A simple model of a mail server. The server is able to receive
* mail items for storage, and deliver them to clients on demand.
*
* #author David J. Barnes and Michael Kölling
* #version 2011.07.31
*/
public class MailServer
{
// Storage for the arbitrary number of mail items to be stored
// on the server.
private HashMap<String, ArrayList<MailItem>> items;
/**
* Construct a mail server.
*/
public MailServer()
{
items = new HashMap<String, ArrayList<MailItem>>();
}
/**
* Return how many mail items are waiting for a user.
* #param who The user to check for.
* #return How many items are waiting.
*/
public int howManyMailItems(String who)
{
int count = 0;
for(String name : items.keySet()) {
if(who != null) {
who = formatName(who);
}
if(items.containsKey(who)) {
count ++;
}
}
return count;
}
/**
* Return the next mail item for a user or null if there
* are none.
* #param who The user requesting their next item.
* #return The user's next item.
*/
public MailItem getNextMailItem(String who)
{
if(who != null) {
who = formatName(who);
}
ArrayList<MailItem> mails = items.get((who));
if(mails == null) {
return null;
}
Iterator<MailItem> it = mails.iterator();
while(it.hasNext()) {
MailItem mail = it.next();
if(mail.getTo().equals(who)) {
it.remove();
return mail;
}
}
return null;
}
/**
* Add the given mail item to the message list.
* #param item The mail item to be stored on the server.
*/
public void post(MailItem item)
{
String who = item.getTo();
if(who != null) {
who = formatName(who);
}
if(!items.containsKey(who)) {
items.put(who, new ArrayList<MailItem>());
}
items.get(who).add(item);
}
private static String formatName(String who) {
if(who.length() > 0) {
return who.toLowerCase();
}
return "";
}
}
/**
* A class to model a simple email client. The client is run by a
* particular user, and sends and retrieves mail via a particular server.
*
* #author David J. Barnes and Michael Kölling
* #version 2011.07.31
*/
public class MailClient
{
// The server used for sending and receiving.
private MailServer server;
// The user running this client.
private String user;
/**
* Create a mail client run by user and attached to the given server.
*/
public MailClient(MailServer server, String user)
{
this.server = server;
this.user = user;
}
/**
* Return the next mail item (if any) for this user.
*/
public MailItem getNextMailItem()
{
return server.getNextMailItem(user);
}
/**
* Print the next mail item (if any) for this user to the text
* terminal.
*/
public void printNextMailItem()
{
MailItem item = server.getNextMailItem(user);
if(item == null) {
System.out.println("No new mail.");
}
else {
item.print();
}
}
/**
* Send the given message to the given recipient via
* the attached mail server.
* #param to The intended recipient.
* #param message The text of the message to be sent.
*/
public void sendMailItem(String to, String subject, String message)
{
MailItem item = new MailItem(user, to, subject, message);
server.post(item);
}
}
If I understand your question properly, recipients must be case-insensitive, and they are to be used as keys in a HashMap.
Why not just use the toLowerCase function on String, and use that version of the recipient as the key? That way, "PAUL", "Paul", and "paul" all get turned into "paul" for purposes of lookup.
You'd apply this by lowercasing the name input at the beginning of any function that uses it.
I would just convert all email-addresses to lower case:
String getEmailAddress(String emailAddress) {
if (emailAddress.length() > 0) return emailAddress.toLowerCase();
return ""
}
Hope this helps.
I' ve got mongodb collection with airports and i need to prepare some Geospatial Queries.
It's a simple document from this collection:
{
"_id" : ObjectId("528e8134556062edda12ffe6"),
"id" : 6523,
"ident" : "00A",
"type" : "heliport",
"name" : "Total Rf Heliport",
"latitude_deg" : 40.07080078125,
"longitude_deg" : -74.9336013793945,
"elevation_ft" : 11,
"continent" : "NA",
"iso_country" : "US",
"iso_region" : "US-PA",
"municipality" : "Bensalem",
"scheduled_service" : "no",
"gps_code" : "00A",
"iata_code" : "",
"local_code" : "00A",
"home_link" : "",
"wikipedia_link" : "",
"keywords" : ""
}
I need to change all documents to something like that:
{
"_id": ObjectId("528e8134556062edda12ffe6"),
"id" : 6523,
"ident" : "00A",
"type" : "heliport",
"name" : "Total Rf Heliport",
"longitude_deg" : 17.27,
"latitude_deg" : 52.22,
"loc" : {
"type" : "Point",
"coordinates" : [
17.27,
52.22
]
},
...
}
This is a simple javascript that should work for it:
var cursor = db.airports.find()
cursor.forEach(function(input) {
x = input.latitude_deg;
y = input.longitude_deg;
id = input._id;
db.airports.update({"_id":id},{$set:{"loc":{"type":"Point","coordinates":[y,x]}}});
});
but i must create program in java which does the same and even though I try I'm not coming out of this.
If anyone can steer me somehow to solve?
Thanks in advance and sorry for my poor english!
Below is an example application I created showing how to do this with the MongoDB Inc. driver (legacy) and the Asynchronous Java Driver.
For the asynchronous driver I show two different methods: synchronous and asynchronous. The main driver for the asynchronous version is that for the synchronous/legacy case have to wait for each update to complete. In the asynchronous model you can continue to get meaningful work done while the stream of document fetching and updates happen in the "background".
HTH,
Rob.
P.S. Full disclosure: I work on the Asynchronous Java Driver.
package geojson.sof20181050;
import java.io.IOException;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.concurrent.Phaser;
import java.util.concurrent.atomic.AtomicLong;
import com.allanbank.mongodb.Callback;
import com.allanbank.mongodb.Durability;
import com.allanbank.mongodb.MongoClient;
import com.allanbank.mongodb.MongoCollection;
import com.allanbank.mongodb.MongoFactory;
import com.allanbank.mongodb.StreamCallback;
import com.allanbank.mongodb.bson.Document;
import com.allanbank.mongodb.bson.Element;
import com.allanbank.mongodb.bson.NumericElement;
import com.allanbank.mongodb.bson.builder.BuilderFactory;
import com.allanbank.mongodb.bson.builder.DocumentBuilder;
import com.allanbank.mongodb.builder.Find;
import com.allanbank.mongodb.builder.GeoJson;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.MongoClientURI;
import com.mongodb.WriteConcern;
/**
* ConvertToGeoJSON provides a solution to a Stack Overflow question on how to
* conver a document containing a latitude and longitude value to contain a
* GeoJSON formated value instead. From the Stack Overflow question: Each input
* document looks like: <blockquote>
*
* <pre>
* <code>
* {
* "_id" : ObjectId("528e8134556062edda12ffe6"),
* "id" : 6523,
* "ident" : "00A",
* "type" : "heliport",
* "name" : "Total Rf Heliport",
* "latitude_deg" : 40.07080078125,
* "longitude_deg" : -74.9336013793945,
* "elevation_ft" : 11,
* "continent" : "NA",
* "iso_country" : "US",
* "iso_region" : "US-PA",
* "municipality" : "Bensalem",
* "scheduled_service" : "no",
* "gps_code" : "00A",
* "iata_code" : "",
* "local_code" : "00A",
* "home_link" : "",
* "wikipedia_link" : "",
* "keywords" : ""
* }
* </code>
* </pre>
*
* </blockquote>
* <p>
* We want the resulting document to look like:<blockquote>
*
* <pre>
* <code>
* {
* "_id": ObjectId("528e8134556062edda12ffe6"),
* "id" : 6523,
* "ident" : "00A",
* "type" : "heliport",
* "name" : "Total Rf Heliport",
* "longitude_deg" : 17.27,
* "latitude_deg" : 52.22,
* "loc" : {
* "type" : "Point",
* "coordinates" : [
* 17.27,
* 52.22
* ]
* },
* ...
* }
* </code>
* </pre>
*
* </blockquote>
* </p>
*
* #see <a
* href="http://stackoverflow.com/questions/20181050/how-do-i-update-fields-of-documents-in-mongo-db-using-the-java-to-geojson-format">Stack
* Overflow Question</a>
*/
public class ConvertToGeoJSON {
/**
* The handle to the MongoDB client. We assume MongoDB is running on your
* machine on the default port of 27017.
*/
protected final static MongoClient client;
/** The collection we will be using. */
protected final static MongoCollection theCollection;
/** The URI for connecting to MongoDB. */
protected static final String URI;
static {
URI = "mongodb://localhost:27017/";
client = MongoFactory.createClient(URI);
theCollection = client.getDatabase("db").getCollection("collection");
}
/**
* See the class Javadoc for a description of the problem..
*
* #param args
* Command line arguments. Ignored.
* #throws InterruptedException
* If waiting for the callback to complete is interrupted.
* #throws IOException
* On a failure to close the client.
*/
public static void main(final String[] args) throws InterruptedException,
IOException {
try {
// We can perform this operation two way. Synchronously and via
// streaming. We will provide an example of both. Change these
// variables to switch between them.
final boolean doLegacy = false;
final boolean doSynchronous = false;
if (doLegacy) {
doLegacy();
}
else if (doSynchronous) {
doSynchronously();
}
else {
doAsynchronously();
}
}
finally {
// Always close the client!
client.close();
}
}
/**
* Performs the document updates asynchronously. This method uses a pair of
* callbacks to perform the updates. The first receives the stream of
* documents in the collection and prepares and sends the update for each
* document. The second receives the results of the update and checks for
* errors.
* <p>
* Neither callback performs robust error handling but could be easily
* modified to retry operations etc.
* </p>
* <p>
* The advantage of this version is that the stream of updates can be
* handled concurrently with the iteration over the results in each batch of
* documents in the collection. This should result in a significant
* reduction in the wall clock time for processing the collections time.
* </p>
* <p>
* We use {#link Phaser} instances to track when we are waiting for
* asynchronous operations so the main thread knows when to terminate the
* application.
* </p>
*
* #throws InterruptedException
* If waiting for the callback to complete is interrupted.
*/
protected static void doAsynchronously() throws InterruptedException {
// Execute the query to find all of the documents and stream
// them to the callback. Have that callback update the document
// asynchronously.
final Phaser finished = new Phaser(1); // Parent.
final AtomicLong updates = new AtomicLong(0);
final StreamCallback<Document> streamCallback = new StreamCallback<Document>() {
// Child Phaser for the complete stream and all updates.
final Phaser streamFinished = new Phaser(finished, 1);
#Override
public void callback(final Document doc) {
final Element id = doc.get("_id");
final NumericElement lat = doc.get(NumericElement.class,
"latitude_deg");
final NumericElement lon = doc.get(NumericElement.class,
"longitude_deg");
final DocumentBuilder query = BuilderFactory.start();
query.add(id);
final DocumentBuilder update = BuilderFactory.start();
update.push("$set").add(
"loc",
GeoJson.point(GeoJson.p(lon.getDoubleValue(),
lat.getDoubleValue())));
final Callback<Long> updateCallback = new Callback<Long>() {
// Child Phaser for the update.
final Phaser updateFinished = new Phaser(streamFinished, 1);
#Override
public void callback(final Long result) {
// All done. Notify the stream.
updates.addAndGet(result);
updateFinished.arriveAndDeregister();
}
#Override
public void exception(final Throwable thrown) {
System.err.printf("Update of {0} failed.\n", id);
thrown.printStackTrace();
callback(null);
}
};
theCollection.updateAsync(updateCallback, query, update,
Durability.ACK);
}
#Override
public synchronized void done() {
streamFinished.arriveAndDeregister();
}
#Override
public void exception(final Throwable thrown) {
thrown.printStackTrace();
done();
}
};
// Now to kick off the processing.
theCollection.streamingFind(streamCallback, Find.ALL);
// Need to wait for the stream and updates to finish.
finished.arriveAndAwaitAdvance();
System.out.printf("Updated %d documents asynchronously.%n",
updates.get());
}
/**
* Performs the document updates using the legacy driver.
* <p>
* The main draw back here (other than those discussed in
* {#link #doSynchronously()}) is the difficulty creating the GeoJSON
* documents.
* </p>
*
* #throws UnknownHostException
* On an invalid URI.
*/
protected static void doLegacy() throws UnknownHostException {
// Execute the query to find all of the documents and then
// update them.
final com.mongodb.MongoClient legacyClient = new com.mongodb.MongoClient(
new MongoClientURI(URI));
final com.mongodb.DBCollection legacyCollection = legacyClient.getDB(
theCollection.getDatabaseName()).getCollection(
theCollection.getName());
try {
int count = 0;
for (final DBObject doc : legacyCollection.find()) {
final Object id = doc.get("_id");
final Number lat = (Number) doc.get("latitude_deg");
final Number lon = (Number) doc.get("longitude_deg");
final BasicDBObject query = new BasicDBObject();
query.append("_id", id);
final ArrayList<Double> coordinates = new ArrayList<>();
coordinates.add(lon.doubleValue());
coordinates.add(lat.doubleValue());
final BasicDBObject geojson = new BasicDBObject("type", "Point");
geojson.append("coordinates", coordinates);
final BasicDBObject set = new BasicDBObject("loc", geojson);
final BasicDBObject update = new BasicDBObject("$set", set);
legacyCollection.update(query, update, /* upsert= */false,
/* multi= */false, WriteConcern.ACKNOWLEDGED);
count += 1;
}
System.out.printf("Updated %d documents via the legacy driver.%n",
count);
}
finally {
// Always close the client.
legacyClient.close();
}
}
/**
* Performs the document updates synchronously.
* <p>
* While this version is conceptually easier to implement the need to wait
* for each update to complete before processing the next document has a
* severe impact on the wall clock time required to complete the update all
* but the smallest of documents.
*/
protected static void doSynchronously() {
// Execute the query to find all of the documents and then
// update them.
int count = 0;
for (final Document doc : theCollection.find(Find.ALL)) {
final Element id = doc.get("_id");
final NumericElement lat = doc.get(NumericElement.class,
"latitude_deg");
final NumericElement lon = doc.get(NumericElement.class,
"longitude_deg");
final DocumentBuilder query = BuilderFactory.start();
query.add(id);
final DocumentBuilder update = BuilderFactory.start();
update.push("$set").add(
"loc",
GeoJson.point(GeoJson.p(lon.getDoubleValue(),
lat.getDoubleValue())));
theCollection.update(query, update, Durability.ACK);
count += 1;
}
System.out.printf("Updated %d documents synchronously.%n",
count);
}
}
Thanks for your answer.
I used legacy driver method and a little change code couse i've got:
Exception in thread "main" java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Numberat
Code look like this:
import java.io.IOException;
import java.util.ArrayList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoClient;
import com.mongodb.WriteConcern;
public class Geo {
public static void main(final String[] args) throws InterruptedException,
IOException {
MongoClient mongoClient = new MongoClient("127.0.0.1", 27017);
DB db = mongoClient.getDB("test");
DBCollection coll = db.getCollection("airports2");
try {
int count = 0;
for (final DBObject doc : coll.find()) {
final Object id = doc.get("_id");
final Number lat = Double.parseDouble(doc.get("latitude_deg").toString());
final Number lon = Double.parseDouble(doc.get("longitude_deg").toString());
final BasicDBObject query = new BasicDBObject();
query.append("_id", id);
final ArrayList coordinates = new ArrayList();
coordinates.add(lon.doubleValue());
coordinates.add(lat.doubleValue());
final BasicDBObject geojson = new BasicDBObject("type", "Point");
geojson.append("coordinates", coordinates);
final BasicDBObject set = new BasicDBObject("loc", geojson);
final BasicDBObject update = new BasicDBObject("$set", set);
coll.update(query, update, /* upsert= */false,
/* multi= */false, WriteConcern.ACKNOWLEDGED);
count += 1;
}
System.out.printf("Updated %d documents via the legacy driver.%n",
count);
}
finally {
// Always close the client.
mongoClient.close();
}
}
}
and it works but the number of updated documents it's note the same as db.airports.count() what worries me.
I am using collection airports.
I'm playing with the Felix and I can't understand one thing.
I have some OSGi Felix bundle and I try to load and use service from this bundle.
Bundle code:
package ihtika2.i_testbundle;
import ihtika2.i_testbundle.service.TestClasssInter;
import java.util.Hashtable;
import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;
import org.osgi.framework.InvalidSyntaxException;
import org.osgi.framework.ServiceReference;
public class Activator implements BundleActivator {
#Override
public void start(BundleContext context) throws Exception {
Hashtable<String, String> props = new Hashtable<String, String>();
props.put("Funct", "TESTCl");
context.registerService(TestClasssInter.class.getName(), new TestClasss(), props);
ServiceReference[] refs;
try {
BundleContext bundleContext = context;
// System.out.println(TestClasssInter.class.getName());
refs = bundleContext.getServiceReferences("ihtika2.i_testbundle.service.TestClasssInter", "(Funct=TESTCl)");
if (refs == null) {
System.out.println("Not Found AboutForm on show!!!");
} else {
Object MainForm = bundleContext.getService(refs[0]);
TestClasssInter sdfsdf = (TestClasssInter) MainForm;
sdfsdf.printSomeLine();
// MainForm.sendContext(bundleContext);
// MainForm.showWindow();
}
} catch (InvalidSyntaxException ex) {
ex.printStackTrace();
}
}
#Override
public void stop(BundleContext context) throws Exception {
// TODO add deactivation code here
}
}
package ihtika2.i_testbundle;
import ihtika2.i_testbundle.service.TestClasssInter;
/**
*
* #author Arthur
*/
public class TestClasss implements TestClasssInter {
#Override
public void printSomeLine() {
System.out.println("TEST MESSAGE");
}
}
package ihtika2.i_testbundle.service;
/**
*
* #author Arthur
*/
public interface TestClasssInter {
public void printSomeLine();
}
As you can see in the updated example - must shows line "TEST MESSAGE" from bundle code. It's shows, all is ok.
But if I try do execute this code in my "loader", then will shown error
Could not create framework: java.lang.ClassCastException: ihtika2.i_testbundle.TestClasss cannot be cast to ihtika2.i_testbundle.service.TestClasssInter
java.lang.ClassCastException: ihtika2.i_testbundle.TestClasss cannot be cast to ihtika2.i_testbundle.service.TestClasssInter
at com.google.code.ihtika.Starter.main(Starter.java:103)
Java Result: -1
Code of the loader is the
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package com.google.code.ihtika;
import ihtika2.i_testbundle.service.TestClasssInter;
import java.io.File;
import java.util.ArrayList;
import java.util.Map;
import java.util.ServiceLoader;
import org.osgi.framework.Bundle;
import org.osgi.framework.BundleContext;
import org.osgi.framework.BundleException;
import org.osgi.framework.InvalidSyntaxException;
import org.osgi.framework.ServiceReference;
import org.osgi.framework.launch.Framework;
import org.osgi.framework.launch.FrameworkFactory;
/**
* This class provides a static {#code main()} method so that the bundle can be
* run as a stand-alone host application. In such a scenario, the application
* creates its own embedded OSGi framework instance and interacts with the
* internal extensions to providing drawing functionality. To successfully
* launch the stand-alone application, it must be run from this bundle's
* installation directory using "{#code java -jar}". The locations of any
* additional extensions that have to be started, have to be passed as command
* line arguments to this method.
*/
public class Starter {
private static Framework m_framework = null;
/**
* Enables the bundle to run as a stand-alone application. When this static
* {#code main()} method is invoked, the application creates its own
* embedded OSGi framework instance and interacts with the internal
* extensions to provide drawing functionality. To successfully launch as a
* stand-alone application, this method should be invoked from the bundle's
* installation directory using "{#code java -jar}". The location of any
* extension that shall be installed can be passed as parameters. <p> For
* example if you build the bundles inside your workspace, maven will create
* a target directory in every project. To start the application from within
* your IDE you should pass: <p>
* <pre>
* {#code file:../servicebased.circle/target/servicebased.circle-1.0.0.jar
* file:../servicebased.square/target/servicebased.square-1.0.0.jar
* file:../servicebased.triangle/target/servicebased.triangle-1.0.0.jar}
* </pre>
*
* #param args The locations of additional bundles to start.
*
*/
public static void main(String[] args) {
// Args should never be null if the application is run from the command line.
// Check it anyway.
ArrayList<String> locations = new ArrayList<>();
indexBundlesDir("I_Bundles/Stage_300", locations);
indexBundlesDir("I_Bundles/Stage_400", locations);
indexBundlesDir("I_Bundles/Stage_500", locations);
// Print welcome banner.
System.out.println("\nWelcome to My Launcher");
System.out.println("======================\n");
try {
Map<String, String> config = ConfigUtil.createConfig();
m_framework = createFramework(config);
m_framework.init();
m_framework.start();
installAndStartBundles(locations);
for (Bundle testBundle : m_framework.getBundleContext().getBundles()) {
if (testBundle.getSymbolicName().equals("ihtika2.I_TestBundle")) {
System.out.println("found");
ServiceReference[] refs;
try {
BundleContext bundleContext = m_framework.getBundleContext();
// System.out.println(TestClasssInter.class.getName());
refs = bundleContext.getServiceReferences("ihtika2.i_testbundle.service.TestClasssInter", "(Funct=TESTCl)");
if (refs == null) {
System.out.println("Not Found AboutForm on show!!!");
} else {
Object MainForm = bundleContext.getService(refs[0]);
TestClasssInter sdfsdf = (TestClasssInter) MainForm;
// MainForm.sendContext(bundleContext);
// MainForm.showWindow();
}
} catch (InvalidSyntaxException ex) {
ex.printStackTrace();
}
}
// Dictionary<String, String> headerLine = testBundle.getHeaders();
// Enumeration e = headerLine.keys();
//
// while (e.hasMoreElements()) {
// Object key = e.nextElement();
// if (key.equals("Import-Package")) {
// System.out.println(key + " - " + headerLine.get(key));
// }
// System.out.println(key + " - " + headerLine.get(key));
// }
}
m_framework.waitForStop(0);
System.exit(0);
} catch (Exception ex) {
System.err.println("Could not create framework: " + ex);
ex.printStackTrace();
System.exit(-1);
}
}
private static void indexBundlesDir(String bundlesDir, ArrayList<String> locations) {
File dir = new File(bundlesDir);
String[] children = dir.list();
if (children == null) {
// Either dir does not exist or is not a directory
} else {
for (int i = 0; i < children.length; i++) {
// Get filename of file or directory
locations.add("file:/c:/Art/Dropbox/OpenSource/MyGIT/ihtika-2/ihtika-2/MainApplication/" + bundlesDir + "/" + children[i]);
}
}
}
/**
* Util method for creating an embedded Framework. Tries to create a
* {#link FrameworkFactory} which is then be used to create the framework.
*
* #param config the configuration to create the framework with
* #return a Framework with the given configuration
*/
private static Framework createFramework(Map<String, String> config) {
ServiceLoader<FrameworkFactory> factoryLoader = ServiceLoader.load(FrameworkFactory.class);
for (FrameworkFactory factory : factoryLoader) {
return factory.newFramework(config);
}
throw new IllegalStateException("Unable to load FrameworkFactory service.");
}
/**
* Installs and starts all bundles used by the application. Therefore the
* host bundle will be started. The locations of extensions for the host
* bundle can be passed in as parameters.
*
* #param bundleLocations the locations where extension for the host bundle
* are located. Must not be {#code null}!
* #throws BundleException if something went wrong while installing or
* starting the bundles.
*/
private static void installAndStartBundles(ArrayList<String> bundleLocations) throws BundleException {
BundleContext bundleContext = m_framework.getBundleContext();
// Activator bundleActivator = new Activator();
// bundleActivator.start(bundleContext);
for (String location : bundleLocations) {
Bundle addition = bundleContext.installBundle(location);
// System.out.println(location);
addition.start();
}
}
}
package ihtika2.i_testbundle.service;
/**
*
* #author Arthur
*/
public interface TestClasssInter {
public void printSomeLine();
}
ClassLoaders just don't work like that, it really needs to be the same class/interface. MainForm implements MainFormInterface, not MainFormInterface2, even if they are identical.
What you need to do is:
Make sure your MainFormInterface is in a separate package (I think it is: ihtika2.mainform.service)
Delete MainFormInterface2
Replace all MainFormInterface2 references to MainFormInterface
Add the package to the org.osgi.framework.system.packages.extra setting in Felix, I think the easiest place to do that is just add it to the Map after ConfigUtil.createConfig(), this way Felix gets access to the ihtika2.mainform.service package from outside OSGi
Make sure your bundle imports package ihtika2.mainform.service, so your bundle has access to ihtika2.mainform.service as well
That should do it.