I am using OWLAPI to get all the inferred axioms and their explanations, for some ontologies. For the explanations, I need to get the explanation with the minimal set of axioms for each inferred axiom (laconic explanation), but I can't figure out how to do it, and I don't see any mention of laconic in the owlapi docs (http://owlcs.github.io/owlapi/apidocs_5/index.html).
Is there a way to get the laconic explanations for an inferred axiom in OWLAPI?
If so, is it possible to give me an example of how I can achieve this?
UPDATE: Following the advice given in comments, I changed my code to:
public static void explain(OWLAxiom entailment, ExplanationGenerator<OWLAxiom> explanation_generator) {
try {
Set<Explanation<OWLAxiom>> explanations =
explanation_generator.getExplanations(entailment, 1);
explanations.forEach(System.out::println);
System.out.println("EndOfExplanation\n");
} catch (Exception e) {
System.out.println(e);
}
};
// This method replicates code existing in the owlexplanation project;
// it's needed because the factories in owlexplanation do not set InitialEntailmentCheckStrategy correctly
public static ExplanationGeneratorFactory<OWLAxiom> createExplanationGeneratorFactory(
OWLReasonerFactory reasonerFactory, ExplanationProgressMonitor<OWLAxiom> progressMonitor,
Supplier<OWLOntologyManager> m) {
EntailmentCheckerFactory<OWLAxiom> checker = new SatisfiabilityEntailmentCheckerFactory(reasonerFactory, m);
Configuration<OWLAxiom> config = new Configuration<>(checker,
new StructuralTypePriorityExpansionStrategy<OWLAxiom>(InitialEntailmentCheckStrategy.PERFORM, m),
new DivideAndConquerContractionStrategy<OWLAxiom>(), progressMonitor, m);
return new BlackBoxExplanationGeneratorFactory<>(config);
};
public static void main(String[] args) {
...
ExplanationGeneratorFactory<OWLAxiom> genFac =
ExplanationManager.createLaconicExplanationGeneratorFactory(
reasoner_factory, OWLManager::createOWLOntologyManager);
ExplanationGenerator<OWLAxiom> gen = genFac.createExplanationGenerator(onto);
inferred_axioms_onto.logicalAxioms().forEach(e -> explain(e, gen));
}
but I still get java.lang.NullPointerException which is caused in the line Set<Explanation<OWLAxiom>> explanations = explanation_generator.getExplanations(entailment, 1);.
The error stack trace is:
at org.semanticweb.owl.explanation.impl.blackbox.StructuralTypePriorityExpansionStrategy.doExpansion(StructuralTypePriorityExpansionStrategy.java:69)
at org.semanticweb.owl.explanation.impl.blackbox.BlackBoxExplanationGenerator2.doExpansion(BlackBoxExplanationGenerator2.java:262)
at org.semanticweb.owl.explanation.impl.blackbox.BlackBoxExplanationGenerator2.computeExplanation(BlackBoxExplanationGenerator2.java:183)
at org.semanticweb.owl.explanation.impl.blackbox.BlackBoxExplanationGenerator2.generateExplanation(BlackBoxExplanationGenerator2.java:292)
at org.semanticweb.owl.explanation.impl.blackbox.hst.HittingSetTree.buildHittingSetTree(HittingSetTree.java:110)
at org.semanticweb.owl.explanation.impl.blackbox.BlackBoxExplanationGenerator2.getExplanations(BlackBoxExplanationGenerator2.java:118)
at org.semanticweb.owl.explanation.impl.blackbox.BlackBoxExplanationGenerator2.getExplanations(BlackBoxExplanationGenerator2.java:94)
at org.semanticweb.owl.explanation.impl.laconic.LaconicExplanationGenerator.computePreciseJustsOptimised(LaconicExplanationGenerator.java:173)
at org.semanticweb.owl.explanation.impl.laconic.LaconicExplanationGenerator.getExplanations(LaconicExplanationGenerator.java:386)
at msc.Explainer.explain(Explainer.java:46)
at msc.Explainer.lambda$2(Explainer.java:132)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at msc.Explainer.main(Explainer.java:132)
I have taken the ExplanationGenerationFactory function from this answer on getting explanations: Explanations in Consistent OWL Ontologies, as without it I couldn't get explanations at all.
Related
I prepared a Pcollection<BeamRecord> object from a file containing json objects using beam sql sdk.
The code below parse and map json lines to ChatHistory objects, then it converts the mapped objects to BeamRecord. Finally I try to use BeamSql on the returned PCollection<BeamRecord> but I get the exception SerializableCoder cannot be cast to BeamRecordCoder.
PCollection<ChatHistory> json_objects = lines.apply(ParDo.of(new ExtractObjectsFn()));
// Convert them to BeamRecords with the same schema as defined above via a DoFn.
PCollection<BeamRecord> apps = json_objects.apply(
ParDo.of(new DoFn<ChatHistory, BeamRecord>() {
#ProcessElement
public void processElement(ProcessContext c) {
List<String> fields_list= new ArrayList<String>(Arrays.asList("conversation_id","message_type","message_date","message","message_auto_id"));
List<Integer> types_list= new ArrayList<Integer>(Arrays.asList(Types.VARCHAR,Types.VARCHAR,Types.VARCHAR,Types.VARCHAR,Types.VARCHAR));
BeamRecordSqlType brtype = BeamRecordSqlType.create(fields_list, types_list);
BeamRecord br = new BeamRecord(
brtype,
c.element().conversation_id,
c.element().message_type,
c.element().message_date,
c.element().message,
c.element().message_auto_id
);
c.output(br);
}
}));
return apps.apply(
BeamSql
.query("SELECT conversation_id, message_type, message, message_date, message_auto_id FROM PCOLLECTION")
);
Here is the generated stack trace
java.lang.ClassCastException: org.apache.beam.sdk.coders.SerializableCoder cannot be cast to org.apache.beam.sdk.coders.BeamRecordCoder
at org.apache.beam.sdk.extensions.sql.BeamSql$QueryTransform.registerTables (BeamSql.java:173)
at org.apache.beam.sdk.extensions.sql.BeamSql$QueryTransform.expand (BeamSql.java:153)
at org.apache.beam.sdk.extensions.sql.BeamSql$QueryTransform.expand (BeamSql.java:116)
at org.apache.beam.sdk.Pipeline.applyInternal (Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform (Pipeline.java:472)
at org.apache.beam.sdk.values.PCollectionTuple.apply (PCollectionTuple.java:160)
at org.apache.beam.sdk.extensions.sql.BeamSql$SimpleQueryTransform.expand (BeamSql.java:246)
at org.apache.beam.sdk.extensions.sql.BeamSql$SimpleQueryTransform.expand (BeamSql.java:186)
at org.apache.beam.sdk.Pipeline.applyInternal (Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform (Pipeline.java:472)
at org.apache.beam.sdk.values.PCollection.apply (PCollection.java:286)
at com.mdm.trial.trial3$JsonParse.expand (trial3.java:123)
at com.mdm.trial.trial3$JsonParse.expand (trial3.java:1)
at org.apache.beam.sdk.Pipeline.applyInternal (Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform (Pipeline.java:472)
at org.apache.beam.sdk.values.PCollection.apply (PCollection.java:286)
at com.mdm.trial.trial3.main (trial3.java:160)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
I saw a similar post, but It still can't fix my error: Running BeamSql WithoutCoder or Making Coder Dynamic
Best regards !
Ismail, in your case using the .setCoder() should work.
I would try extracting the row type out of the ParDo, and then applying it to apps before applying the SQL query:
PCollection<ChatHistory> json_objects = lines.apply(ParDo.of(new ExtractObjectsFn()));
// Create a row type first:
List<String> fields_list= new ArrayList<String>(Arrays.asList("conversation_id","message_type","message_date","message","message_auto_id"));
List<Integer> types_list= new ArrayList<Integer>(Arrays.asList(Types.VARCHAR,Types.VARCHAR,Types.VARCHAR,Types.VARCHAR,Types.VARCHAR));
final BeamRecordSqlType brtype = BeamRecordSqlType.create(fields_list, types_list);
// Convert them to BeamRecords with the same schema as defined above via a DoFn.
PCollection<BeamRecord> apps = json_objects.apply(
ParDo.of(new DoFn<ChatHistory, BeamRecord>() {
#ProcessElement
public void processElement(ProcessContext c) {
BeamRecord br = new BeamRecord(
brtype,
c.element().conversation_id,
c.element().message_type,
c.element().message_date,
c.element().message,
c.element().message_auto_id
);
c.output(br);
}
}));
return apps
.setCoder(brtype.getRecordCoder())
.apply(
BeamSql
.query("SELECT conversation_id, message_type, message, message_date, message_auto_id FROM PCOLLECTION")
);
Couple of examples:
This example sets the coder by using Create.withCoder() which does the same thing;
This example filters and converts from Events using the ToRow.parDo() and then also sets the coder which is specified here;
Please note that BeamRecord has been renamed to Row, and few other changes are reflected in the examples above.
I need to allocate rather large matrix using OpenCV 3.1.0. I'm running following code with -Djava.library.path=$MODULE_DIR$\opencv\310\windows\x64\ -Xmx8g arguments:
public class MatTest extends BaseTest {
static { System.loadLibrary(Core.NATIVE_LIBRARY_NAME);}
#Test
public void tooBig() throws IOException {
float[] data = new float[13320*67294];
Mat iMatrix = new Mat(13320, 67294, CvType.CV_32FC1);
iMatrix.put(0, 0, data); //exception here
}
#Test
public void medium() throws IOException {
float[] data = new float[13918*13240];
Mat iMatrix = new Mat(13918, 13240, CvType.CV_32FC1);
iMatrix.put(0, 0, data);
}
}
The first test works, since the seconds throws (line: iMatrix.put(0, 0, data))
java.lang.Exception: unknown exception
at org.opencv.core.Mat.nPutF(Native Method)
at org.opencv.core.Mat.put(Mat.java:953)
at my.app.MatTest.tooBig(MatTest.java:19)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
Is it a OpenCV or native library usage limitation? Is there a workaround for such issue?
Edited: attached full code and stacktrace
It is OpenCV issue. There are some variables of signed int type as a matrix size which was exceeded by my huge array. Check source code: link. The workaround is to create the list of smaller Mats and join them using vconcat(slices, result) function.
I'm running my tests using gradle testFlavorType
JSONObject jsonObject1 = new JSONObject();
JSONObject jsonObject2 = new JSONObject();
jsonObject1.put("test", "test");
jsonObject2.put("test", "test");
assertEquals(jsonObject1.get("test"), jsonObject2.get("test"));
The above test succeeds.
jsonObject = new SlackMessageRequest(channel, message).buildBody();
String channelAssertion = jsonObject.getString(SlackMessageRequest.JSON_KEY_CHANNEL);
String messageAssertion = jsonObject.getString(SlackMessageRequest.JSON_KEY_TEXT);
assertEquals(channel, channelAssertion);
assertEquals(message, messageAssertion);
But the above two requests fail. The stack trace says that channelAssertion and messageAssertion are null, but not sure why. My question is: Why are the above two asserts failing?
Below is the SlackMessageRequest.
public class SlackMessageRequest
extends BaseRequest {
// region Variables
public static final String JSON_KEY_TEXT = "text";
public static final String JSON_KEY_CHANNEL = "channel";
private String mChannel;
private String mMessage;
// endregion
// region Constructors
public SlackMessageRequest(String channel, String message) {
mChannel = channel;
mMessage = message;
}
// endregion
// region Methods
#Override
public MethodType getMethodType() {
return MethodType.POST;
}
#Override
public JSONObject buildBody() throws JSONException {
JSONObject body = new JSONObject();
body.put(JSON_KEY_TEXT, getMessage());
body.put(JSON_KEY_CHANNEL, getChannel());
return body;
}
#Override
public String getUrl() {
return "http://localhost:1337";
}
public String getMessage() {
return mMessage;
}
public String getChannel() {
return mChannel;
}
// endregion
}
Below is the stacktrace:
junit.framework.ComparisonFailure: expected:<#tk> but was:<null>
at junit.framework.Assert.assertEquals(Assert.java:100)
at junit.framework.Assert.assertEquals(Assert.java:107)
at junit.framework.TestCase.assertEquals(TestCase.java:269)
at com.example.app.http.request.SlackMessageRequestTest.testBuildBody(SlackMessageRequestTest.java:30)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at junit.framework.TestCase.runTest(TestCase.java:176)
at junit.framework.TestCase.runBare(TestCase.java:141)
at junit.framework.TestResult$1.protect(TestResult.java:122)
at junit.framework.TestResult.runProtected(TestResult.java:142)
at junit.framework.TestResult.run(TestResult.java:125)
at junit.framework.TestCase.run(TestCase.java:129)
at junit.framework.TestSuite.runTest(TestSuite.java:252)
at junit.framework.TestSuite.run(TestSuite.java:247)
at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:86)
at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:86)
at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:49)
at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:64)
at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:50)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.messaging.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at org.gradle.messaging.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.messaging.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:360)
at org.gradle.internal.concurrent.DefaultExecutorFactory$StoppableExecutorImpl$1.run(DefaultExecutorFactory.java:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
EDIT 5:55PM EST
I've figured out that I can log with System.out.println("") and then see the results by running gradle testFlavorType --debug and by trial and error I've discovered the following weird situation:
#Override
public JSONObject buildBody() throws JSONException {
System.out.println("buildBody mChannel = " + mChannel);
System.out.println("buildBody mMessage = " + mMessage);
JSONObject body = new JSONObject();
body.put(JSON_KEY_TEXT, getMessage());
body.put(JSON_KEY_CHANNEL, getChannel());
if (body.length() != 0) {
Iterator<String> keys = body.keys();
if (keys.hasNext()) {
do {
String key = keys.next();
System.out.println("keys: " + key);
} while (keys.hasNext());
}
} else {
System.out.println("There are no keys????");
}
return body;
}
For some reason, "There are no keys????" is printing out?!?!?!?! Why?!
EDIT 6:20PM EST
I've figured out how to debug unit tests. According to the debugger, the assigned JSONObject is returning "null". I have no clue what this means (see below). Since I think this is relevant, my gradle file includes the following:
testOptions {
unitTests.returnDefaultValues = true
}
It's especially strange because if I construct a JSONObject inside the test, then everything works fine. But if it is part of the original application's code, then it doesn't work and does the above.
As Lucas says, JSON is bundled up with the Android SDK, so you are working with a stub.
The current solution is to pull JSON from Maven Central like this:
dependencies {
...
testImplementation 'org.json:json:20210307'
}
You can replace the version 20210307 with the the latest one depending on the Android API. It is not known which version of the maven artefact corresponds exactly/most closely to what ships with Android.
Alternatively, you can download and include the jar:
dependencies {
...
testImplementation files('libs/json.jar')
}
Note that you also need to use Android Studio 1.1 or higher and at least build tools version 22.0.0 or above for this to work.
Related issue: #179461
The class JSONObject is part of the android SDK. That means that is not available for unit testing by default.
From http://tools.android.com/tech-docs/unit-testing-support
The android.jar file that is used to run unit tests does not contain
any actual code - that is provided by the Android system image on real
devices. Instead, all methods throw exceptions (by default). This is
to make sure your unit tests only test your code and do not depend on
any particular behaviour of the Android platform (that you have not
explicitly mocked e.g. using Mockito).
When you set the test options to
testOptions {
unitTests.returnDefaultValues = true
}
you are fixing the "Method ... not mocked." problem, but the outcome is that when your code uses new JSONObject() you are not using the real method, you are using a mock method that doesn't do anything, it just returns a default value. That's the reason the object is null.
You can find different ways of solving the problem in this question: Android methods are not mocked when using Mockito
Well, my first hunch would be that your getMessage() method returns null. You could show the body of that method in your question and have us find the answer for you, but you should probably research how to debug android applications using breakpoints.
That way you can run your code step by step and see the values of each variable at every step. That would show you your problem in no time, and it's a skill you should definitely master as soon as possible if you intend to get seriously involved in programming.
When the multiple trees in the view expanded (Expand the selected tree), the following error is thrown.
I couldnot find out what exactly throws this error. this is the stack trace i got after exception,
org.eclipse.swt.SWTError: No more handles
at org.eclipse.swt.SWT.error(SWT.java:4109)
at org.eclipse.swt.SWT.error(SWT.java:3998)
at org.eclipse.swt.SWT.error(SWT.java:3969)
at org.eclipse.swt.widgets.Display.internal_new_GC(Display.java:2589)
at org.eclipse.swt.graphics.Image.getImageData(Image.java:1371)
at org.eclipse.swt.internal.ImageList.set(ImageList.java:401)
at org.eclipse.swt.internal.ImageList.add(ImageList.java:66)
at org.eclipse.swt.widgets.Tree.imageIndex(Tree.java:3636)
at org.eclipse.swt.widgets.TreeItem.setImage(TreeItem.java:1686)
at org.eclipse.jface.viewers.TreeViewerRow.setImage(TreeViewerRow.java:166)
at org.eclipse.jface.viewers.ViewerCell.setImage(ViewerCell.java:169)
at org.eclipse.jface.viewers.WrappedViewerLabelProvider.update(WrappedViewerLabelProvider.java:166)
at org.eclipse.jface.viewers.ViewerColumn.refresh(ViewerColumn.java:152)
at org.eclipse.jface.viewers.AbstractTreeViewer.doUpdateItem(AbstractTreeViewer.java:934)
at org.eclipse.jface.viewers.AbstractTreeViewer$UpdateItemSafeRunnable.run(AbstractTreeViewer.java:102)
at org.eclipse.core.runtime.SafeRunner.run(SafeRunner.java:42)
at org.eclipse.ui.internal.JFaceUtil$1.run(JFaceUtil.java:49)
at org.eclipse.jface.util.SafeRunnable.run(SafeRunnable.java:175)
at org.eclipse.jface.viewers.AbstractTreeViewer.doUpdateItem(AbstractTreeViewer.java:1014)
at org.eclipse.jface.viewers.StructuredViewer$UpdateItemSafeRunnable.run(StructuredViewer.java:481)
at org.eclipse.core.runtime.SafeRunner.run(SafeRunner.java:42)
at org.eclipse.ui.internal.JFaceUtil$1.run(JFaceUtil.java:49)
at org.eclipse.jface.util.SafeRunnable.run(SafeRunnable.java:175)
at org.eclipse.jface.viewers.StructuredViewer.updateItem(StructuredViewer.java:2141)
at org.eclipse.jface.viewers.AbstractTreeViewer.createTreeItem(AbstractTreeViewer.java:829)
at org.eclipse.jface.viewers.AbstractTreeViewer$1.run(AbstractTreeViewer.java:804)
at org.eclipse.swt.custom.BusyIndicator.showWhile(BusyIndicator.java:70)
at org.eclipse.jface.viewers.AbstractTreeViewer.createChildren(AbstractTreeViewer.java:778)
at org.eclipse.jface.viewers.TreeViewer.createChildren(TreeViewer.java:644)
at org.eclipse.jface.viewers.AbstractTreeViewer.internalExpandToLevel(AbstractTreeViewer.java:1714)
at org.eclipse.jface.viewers.AbstractTreeViewer.internalExpandToLevel(AbstractTreeViewer.java:1724)
at org.eclipse.jface.viewers.AbstractTreeViewer.internalExpandToLevel(AbstractTreeViewer.java:1724)
at org.eclipse.jface.viewers.AbstractTreeViewer.internalExpandToLevel(AbstractTreeViewer.java:1724)
at org.eclipse.jface.viewers.AbstractTreeViewer.internalExpandToLevel(AbstractTreeViewer.java:1724)
at org.eclipse.jface.viewers.AbstractTreeViewer.expandToLevel(AbstractTreeViewer.java:1056)
at org.eclipse.jface.viewers.AbstractTreeViewer.expandToLevel(AbstractTreeViewer.java:1037)
at org.eclipse.jface.viewers.AbstractTreeViewer.expandAll(AbstractTreeViewer.java:1026)
at com.rockwellcollins.rccase.tarbuilder.actions.ExpandAllAction.run(ExpandAllAction.java:44)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:584)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:501)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:411)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1053)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:4066)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3657)
at org.eclipse.ui.internal.Workbench.runEventLoop(Workbench.java:2640)
at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2604)
at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2438)
at org.eclipse.ui.internal.Workbench$7.run(Workbench.java:671)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:664)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149)
at com.rockwellcollins.rccase.Application.start(Application.java:74)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:369)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:620)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:575)
at org.eclipse.equinox.launcher.Main.run(Main.java:1408)
at org.eclipse.equinox.launcher.Main.main(Main.java:1384)
//Label Provider for that tree
public class ViewTreeLabelProvider extends LabelProvider implements
IColorProvider, IBaseLabelProvider,IFontProvider{
#Override
public Image getImage(Object element) {
if (element instanceof EObject) {
return aa.getImages(element);
}
return super.getImage(element);
}
}
// for loading image
public class aa {
public static Image getImages(Object element) {
if (element instanceof ClassA) {
return ClassA.getimage();
} else if (element instanceof ClassB) {
return ClassB.getimage();
} else if (element instanceof ClassC) {
return ClassC.getimage();
} else if (element instanceof ClassD) {
return ClassD.getimage();
}
return null;
}
}
My project code base is vast, i could not share it completely. So, i wrote the snippet in the simple way to convey the problem.
Actually, the images are placed in the icons folder and are fetched by AbstractUIPlugin.imageDescriptorFromPlugin(plugin, path), which inturn stores the images in the Image registry.
I also noticed that, this may be due to the limit in the GDI Objects in the Windows registry.
After the 10000 GDI objects limit is reached the exception is thrown.
By default for Windows 7, GDIProcessHandleQuota value is 10,000. When I googled , I found that, the value can be set to maximum of 65,536.
I have tried to by increasing the GDIProcessHandleQuota from 10,000 to 65,000. Still the same exception is thrown, but after reaching 19,932 GDIObjects.
I anticipate that, the problem may be due to improper disposal of GDIObjects and the exception is thrown in the Image.class.
Suggestions please!!
In your ClassA.getimage(), ClassB.getimage() ... make sure to not create an image each time ".getimage()" is called, cache it.
private Image image;
public Image getImage() {
if ( image == null) {
image = new Image(Display.getDefault(), "");
}
return image;
}
Since it seems that you are in an Eclipse environment, better even use "org.eclipse.jface.resource.ImageRegistry".
Where "UIPlugin" is your plugin extending "org.eclipse.ui.plugin.AbstractUIPlugin".
If you don't have one, create one and add it to your MANIFEST.MF (Bundle-Activator: YOURCLASS).
public Image getImage() {
String key = getClass().getName();
ImageRegistry imageRegistry = UIPlugin.getDefault().getImageRegistry();
Image image = imageRegistry.get(key);
if (image == null) {
image = new Image(Display.getDefault(), "");
imageRegistry.put(key, image);
}
return image;
}
I have problems indexing table/collection using jcouchdb. Actually it looks very simple but After some researches i dont know where the problem is or maybe i just blind. I followed the official test class:
https://code.google.com/p/orient/source/browse/trunk/tests/src/test/java/com/orientechnologies/orient/test/database/auto/ClassIndexTest.java
Take a look on the follwing code:
public class TestIndex {
public static final String dbname = "indextest";
public static final String colname = "object";
public static final String indexAttribute = "ATTR1";
public static void main(String[] args)
{
//ODatabaseDocumentTx db = new ODatabaseDocumentTx ("local:/tmp/databases/" + dbname).create();
ODatabaseDocumentTx db = new ODatabaseDocumentTx ("local:/tmp/databases/orienttestdb");
db.open("admin", "admin");
try {
if(db.getMetadata().getSchema().existsClass(colname))
db.getMetadata().getSchema().dropClass(colname);
OClass object = db.getMetadata().getSchema().getOrCreateClass((colname));
object.createProperty(indexAttribute, OType.INTEGER);
object.createIndex(colname, OClass.INDEX_TYPE.NOTUNIQUE, indexAttribute);
db.getMetadata().getSchema().save();
}
catch(Exception e){
e.printStackTrace();
}
finally{
db.close();
}
}
}
I always get the following exception:
Exception in thread "main" java.util.ServiceConfigurationError: com.orientechnologies.orient.core.index.OIndexFactory: Provider com.orientechnologies.orient.core.index.hashindex.local.OHashIndexFactory could not be instantiated: java.lang.NoSuchFieldError: UNIQUE_HASH
at java.util.ServiceLoader.fail(Unknown Source)
at java.util.ServiceLoader.access$100(Unknown Source)
at java.util.ServiceLoader$LazyIterator.next(Unknown Source)
at java.util.ServiceLoader$1.next(Unknown Source)
at com.orientechnologies.orient.core.index.OIndexes.getFactories(OIndexes.java:76)
at com.orientechnologies.orient.core.index.OIndexes.getAllFactories(OIndexes.java:87)
at com.orientechnologies.orient.core.index.OIndexes.createIndex(OIndexes.java:117)
at com.orientechnologies.orient.core.index.OIndexManagerShared.createIndex(OIndexManagerShared.java:76)
at com.orientechnologies.orient.core.index.OIndexManagerProxy.createIndex(OIndexManagerProxy.java:68)
at com.orientechnologies.orient.core.metadata.schema.OClassImpl.createIndex(OClassImpl.java:1123)
at com.orientechnologies.orient.core.metadata.schema.OClassImpl.createIndex(OClassImpl.java:1085)
at com.orientechnologies.orient.core.metadata.schema.OClassImpl.createIndex(OClassImpl.java:1081)
Does anybody know the solution for this. I also tryied closing and opening db, save schema before and using different indextypes but without any positive results.
Second Question: Is it necessary to define schema to index columns/attributes in the collection or is there another way?