Cannot start Neo4j Server after Spatial data load - java

I've been trying to use the Neo4j Spatial plugin with data loaded via Java. I have added the plugin, and when I start an empty database this is confirmed by the following GET request to the server.
{
"extensions": {
"SpatialPlugin": {
"addSimplePointLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/addSimplePointLayer",
"findClosestGeometries": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/findClosestGeometries",
"addNodesToLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/addNodesToLayer",
"addGeometryWKTToLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/addGeometryWKTToLayer",
"findGeometriesWithinDistance": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/findGeometriesWithinDistance",
"addEditableLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/addEditableLayer",
"addCQLDynamicLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/addCQLDynamicLayer",
"addNodeToLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/addNodeToLayer",
"getLayer": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/getLayer",
"findGeometriesInBBox": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/findGeometriesInBBox",
"updateGeometryFromWKT": "http://localhost:7474/db/data/ext/SpatialPlugin/graphdb/updateGeometryFromWKT"
}
},
"node": "http://localhost:7474/db/data/node",
"node_index": "http://localhost:7474/db/data/index/node",
"relationship_index": "http://localhost:7474/db/data/index/relationship",
"extensions_info": "http://localhost:7474/db/data/ext",
"relationship_types": "http://localhost:7474/db/data/relationship/types",
"batch": "http://localhost:7474/db/data/batch",
"cypher": "http://localhost:7474/db/data/cypher",
"indexes": "http://localhost:7474/db/data/schema/index",
"constraints": "http://localhost:7474/db/data/schema/constraint",
"transaction": "http://localhost:7474/db/data/transaction",
"node_labels": "http://localhost:7474/db/data/labels",
"neo4j_version": "2.3.2"
}
However, when I stop the server, load my spatial data via Java with a SpatialIndexProvider.SIMPLE_WKT_CONFIG index, then adding it with:
try (Transaction tx = db.beginTx()) {
Index<Node> index = db.index().forNodes("location", SpatialIndexProvider.SIMPLE_WKT_CONFIG);
for (String line : lines) {
String[] columns = line.split(",");
Node node = db.createNode();
node.setProperty("wkt", String.format("POINT(%s %s)", columns[4], columns[3]));
node.setProperty("name", columns[0]);
index.add(node, "dummy", "value");
}
tx.success();
}
After a restart, I get the error:
2016-02-23 13:44:36.747+0000 ERROR [o.n.k.KernelHealth] setting TM not OK. Kernel has encountered some problem, please perform necessary action (tx recovery/restart) No index provider 'spatial' found. Maybe the intended provider (or one more of its dependencies) aren't on the classpath or it failed to load.
in Messages.log inside the graph.db. Is there anything obvious that I'm doing wrong?
I'm on windows 8, Neo4j 2.3.2, Java 8 and neo4j-spatial-0.15-neo4j-2.3.0.jar

Did you unzip the full spatial zip into the plugins directory?
Otherwise some classes that spatial needs can't be found.

Related

problem of associating a data source associated with Word mail merge from a java application (OLE communication)

I'm working on a java application that interacts with Word through an OLE library (org.eclipse.swt.ole.win32) to merge documents (mail merge).
the java method which makes it possible to merge has been working for several years without any particular problem.
but recently the data source can no longer be associated with the merge document.
This problem is random (on some workstations it works and on others it doesn't, yet same system configuration)
I have no explicit error reported on the java side
Here is the method that communicates with Word:
public void mergeDocument(File model, File source) throws Exception {
OleAutomation autoMailMerge = null;
LOGGER.log(new Status(IStatus.INFO, pluginID, "Merge d un document"));
LOGGER.log(new Status(IStatus.INFO, pluginID, "fichier modele: " + model.getCanonicalPath()));
LOGGER.log(new Status(IStatus.INFO, pluginID, "fichier source: " + source.getPath()));
openDocumentReadOnly(model);
autoMailMerge = OLEHelper.getAutomationProperty(autoDocument, "MailMerge");
if ((source != null) && (source.exists()) && (!source.isDirectory())) {
OLEHelper.invoke(autoMailMerge, "OpenDataSource", source.getPath());
} else {
throw new MSWordOleInterfaceException(MSWordOleInterfaceCst.MSG_ERROR_EMPTY_SOURCE_PATH
+ ((source == null) ? "null" : source.getPath()));
}
OLEHelper.invoke(autoMailMerge, "Execute");
OleAutomation autoDocumentMerged = getActiveDocument();
closeDocument(autoDocument);
activateDocument(autoDocumentMerged);
autoDocument = autoDocumentMerged;
autoMailMerge.dispose();
}
Merging by hand from Word (associating the data source and merging) works on workstations where the java application does not work.
thanks to the OLE command I validated that it is the data source which is not passed (on a workstation which works I have a return with the name of the source, on one or it does not work the return is empty)
LOGGER.log(new Status(IStatus.INFO, pluginID, "data source name: "
+ OLEHelper.getVariantProperty(autoDataSource, "Name").getString()));
-a temporary solution has been found, by deleting the registry key related to office:
HKEY_CURRENT_USER\SOFTWARE\Microsoft\Office\16.0\Word\DocumentTemplateCache
but this is only a temporary solution, the problem comes back.

cassandra read timeout in java butruns well on cqlsh

I'm runnig a select query on cassandra in java.It return read timeout exception.The problem is that is doen not produce an error in cqlsh.So the problem is in the code i guess.I'm using flink to connect to cassandra.Also i think that maybe the problem is that in cqlsh i need to hit enter the get more results.Maybe that's the cause for stopping the drive to read through all the result.Also when i use limit 100 it runs well.I have changed the cassandra.yaml read configuration but nothing changed.
ClusterBuilder cb = new ClusterBuilder() {
#Override
public Cluster buildCluster(Cluster.Builder builder) {
return builder.withPort(9042).addContactPoint("127.0.0.1")
.withSocketOptions(new SocketOptions().setReadTimeoutMillis(60000).setKeepAlive(true).setReuseAddress(true))
.withLoadBalancingPolicy(new RoundRobinPolicy())
.withReconnectionPolicy(new ConstantReconnectionPolicy(500L))
.withQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.ONE))
.build();
}
};
CassandraInputFormat<Tuple1<ProfileAlternative>> cassandraInputFormat = new CassandraInputFormat("SELECT toJson(profilealternative) FROM profiles.profile where skills contains 'Financial Sector' ", cb);
cassandraInputFormat.configure(null);
cassandraInputFormat.open(null);
Tuple1<ProfileAlternative> testOutputTuple = new Tuple1();
while (!cassandraInputFormat.reachedEnd()) {
cassandraInputFormat.nextRecord(testOutputTuple);
System.out.println(testOutputTuple.f0);
}

Using Spring Data,Mongodb, how can I avoid Duplicate vertices error

I get the error in one of the polygons i am importing.
Write failed with error code 16755 and error message 'Can't extract geo keys: { _id: "b9c5ac0c-e469-4b97-b059-436cd02ffe49", _class: .... ] Duplicate vertices: 0 and 15'
Full stack Trace: https://gist.github.com/boundaries-io/927aa14e8d1e42d7cf516dc25b6ebb66#file-stacktrace
GeoJson MultiPolygon I am importing using Spring Data MongoDB
public class MyPolgyon {
#Id
String id;
#GeoSpatialIndexed(type=GeoSpatialIndexType.GEO_2DSPHERE)
GeoJsonPoint position;
#GeoSpatialIndexed(type=GeoSpatialIndexType.GEO_2DSPHERE)
GeoJsonPoint location;
#GeoSpatialIndexed(type=GeoSpatialIndexType.GEO_2DSPHERE)
GeoJsonPolygon polygon;
public static GeoJsonPolygon generateGeoJsonPolygon(List<LngLatAlt> coordinates) {
List<Point> points = new ArrayList<Point>();
for ( LngLatAlt point: coordinates) {
org.springframework.data.geo.Point dataPoint = new org.springframework.data.geo.Point( point.getLongitude() ,point.getLatitude());
points.add(dataPoint);
}
return new GeoJsonPolygon(points);
}
How can i avoid this error in Java?
I can load the geojson fine in http://geojson.io
here is the GEOJSON: https://gist.github.com/boundaries-io/4719bfc386c3728b36be10af29860f4c#file-rol-ca-part1-geojson
removal of duplicates using:
for (com.vividsolutions.jts.geom.Coordinate coordinate : geometry.getCoordinates()) {
Point lngLatAtl = new Point(coordinate.x, coordinate.y);
boolean isADup = points.contains(lngLatAtl);
if ( !notDup ){
points.add(lngLatAtl);
}else{
LOGGER.debug("Duplicate, [" + lngLatAtl.toString() +"] index[" + count +"]");
}
count++;
}
Logging:
2017-10-27 22:38:18 DEBUG TestBugs:58 - Duplicate, [Point [x=-97.009868, y=52.358242]] index[15]
2017-10-27 22:38:18 DEBUG TestBugs:58 - Duplicate, [Point [x=-97.009868, y=52.358242]] index[3348]
In this case you have duplicate vertex at index 0 and index 1341 for 2nd polygon.
[ -62.95859676499998, 46.20653318300003 ]
The insertion fails when Mongo db is trying to build the 2d sphere index for the document. Remove the coordinate at index 1341 and you should be able to persist successfully.
You just have to cleanse the data when you find the error.
You can write a small program to read the error from mongo db and provide the update back to the client. Client can act on those messages and try again the request.
More information on geo errors can be found here.
You can look at the code here for GeoParser to find how/what errors are generated. For the specific error you got you can take a look here GeoParser. This error is generated by S2 library that Mongodb uses for validation.

Unable to Connect to BigQuery from local App Engine instance in Eclipse

I'm new to Google App Engine and I'm trying to run through some of the tutorials to see how this would work for my organization. We are looking at putting some of our data into BigQuery and converting some of our Web applications to App Engine which would need to access BigQuery data.
I am using the java-docs-samples-master code, specifically bigquery/cloud-client/src/main/java/com/example/bigquery/SimpleApp.java
I can run this from the command line using
mvn exec:java -Dexec.mainClass=com.example.bigquery.SimpleAppMain
I incorporate the code into App Engine, which I'm running in Eclipse and created a wrapper so I could still run it from the command line. It works when running from the command line but I get an error when I run it from App Engine in Eclipse.
Is there something I'm missing to configure my local App Engine to connect to Big Query?
Error:
com.google.cloud.bigquery.BigQueryException: Invalid project ID 'no_app_id'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash.
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:86)
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.create(HttpBigQueryRpc.java:170)
at com.google.cloud.bigquery.BigQueryImpl$3.call(BigQueryImpl.java:208)
...
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400
{ "code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid project ID 'no_app_id'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash.",
"reason" : "invalid"
} ],
"message" : "Invalid project ID 'no_app_id'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash."
}
Code:
package com.example.bigquery;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.FieldValue;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.QueryJobConfiguration;
import com.google.cloud.bigquery.QueryResponse;
import com.google.cloud.bigquery.QueryResult;
import java.util.List;
import java.util.UUID;
public class SimpleApp {
public void runBQ() throws Exception {
// [START create_client]
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
// [END create_client]
// [START run_query]
QueryJobConfiguration queryConfig =
QueryJobConfiguration.newBuilder(
"SELECT "
+ "APPROX_TOP_COUNT(corpus, 10) as title, "
+ "COUNT(*) as unique_words "
+ "FROM `publicdata.samples.shakespeare`;")
// Use standard SQL syntax for queries.
// See: https://cloud.google.com/bigquery/sql-reference/
.setUseLegacySql(false)
.build();
// Create a job ID so that we can safely retry.
JobId jobId = JobId.of(UUID.randomUUID().toString());
Job queryJob = bigquery.create(JobInfo.newBuilder(queryConfig).setJobId(jobId).build());
// Wait for the query to complete.
queryJob = queryJob.waitFor();
// Check for errors
if (queryJob == null) {
throw new RuntimeException("Job no longer exists");
} else if (queryJob.getStatus().getError() != null) {
// You can also look at queryJob.getStatus().getExecutionErrors() for all
// errors, not just the latest one.
throw new RuntimeException(queryJob.getStatus().getError().toString());
}
// Get the results.
QueryResponse response = bigquery.getQueryResults(jobId);
// [END run_query]
// [START print_results]
QueryResult result = response.getResult();
// Print all pages of the results.
while (result != null) {
for (List<FieldValue> row : result.iterateAll()) {
List<FieldValue> titles = row.get(0).getRepeatedValue();
System.out.println("titles:");
for (FieldValue titleValue : titles) {
List<FieldValue> titleRecord = titleValue.getRecordValue();
String title = titleRecord.get(0).getStringValue();
long uniqueWords = titleRecord.get(1).getLongValue();
System.out.printf("\t%s: %d\n", title, uniqueWords);
}
long uniqueWords = row.get(1).getLongValue();
System.out.printf("total unique words: %d\n", uniqueWords);
}
result = result.getNextPage();
}
// [END print_results]
}
}
From the looks of your error code, it's probably due to your project ID not being set: "no_app_id". Here is how to set your project ID for app engine: https://developers.google.com/eclipse/docs/appengine_appid_version.
Not sure if I am late, but I encountered such error while working with Firestore and it was due to no project being set on the 'Cloud Platform' tab in App Engine run configuration. When I logged into an account and selected a project ID, this error went away.

Corrupt Olympus Makernote Exif Directory

I'm trying to extract information from my photos via Java.
My camera, an Olympus E-510 saves all the pictures with a corrupt makernote directory. When I try to get the tags from the OlympusMakernoteDirectory, there are none. Each directory has one error which turns out to be an "Illegally sized directory" error.
Do I have any chance of somehow accessing the data in the directory? I wouldn't mind juggling bytes but I have no idea where to start :(
Currently working on a javascript only solution.
Will contribute it here in the next days:
https://github.com/redaktor/exiftool.js (it is just a fork for now)
Some Olympus Cameras seem to store their makernotes only in subIFDs.
I wouldn't mind juggling bytes but I have no idea where to start :(
start here :
this has a special meaning ::
0x3000: 'RawInfo',
0x4000: 'MainInfo',
they can be either pointers to SubIFDs or arrays
some cameras store the "root tags" in 0x4000 ...
{
0x2010: '_IFDpointer_Equipment', // (misleading name returns many things, e.g. serial)
0x2020: '_IFDpointer_CameraSettings',
0x2030: '_IFDpointer_RawDevelopment',
0x2031: '_IFDpointer_RawDevelopment2',
0x2040: '_IFDpointer_ImageProcessing',
0x2050: '_IFDpointer_FocusInfo',
0x0000: 'MakerNoteVersion',
0x0001: 'CameraSettings',
0x0003: 'CameraSettings',
0x0040: 'CompressedImageSize',
0x0081: 'PreviewImageData',
0x0088: 'PreviewImageStart',
0x0089: 'PreviewImageLength',
0x0100: 'ThumbnailImage',
0x0104: 'BodyFirmwareVersion',
0x0200: 'SpecialMode',
0x0201: 'Quality',
0x0202: 'Macro',
0x0203: 'BWMode',
0x0204: 'DigitalZoom',
0x0205: 'FocalPlaneDiagonal',
0x0206: 'LensDistortionParams',
0x0207: 'Olympus CameraType Values',
0x0208: 'Olympus TextInfo',
0x020b: 'EpsonImageWidth',
0x020c: 'EpsonImageHeight',
0x020d: 'EpsonSoftware',
0x0280: 'PreviewImage',
0x0300: 'PreCaptureFrames',
0x0301: 'WhiteBoard',
0x0302: 'OneTouchWB',
0x0303: 'WhiteBalanceBracket',
0x0304: 'WhiteBalanceBias',
0x0404: 'SerialNumber',
0x0405: 'Firmware',
0x0e00: 'PrintIM',
0x0f00: 'DataDump',
0x0f01: 'DataDump2',
0x0f04: 'ZoomedPreviewStart',
0x0f05: 'ZoomedPreviewLength',
0x0f06: 'ZoomedPreviewSize',
0x1000: 'ShutterSpeedValue',
0x1001: 'ISOValue',
0x1002: 'ApertureValue',
0x1003: 'BrightnessValue',
0x1004: 'FlashMode',
0x1006: 'ExposureCompensation',
0x1007: 'SensorTemperature',
0x1008: 'LensTemperature',
0x1009: 'LightCondition',
0x100a: 'FocusRange',
0x100b: 'FocusMode',
0x100c: 'ManualFocusDistance',
0x100d: 'ZoomStepCount',
0x100e: 'FocusStepCount',
0x100f: 'Sharpness',
0x1010: 'FlashChargeLevel',
0x1011: 'ColorMatrix',
0x1012: 'BlackLevel',
0x1013: 'ColorTemperatureBG?',
0x1014: 'ColorTemperatureRG?',
0x1017: 'RedBalance',
0x1018: 'BlueBalance',
0x1019: 'ColorMatrixNumber',
0x101a: 'SerialNumber',
0x101b: 'ExternalFlashAE1_0?',
0x101c: 'ExternalFlashAE2_0?',
0x101d: 'InternalFlashAE1_0?',
0x101e: 'InternalFlashAE2_0?',
0x101f: 'ExternalFlashAE1?',
0x1020: 'ExternalFlashAE2?',
0x1021: 'InternalFlashAE1?',
0x1022: 'InternalFlashAE2?',
0x1023: 'FlashExposureComp',
0x1024: 'InternalFlashTable',
0x1025: 'ExternalFlashGValue',
0x1026: 'ExternalFlashBounce',
0x1027: 'ExternalFlashZoom',
0x1028: 'ExternalFlashMode',
0x1029: 'Contrast',
0x102a: 'SharpnessFactor',
0x102b: 'ColorControl',
0x102c: 'ValidBits',
0x102d: 'CoringFilter',
0x102e: 'OlympusImageWidth',
0x102f: 'OlympusImageHeight',
0x1030: 'SceneDetect',
0x1031: 'SceneArea?',
0x1033: 'SceneDetectData?',
0x1034: 'CompressionRatio',
0x1035: 'PreviewImageValid',
0x1036: 'PreviewImageStart',
0x1037: 'PreviewImageLength',
0x1038: 'AFResult',
0x1039: 'CCDScanMode',
0x103a: 'NoiseReduction',
0x103b: 'FocusStepInfinity',
0x103c: 'FocusStepNear',
0x103d: 'LightValueCenter',
0x103e: 'LightValuePeriphery',
0x103f: 'FieldCount?'
}
_IFDpointer_Equipment: {
0x0000: 'EquipmentVersion',
// 0x0100: { ref: ' Olympus CameraType Values' }, // TODO
0x0101: 'SerialNumber',
0x0102: 'InternalSerialNumber',
0x0103: 'FocalPlaneDiagonal',
0x0104: 'BodyFirmwareVersion',
// 0x0201: { ref: ' Olympus LensType Values' }, // TODO
0x0202: 'LensSerialNumber',
0x0203: 'LensModel',
0x0204: 'LensFirmwareVersion',
0x0205: 'MaxApertureAtMinFocal',
0x0206: 'MaxApertureAtMaxFocal',
0x0207: 'MinFocalLength',
0x0208: 'MaxFocalLength',
0x020a: 'MaxAperture',
0x020b: 'LensProperties',
0x0301: 'Extender',
0x0302: 'ExtenderSerialNumber',
0x0303: 'ExtenderModel',
0x0304: 'ExtenderFirmwareVersion',
0x0403: 'ConversionLens',
0x1000: 'FlashType',
0x1002: 'FlashFirmwareVersion',
0x1003: 'FlashSerialNumber'
},
_IFDpointer_CameraSettings: {
0x0000: 'CameraSettingsVersion',
0x0100: 'PreviewImageValid',
0x0101: 'PreviewImageStart',
0x0102: 'PreviewImageLength',
0x0200: 'ExposureMode',
0x0201: 'AELock',
0x0203: 'ExposureShift',
0x0204: 'NDFilter',
0x0300: 'MacroMode',
0x0302: 'FocusProcess',
0x0303: 'AFSearch',
0x0304: 'AFAreas',
0x0305: 'AFPointSelected',
0x0306: 'AFFineTune',
0x0307: 'AFFineTuneAdj',
0x0401: 'FlashExposureComp',
0x0404: 'FlashControlMode',
0x0405: 'FlashIntensity',
0x0406: 'ManualFlashStrength',
0x0501: 'WhiteBalanceTemperature',
0x0502: 'WhiteBalanceBracket',
0x0503: 'CustomSaturation',
0x0504: 'ModifiedSaturation',
0x0505: 'ContrastSetting',
0x0506: 'SharpnessSetting',
0x0507: 'ColorSpace',
0x050a: 'NoiseReduction',
0x050b: 'DistortionCorrection',
0x050c: 'ShadingCompensation',
0x050d: 'CompressionFactor',
0x050f: 'Gradation',
0x0521: 'PictureModeSaturation',
0x0522: 'PictureModeHue?',
0x0523: 'PictureModeContrast',
0x0524: 'PictureModeSharpness',
0x0527: 'NoiseFilter',
0x052d: 'PictureModeEffect',
0x052e: 'ToneLevel',
0x0600: 'DriveMode',
0x0601: 'PanoramaMode',
0x0603: 'ImageQuality2',
0x0604: 'ImageStabilization',
0x0900: 'ManometerPressure',
0x0901: 'ManometerReading',
0x0902: 'ExtendedWBDetect',
0x0903: 'LevelGaugeRoll',
0x0904: 'LevelGaugePitch',
0x0908: 'DateTimeUTC'
},
_IFDpointer_RawDevelopment: {
0x0000: 'RawDevVersion',
0x0100: 'RawDevExposureBiasValue',
0x0101: 'RawDevWhiteBalanceValue',
0x0102: 'RawDevWBFineAdjustment',
0x0103: 'RawDevGrayPoint',
0x0104: 'RawDevSaturationEmphasis',
0x0105: 'RawDevMemoryColorEmphasis',
0x0106: 'RawDevContrastValue',
0x0107: 'RawDevSharpnessValue',
0x0108: 'RawDevColorSpace',
0x0109: 'RawDevEngine',
0x010a: 'RawDevNoiseReduction',
0x010b: 'RawDevEditStatus'
},

Categories