I'm working on a java application that interacts with Word through an OLE library (org.eclipse.swt.ole.win32) to merge documents (mail merge).
the java method which makes it possible to merge has been working for several years without any particular problem.
but recently the data source can no longer be associated with the merge document.
This problem is random (on some workstations it works and on others it doesn't, yet same system configuration)
I have no explicit error reported on the java side
Here is the method that communicates with Word:
public void mergeDocument(File model, File source) throws Exception {
OleAutomation autoMailMerge = null;
LOGGER.log(new Status(IStatus.INFO, pluginID, "Merge d un document"));
LOGGER.log(new Status(IStatus.INFO, pluginID, "fichier modele: " + model.getCanonicalPath()));
LOGGER.log(new Status(IStatus.INFO, pluginID, "fichier source: " + source.getPath()));
openDocumentReadOnly(model);
autoMailMerge = OLEHelper.getAutomationProperty(autoDocument, "MailMerge");
if ((source != null) && (source.exists()) && (!source.isDirectory())) {
OLEHelper.invoke(autoMailMerge, "OpenDataSource", source.getPath());
} else {
throw new MSWordOleInterfaceException(MSWordOleInterfaceCst.MSG_ERROR_EMPTY_SOURCE_PATH
+ ((source == null) ? "null" : source.getPath()));
}
OLEHelper.invoke(autoMailMerge, "Execute");
OleAutomation autoDocumentMerged = getActiveDocument();
closeDocument(autoDocument);
activateDocument(autoDocumentMerged);
autoDocument = autoDocumentMerged;
autoMailMerge.dispose();
}
Merging by hand from Word (associating the data source and merging) works on workstations where the java application does not work.
thanks to the OLE command I validated that it is the data source which is not passed (on a workstation which works I have a return with the name of the source, on one or it does not work the return is empty)
LOGGER.log(new Status(IStatus.INFO, pluginID, "data source name: "
+ OLEHelper.getVariantProperty(autoDataSource, "Name").getString()));
-a temporary solution has been found, by deleting the registry key related to office:
HKEY_CURRENT_USER\SOFTWARE\Microsoft\Office\16.0\Word\DocumentTemplateCache
but this is only a temporary solution, the problem comes back.
I get the error in one of the polygons i am importing.
Write failed with error code 16755 and error message 'Can't extract geo keys: { _id: "b9c5ac0c-e469-4b97-b059-436cd02ffe49", _class: .... ] Duplicate vertices: 0 and 15'
Full stack Trace: https://gist.github.com/boundaries-io/927aa14e8d1e42d7cf516dc25b6ebb66#file-stacktrace
GeoJson MultiPolygon I am importing using Spring Data MongoDB
public class MyPolgyon {
#Id
String id;
#GeoSpatialIndexed(type=GeoSpatialIndexType.GEO_2DSPHERE)
GeoJsonPoint position;
#GeoSpatialIndexed(type=GeoSpatialIndexType.GEO_2DSPHERE)
GeoJsonPoint location;
#GeoSpatialIndexed(type=GeoSpatialIndexType.GEO_2DSPHERE)
GeoJsonPolygon polygon;
public static GeoJsonPolygon generateGeoJsonPolygon(List<LngLatAlt> coordinates) {
List<Point> points = new ArrayList<Point>();
for ( LngLatAlt point: coordinates) {
org.springframework.data.geo.Point dataPoint = new org.springframework.data.geo.Point( point.getLongitude() ,point.getLatitude());
points.add(dataPoint);
}
return new GeoJsonPolygon(points);
}
How can i avoid this error in Java?
I can load the geojson fine in http://geojson.io
here is the GEOJSON: https://gist.github.com/boundaries-io/4719bfc386c3728b36be10af29860f4c#file-rol-ca-part1-geojson
removal of duplicates using:
for (com.vividsolutions.jts.geom.Coordinate coordinate : geometry.getCoordinates()) {
Point lngLatAtl = new Point(coordinate.x, coordinate.y);
boolean isADup = points.contains(lngLatAtl);
if ( !notDup ){
points.add(lngLatAtl);
}else{
LOGGER.debug("Duplicate, [" + lngLatAtl.toString() +"] index[" + count +"]");
}
count++;
}
Logging:
2017-10-27 22:38:18 DEBUG TestBugs:58 - Duplicate, [Point [x=-97.009868, y=52.358242]] index[15]
2017-10-27 22:38:18 DEBUG TestBugs:58 - Duplicate, [Point [x=-97.009868, y=52.358242]] index[3348]
In this case you have duplicate vertex at index 0 and index 1341 for 2nd polygon.
[ -62.95859676499998, 46.20653318300003 ]
The insertion fails when Mongo db is trying to build the 2d sphere index for the document. Remove the coordinate at index 1341 and you should be able to persist successfully.
You just have to cleanse the data when you find the error.
You can write a small program to read the error from mongo db and provide the update back to the client. Client can act on those messages and try again the request.
More information on geo errors can be found here.
You can look at the code here for GeoParser to find how/what errors are generated. For the specific error you got you can take a look here GeoParser. This error is generated by S2 library that Mongodb uses for validation.
I'm new to Google App Engine and I'm trying to run through some of the tutorials to see how this would work for my organization. We are looking at putting some of our data into BigQuery and converting some of our Web applications to App Engine which would need to access BigQuery data.
I am using the java-docs-samples-master code, specifically bigquery/cloud-client/src/main/java/com/example/bigquery/SimpleApp.java
I can run this from the command line using
mvn exec:java -Dexec.mainClass=com.example.bigquery.SimpleAppMain
I incorporate the code into App Engine, which I'm running in Eclipse and created a wrapper so I could still run it from the command line. It works when running from the command line but I get an error when I run it from App Engine in Eclipse.
Is there something I'm missing to configure my local App Engine to connect to Big Query?
Error:
com.google.cloud.bigquery.BigQueryException: Invalid project ID 'no_app_id'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash.
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:86)
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.create(HttpBigQueryRpc.java:170)
at com.google.cloud.bigquery.BigQueryImpl$3.call(BigQueryImpl.java:208)
...
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 400
{ "code" : 400,
"errors" : [ {
"domain" : "global",
"message" : "Invalid project ID 'no_app_id'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash.",
"reason" : "invalid"
} ],
"message" : "Invalid project ID 'no_app_id'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash."
}
Code:
package com.example.bigquery;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.FieldValue;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.QueryJobConfiguration;
import com.google.cloud.bigquery.QueryResponse;
import com.google.cloud.bigquery.QueryResult;
import java.util.List;
import java.util.UUID;
public class SimpleApp {
public void runBQ() throws Exception {
// [START create_client]
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
// [END create_client]
// [START run_query]
QueryJobConfiguration queryConfig =
QueryJobConfiguration.newBuilder(
"SELECT "
+ "APPROX_TOP_COUNT(corpus, 10) as title, "
+ "COUNT(*) as unique_words "
+ "FROM `publicdata.samples.shakespeare`;")
// Use standard SQL syntax for queries.
// See: https://cloud.google.com/bigquery/sql-reference/
.setUseLegacySql(false)
.build();
// Create a job ID so that we can safely retry.
JobId jobId = JobId.of(UUID.randomUUID().toString());
Job queryJob = bigquery.create(JobInfo.newBuilder(queryConfig).setJobId(jobId).build());
// Wait for the query to complete.
queryJob = queryJob.waitFor();
// Check for errors
if (queryJob == null) {
throw new RuntimeException("Job no longer exists");
} else if (queryJob.getStatus().getError() != null) {
// You can also look at queryJob.getStatus().getExecutionErrors() for all
// errors, not just the latest one.
throw new RuntimeException(queryJob.getStatus().getError().toString());
}
// Get the results.
QueryResponse response = bigquery.getQueryResults(jobId);
// [END run_query]
// [START print_results]
QueryResult result = response.getResult();
// Print all pages of the results.
while (result != null) {
for (List<FieldValue> row : result.iterateAll()) {
List<FieldValue> titles = row.get(0).getRepeatedValue();
System.out.println("titles:");
for (FieldValue titleValue : titles) {
List<FieldValue> titleRecord = titleValue.getRecordValue();
String title = titleRecord.get(0).getStringValue();
long uniqueWords = titleRecord.get(1).getLongValue();
System.out.printf("\t%s: %d\n", title, uniqueWords);
}
long uniqueWords = row.get(1).getLongValue();
System.out.printf("total unique words: %d\n", uniqueWords);
}
result = result.getNextPage();
}
// [END print_results]
}
}
From the looks of your error code, it's probably due to your project ID not being set: "no_app_id". Here is how to set your project ID for app engine: https://developers.google.com/eclipse/docs/appengine_appid_version.
Not sure if I am late, but I encountered such error while working with Firestore and it was due to no project being set on the 'Cloud Platform' tab in App Engine run configuration. When I logged into an account and selected a project ID, this error went away.
I'm trying to extract information from my photos via Java.
My camera, an Olympus E-510 saves all the pictures with a corrupt makernote directory. When I try to get the tags from the OlympusMakernoteDirectory, there are none. Each directory has one error which turns out to be an "Illegally sized directory" error.
Do I have any chance of somehow accessing the data in the directory? I wouldn't mind juggling bytes but I have no idea where to start :(
Currently working on a javascript only solution.
Will contribute it here in the next days:
https://github.com/redaktor/exiftool.js (it is just a fork for now)
Some Olympus Cameras seem to store their makernotes only in subIFDs.
I wouldn't mind juggling bytes but I have no idea where to start :(
start here :
this has a special meaning ::
0x3000: 'RawInfo',
0x4000: 'MainInfo',
they can be either pointers to SubIFDs or arrays
some cameras store the "root tags" in 0x4000 ...
{
0x2010: '_IFDpointer_Equipment', // (misleading name returns many things, e.g. serial)
0x2020: '_IFDpointer_CameraSettings',
0x2030: '_IFDpointer_RawDevelopment',
0x2031: '_IFDpointer_RawDevelopment2',
0x2040: '_IFDpointer_ImageProcessing',
0x2050: '_IFDpointer_FocusInfo',
0x0000: 'MakerNoteVersion',
0x0001: 'CameraSettings',
0x0003: 'CameraSettings',
0x0040: 'CompressedImageSize',
0x0081: 'PreviewImageData',
0x0088: 'PreviewImageStart',
0x0089: 'PreviewImageLength',
0x0100: 'ThumbnailImage',
0x0104: 'BodyFirmwareVersion',
0x0200: 'SpecialMode',
0x0201: 'Quality',
0x0202: 'Macro',
0x0203: 'BWMode',
0x0204: 'DigitalZoom',
0x0205: 'FocalPlaneDiagonal',
0x0206: 'LensDistortionParams',
0x0207: 'Olympus CameraType Values',
0x0208: 'Olympus TextInfo',
0x020b: 'EpsonImageWidth',
0x020c: 'EpsonImageHeight',
0x020d: 'EpsonSoftware',
0x0280: 'PreviewImage',
0x0300: 'PreCaptureFrames',
0x0301: 'WhiteBoard',
0x0302: 'OneTouchWB',
0x0303: 'WhiteBalanceBracket',
0x0304: 'WhiteBalanceBias',
0x0404: 'SerialNumber',
0x0405: 'Firmware',
0x0e00: 'PrintIM',
0x0f00: 'DataDump',
0x0f01: 'DataDump2',
0x0f04: 'ZoomedPreviewStart',
0x0f05: 'ZoomedPreviewLength',
0x0f06: 'ZoomedPreviewSize',
0x1000: 'ShutterSpeedValue',
0x1001: 'ISOValue',
0x1002: 'ApertureValue',
0x1003: 'BrightnessValue',
0x1004: 'FlashMode',
0x1006: 'ExposureCompensation',
0x1007: 'SensorTemperature',
0x1008: 'LensTemperature',
0x1009: 'LightCondition',
0x100a: 'FocusRange',
0x100b: 'FocusMode',
0x100c: 'ManualFocusDistance',
0x100d: 'ZoomStepCount',
0x100e: 'FocusStepCount',
0x100f: 'Sharpness',
0x1010: 'FlashChargeLevel',
0x1011: 'ColorMatrix',
0x1012: 'BlackLevel',
0x1013: 'ColorTemperatureBG?',
0x1014: 'ColorTemperatureRG?',
0x1017: 'RedBalance',
0x1018: 'BlueBalance',
0x1019: 'ColorMatrixNumber',
0x101a: 'SerialNumber',
0x101b: 'ExternalFlashAE1_0?',
0x101c: 'ExternalFlashAE2_0?',
0x101d: 'InternalFlashAE1_0?',
0x101e: 'InternalFlashAE2_0?',
0x101f: 'ExternalFlashAE1?',
0x1020: 'ExternalFlashAE2?',
0x1021: 'InternalFlashAE1?',
0x1022: 'InternalFlashAE2?',
0x1023: 'FlashExposureComp',
0x1024: 'InternalFlashTable',
0x1025: 'ExternalFlashGValue',
0x1026: 'ExternalFlashBounce',
0x1027: 'ExternalFlashZoom',
0x1028: 'ExternalFlashMode',
0x1029: 'Contrast',
0x102a: 'SharpnessFactor',
0x102b: 'ColorControl',
0x102c: 'ValidBits',
0x102d: 'CoringFilter',
0x102e: 'OlympusImageWidth',
0x102f: 'OlympusImageHeight',
0x1030: 'SceneDetect',
0x1031: 'SceneArea?',
0x1033: 'SceneDetectData?',
0x1034: 'CompressionRatio',
0x1035: 'PreviewImageValid',
0x1036: 'PreviewImageStart',
0x1037: 'PreviewImageLength',
0x1038: 'AFResult',
0x1039: 'CCDScanMode',
0x103a: 'NoiseReduction',
0x103b: 'FocusStepInfinity',
0x103c: 'FocusStepNear',
0x103d: 'LightValueCenter',
0x103e: 'LightValuePeriphery',
0x103f: 'FieldCount?'
}
_IFDpointer_Equipment: {
0x0000: 'EquipmentVersion',
// 0x0100: { ref: ' Olympus CameraType Values' }, // TODO
0x0101: 'SerialNumber',
0x0102: 'InternalSerialNumber',
0x0103: 'FocalPlaneDiagonal',
0x0104: 'BodyFirmwareVersion',
// 0x0201: { ref: ' Olympus LensType Values' }, // TODO
0x0202: 'LensSerialNumber',
0x0203: 'LensModel',
0x0204: 'LensFirmwareVersion',
0x0205: 'MaxApertureAtMinFocal',
0x0206: 'MaxApertureAtMaxFocal',
0x0207: 'MinFocalLength',
0x0208: 'MaxFocalLength',
0x020a: 'MaxAperture',
0x020b: 'LensProperties',
0x0301: 'Extender',
0x0302: 'ExtenderSerialNumber',
0x0303: 'ExtenderModel',
0x0304: 'ExtenderFirmwareVersion',
0x0403: 'ConversionLens',
0x1000: 'FlashType',
0x1002: 'FlashFirmwareVersion',
0x1003: 'FlashSerialNumber'
},
_IFDpointer_CameraSettings: {
0x0000: 'CameraSettingsVersion',
0x0100: 'PreviewImageValid',
0x0101: 'PreviewImageStart',
0x0102: 'PreviewImageLength',
0x0200: 'ExposureMode',
0x0201: 'AELock',
0x0203: 'ExposureShift',
0x0204: 'NDFilter',
0x0300: 'MacroMode',
0x0302: 'FocusProcess',
0x0303: 'AFSearch',
0x0304: 'AFAreas',
0x0305: 'AFPointSelected',
0x0306: 'AFFineTune',
0x0307: 'AFFineTuneAdj',
0x0401: 'FlashExposureComp',
0x0404: 'FlashControlMode',
0x0405: 'FlashIntensity',
0x0406: 'ManualFlashStrength',
0x0501: 'WhiteBalanceTemperature',
0x0502: 'WhiteBalanceBracket',
0x0503: 'CustomSaturation',
0x0504: 'ModifiedSaturation',
0x0505: 'ContrastSetting',
0x0506: 'SharpnessSetting',
0x0507: 'ColorSpace',
0x050a: 'NoiseReduction',
0x050b: 'DistortionCorrection',
0x050c: 'ShadingCompensation',
0x050d: 'CompressionFactor',
0x050f: 'Gradation',
0x0521: 'PictureModeSaturation',
0x0522: 'PictureModeHue?',
0x0523: 'PictureModeContrast',
0x0524: 'PictureModeSharpness',
0x0527: 'NoiseFilter',
0x052d: 'PictureModeEffect',
0x052e: 'ToneLevel',
0x0600: 'DriveMode',
0x0601: 'PanoramaMode',
0x0603: 'ImageQuality2',
0x0604: 'ImageStabilization',
0x0900: 'ManometerPressure',
0x0901: 'ManometerReading',
0x0902: 'ExtendedWBDetect',
0x0903: 'LevelGaugeRoll',
0x0904: 'LevelGaugePitch',
0x0908: 'DateTimeUTC'
},
_IFDpointer_RawDevelopment: {
0x0000: 'RawDevVersion',
0x0100: 'RawDevExposureBiasValue',
0x0101: 'RawDevWhiteBalanceValue',
0x0102: 'RawDevWBFineAdjustment',
0x0103: 'RawDevGrayPoint',
0x0104: 'RawDevSaturationEmphasis',
0x0105: 'RawDevMemoryColorEmphasis',
0x0106: 'RawDevContrastValue',
0x0107: 'RawDevSharpnessValue',
0x0108: 'RawDevColorSpace',
0x0109: 'RawDevEngine',
0x010a: 'RawDevNoiseReduction',
0x010b: 'RawDevEditStatus'
},