I'm trying to get to work the GeoTools quickstart tutorial. I've downloaded a map from http://www.naturalearthdata.com.
The file is:
http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/cultural/ne_10m_admin_0_countries.zip
This is not exactly the file linked in the tutorial because that seems to be a dead link, resulting in a 404 Not Found. However since the one I've choosen is from the same site I hope that it's correct.
All the tutorial is a static main method with the following code.
File file = JFileDataStoreChooser.showOpenFile("shp", new File("."), null);
FileDataStore store = FileDataStoreFinder.getDataStore(file);
SimpleFeatureSource featureSource = store.getFeatureSource();
// Create a map content and add our shapefile to it
MapContent map = new MapContent();
map.setTitle("Quickstart");
Style style = SLD.createSimpleStyle(featureSource.getSchema());
Layer layer = new FeatureLayer(featureSource, style);
map.addLayer(layer);
// Now display the map
JMapFrame.showMap(map);
But
when i run it and select the shp file (extracted from the download), after a few seconds i get the following exception:
SEVERE: Invalid empty measure '', was expecting a number, eventually followed by px, m or ft
In debug i see that is thrown by the line:
JMapFrame.showMap(map);
How can I fix this?
I had the same error. Don't know why but it works for me using version 10-SNAPSHOT from the snapshot repository
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-shapefile</artifactId>
<version>10-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-swing</artifactId>
<version>10-SNAPSHOT</version>
</dependency>
<repository>
<snapshots>
<enabled>true</enabled>
</snapshots>
<id>opengeo</id>
<name>OpenGeo Maven Repository</name>
<url>http://repo.opengeo.org</url>
</repository>
Related
I'm running Spark 2.4.3 in standalone mode in Ubuntu. I am using Maven to create the JAR file. Below is the code I'm trying to run which is intended to stream data from Twitter.
Once Spark is started Spark master will be at 127.0.1.1:7077.
The java version being used is 1.8.
package SparkTwitter.SparkJavaTwitter;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.function.VoidFunction;
import org.apache.spark.streaming.Duration;
import org.apache.spark.streaming.api.java.JavaDStream;
import org.apache.spark.streaming.api.java.JavaPairDStream;
import org.apache.spark.streaming.api.java.JavaReceiverInputDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import org.apache.spark.streaming.twitter.TwitterUtils;
import scala.Tuple2;
import twitter4j.Status;
import twitter4j.auth.Authorization;
import twitter4j.auth.OAuthAuthorization;
import twitter4j.conf.Configuration;
import twitter4j.conf.ConfigurationBuilder;
import com.google.common.collect.Iterables;
public class TwitterStream {
public static void main(String[] args) {
// Prepare the spark configuration by setting application name and master node "local" i.e. embedded mode
final SparkConf sparkConf = new SparkConf().setAppName("Twitter Data Processing").setMaster("local[2]");
// Create Streaming context using spark configuration and duration for which messages will be batched and fed to Spark Core
final JavaStreamingContext streamingContext = new JavaStreamingContext(sparkConf, Duration.apply(10000));
// Prepare configuration for Twitter authentication and authorization
final Configuration conf = new ConfigurationBuilder().setDebugEnabled(false)
.setOAuthConsumerKey("customer key")
.setOAuthConsumerSecret("customer key secret")
.setOAuthAccessToken("Access token")
.setOAuthAccessTokenSecret("Access token secret")
.build();
// Create Twitter authorization object by passing prepared configuration containing consumer and access keys and tokens
final Authorization twitterAuth = new OAuthAuthorization(conf);
// Create a data stream using streaming context and Twitter authorization
final JavaReceiverInputDStream<Status> inputDStream = TwitterUtils.createStream(streamingContext, twitterAuth, new String[]{});
// Create a new stream by filtering the non english tweets from earlier streams
final JavaDStream<Status> enTweetsDStream = inputDStream.filter((status) -> "en".equalsIgnoreCase(status.getLang()));
// Convert stream to pair stream with key as user screen name and value as tweet text
final JavaPairDStream<String, String> userTweetsStream =
enTweetsDStream.mapToPair(
(status) -> new Tuple2<String, String>(status.getUser().getScreenName(), status.getText())
);
// Group the tweets for each user
final JavaPairDStream<String, Iterable<String>> tweetsReducedByUser = userTweetsStream.groupByKey();
// Create a new pair stream by replacing iterable of tweets in older pair stream to number of tweets
final JavaPairDStream<String, Integer> tweetsMappedByUser = tweetsReducedByUser.mapToPair(
userTweets -> new Tuple2<String, Integer>(userTweets._1, Iterables.size(userTweets._2))
);
// Iterate over the stream's RDDs and print each element on console
tweetsMappedByUser.foreachRDD((VoidFunction<JavaPairRDD<String, Integer>>)pairRDD -> {
pairRDD.foreach(new VoidFunction<Tuple2<String,Integer>>() {
#Override
public void call(Tuple2<String, Integer> t) throws Exception {
System.out.println(t._1() + "," + t._2());
}
});
});
// Triggers the start of processing. Nothing happens if streaming context is not started
streamingContext.start();
// Keeps the processing live by halting here unless terminated manually
//streamingContext.awaitTermination();
}
}
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>SparkTwitter</groupId>
<artifactId>SparkJavaTwitter</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>SparkJavaTwitter</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>2.4.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.12</artifactId>
<version>2.4.3</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-twitter -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-twitter_2.11</artifactId>
<version>1.6.3</version>
</dependency>
</dependencies>
</project>
To execute the code I'm using the following command
./bin/spark-submit --class SparkTwitter.SparkJavaTwitter.TwitterStream /home/hadoop/eclipse-workspace/SparkJavaTwitter/target/SparkJavaTwitter-0.0.1-SNAPSHOT.jar
Below is the output I'm getting.
19/11/10 22:17:58 WARN Utils: Your hostname, hadoop-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface enp0s3)
19/11/10 22:17:58 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/11/10 22:17:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Warning: Failed to load SparkTwitter.SparkJavaTwitter.TwitterStream: twitter4j/auth/Authorization
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
I've been running a word count program the same way and it works fine. When I build the JAR it builds successfully as well. Do I have to specify any more parameters while running the JAR?
I've faced a similar problem and found that you need to give the jars directly to spark-submit. What I do is point out the directory where the jars used to build the project are stored using the --jars "<path-to-jars>/*" option to spark-submit.
Perhaps this is not the best option, but it works...
Also, when updating versions beware that the jars in that folder must also be updated.
During initialization of Apache's Storm JdbcInsertBolt I get an error
java.lang.ClassCastException:
Cannot cast org.apache.phoenix.jdbc.PhoenixDriver to javax.sql.DataSource
at com.zaxxer.hikari.util.UtilityElf.createInstance(UtilityElf.java:90)
at com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:292)
at com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:84)
at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:102)
at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:71)
at org.apache.storm.jdbc.common.HikariCPConnectionProvider.prepare(HikariCPConnectionProvider.java:53)
at org.apache.storm.jdbc.mapper.SimpleJdbcMapper.<init>(SimpleJdbcMapper.java:43)
from the underlying HikariCPConnectionProvider. Whats wrong?
I am following http://storm.apache.org/releases/1.1.2/storm-jdbc.html, here is what I am doing based on that:
I like to write data from a Apache Storm topology to a HBase table via Phoenix. For that I downloaded the driver-file (phoenix-4.7.0.2.6.5.3003-25-client.jar) from my cluster-Server and added it to my local maven repository:
mvn install:install-file
-Dfile=lib\phoenix-4.7.0.2.6.5.3003-25-client.jar
-DgroupId=org.apache.phoenix
-DartifactId=phoenix-jdbc -Dversion=4.7.0 -Dpackaging=jar
After that I updated my .pom:
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-jdbc</artifactId>
<version>4.7.0</version>
</dependency>
Now add Storm's JDBC-Bolt:
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-jdbc</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
</dependency>
and I am set-up to use the bolt. First: Setup Connection-Provider:
Map hikariConfigMap = new HashMap();
hikariConfigMap.put("dataSourceClassName", "org.apache.phoenix.jdbc.PhoenixDriver");
hikariConfigMap.put("dataSource.url", "<zookeeperQuorumURI>:2181:/hbase-unsecure");
this.connectionProvider = new HikariCPConnectionProvider(hikariConfigMap);
Now initialize the tuple-values-to-db-columns-mapper
this.simpleJdbcMapper = new SimpleJdbcMapper(this.tablename, connectionProvider);
During this the error mentioned above happens.
Just for completeness: The JdbcInsertBolt gets created like this:
new JdbcInsertBolt(this.connectionProvider, this.simpleJdbcMapper)
.withTableName(this.tablename)
.withQueryTimeoutSecs(30);
have you tried to set :
driverClassName -> org.apache.phoenix.jdbc.PhoenixDriver. The current code seems to have set the dataSourceClassName which is different i guess
Refering this
I am trying to get key in this way:
private static String getPolicyKey(String secretName, String keyVaultUrl, String applicationId, String applicationSecret) {
KeyVaultClient keyVaultClient = new KeyVaultClient(new ApplicationTokenCredentials(
applicationId, // Application ID
"myDomain.com", // Azure Active Directory Domain
applicationSecret, // Application Key Value
AzureEnvironment.AZURE
));
return keyVaultClient.getSecret(
keyVaultUrl, // KeyValut URL
secretName // Secret Name
).value();
}
But I am getting an Exception:
java.lang.NoSuchMethodError: com.microsoft.azure.credentials.ApplicationTokenCredentials.proxy()Ljava/net/Proxy;
at com.microsoft.azure.credentials.ApplicationTokenCredentials.acquireAccessToken(ApplicationTokenCredentials.java:137)
at com.microsoft.azure.credentials.ApplicationTokenCredentials.getToken(ApplicationTokenCredentials.java:127)
at com.microsoft.azure.credentials.AzureTokenCredentials.getToken(AzureTokenCredentials.java:39)
at com.microsoft.azure.credentials.AzureTokenCredentialsInterceptor.intercept(AzureTokenCredentialsInterceptor.java:36)
at okhttp3.RealCall$ApplicationInterceptorChain.proceed(RealCall.java:190)
at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:163)
at okhttp3.RealCall.execute(RealCall.java:57)
at retrofit2.OkHttpCall.execute(OkHttpCall.java:174)
at retrofit2.adapter.rxjava.RxJavaCallAdapterFactory$RequestArbiter.request(RxJavaCallAdapterFactory.java:171)
at rx.Subscriber.setProducer(Subscriber.java:211)
at rx.internal.operators.OnSubscribeMap$MapSubscriber.setProducer(OnSubscribeMap.java:102)
at retrofit2.adapter.rxjava.RxJavaCallAdapterFactory$CallOnSubscribe.call(RxJavaCallAdapterFactory.java:152)
at retrofit2.adapter.rxjava.RxJavaCallAdapterFactory$CallOnSubscribe.call(RxJavaCallAdapterFactory.java:138)
at rx.Observable.unsafeSubscribe(Observable.java:10142)
I am using the following dependencies:
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-eventhubs</artifactId>
<version>0.15.1</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-keyvault</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-client-authentication</artifactId>
<version>1.1.0</version>
</dependency>
Note: I tried with azure-eventhubs version 1.0.1 but even that gave same error.
This is my first time dealing with azure-eventhubs, any kind of direction on this will be extremely helpful.
In Maven, when you have a transitive dependency shared between multiple direct dependencies at the same distance (or depth), the one that will end up in your dependency tree will be one coming from the first direct dependency in order of appearance.
Since com.microsoft.azure:azure-keyvault:1.0.0 appears first in your POM, the project will end up with version 1.0.0 of com.microsoft.azure:azure-client-runtime, which is where the class that would contain the proxy() method resides. The problem is that this method was introduced until version 1.1.0
What you can do here is swapping the order in which the Azure Key Vault and Azure Client Authentication dependencies appear. Another thing you can do is directly declare which version of azure-client-runtime you want to use in your POM:
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-client-runtime</artifactId>
<version>1.1.0</version>
</dependency>
Using Java 8, I'd like to programmatically load a javascript file and execute it using Avatar JS (for Node env support). I also want to use Maven to manage the dependencies.
Here's the simple Nashorn snippet I'm using and I'd like to extend this to support Node.JS modules, ideally using Avatar JS.
ScriptEngine engine = new ScriptEngineManager().getEngineByName("nashorn");
InputStream in = getClass().getClassLoader().getResourceAsStream("js/hello-world.js");
String result = (String)engine.eval(new InputStreamReader(in));
System.out.print(result);
The relevant Maven config also looks like this:
<repositories>
<repository>
<id>nexus-snapshots</id>
<name>Nexus Snapshots</name>
<url>https://maven.java.net/content/repositories/snapshots/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>avatar-js</artifactId>
<version>0.10.32-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>libavatar-js-linux-x64</artifactId>
<version>0.10.32-SNAPSHOT</version>
<type>pom</type>
</dependency>
</dependencies>
I get the impression there's a lot of good functionality in Avatar, but I'm struggling to find any decent docs or examples. Can anyone provide a code example of how to do this?
I figured this out, the relevant code I have running looks like this:
import com.oracle.avatar.js.Server;
import com.oracle.avatar.js.Loader;
import com.oracle.avatar.js.log.Logging;
and
String runJs() throws Throwable {
StringWriter scriptWriter = new StringWriter();
ScriptEngine engine = new ScriptEngineManager().getEngineByName("nashorn");
ScriptContext scriptContext = engine.getContext();
scriptContext.setWriter(scriptWriter);
Server server = new Server(engine, new Loader.Core(), new Logging(false), System.getProperty("user.dir"));
server.run("js/hello-world.js");
return scriptWriter.toString();
}
and, for now, a simple hello-world.js:
var util = require('util')
var result = util.format('hello %s', 'Phil');
print(result);
I also pass in java.library.home as a JVM argument when running the application. The Avatar native library resides in this directory
I am trying to use an open source tool built on Batik and I am running into trouble with one of the dependencies when I try to build it. Pretty sure this is something to do with classpaths and library locations, but I can't figure out what is happening.
So the project I am working with ( SVG2EMF ) is using the FreeHep EMF Driver, which in turn uses the FreeHep GraphicsIO project. Because these three have not been playing nicely on my system ( Ubuntu 14.04 ) I've downloaded the source for all three to try and step through the problem.
Everything builds correctly and I can step through the code successfully, but the unit tests on SVG2EMF fail at the point where the EMF Driver makes a call to something from GraphicsIO- the relevant parts of the code in question is here:
import org.freehep.graphicsio.ImageGraphics2D;
import org.freehep.graphicsio.ImageConstants;
// ...snip...
public class AlphaBlend extends EMFTag implements EMFConstants
{
// ...snip...
public void write(int tagID, EMFOutputStream emf) throws IOException
{
emf.writeRECTL(bounds);
emf.writeLONG(x);
emf.writeLONG(y);
emf.writeLONG(width);
emf.writeLONG(height);
dwROP.write(emf);
emf.writeLONG(xSrc);
emf.writeLONG(ySrc);
emf.writeXFORM(transform);
emf.writeCOLORREF(bkg);
emf.writeDWORD(usage);
emf.writeDWORD(size); // bmi follows this record immediately
emf.writeDWORD(BitmapInfoHeader.size);
emf.writeDWORD(size + BitmapInfoHeader.size); // bitmap follows bmi
emf.pushBuffer();
int encode;
// plain
encode = BI_RGB;
ImageGraphics2D.writeImage(
(RenderedImage) image,
ImageConstants.RAW.toLowerCase(),
ImageGraphics2D.getRAWProperties(bkg, "*BGRA"),
new NoCloseOutputStream(emf));
// emf.writeImage(image, bkg, "*BGRA", 1);
// png
// encode = BI_PNG;
// ImageGraphics2D.writeImage(image, "png", new Properties(), new
// NoCloseOutputStream(emf));
// jpg
// encode = BI_JPEG;
// ImageGraphics2D.writeImage(image, "jpg", new Properties(), new
// NoCloseOutputStream(emf));
int length = emf.popBuffer();
emf.writeDWORD(length);
emf.writeLONG(image.getWidth());
emf.writeLONG(image.getHeight());
BitmapInfoHeader header = new BitmapInfoHeader(image.getWidth(), image
.getHeight(), 32, encode, length, 0, 0, 0, 0);
bmi = new BitmapInfo(header);
bmi.write(emf);
emf.append();
}
This throws a NoClassDefFoundError specifically relating to org.freehep.graphicsio.ImageGraphics2D on that writeImage call. When I step through in the debugger, a watch on ImageConstants.RAW has the value of Unknown type "org.freehep.graphicsio.ImageConstants" even though the application built quite happily with those references. Any references to ImageGraphics2D behave in exactly the same way.
The dependency in the SVG2EMF pom.xml looks like this:
<dependencies>
<!-- some other dependencies -->
<dependency>
<groupId>org.freehep</groupId>
<artifactId>freehep-graphicsio-emf</artifactId>
<version>2.1.1</version>
</dependency>
</dependencies>
Dependency from the FreeHEP EMF Driver looks like this:
<dependencies>
<!-- necessary because transitive deps seem to go above inhertied deps -->
<dependency>
<groupId>org.freehep</groupId>
<artifactId>freehep-util</artifactId>
<version>2.0.2</version>
</dependency>
<dependency>
<groupId>org.freehep</groupId>
<artifactId>freehep-graphicsio</artifactId>
<version>2.1.1</version>
</dependency>
<!-- Other dependencies -->
</dependencies>
Can anybody shed any light on what is actually going on here or what I need to be doing in order to enable this to work?
EDIT: I think I have found where the problem is coming from- way down the StackTrace I see a "Caused by: ExceptionInInitializerError" - which appears to mark the class as inaccessible from then on. So the dependency does exist, but an exception is being thrown by the initializer which causes the JRE to mark it as unusable.
Further Edit: To solve these problems it can be useful ( although it is not mentioned anywhere on the freehep.org website ) to know that the project is now hosted on Github so you can find newer versions from there. In my case going straight to the latest version solved the problem.