I'm building a library for fetching encrypted secrets from cloud storage (in Scala, using the Java clients). I'm using the following google libraries:
"com.google.apis" % "google-api-services-cloudkms" % "v1-rev26-1.23.0" exclude("com.google.guava", "guava-jdk5"),
"com.google.cloud" % "google-cloud-storage" % "1.14.0",
Everything works fine locally, but when I try to run my code in Dataproc I'm getting the following error:
Exception in thread "main" java.lang.NoSuchMethodError: com.google.api.client.googleapis.services.json.AbstractGoogleJsonClient$Builder.setBatchPath(Ljava/lang/String;)Lcom/google/api/client/googleapis/services/AbstractGoogleClient$Builder;
at com.google.api.services.cloudkms.v1.CloudKMS$Builder.setBatchPath(CloudKMS.java:4250)
at com.google.api.services.cloudkms.v1.CloudKMS$Builder.<init>(CloudKMS.java:4229)
at gcp.encryption.EncryptedSecretsUser$class.clients(EncryptedSecretsUser.scala:111)
at gcp.encryption.EncryptedSecretsUser$class.getEncryptedSecrets(EncryptedSecretsUser.scala:62)
The offending line in my code is:
val kms: CloudKMS = new CloudKMS.Builder(credential.getTransport,
credential.getJsonFactory,
credential)
.setApplicationName("Encrypted Secrets User")
.build()
I see in the documentation that some google libraries are available on Dataproc (I'm using a Spark cluster with image version 1.2.15). But as far as I can see the transitive dependency for google-api-client is the same one I'm using locally (1.23.0). So how come the method isn't found?
Should I set up my dependencies differently for running on Dataproc?
EDIT
Finally managed to solve this in another project. Turns out that besides shading all the google dependencies (including the gcs-connector!!), you also have to register your shaded class with the JVM to handle the gs:// file-system.
Below is the maven configuration that works for me, something similar can be achieved with sbt:
Parent POM:
<project xmlns="http://maven.apache.org/POM/4.0.0"...>
...
<properties>
<!-- Spark version -->
<spark.version>[2.2.1]</spark.version>
<!-- Jackson-libs version pulled in by spark -->
<jackson.version>[2.6.5]</jackson.version>
<!-- Avro version pulled in by jackson -->
<avro.version>[1.7.7]</avro.version>
<!-- Kryo-shaded version pulled in by spark -->
<kryo.version>[3.0.3]</kryo.version>
<!-- Apache commons-lang version pulled in by spark -->
<commons.lang.version>2.6</commons.lang.version>
<!-- TODO: need to shade google libs because of version-conflicts on Dataproc. Remove this when Dataproc 1.3/2.0 is released -->
<bigquery-conn.version>[0.10.6-hadoop2]</bigquery-conn.version>
<gcs-conn.version>[1.6.5-hadoop2]</gcs-conn.version>
<google-storage.version>[1.29.0]</google-storage.version>
<!-- The guava version we want to use -->
<guava.version>[23.2-jre]</guava.version>
<!-- The google api version used by the google-cloud-storage lib -->
<api-client.version>[1.23.0]</api-client.version>
<!-- The google-api-services-storage version used by the google-cloud-storage lib -->
<storage-api.version>[v1-rev114-1.23.0]</storage-api.version>
<!-- Picked up by compiler and resource plugins -->
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
...
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<minimizeJar>true</minimizeJar>
<filters>
<filter>
<artifact>com.google.**:*</artifact>
<includes>
<include>**</include>
</includes>
</filter>
<filter>
<artifact>com.google.cloud.bigdataoss:gcs-connector</artifact>
<excludes>
<!-- Register a provider with the shaded name instead-->
<exclude>META-INF/services/org.apache.hadoop.fs.FileSystem</exclude>
</excludes>
</filter>
</filters>
<artifactSet>
<includes>
<include>com.google.*:*</include>
</includes>
<excludes>
<exclude>com.google.code.findbugs:jsr305</exclude>
</excludes>
</artifactSet>
<relocations>
<relocation>
<pattern>com.google</pattern>
<shadedPattern>com.shaded.google</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
...
</plugins>
</build>
<dependencyManagement>
<dependencies>
<dependency>
...
<groupId>com.google.cloud.bigdataoss</groupId>
<artifactId>gcs-connector</artifactId>
<version>${gcs-conn.version}</version>
<exclusions>
<!-- conflicts with Spark dependencies -->
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
</exclusion>
<!-- conflicts with Spark dependencies -->
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<!-- Avoid conflict with the version pulled in by the GCS-connector on Dataproc -->
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-storage</artifactId>
<version>${storage-api.version}</version>
</dependency>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>${commons.lang.version}</version>
</dependency>
<dependency>
<groupId>com.esotericsoftware</groupId>
<artifactId>kryo-shaded</artifactId>
<version>${kryo.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client</artifactId>
<version>${api-client.version}</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${guava.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
<version>${google-storage.version}</version>
<exclusions>
<!-- conflicts with Spark dependencies -->
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</dependency>
...
</dependencies>
...
</project>
Child POM:
<dependencies>
<!-- Libraries available on dataproc -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.cloud.bigdataoss</groupId>
<artifactId>gcs-connector</artifactId>
</dependency>
<dependency>
<groupId>com.esotericsoftware</groupId>
<artifactId>kryo-shaded</artifactId>
<scope>provided</scope><!-- Pulled in by spark -->
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<scope>provided</scope><!-- Pulled in by spark -->
</dependency>
</dependencies>
And add a file named org.apache.hadoop.fs.FileSystem under path/to/your-project/src/main/resources/META-INF/services, containing the name of your shaded class, e.g:
# WORKAROUND FOR DEPENDENCY CONFLICTS ON DATAPROC
#
# Use the shaded class as a provider for the gs:// file system
#
com.shaded.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem
(Notice that this file was filtered out of the gcs-connector library in the parent POM)
It may not be obvious, but the google-api-client version in the latest stable GCS connector is actually 1.20.0.
The reason is that this was the commit which rolled the api client version forward to 1.23.0, and it was part of a series of commits including this dependency-shading commit with the overall goal of no longer leaking the transitive dependency into the job classpath at all, precisely to avoid version collision issues in the future, at the cost of everyone having to bring their own fat jar containing the full api client dependencies themselves.
However, it turns out that many people have already grown to depend on the GCS-connector-provided api client to be on the classpath, so there are production workloads out there which cannot survive such a change inside of a minor version upgrade; thus, the upgraded GCS connector which uses 1.23.0 but also shades it so that it won't appear in the job classpath anymore is reserved for a future Dataproc 1.3+ or 2.0+ release.
In your case, you could try using a 1.20.0 version of your dependencies (you may also have to downgrade the version of the google-cloud-storage dependency you included, though a 1.22.0 version of that may still work assuming no breaking changes, since setBatchPath was indeed introduced only in 1.23.0), or otherwise you can try to shade all your own dependencies using sbt-assembly.
We can verify that setBatchPath was introduced only in 1.23.0:
$ javap -cp google-api-client-1.22.0.jar com.google.api.client.googleapis.services.AbstractGoogleClient.Builder | grep set
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setRootUrl(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setServicePath(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setGoogleClientRequestInitializer(com.google.api.client.googleapis.services.GoogleClientRequestInitializer);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setHttpRequestInitializer(com.google.api.client.http.HttpRequestInitializer);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setApplicationName(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressPatternChecks(boolean);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressRequiredParameterChecks(boolean);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressAllChecks(boolean);
$ javap -cp google-api-client-1.23.0.jar com.google.api.client.googleapis.services.AbstractGoogleClient.Builder | grep set
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setRootUrl(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setServicePath(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setBatchPath(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setGoogleClientRequestInitializer(com.google.api.client.googleapis.services.GoogleClientRequestInitializer);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setHttpRequestInitializer(com.google.api.client.http.HttpRequestInitializer);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setApplicationName(java.lang.String);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressPatternChecks(boolean);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressRequiredParameterChecks(boolean);
public com.google.api.client.googleapis.services.AbstractGoogleClient$Builder setSuppressAllChecks(boolean);
Related
I am creating Rest client using JAX RS Jersey 2. The client works but only in my IDE (IntellIJ IDEA), when I build it with Maven, using maven-assembly-plugin and run the jar it doesn't work anymore.
I get MessageBodyWriter not found for media type=application/json error.
I have tried adding more dependencies that people suggested in other posts but I don't think a dependency is a problem since it runs in the IDE.
Here is the code that throws the exception
return client.target(uri)
.request(MediaType.APPLICATION_JSON)
.post(Entity.entity(transactions.get(0), MediaType.APPLICATION_JSON));
After debugging, when I replace transactions.get(0) with an empty string "", it works.
Here is the pom.xml for maven
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>11</source>
<target>11</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>SequencerControllers</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.25.1</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId>
<version>2.25.1</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.21</version>
</dependency>
</dependencies>
Am i missing something? Really bothers me that it runs in the IDE but not when built with Maven since I build it with dependencies.
Check if you have class com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider in your resulting JAR. If not, add following dependency:
<dependency>
<groupId>com.fasterxml.jackson.jaxrs</groupId>
<artifactId>jackson-jaxrs-json-provider</artifactId>
<version>2.10.0</version>
</dependency>
there's an ordering issue as the Jackson and Jackson / jersey jars don't play well in an "uber" assembly jar. This is because they have same-named files in their individual jars that contain different values.
The assembly plugin will pick one version as the "winner" that gets included in the uber jar. The losing duplicate will not be included and in your case, you are likely getting the wrong copy included in your uber jar.
To fix this, make sure that these 2 entries are moved to the top of the maven dependencies list
<dependency>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId>
<version>${jersey-version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.jaxrs</groupId>
<artifactId>jackson-jaxrs-json-provider</artifactId>
<version>${jackson-version}</version>
</dependency>
I also moved this one at the bottom of the dependencies:
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson-version}</version>
</dependency>
That combination seemed to fix the uber jar issue in my situation.
So I am kind of at a dead end here. Have been troubleshooting for a half a day now. Using Hibernate JPA persistence in a Java application.
When running code from within IDE (IntelliJ 2018.1.5) it runs fine, however, when trying to run from jar via command line I get the following error (full stacktrace):
Exception in thread "main" org.hibernate.boot.archive.spi.ArchiveException: Could not build ClassFile
at org.hibernate.boot.archive.scan.spi.ClassFileArchiveEntryHandler.toClassFile(ClassFileArchiveEntryHandler.java:64)
at org.hibernate.boot.archive.scan.spi.ClassFileArchiveEntryHandler.handleEntry(ClassFileArchiveEntryHandler.java:47)
at org.hibernate.boot.archive.internal.JarFileBasedArchiveDescriptor.visitArchive(JarFileBasedArchiveDescriptor.java:147)
at org.hibernate.boot.archive.scan.spi.AbstractScannerImpl.scan(AbstractScannerImpl.java:47)
at org.hibernate.boot.model.process.internal.ScanningCoordinator.coordinateScan(ScanningCoordinator.java:75)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.prepare(MetadataBuildingProcess.java:98)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.<init>(EntityManagerFactoryBuilderImpl.java:228)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.<init>(EntityManagerFactoryBuilderImpl.java:170)
at org.hibernate.jpa.boot.spi.Bootstrap.getEntityManagerFactoryBuilder(Bootstrap.java:76)
at org.hibernate.jpa.HibernatePersistenceProvider.getEntityManagerFactoryBuilder(HibernatePersistenceProvider.java:181)
at org.hibernate.jpa.HibernatePersistenceProvider.getEntityManagerFactoryBuilderOrNull(HibernatePersistenceProvider.java:129)
at org.hibernate.jpa.HibernatePersistenceProvider.getEntityManagerFactoryBuilderOrNull(HibernatePersistenceProvider.java:71)
at org.hibernate.jpa.HibernatePersistenceProvider.createEntityManagerFactory(HibernatePersistenceProvider.java:52)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:55)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:39)
at com.gsr.metrics.repository.BaseRepository.<init>(BaseRepository.java:10)
at com.gsr.metrics.repository.ProcessHistoryRepository.<init>(ProcessHistoryRepository.java:11)
at com.gsr.metrics.FileProcessor.<init>(FileProcessor.java:24)
at com.gsr.metrics.PostProcessor.run(PostProcessor.java:42)
at com.gsr.metrics.PostProcessor.main(PostProcessor.java:28) Caused by: java.io.IOException: invalid constant type: 19 at 5
at javassist.bytecode.ConstPool.readOne(ConstPool.java:1241)
at javassist.bytecode.ConstPool.read(ConstPool.java:1172)
at javassist.bytecode.ConstPool.<init>(ConstPool.java:185)
at javassist.bytecode.ClassFile.read(ClassFile.java:807)
at javassist.bytecode.ClassFile.<init>(ClassFile.java:148)
at org.hibernate.boot.archive.scan.spi.ClassFileArchiveEntryHandler.toClassFile(ClassFileArchiveEntryHandler.java:61)
... 19 more
Build configuration is Maven and this is the Hibernate dependency entry
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.gsr.metrics</groupId>
<artifactId>PostProcessor</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.gsr.metrics.PostProcessor</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.2.12.Final</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.27</version>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.197</version>
</dependency>
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>4.1</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.16.4</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.2.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>1.72</version>
</dependency>
</dependencies>
</project>
I've tried different Hibernate versions but in all cases have gotten the same error.
Problem occurs when this statement is executed
em = Persistence.createEntityManagerFactory(persistenceUnitName).createEntityManager();
I've stumbled upon the same problem and I found a similar solution.
I was using log4j-core 2.11.1 and got the same error when running the application from the jar. I've followed your solution and changed the verison to log4j-core 2.8.2.
After that I've still got an error, but this time it was a lack of dependeny: apparently it was missing one of the "com.fasterxml.jackson.core" package classes. So I've added the dependency (version 2.9.6) and now it's running.
Hope this helps some one because it was a very frustrating error.
After a great deal of trial and error I isolated the issue to some type of library conflict with log4j. Don't ask me what the exact problem was but changing the versions of one of the two made the problem go away. Did not have the time to perform any sort of detailed analysis on the underlying reason.
In short: In Spark 2.0 spark-submit prefers his version of Guava library (14.0.1) - but I want to use recent jar version (19.0).
Question: How convince Spark to use version provided in my pom.xml file?
My suspicion: I can use spark.driver.userClassPathFirst=true option. But it is experimental feature (Spark 2.0.0 doc) - so maybe there is better solution?
Problem detailed explanation:
I'm using Spark 2.0.0 (hadoop2.7) and Elasticsearch 2.3.4. And I'm fighting with very simple application which tries use Spark Streaming and Elasticsearch together. Here it is:
SparkConf sparkConf = new SparkConf().setAppName("SampleApp");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.milliseconds(500));
jssc.checkpoint("/tmp");
JavaDStream<String> messages = jssc.textFileStream("/some_directory_path");
TransportClient client = TransportClient.builder().build()
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
messages.foreachRDD(rdd -> {
XContentBuilder builder = jsonBuilder()
.startObject()
.field("words", "some words")
.endObject();
clientTu.prepareIndex("indexName", "typeName")
.setSource(builder.string())
.get();
});
jssc.start();
jssc.awaitTermination();
The project is build with Maven. Here is part of pom.xml
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>2.3.4</version>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>19.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createSourcesJar>true</createSourcesJar>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.abc.App</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
As you can see I've done some exclusions.
And everything seems great, but after execution using command:
spark-submit --class com.abc.App --master local[2] /somePath/superApp-0.0.1-SNAPSHOT.jar
I gain the exception:
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
at org.elasticsearch.threadpool.ThreadPool.<clinit>(ThreadPool.java:190)
at org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:131)
at com.abc.App.main(App.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
After adding --conf spark.driver.userClassPathFirst=true to spark-submit command, the application seems to work correctly. But I'm not sure this is the right way to manage this problem, because this option is marked in documentation as experimental.
In other words, Spark prefers libraries from his runtime environment and ignores those provided in "uber" jar (assembled jar). So I want to know how change this behavior in proper way?
Question again: What I have to do to be sure that particular jar defined in my POM will be used in runtime (with defined version)?
Edit: In more complex application which tries to use both Spark Streaming and Elasticsearch there're the same problems with other libraries (for example io.netty:netty). And in such situation, simple spark.driver.userClassPathFirst option activation doesn't help at all.
I have a dependency jReddit.jar in my local repo. When I build my project that has a reference to jReddit in it's pom, I keep getting error when server startup because it can't find jReddit.jar in the target web inf. I check the target web inf, and the jReddit is there, only its a folder and not a jar. It looks like 'jReddit.jar' but it is a folder not a jar. If I copy and paste the actual jar in my server it works, but I need it to build correctly. Inside my local repo the jReddit is there as a jar and looks normal.
I am not sure why this is happening because one day everything was working, then the next I can't build or deploy correctly.
<dependency>
<groupId>com.github.jreddit</groupId>
<artifactId>jreddit</artifactId>
<version>1.0.5</version>
</dependency>
I havent changed this since it broke, so I think its fine
whole pom
<?xml version="1.0" encoding="UTF-8"?>
<!--
JBoss, Home of Professional Open Source
Copyright 2013, Red Hat, Inc. and/or its affiliates, and individual
contributors by the #authors tag. See the copyright.txt in the
distribution for a full listing of individual contributors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.jreddit.service</groupId>
<artifactId>jRedditService</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>war</packaging>
<name>WildFly Quickstarts: jRedditService</name>
<description>A starter Java EE 7 webapp project for use on JBoss WildFly / WildFly, generated from the jboss-javaee6-webapp archetype</description>
<url>http://wildfly.org</url>
<licenses>
<license>
<name>Apache License, Version 2.0</name>
<distribution>repo</distribution>
<url>http://www.apache.org/licenses/LICENSE-2.0.html</url>
</license>
</licenses>
<properties>
<!-- Explicitly declaring the source encoding eliminates the following
message: -->
<!-- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered
resources, i.e. build is platform dependent! -->
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- JBoss dependency versions -->
<version.wildfly.maven.plugin>1.0.2.Final</version.wildfly.maven.plugin>
<!-- Define the version of the JBoss BOMs we want to import to specify
tested stacks. -->
<version.jboss.bom>8.2.1.Final</version.jboss.bom>
<!-- other plugin versions -->
<version.compiler.plugin>3.1</version.compiler.plugin>
<version.surefire.plugin>2.16</version.surefire.plugin>
<version.war.plugin>2.5</version.war.plugin>
<!-- maven-compiler-plugin -->
<maven.compiler.target>1.7</maven.compiler.target>
<maven.compiler.source>1.7</maven.compiler.source>
</properties>
<dependencyManagement>
<dependencies>
<!-- JBoss distributes a complete set of Java EE 7 APIs including a Bill
of Materials (BOM). A BOM specifies the versions of a "stack" (or a collection)
of artifacts. We use this here so that we always get the correct versions
of artifacts. Here we use the jboss-javaee-7.0-with-tools stack (you can
read this as the JBoss stack of the Java EE 7 APIs, with some extras tools
for your project, such as Arquillian for testing) and the jboss-javaee-7.0-with-hibernate
stack you can read this as the JBoss stack of the Java EE 7 APIs, with extras
from the Hibernate family of projects) -->
<dependency>
<groupId>org.wildfly.bom</groupId>
<artifactId>jboss-javaee-7.0-with-tools</artifactId>
<version>${version.jboss.bom}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.wildfly.bom</groupId>
<artifactId>jboss-javaee-7.0-with-hibernate</artifactId>
<version>${version.jboss.bom}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>com.github.jreddit</groupId>
<artifactId>jreddit</artifactId>
<version>1.0.5</version>
<!-- <scope>provided</scope> -->
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.5.4</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
<!-- First declare the APIs we depend on and need for compilation. All
of them are provided by JBoss WildFly -->
<!-- Import the CDI API, we use provided scope as the API is included in
JBoss WildFly -->
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- Import the Common Annotations API (JSR-250), we use provided scope
as the API is included in JBoss WildFly -->
<dependency>
<groupId>org.jboss.spec.javax.annotation</groupId>
<artifactId>jboss-annotations-api_1.2_spec</artifactId>
<scope>provided</scope>
</dependency>
<!-- Import the JAX-RS API, we use provided scope as the API is included
in JBoss WildFly -->
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>jaxrs-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- Import the JPA API, we use provided scope as the API is included in
JBoss WildFly -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.1-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- Import the EJB API, we use provided scope as the API is included in
JBoss WildFly -->
<dependency>
<groupId>org.jboss.spec.javax.ejb</groupId>
<artifactId>jboss-ejb-api_3.2_spec</artifactId>
<scope>provided</scope>
</dependency>
<!-- JSR-303 (Bean Validation) Implementation -->
<!-- Provides portable constraints such as #Email -->
<!-- Hibernate Validator is shipped in JBoss WildFly -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- Import the JSF API, we use provided scope as the API is included in
JBoss WildFly -->
<dependency>
<groupId>org.jboss.spec.javax.faces</groupId>
<artifactId>jboss-jsf-api_2.2_spec</artifactId>
<scope>provided</scope>
</dependency>
<!-- Now we declare any tools needed -->
<!-- Annotation processor to generate the JPA 2.0 metamodel classes for
typesafe criteria queries -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-jpamodelgen</artifactId>
<scope>provided</scope>
</dependency>
<!-- Annotation processor that raising compilation errors whenever constraint
annotations are incorrectly used. -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator-annotation-processor</artifactId>
<scope>provided</scope>
</dependency>
<!-- Needed for running tests (you may also use TestNG) -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<!-- Optional, but highly recommended -->
<!-- Arquillian allows you to test enterprise code such as EJBs and Transactional(JTA)
JPA from JUnit/TestNG -->
<dependency>
<groupId>org.jboss.arquillian.junit</groupId>
<artifactId>arquillian-junit-container</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.protocol</groupId>
<artifactId>arquillian-protocol-servlet</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<!-- Maven will append the version to the finalName (which is the name
given to the generated war, and hence the context root) -->
<finalName>${project.artifactId}</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>org.jReddit.service.rest.JaxRsActivator</mainClass>
<classpathPrefix>dependency-jars/</classpathPrefix>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>${version.war.plugin}</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
<archive>
<manifest>
<addClasspath>true</addClasspath>
</manifest>
</archive>
</configuration>
</plugin>
<!-- The WildFly plugin deploys your war to a local WildFly container -->
<!-- To use, run: mvn package wildfly:deploy -->
<plugin>
<groupId>org.wildfly.plugins</groupId>
<artifactId>wildfly-maven-plugin</artifactId>
<version>${version.wildfly.maven.plugin}</version>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<!-- The default profile skips all tests, though you can tune it to run
just unit tests based on a custom pattern -->
<!-- Seperate profiles are provided for running all tests, including Arquillian
tests that execute in the specified container -->
<id>default</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>${version.surefire.plugin}</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>
</profile>
<profile>
<!-- An optional Arquillian testing profile that executes tests
in your WildFly instance -->
<!--
This profile will start a new WildFly instance, and execute the
test, shutting it down when done -->
arq-wildfly-managed
org.wildfly
wildfly-arquillian-container-managed
test
<profile>
<!-- An optional Arquillian testing profile that executes tests
in a remote WildFly instance -->
<!-- Run with: mvn clean test -Parq-wildfly-remote -->
<id>arq-wildfly-remote</id>
<dependencies>
<dependency>
<groupId>org.wildfly</groupId>
<artifactId>wildfly-arquillian-container-remote</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
</profiles>
this is relevant jReddits pom
<parent>
<groupId>org.sonatype.oss</groupId>
<artifactId>oss-parent</artifactId>
<version>7</version>
</parent>
<groupId>com.github.jreddit</groupId>
<artifactId>jreddit</artifactId>
<version>1.0.5</version>
<name>jReddit</name>
<description>jReddit is a wrapper for the Reddit API written in Java. </description>
<url>https://github.com/karan/jReddit</url>
Imgur
I know this question was asked a long time ago, but since I had this problem, and I didn't find anything on Stackoverflow about this, I'll anwser myself.
Basically this happens when the project of the dependency, that you made the JAR file, is still open on your IDE.
That's because if you make changes on your dependecy you won't need to use the mvn package nor the mvn install, if you close your dependecy project you'll see that the folder will turn into the correctly JAR file as expected, and your main project will read the jar from your m2 folder.
I'm trying to set up an pom.xml for my web app to connect with database. The problem occurs, when I change <artifactId>tapestry-core</artifactId> to <artifactId>tapestry-hibernate</artifactId>.
Here is output when I try to build:
The POM for org.apache.tapestry:tapestry-hibernate:jar:5.4-beta-24 is missing, no dependency information available
The POM for unknown.binary:hibernate-jpa-2.1-api-1.0.0.Final:jar:SNAPSHOT is missing, no dependency information available
------------------------------------------------------------------------
BUILD FAILURE
------------------------------------------------------------------------
Total time: 2.177s
Finished at: Mon Mar 30 20:18:00 CEST 2015
Final Memory: 6M/15M
------------------------------------------------------------------------
Failed to execute goal on project TapestryApp: Could not resolve dependencies for project com.rile:TapestryApp:war:1.0-SNAPSHOT: Failure to find org.apache.tapestry:tapestry-hibernate:jar:5.4-beta-24 in https://repository.apache.org/content/groups/staging/ was cached in the local repository, resolution will not be reattempted until the update interval of apache-staging has elapsed or updates are forced -> [Help 1]
To see the full stack trace of the errors, re-run Maven with the -e switch.
Re-run Maven using the -X switch to enable full debug logging.
For more information about the errors and possible solutions, please read the following articles:
[Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
I'm using latest version of NetBeans IDE.
Here is full pom.xml file:
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0">
<modelVersion>4.0.0</modelVersion>
<groupId>com.rile</groupId>
<artifactId>TapestryApp</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>war</packaging>
<name>TapestryApp</name>
<dependencies>
<!-- To set up an application with a database, change the artifactId below to
tapestry-hibernate, and add a dependency on your JDBC driver. You'll also
need to add Hibernate configuration files, such as hibernate.cfg.xml. -->
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-hibernate</artifactId>
<version>${tapestry-release-version}</version>
</dependency>
<!-- Include the Log4j implementation for the SLF4J logging framework -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j-release-version}</version>
</dependency>
<!--
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-hibernate</artifactId>
<version>${tapestry-release-version}</version>
</dependency>
-->
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.3.2</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.34</version>
</dependency>
<!-- Uncomment this to add support resource minification and runtime compilation -->
<!--
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-yuicompressor</artifactId>
<version>${tapestry-release-version}</version>
</dependency>
-->
<!-- Uncomment this to add support for file uploads: -->
<!--
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-upload</artifactId>
<version>${tapestry-release-version}</version>
</dependency>
-->
<!-- A dependency on either JUnit or TestNG is required, or the surefire plugin (which runs the tests)
will fail, preventing Maven from packaging the WAR. Tapestry includes a large number
of testing facilities designed for use with TestNG (http://testng.org/), so it's recommended. -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>${testng-release-version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.easymock</groupId>
<artifactId>easymock</artifactId>
<version>${easymock-release-version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-test</artifactId>
<version>${tapestry-release-version}</version>
<scope>test</scope>
</dependency>
<!-- Provided by the servlet container, but sometimes referenced in the application
code. -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>${servlet-api-release-version}</version>
<scope>provided</scope>
</dependency>
<!-- Provide dependency to the Tapestry javadoc taglet which replaces the Maven component report -->
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-javadoc</artifactId>
<version>${tapestry-release-version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>4.3.1.Final</version>
</dependency>
<dependency>
<groupId>unknown.binary</groupId>
<artifactId>hibernate-jpa-2.1-api-1.0.0.Final</artifactId>
<version>SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<finalName>TapestryApp</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
<optimize>true</optimize>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7.2</version>
<configuration>
<systemPropertyVariables>
<tapestry.execution-mode>Qa</tapestry.execution-mode>
</systemPropertyVariables>
</configuration>
</plugin>
<!-- Run the application using "mvn jetty:run" -->
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.16</version>
<configuration>
<!-- Log to the console. -->
<requestLog implementation="org.mortbay.jetty.NCSARequestLog">
<!-- This doesn't do anything for Jetty, but is a workaround for a Maven bug
that prevents the requestLog from being set. -->
<append>true</append>
</requestLog>
<systemProperties>
<systemProperty>
<name>tapestry.execution-mode</name>
<value>development</value>
</systemProperty>
</systemProperties>
</configuration>
</plugin>
</plugins>
</build>
<reporting/>
<repositories>
<!-- This repository is only needed when the Tapestry version is a preview release, rather
than a final release. -->
<repository>
<id>apache-staging</id>
<url>https://repository.apache.org/content/groups/staging/</url>
</repository>
<repository>
<id>unknown-jars-temp-repo</id>
<name>A temporary repository created by NetBeans for libraries and jars it could not identify. Please replace the dependencies in this repository with correct ones and delete this repository.</name>
<url>file:${project.basedir}/lib</url>
</repository>
</repositories>
<properties>
<tapestry-release-version>5.4-beta-24</tapestry-release-version>
<servlet-api-release-version>2.5</servlet-api-release-version>
<testng-release-version>6.5.2</testng-release-version>
<easymock-release-version>3.0</easymock-release-version>
<slf4j-release-version>1.7.7</slf4j-release-version>
</properties>
</project>
This version of tapestry-hibernate is not available in the maven central repository neither in Apache staging repository (you refer to it in your POM file).
If you want to go with the beta version then I suggest you to take the latest available (for example this one).
EDIT:
in the end of your POM file you have a property:
<tapestry-release-version>5.4-beta-24</tapestry-release-version>
change it to:
<tapestry-release-version>5.4-beta-28</tapestry-release-version>
EDIT2:
BTW, you define the following dependency twice:
<dependency>
<groupId>org.apache.tapestry</groupId>
<artifactId>tapestry-hibernate</artifactId>
<version>${tapestry-release-version}</version>
</dependency>
which does no harm at the moment but you'd better clean it up.