I got invocation target exception when running project with spark 1.3 lib in maven in IntelliJ.
I got met this error only in IntelliJ IDE. After I deployed the jar and ran via spark-submit, the error went out.
Any one has met with the same problem before? I hope to fix this problem so as to do easy-debugging. otherwise I have to package the jar every time when I want to run the code.
details are as below:
2015-04-21 09:39:13 ERROR MetricsSystem:75 - Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
2015-04-21 09:39:13 ERROR TrainingSFERunner:144 - java.lang.reflect.InvocationTargetException
2015-04-20 16:08:44 INFO BlockManagerMaster:59 - Registered BlockManager
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:187)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:181)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:181)
at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:98)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:390)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at spark.mllibClassifier.JavaRandomForests.run(JavaRandomForests.java:105)
at spark.mllibClassifier.SparkMLlibMain.runMain(SparkMLlibMain.java:263)
at spark.mllibClassifier.JavaRandomForests.main(JavaRandomForests.java:221)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.module.SimpleSerializers.<init>(Ljava/util/List;)V
at com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:223)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:45)
My pom file is as follows.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>projects</groupId>
<artifactId>project1</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.0</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>5.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>5.0.0</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>colt</groupId>
<artifactId>colt</artifactId>
<version>1.2.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
</project>
Strangely, I found the error didn't come out any more when I move the spark related dependencies to the front.
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.3.0</version>
</dependency>
//....the rest dependencies....
</dependencies>
So the sequence of the dependencies matters! Any one knows why?
I think the issue here is in jackson dependency. I had similar issue and problem was multiple jackson-core and jackson-databind versions. I think it is scala related issue. Anyway add this 2 jackson dependencies to pom with lower versions and it should work. Maybe you will not find right version from first try. This one works for me.
<jackson-core.version>2.4.4</jackson-core.version>
<spark.version>2.3.0</spark.version>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson-core.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson-core.version}</version>
</dependency>
Jackson dependency issue - I had spark3.0.3-hadoop2.7 version, and had 2 versions of jackson annotation and databind jars. Removed those, and this error got solved.
Related
I know there are few similar questions, but they are mostly outdated and does not help to resolve my problem.
Trying to run my first selenium test using selenium version 4.2.0 and its my first time to use maven but it always throw this error and I did all I have to overcome this but still don't know how to solve or where is the issue.
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.ImmutableList.toImmutableList()Ljava/util/stream/Collector;
at org.openqa.selenium.remote.CapabilitiesUtils.makeW3CSafe(CapabilitiesUtils.java:100)
at org.openqa.selenium.remote.CapabilitiesUtils.makeW3CSafe(CapabilitiesUtils.java:72)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
at java.util.Collections$2.tryAdvance(Collections.java:4719)
at java.util.Collections$2.forEachRemaining(Collections.java:4727)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at org.openqa.selenium.remote.DriverCommand.NEW_SESSION(DriverCommand.java:65)
at org.openqa.selenium.remote.RemoteWebDriver.startSession(RemoteWebDriver.java:247)
at org.openqa.selenium.remote.RemoteWebDriver.<init>(RemoteWebDriver.java:163)
at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:126)
at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:114)
at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:94)
at com.google.ChromeDriverDemo.main(ChromeDriverDemo.java:15)
here is my POM file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.google</groupId>
<artifactId>Selenium.test</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Selenium.test</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.21</version>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<type>maven-plugin</type>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
<type>maven-plugin</type>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<type>maven-plugin</type>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<type>maven-plugin</type>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>4.2.0</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>31.1-jre</version>
</dependency>
</dependencies>
</project>
It's a tipical jar conflict problem. selenium-java include a guava dependence.
you can use mavenhelper or mvn dependency:tree command help you analysis.
try to exclude guava dependence
I am trying to run my Java Maven project in a JAR file. It runs on IntelliJ IDEA 2017, however, when running it via the JAR file it does not run. The following error is shown:
Exception in thread "main" java.util.ServiceConfigurationError: Cannot instantiate SPI class: org.apache.lucene.codecs.lucene62.Lucene62Codec
at org.apache.lucene.util.NamedSPILoader.reload(NamedSPILoader.java:82)
at org.apache.lucene.util.NamedSPILoader.<init>(NamedSPILoader.java:51)
at org.apache.lucene.util.NamedSPILoader.<init>(NamedSPILoader.java:38)
at org.apache.lucene.codecs.Codec$Holder.<clinit>(Codec.java:47)
at org.apache.lucene.codecs.Codec.getDefault(Codec.java:143)
at org.apache.lucene.index.LiveIndexWriterConfig.<init>(LiveIndexWriterConfig.java:121)
at org.apache.lucene.index.IndexWriterConfig.<init>(IndexWriterConfig.java:151)
at com.hrforecast.skillextraction.LuceneIndexManager.buildLuceneIndex(LuceneIndexManager.java:54)
at com.hrforecast.skillextraction.SkillExtractionModule.getJobsSearcher(SkillExtractionModule.java:129)
at com.hrforecast.skillextraction.SkillExtractionModule.run(SkillExtractionModule.java:60)
at com.hrforecast.skillextraction.Main.main(Main.java:28)
Caused by: java.lang.IllegalArgumentException: An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath.
The current classpath supports the following names: [IDVersion]
at org.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:116)
at org.apache.lucene.codecs.PostingsFormat.forName(PostingsFormat.java:112)
at org.apache.lucene.codecs.lucene62.Lucene62Codec.<init>(Lucene62Codec.java:167)
at org.apache.lucene.codecs.lucene62.Lucene62Codec.<init>(Lucene62Codec.java:82)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at java.lang.Class.newInstance(Unknown Source)
at org.apache.lucene.util.NamedSPILoader.reload(NamedSPILoader.java:72)
... 10 more
I have seen similar problems in the other stackoverflow questions for example What causes err ' A SPI class of type lucene.codecs.Codec name 'Lucene42' but this also did not solve my problem.
My pom.xml is below. As you can see I tried also to use Resource Transformers, but it changed nothing.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
...
<dependencies>
<!-- Spring framework -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>3.2.2.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>3.2.2.RELEASE</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.5.0</version>
</dependency>
<!-- Spring data mongodb -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.0.RELEASE</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.lucene/lucene-core -->
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>6.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.lucene/lucene-analyzers-common -->
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>6.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.lucene/lucene-queryparser -->
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-queryparser</artifactId>
<version>6.6.0</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>de.flapdoodle.embed</groupId>
<artifactId>de.flapdoodle.embed.mongo</artifactId>
<version>1.26</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<!--<plugin>-->
<!--<artifactId>maven-compiler-plugin</artifactId>-->
<!--<version>3.0</version>-->
<!--<configuration>-->
<!--<source>1.6</source>-->
<!--<target>1.6</target>-->
<!--</configuration>-->
<!--</plugin>-->
<!--<plugin>-->
<!--<groupId>org.apache.maven.plugins</groupId>-->
<!--<artifactId>maven-eclipse-plugin</artifactId>-->
<!--<version>2.9</version>-->
<!--<configuration>-->
<!--<downloadSources>true</downloadSources>-->
<!--<downloadJavadocs>true</downloadJavadocs>-->
<!--</configuration>-->
<!--</plugin>-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com..Main</mainClass>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Any help or guidance will be appreciated.
My solution was to use Maven Shade plugin and as I was using IntelliJ, the problem lied in how I was creating the JAR file.
Instead of creating it using 'artifacts' Build in IntelliJ, I created the JAR file using the 'mvn' tool in the terminal. Make sure your JAVA_HOME is correctly set. I had to install openjdk8 using apt-get again. In the end, the correct JAR file was created in the target directory.
See here for more details on how to create a JAR file using Maven Shade.
(This solution worked for Lucene 6.6.0, Windows 10 bash).
When I'm using spark 1.6.1, everything is alright. When I switch to Spark 2.1.0, I come across the problem below:
Task 33 in stage3.0 failed 4times; aborting job
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 33 in stage 3.0 failed 4 times, most recent failure: Lost taks 33.3 in stage 3.0 (TID 310, 192.168.1.5, executor 3): java.io.invalidclassexception scala.tuple2; local class incompatible; local class incompatible: stream classdesc serialVersionUID = -4864544146559264103, local class serialVersionUID = 3356420310891166197
I know -4864544146559264103 is correspond to scala 2.10, while 3356420310891166197 is correspond to scala 2.11. Although I changed my configuration to
EDIT: the entire pom file shows below.
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test.spark</groupId>
<artifactId>spark</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>spark</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<build>
<plugins>
<plugin>
<!-- solve the problem of : java.lang.ClassNotFoundException: kafka.producer.ProducerConfig -->
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>io.fastjson</groupId>
<artifactId>boon</artifactId>
<version>0.33</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.7</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1.1</version>
</dependency>
</dependencies>
</project>
the problem is still exists. How to fix this problem? Any detail needed will be added. Thanks for any help!
Finally, I fixed this problem. It's my fault, the pom file is alright, and the project works well.
The problem is resulted from one detail, the code reads scala.Tuple2 object from HDFS, which is not mentioned in my question (I'm sorry to say this). The objects in HDFS are generated with scala 2.10 by another project, so the problem occurs.
Anyway, thanks for your help.
My java class throws this error when deployed on tomcat:
org.apache.jasper.JasperException:
javax.servlet.ServletException: java.lang.NoSuchMethodError: com.fasterxml.jackson.core.JsonParser.getValueAsString()Ljava/lang/String;
org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:556)
When tested locally the class runs perfectly well. I understand that there might be a conflict of Jackson version, but I can't find out.
Here is an extract of my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.skylads.skott.webapp</groupId>
<version>0.0.1-SNAPSHOT</version>
<packaging>war</packaging>
<name>Skylads CRM App</name>
<description>CRM App integrating various features</description>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
<resource>
<directory>resources</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<warSourceDirectory>WebContent</warSourceDirectory>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
</plugins>
</build>
<artifactId>crm</artifactId>
<dependencies>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-oauth2</artifactId>
<exclusions>
<exclusion>
<artifactId>jackson-core</artifactId>
<groupId>com.fasterxml.jackson.core</groupId>
</exclusion>
</exclusions>
<version>v2-rev98-1.20.0</version>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-plus</artifactId>
<version>v1-rev323-1.21.0</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>com.google.oauth-client</groupId>
<artifactId>google-oauth-client-java6</artifactId>
<version>1.21.0</version>
</dependency>
<dependency>
<groupId>com.google.http-client</groupId>
<artifactId>google-http-client-jackson</artifactId>
<version>1.20.0</version>
</dependency>
<dependency>
<groupId>com.google.oauth-client</groupId>
<artifactId>google-oauth-client</artifactId>
<version>1.21.0</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-jdbc</artifactId>
<version>7.0.35</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>org.scribe</groupId>
<artifactId>scribe</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.1.3</version>
</dependency>
</dependencies>
</dependencyManagement>
</project>
App runs on Tomcat 7.0.
Edit: Full pom.xml added
Any help very much appreciated.
OK guys, i'm out of this nightmare. I used http://jhades.org/ to identify the overlapping jars, went into web-inf and removed old jars that were conflicting and creating the issue.
Maven is very cool but doesn't solve everything, I lost 3hours+ on this issue...
Thanks again for your contribution that were very helpful.
Soufian
As you said on tomcat you are using a different version of jackson-core than in your build.
Find out what you use in maven with mvn help:effective-pom
Than have a look what you have packaged in your war.
Its also possible,that tomcat uses its own jackson library that makes the problem.
You have not provided extensive information about your question but I can tell you that version 2.1.0 is the first version to have the getValueAsString method. The version in your pom.xml also does have it. Older versions do not have it and this where your problem is. So according to my experience, this is clearly a problem that you have two jackson-core jar versions running and the class loaders are getting the older version first. You need to do something like:
$ find . -name "*jackson*"
in your tomcat 7 directory. Find the older version and remove it from your integration. Read the tomcat 7 webpage to know more about classloaders for your tomcat version.
In my project I want to avoid version conflict of neo4j lucene indexer (which uses lucene version - 3.6.2) and apache lucene (lucene version - 5.3.0). For this I want to use Maven shade plugin. Actually, I added plugin to my projects 'pom.xml' file but problem wasn't solved. I get exception -
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.lucene.analysis.standard.StandardAnalyzer: method <init>()V not found
at com.sessa.col.spr.act.dictionary.DictionaryConfiguration.writerConfiguration(DictionaryConfiguration.java:124)
at com.sessa.col.spr.act.process_flow.Flow.startProcess(Flow.java:59)
at com.sessa.col.spr.act.process_flow.FlowHandler.main(FlowHandler.java:17)
It seems that it is caused by version conflict again. I guess, I don't use Maven Shade plugin in a correct way. How should it be used?
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sessa.col.spr.act</groupId>
<artifactId>Color-Spreading-Activation</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Color-Spreading-Activation</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<neo4j-version>2.2.5</neo4j-version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.4</version>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j</artifactId>
<version>${neo4j-version}</version>
</dependency>
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-corenlp</artifactId>
<version>3.5.2</version>
</dependency>
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-parser</artifactId>
<version>3.5.2</version>
</dependency>
<dependency>
<groupId>edu.stanford.nlp</groupId>
<artifactId>stanford-corenlp</artifactId>
<version>3.5.2</version>
<classifier>models</classifier>
</dependency>
<dependency>
<groupId>com.sparsity</groupId>
<artifactId>sparkseejava</artifactId>
<version>5.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.jena</groupId>
<artifactId>jena-tdb</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp-tools</artifactId>
<version>1.5.3</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>5.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>5.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-queryparser</artifactId>
<version>5.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-queries</artifactId>
<version>5.3.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<relocations>
<relocation>
<pattern>org.apache.lucene</pattern>
<shadedPattern>shaded_lucene_3_6_2.org.apache.lucene</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>neo4j-repo</id>
<name>Neo4j Repository</name>
<url>http://m2.neo4j.org/content/repositories/releases</url>
</repository>
</repositories>
I doubt that maven-shade will help you here. You basically want to have multiple versions of the same jar - lucene 3.6.2 and 5.x.y used at the same time.
The only solution I'm aware of here is using classloader separation.
However it might be worth refactoring the architecture to prevent that problem by separating Neo4j and your code into separate JVMs.
Add this project to eclipse -> https://github.com/lagodiuk/neo4j-uber-jar.
Use mvn-install and create ~SNAPSHOT.jar
Add that .jar to your project (which has conflict)
Remove neo4j maven dependency from that project.