BigQuery: java.lang.NoClassDefFoundError: com/google/api/gax/retrying/ExceptionRetryAlgorithm - java

I'm working on a Google Dataflow Java project. I'm trying to create and update BigQuery tables in the initialisation of the pipeline. So, for getting started, I'm trying to instantiate BigQuery Java Client this way:
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
When running, it brings me the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/api/gax/retrying/ExceptionRetryAlgorithm
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at com.google.cloud.BaseService.<clinit>(BaseService.java:48)
at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryFactory.create(BigQueryOptions.java:46)
at com.google.cloud.bigquery.BigQueryOptions$DefaultBigQueryFactory.create(BigQueryOptions.java:40)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:426)
at com.mydomain.Analysis.main(Analysis.java:359)
Caused by: java.lang.ClassNotFoundException: com.google.api.gax.retrying.ExceptionRetryAlgorithm
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 17 more
This is my pom.xml:
<properties>
<beam.version>2.19.0</beam.version>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-collections4 -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-collections4</artifactId>
<version>4.0</version>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-bigquery</artifactId>
<version>1.110.0</version>
</dependency>
<!-- Beam -->
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-core</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
<version>${beam.version}</version>
<exclusions>
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-extensions-google-cloud-platform-core</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-jdbc</artifactId>
<version>${beam.version}</version>
</dependency>
</dependencies>
What am I missing?

From the error it looks like some .jar containing the com.google.api.gax.retrying.ExceptionRetryAlgorithm.class is missing.
A quick Google search directed me to add the below dependency to the pom.xml. This gax.jar contains the ExceptionRetryAlgorithm class.
<dependency>
<groupId>com.google.api</groupId>
<artifactId>gax</artifactId>
<version>0.14.0</version>
</dependency>
I hope that adding the above dependency to your pom resolves the issue.

Related

Running Mahout in Hadoop Cluster - java.lang.ClassNotFoundException

I am trying to run the mahout code in the Hadoop cluster, it is throwing the below error.
Error: java.lang.ClassNotFoundException:
org.apache.mahout.cf.taste.model.DataModel at
java.net.URLClassLoader.findClass(URLClassLoader.java:387) at
java.lang.ClassLoader.loadClass(ClassLoader.java:418) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355) at
java.lang.ClassLoader.loadClass(ClassLoader.java:351) at
java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:348) at
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2625)
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2590)
at
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2686)
at
org.apache.hadoop.mapreduce.task.JobContextImpl.getReducerClass(JobContextImpl.java:211)
at
org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:614)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:390) at
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:178) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:172)
Although it's working fine in local. Here's my pom.xml
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.24</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.mahout</groupId>
<artifactId>mahout-core</artifactId>
<version>0.9</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-math3</artifactId>
<version>3.6.1</version>
</dependency>
</dependencies>

tomcat errors in idea

when i run my code with Tomcat in idea.i run with the following error.i know there is some wrong about the configuration about Tomcat.But i do not how to find it.
[2018-12-05 09:01:14,894] Artifact SSMCRUD:Web exploded: Waiting for server connection to start artifact deployment...
Caused by: java.lang.NoClassDefFoundError: org/springframework/core/io/Resource
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredFields(Unknown Source)
at org.apache.catalina.util.Introspection.getDeclaredFields(Introspection.java:106)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationAnnotations(WebAnnotationSet.java:63)
at org.apache.catalina.startup.ContextConfig.applicationAnnotationsConfig(ContextConfig.java:336)
at org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:786)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:307)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:95)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5212)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
... 42 more
Caused by: java.lang.ClassNotFoundException: org.springframework.core.io.Resource
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1335)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1163)
... 56 more
the following is my pom.
<groupId>com.dxq</groupId>
<artifactId>SSM-CRUD</artifactId>
<version>1.0-SNAPSHOT</version>
<!-- 引入项目依赖的jar包-->
<dependencies>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>4.3.7.RELEASE</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework/spring-context -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>4.3.7.RELEASE</version>
</dependency>
<!--面向切面的编程-->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>4.3.7.RELEASE</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
<version>3.4.2</version>
</dependency>
<!-- mybatis整合spring的适配包 -->
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-spring</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>servlet-api</artifactId>
<scope>provided</scope>
<version>2.5-20081211</version>
</dependency>
</dependencies>
</project>
what is wrong about my code? Is there some wrongs about the configuration in my pom.xml
the following is my dependencies.i can not find any wrong!

Getting class not found for org.json.simple.parser.ParseException even through dependecy is there in pom.xml?

Here is my pom.xml
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1.1</version>
</dependency>
</dependencies>
I have added the com.googlecode.json-simple as a dependency in the project, and it is working fine in eclipse build wise, but when I run the jar in hadoop I get
Error: java.lang.ClassNotFoundException: org.json.simple.parser.ParseException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2134)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2099)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobContextImpl.java:186)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:745)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Any ideas?
I think the issue could be because of the scope .
Please try to change the scope to "compile" or "test" . I think you are using it for testing.
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
<scope>provided</scope>
</dependency>

java.lang.NoClassDefFoundError

Here what i got when compile the project
by this command
mvn compile exec:java -Dexec.classPathScope=compile -Dexec.mainClass=trident.myproject
got this
java.lang.NoClassDefFoundError: storm/trident/state/StateFactory
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2427)
at java.lang.Class.getMethod0(Class.java:2670)
at java.lang.Class.getMethod(Class.java:1603)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:285)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.ClassNotFoundException: storm.trident.state.StateFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 6 more
INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An exception occured while executing the Java class. storm/trident /state/StateFactory
storm.trident.state.StateFactory
i reviewed the folder and can't find this class ! although i ran it before successfully
i wrote pom file as parts
<dependencies>
<dependency>
<groupId>storm</groupId>
<artifactId>storm</artifactId>
<version>0.8.2</version>
<scope>provided</scope>
</dependency>
part 2 of pom.xml
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>colt</groupId>
<artifactId>colt</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.0.0</version>
</dependency>
part 3 of pom.xml
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-core</artifactId>
<version>4.0.2</version>
</dependency>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-stream</artifactId>
<version>4.0.2</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
</dependencies>
You might have changed the scope of the dependency in your pom.xml file. If your container is providing the dependency change scope to provided.
<scope>provided</scope>
If container is not providing the dependency change scope to compile.
<scope>compile</scope>

Kafka Consumer Example throws ClassNotFoundException

I have Kafka 0.8 built using Scala 2.8; the steps followed from this site. I could also run the kafka-console-producer and kafka-console-consumer examples successfully. I am now experimenting with Consumer Group Example (Please the "Full Source Code" section). The exception I get is below:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:297)
at java.lang.Thread.run(Thread.java:701)
Caused by: java.lang.NoClassDefFoundError: scala/Tuple2$mcJJ$sp
at kafka.consumer.ConsumerConfig.<init>(Unknown Source)
at ConsumerGroupExample.createConsumerConfig(ConsumerGroupExample.java:55)
at ConsumerGroupExample.<init>(ConsumerGroupExample.java:18)
at ConsumerGroupExample.main(ConsumerGroupExample.java:64)
... 6 more
Caused by: java.lang.ClassNotFoundException: scala.Tuple2$mcJJ$sp
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
... 10 more
My dependencies looks like this:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.8.0</version>
</dependency>
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>com.101tec</groupId>
<artifactId>zkclient</artifactId>
<version>0.3</version>
</dependency>
<dependency>
<groupId>com.yammer.metrics</groupId>
<artifactId>metrics-core</artifactId>
<version>2.2.0</version>
</dependency>
As a note I could get it running with a Python client. But I really need this done in Java. Would really appreciate any help!
You're using kafka built against 2.10 (see, that _2.10 in artifact name)
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.8.0</version>
</dependency>
with scala 2.8.0
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.8.0</version>
</dependency>
Major scala versions (2.8.*, 2.9.*, 2.10.*) are binary incompatible between each other -- pick the right version of either kafka artifact or project's scala library and everything will be fine.

Categories