I am using Hadoop 3.2.1 with pig 0.17.0 and oozie 5.1.0. While Ensuring the Pig Workflow sample in
Oozie I have encountered an issue in oozie which are mentioned below.
I currently using Guava jar 20.0 but I have changed the guava jar versions(27.0,14.02,11.0) share/lib/ directory and local directory but I got the same issue.
java.lang.NoSuchMethodError: com.google.common.io.Closeables.closeQuietly(Ljava/io/Closeable;)V
at org.apache.pig.Main.log4jConfAsProperties(Main.java:826)
at org.apache.pig.Main.configureLog4J(Main.java:769)
at org.apache.pig.Main.run(Main.java:409)
at org.apache.pig.PigRunner.run(PigRunner.java:49)
at org.apache.oozie.action.hadoop.PigMain.logExpandedScript(PigMain.java:244)
at org.apache.oozie.action.hadoop.PigMain.run(PigMain.java:199)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104)
at org.apache.oozie.action.hadoop.PigMain.main(PigMain.java:77)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
Related
We just upgraded to Java 8 and I'm seeing the following error in our logs. My lead tells me I need to re-compile with an older version of JAXB, can anyone tell me where I can go to download JAXB 2.0 ? Or suggest another solution to this error?
java.lang.NoSuchMethodError: javax.xml.bind.annotation.XmlAccessorType.value()Ljavax/xml/bind/annotation/AccessType;
at com.sun.xml.bind.v2.model.impl.ClassInfoImpl.getAccessType(ClassInfoImpl.java:339)
at com.sun.xml.bind.v2.model.impl.ClassInfoImpl.getProperties(ClassInfoImpl.java:228)
at com.sun.xml.bind.v2.model.impl.RuntimeClassInfoImpl.getProperties(RuntimeClassInfoImpl.java:87)
at com.sun.xml.bind.v2.model.impl.ModelBuilder.getClassInfo(ModelBuilder.java:127)
at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.getClassInfo(RuntimeModelBuilder.java:49)
at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.getClassInfo(RuntimeModelBuilder.java:41)
at com.sun.xml.bind.v2.model.impl.ModelBuilder.getTypeInfo(ModelBuilder.java:189)
at com.sun.xml.bind.v2.model.impl.RegistryInfoImpl.<init>(RegistryInfoImpl.java:51)
at com.sun.xml.bind.v2.model.impl.ModelBuilder.addRegistry(ModelBuilder.java:232)
at com.sun.xml.bind.v2.model.impl.ModelBuilder.getTypeInfo(ModelBuilder.java:201)
at com.sun.xml.bind.v2.runtime.JAXBContextImpl.getTypeInfoSet(JAXBContextImpl.java:327)
at com.sun.xml.bind.v2.runtime.JAXBContextImpl.<init>(JAXBContextImpl.java:198)
at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:76)
at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:55)
at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:124)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:171)
at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:131)
at javax.xml.bind.ContextFinder.find(ContextFinder.java:335)
at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:431)
at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:394)
at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:298)
at com.sun.xml.internal.ws.assembler.MetroConfigLoader$3.run(MetroConfigLoader.java:259)
at com.sun.xml.internal.ws.assembler.MetroConfigLoader$3.run(MetroConfigLoader.java:256)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.xml.internal.ws.assembler.MetroConfigLoader.createJAXBContext(MetroConfigLoader.java:255)
at com.sun.xml.internal.ws.assembler.MetroConfigLoader.loadMetroConfig(MetroConfigLoader.java:241)
at com.sun.xml.internal.ws.assembler.MetroConfigLoader.init(MetroConfigLoader.java:131)
at com.sun.xml.internal.ws.assembler.MetroConfigLoader.<init>(MetroConfigLoader.java:104)
at com.sun.xml.internal.ws.assembler.TubelineAssemblyController.getTubeCreators(TubelineAssemblyController.java:78)
at com.sun.xml.internal.ws.assembler.MetroTubelineAssembler.createClient(MetroTubelineAssembler.java:103)
at com.sun.xml.internal.ws.client.Stub.createPipeline(Stub.java:328)
at com.sun.xml.internal.ws.client.Stub.<init>(Stub.java:295)
at com.sun.xml.internal.ws.client.Stub.<init>(Stub.java:228)
at com.sun.xml.internal.ws.client.Stub.<init>(Stub.java:243)
at com.sun.xml.internal.ws.client.sei.SEIStub.<init>(SEIStub.java:84)
at com.sun.xml.internal.ws.client.WSServiceDelegate.getStubHandler(WSServiceDelegate.java:814)
at com.sun.xml.internal.ws.client.WSServiceDelegate.createEndpointIFBaseProxy(WSServiceDelegate.java:803)
at com.sun.xml.internal.ws.client.WSServiceDelegate.getPort(WSServiceDelegate.java:436)
at com.sun.xml.internal.ws.client.WSServiceDelegate.getPort(WSServiceDelegate.java:404)
at com.sun.xml.internal.ws.client.WSServiceDelegate.getPort(WSServiceDelegate.java:386)
at javax.xml.ws.Service.getPort(Service.java:119)
If you’re using maven or gradle just set the appropriate version, it will be downloaded from maven central repository. Otherwise you can manually download it and add it to classpath.
https://mvnrepository.com/artifact/javax.xml.bind/jaxb-api/2.0
I am trying to run the CountVectorizerDemo program provided here:
https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaCountVectorizerExample.java
I'm getting the following error and don't know what the problem is.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.ml.util.SchemaUtils$.checkColumnType$default$4()Ljava/lang/String;
at org.apache.spark.ml.feature.CountVectorizerParams$class.validateAndTransformSchema(CountVectorizer.scala:71)
at org.apache.spark.ml.feature.CountVectorizer.validateAndTransformSchema(CountVectorizer.scala:107)
at org.apache.spark.ml.feature.CountVectorizer.transformSchema(CountVectorizer.scala:168)
at org.apache.spark.ml.PipelineStage.transformSchema(Pipeline.scala:59)
at org.apache.spark.ml.feature.CountVectorizer.fit(CountVectorizer.scala:130)
at com.bah.ossem.spark.topic.CountVectorizerDemo.main(CountVectorizerDemo.java:42)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The problem was that my cluster was using Spark core 1.4 but my application was using Spark core 1.5.1 and MLlib 1.5.1. I updated my AWS cluster to Spark 1.5.1 and that fixed the problem.
My cdh5.2 cluster has a problem to run hbase MR jobs.
For example, I added the hbase classpath into the hadoop classpath:
vi /etc/hadoop/conf/hadoop-env.sh
add the line:
export HADOOP_CLASSPATH="/usr/lib/hbase/bin/hbase classpath:$HADOOP_CLASSPATH"
And when I am running:
hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcounter "mytable"
I get the following exception:
14/12/09 03:44:02 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://clusterName/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://clusterName/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1083)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1075)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1075)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
at org.apache.hadoop.hbase.mapreduce.RowCounter.main(RowCounter.java:191)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:153)
Even I have the same problem with CDH 5.2.0. as a work around,
I manually copied the jar file in to hdfs, then exceptions are not coming
So, The problem was environment issue:
When I added the below jars into /usr/lib/hadoop/lib. all worked fine
hbase-client-0.98.6-cdh5.2.1.jar
hbase-common-0.98.6-cdh5.2.1.jar
hbase-protocol-0.98.6-cdh5.2.1.jar
hbase-server-0.98.6-cdh5.2.1.jar
hbase-prefix-tree-0.98.6-cdh5.2.1.jar
hadoop-core-2.5.0-mr1-cdh5.2.1.jar
htrace-core-2.04.jar
My machine has the following rpms in it:
>> rpm -qa | grep cdh
zookeeper-3.4.5+cdh5.2.1+84-1.cdh5.2.1.p0.13.el6.x86_64
hadoop-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-regionserver-0.98.6+cdh5.2.1+64-1.cdh5.2.1.p0.9.el6.x86_64
cloudera-cdh-5-0.x86_64
bigtop-utils-0.7.0+cdh5.2.1+0-1.cdh5.2.1.p0.13.el6.noarch
bigtop-jsvc-0.6.0+cdh5.2.1+578-1.cdh5.2.1.p0.13.el6.x86_64
parquet-1.5.0+cdh5.2.1+38-1.cdh5.2.1.p0.12.el6.noarch
hadoop-hdfs-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-mapreduce-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-tasktracker-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-0.98.6+cdh5.2.1+64-1.cdh5.2.1.p0.9.el6.x86_64
avro-libs-1.7.6+cdh5.2.1+69-1.cdh5.2.1.p0.13.el6.noarch
parquet-format-2.1.0+cdh5.2.1+6-1.cdh5.2.1.p0.14.el6.noarch
hadoop-yarn-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-hdfs-datanode-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
I still wonder which rpm is missing.
Instead of adding the file manually to HDFS, you can add the hbase library path to the .bashrc file. Add the lib folder in hbase to the CLASSPATH.
Also, add the classpath of hbase to HADOOP_CLASSPATH.
Your .bashrc file should contain the following:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:`${HBASE_HOME}/bin/hbase classpath`
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:`${HBASE_HOME}/bin/hbase mapredcp`
export CLASSPATH=${HBASE_HOME}/lib/*
Note: The CLASSPATH should point to the lib folder of your hbase installation folder. Use the following to compile and run your java code.
javac Example.java
java -classpath $CLASSPATH:. Example
I am running graphbuilder with hadoop (single-node). I have followed the tutorial "https://01.org/graphbuilder/documentation/how-run-demo-application". However, when I run the command
bin/hadoop jar /usr/local/graphbuilder/target/graphbuilder-0.0.1-SNAPSHOT-hadoop-job.jar com.intel.hadoop.graphbuilder.demoapps.wikipedia.docwordgraph.TFIDFGraphEnd2End 1 /user/hduser/wiki-input /user/hduser/en-wiki-articles-output ingressCode*
I am getting error
INFO docwordgraph.CreateWordCountGraph: ========== Creating Graph ================
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.filecache.DistributedCache.addFileToClassPath(Lorg/apache/hadoop/fs/Path;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/fs/FileSystem;)V
at com.intel.hadoop.graphbuilder.util.FsUtil.distributedTempClassToClassPath(FsUtil.java:46)
at com.intel.hadoop.graphbuilder.job.AbstractPreprocessJob.run(AbstractPreprocessJob.java:111)
at com.intel.hadoop.graphbuilder.demoapps.wikipedia.docwordgraph.CreateWordCountGraph.main(CreateWordCountGraph.java:77)
at com.intel.hadoop.graphbuilder.demoapps.wikipedia.docwordgraph.TFIDFGraphEnd2End.main(TFIDFGraphEnd2End.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)"
Can you please guide me how to solve this issue?
Thanks
Looks like you're using a back level version of hadoop. Check the version of hadoop that your version of graph builder needs and make sure that's the version you're running.
I am facing some issues while trying to launch a mapreduce job on our Hadoop cluster from eclipse. I have added a folder named "conf" as a class folder and under that folder, I have imported the "core-site.xml", "hdfs-site.xml", "mapred-site.xml" and "hbase-site.xml". My hadoop cluster runs Hadoop 0.20.205.0, HBase-0.94.1. We are able to successfully submit the jobs to the cluser using "hadoop jar" command. Since this is very cumbersome, I want to setup eclipse such that I could submit the Hadoop jobs to the cluster by just running the program.
After I added the required dependencies to the project, I am getting the following exception when I run the example "PiEstimator.java" (of Hadoop-0.20.205.0).
Number of Maps = 4
Samples per Map = 4
Exception in thread "main" org.apache.hadoop.ipc.RemoteException: java.io.IOException: java.lang.NoSuchMethodException: org.apache.hadoop.hdfs.protocol.ClientProtocol.create(java.lang.String, org.apache.hadoop.fs.permission.FsPermission, java.lang.String, boolean, boolean, short, long)
at java.lang.Class.getMethod(Class.java:1632)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at org.apache.hadoop.ipc.Client.call(Client.java:1066)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.create(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy1.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:3245)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:713)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:182)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:892)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:393)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:284)
at com.amazon.seo.mapreduce.examples.PiEstimator.estimate(PiEstimator.java:265)
at com.amazon.seo.mapreduce.examples.PiEstimator.run(PiEstimator.java:325)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.amazon.seo.mapreduce.examples.PiEstimator.main(PiEstimator.java:333)
Can you please help me understand what part of my setup is wrong and how to fix it?
Were you able to resolve this issue?
I've resolved similar error with ClassDefinition as follows:
Create Jar as a Runnable Java File (File >> Export >> Runnable Java File)
export HADOOP_CLASSPATH =
This will make hadoop to pickup the correct class from your jar file.
Unfortunately, I believe that you will have to upgrade Hadoop's version to at least 2.5.0