NoCLassDefFound error in hadoop - java

I am using hadoop 2.4.1 version. I am trying to run a mapreduce job which moves data from local system to hdfs cluster(output directory). If I set the output directory as my local system path, the program is running fine. But when I set the output directory as a path in hdfs cluster I am getting the below error
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/protobuf/ServiceException
at org.apache.hadoop.ipc.ProtobufRpcEngine.<clinit>(ProtobufRpcEngine.java:69)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1834)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1799)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1893)
at org.apache.hadoop.ipc.RPC.getProtocolEngine(RPC.java:203)
at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:537)
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:328)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:235)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:139)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:510)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.setOutputPath(FileOutputFormat.java:160)
at s1.run(s1.java:66)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at s1.main(s1.java:75)
Caused by: java.lang.ClassNotFoundException: com.google.protobuf.ServiceException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 25 more
I saw some posts which stated the issue could be related to protobuf dependecy.
Hadoop 2.2.0 mapreduce job not running after upgrading from hadoop 1.0.4
I am using hadoop commons jar 2.5.2 which has the protobuf. Any help to solve this would be appreciated.

Made it working ! Found that there were some jars of 2.2 version which were incompatible with the current version. When i updated those, the program works fine.

if you compile zhe *.java use default java CLASSPATH is ok.
Edit the hadoop_env.sh
export HADOOP_CLASSPATH=${CLASSPATH}
restart the hadoop server

NoClassDefFoundError is thrown by jvm at runtime when a class is not present in classpath.
Check your classpath.
Check also this answer. Could be useful if you solved the NoClassDefFoundError link

Related

Jar can not resolve org.apache.commons import

I am not particularly well experienced with Java at all and try to get a jar file running on my Ubuntu machine (https://sites.google.com/site/communitydetectionslpa/home).
However once I run the jar file with the command suggested by developers I receive the following error:
java -jar GANXiSw.jar -i test.ipairs
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/collections/map/MultiKeyMap
at Net.<init>(Net.java:38)
at SLPAw.<init>(SLPAw.java:146)
at SLPAw.main(SLPAw.java:2050)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.collections.map.MultiKeyMap
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 3 more
Apparently java is not able to import the org.apache.commons class properly. After some researching I checked if I have libcommons-collections3-java installed, which however is the case.
I read something about adding the library explicitly to my CLASSPATH, which however I also read to be not good practice.
What is the best approach to fix my issue?

jenkins executor not building MasterToSlaveCallable

Trying to build a project in Jenkins
Some misconfiguration or something that has been broken , and cannot understand what.
The builds get executed but no command is printed on the Console Output.
It has to do with the executors. The jenkins log is:
java.lang.NoClassDefFoundError: jenkins/security/MasterToSlaveCallable
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
at jenkins.util.AntClassLoader.defineClassFromData(AntClassLoader.java:1138)
at hudson.ClassicPluginStrategy$AntClassLoader2.defineClassFromData(ClassicPluginStrategy.java:756)
at jenkins.util.AntClassLoader.getClassFromStream(AntClassLoader.java:1309)
at jenkins.util.AntClassLoader.findClassInComponents(AntClassLoader.java:1365)
at jenkins.util.AntClassLoader.findClass(AntClassLoader.java:1325)
at jenkins.util.AntClassLoader.loadClass(AntClassLoader.java:1078)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:531)
at hudson.model.ResourceController.execute(ResourceController.java:89)
at hudson.model.Executor.run(Executor.java:240)
Caused by: java.lang.ClassNotFoundException: jenkins.security.MasterToSlaveCallable
at jenkins.util.AntClassLoader.findClassInComponents(AntClassLoader.java:1375)
at jenkins.util.AntClassLoader.findClass(AntClassLoader.java:1325)
at jenkins.util.AntClassLoader.loadClass(AntClassLoader.java:1078)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more
Thank you
A NoClassDefFoundError indicates a likely incompatibility between a Jenkins plugin and either another plugin or Jenkins core. I would double-check the versions of core and your plugins on the Jenkins management page to make sure you're not running any incompatible versions.

Cannot run the code due to java.lang.NoClassDefFoundError

I am investigating the jsprit library. For this I just created a new project in Eclipse and copied-pasted the demo example class. Then I added all jars to the path, including log4j-1.2.17.jar. Nevrtheless I cannot execute the demo code due to the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/logging/log4j/LogManager
at jsprit.core.problem.vehicle.VehicleImpl$Builder.<clinit>(VehicleImpl.java:108)
at com.test.jsprit.main(jsprit.java:49)
Caused by: java.lang.ClassNotFoundException: org.apache.logging.log4j.LogManager
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
What does it actually mean and how to solve this issue? Does it mean that some other *.jar file is missed?
UPDATE:
The error occurs at the following line:
Builder vehicleBuilder = VehicleImpl.Builder.newInstance("vehicle");
You are using the wrong version of the log4j lib. You need the 2.x version.
See:
LogManager (Version 1.x)
LogManager (Version 2.x)
log4j-1.2.17.jar implements log4j version 1 and its pretty old. What you need is implementation of log4j 2 - add this as the dependency: link

HBase not starting after adding the jar file with Mapper/Reducer

I am trying to write a Mapper/Reducer for hbase and I added the jar. However after adding the jar file in lib directory, I cannot start hbase. I want to debug what is going wrong? How can I change the log level? Will it help?
Following is the exception:
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:217)
at org.apache.hadoop.hbase.LocalHBaseCluster.(LocalHBaseCluster.java:153)
at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:224)
at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:139)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2290)
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProtocolProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lorg/apache/hadoop/security/UserGroupInformation;Lorg/apache/hadoop/conf/Configuration;Ljavax/net/SocketFactory;ILorg/apache/hadoop/io/retry/RetryPolicy;Ljava/util/concurrent/atomic/AtomicBoolean;)Lorg/apache/hadoop/ipc/ProtocolProxy;
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:420)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:316)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:665)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:601)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2625)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2607)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:1004)
at org.apache.hadoop.hbase.regionserver.HRegionServer.(HRegionServer.java:562)
at org.apache.hadoop.hbase.master.HMaster.(HMaster.java:364)
at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.(HMasterCommandLine.java:307)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
... 7 more
So it seems the error was due to mismatch of hadoop libraries in the lib directory of HBase(which was hadoop--2.5.1) against my actual Hadoop installation(hadoop--2.6.0). My jar was looking for classes which it was not finding in the older version of the hadoop libraries due to which it was failing. this answer made me realize the issue. After I copied all the hadoop-*-2.6.0 jars in the lib directory, HBase started as expected. The same is also mentioned in HBase-Hadoop compatibility documentation.

Pentaho RowListener ClassNotFoundException

I am trying to perform map reduce on Pentaho 5. For Pentaho 5, the Pentaho applications come pre-configured for Apache Hadoop 0.20.2 and it says no further configuration is required for this version. I installed Hadoop 0.20.2 on windows using cygwin and every thing works fine. I run a simple job on Pentaho, which copies file in HDFS which finished successfully and the file system was copied into HDFS. But as soon as I run map reduce job, it says the job was finished on pentaho but the map reduce task was failed and on the output directory on HDFS the result is missing and the log file says:
Error: java.lang.ClassNotFoundException: org.pentaho.di.trans.step.RowListener
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:762)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:807)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:833)
at org.apache.hadoop.mapred.JobConf.getMapRunnerClass(JobConf.java:790)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.Child.main(Child.java:170
Please Help me out.
Maybe a little bit old but I thought it might help someone.
This can be caused by:
For Hadoop version > 0.20: check if you set up your environment correctly, see Pentaho Support : Creating a New Hadoop Configuration
Check on HDFS if you have an /opt/pentaho/mapreduce/ and check folder permissions on HDFS for /opt/pentaho (Did you find the kettle-*.jar files in lib folder?)
Check classpath separator ("," in Windows, ":" in Linux). In order to change it, edit spoon.sh (or spoon.bat) and modify OPT variable like this: OPT="$OPT -Dhadoop.cluster.path.separator=,"

Categories