Hi i am new to Apache mahout, i am getting error while running "classify-20newsgroups.sh" this example which automatically gets dataset from internet.
Error trace:
hduser#raj-Lenovo-G550:/usr/local/mahout/examples$ bin/classify-20newsgroups.sh
Please select a number to choose the corresponding task to run
1. cnaivebayes
2. naivebayes
3. sgd
4. clean -- cleans up the work area in /tmp/mahout-work-hduser
Enter your choice : 3
ok. You chose 3 and we'll use sgd
creating work directory at /tmp/mahout-work-hduser
Downloading 20news-bydate
bin/classify-20newsgroups.sh: line 68: curl: command not found
Extracting...
tar (child): ../20news-bydate.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
Training on /tmp/mahout-work-hduser/20news-bydate/20news-bydate-train/
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
Running on hadoop, using /usr/local/hadoop-1.2.1/bin/hadoop and HADOOP_CONF_DIR=/usr/local/hadoop-1.2.1/conf
MAHOUT-JOB: /usr/local/mahout/mahout-examples-0.9-job.jar
14/08/06 14:07:53 WARN driver.MahoutDriver: No org.apache.mahout.classifier.sgd.TrainNewsGroups.props found on classpath, will use command-line arguments only
Exception in thread "main" java.lang.NullPointerException
at org.apache.mahout.classifier.sgd.TrainNewsGroups.main(TrainNewsGroups.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
Any body pls help here
Edited :
i tried
using sudo apt-get install curl but got
hduser#raj-Lenovo-G550:/usr/local/mahout/examples$ bin/classify-20newsgroups.sh
Please select a number to choose the corresponding task to run
1. cnaivebayes
2. naivebayes
3. sgd
4. clean -- cleans up the work area in /tmp/mahout-work-hduser
Enter your choice : 3
ok. You chose 3 and we'll use sgd
creating work directory at /tmp/mahout-work-hduser
Training on /tmp/mahout-work-hduser/20news-bydate/20news-bydate-train/
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
Running on hadoop, using /usr/local/hadoop-1.2.1/bin/hadoop and HADOOP_CONF_DIR=/usr/local/hadoop-1.2.1/conf/
MAHOUT-JOB: /usr/local/mahout/mahout-examples-0.9-job.jar
14/08/06 17:06:41 WARN driver.MahoutDriver: No org.apache.mahout.classifier.sgd.TrainNewsGroups.props found on classpath, will use command-line arguments only
Exception in thread "main" java.lang.NullPointerException
at org.apache.mahout.classifier.sgd.TrainNewsGroups.main(TrainNewsGroups.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
The problem here is that can't download the corpus 20newsgroups with the curl command because it doesn't find in the operating system, look at the following line error : bin/classify-20newsgroups.sh: line 68: curl: command not found.
Related
I have hadoop
hadoop#nodo1:/opt/hadoop$ hadoop version Hadoop 2.7.7 Subversion
Unknown -r c1aad84bd27cd79c3d1a7dd58202a8c3ee1ed3ac Compiled by stevel
on 2018-07-18T22:47Z Compiled with protoc 2.5.0 From source with
checksum 792e15d20b12c74bd6f19a1fb886490 This command was run using
/opt/hadoop/share/hadoop/common/hadoop-common-2.7.7.jar
And as learned in a course, I use
/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.7.jar wordcount
But when i run this, the next error is shown:
hadoop#nodo1:/opt/hadoop$ hadoop jar
/opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.7.jar
wordcount /libros /output3
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
hdfs://nodo1:9000/output3 already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:87)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:226)
at org.apache.hadoop.util.RunJar.main(RunJar.java:141)
In the path i have a book
hadoop#nodo1:/opt/hadoop$ hdfs dfs -ls /libros/ Found 1 items
-rw-r--r-- 1 hadoop supergroup 2198927 2018-11-02 10:22 /libros/quijote.txt
TNK from your help
First do
hdfs dfs -ls /output3
If there is a file then,
Either Delete, Output directory hdfs://nodo1:9000/output3 or,
Use different file name
# Change output3 to output4
hadoop jar /opt/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.7.jar wordcount /libros /output4
My cdh5.2 cluster has a problem to run hbase MR jobs.
For example, I added the hbase classpath into the hadoop classpath:
vi /etc/hadoop/conf/hadoop-env.sh
add the line:
export HADOOP_CLASSPATH="/usr/lib/hbase/bin/hbase classpath:$HADOOP_CLASSPATH"
And when I am running:
hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcounter "mytable"
I get the following exception:
14/12/09 03:44:02 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://clusterName/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://clusterName/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1083)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1075)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1075)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
at org.apache.hadoop.hbase.mapreduce.RowCounter.main(RowCounter.java:191)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:153)
Even I have the same problem with CDH 5.2.0. as a work around,
I manually copied the jar file in to hdfs, then exceptions are not coming
So, The problem was environment issue:
When I added the below jars into /usr/lib/hadoop/lib. all worked fine
hbase-client-0.98.6-cdh5.2.1.jar
hbase-common-0.98.6-cdh5.2.1.jar
hbase-protocol-0.98.6-cdh5.2.1.jar
hbase-server-0.98.6-cdh5.2.1.jar
hbase-prefix-tree-0.98.6-cdh5.2.1.jar
hadoop-core-2.5.0-mr1-cdh5.2.1.jar
htrace-core-2.04.jar
My machine has the following rpms in it:
>> rpm -qa | grep cdh
zookeeper-3.4.5+cdh5.2.1+84-1.cdh5.2.1.p0.13.el6.x86_64
hadoop-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-regionserver-0.98.6+cdh5.2.1+64-1.cdh5.2.1.p0.9.el6.x86_64
cloudera-cdh-5-0.x86_64
bigtop-utils-0.7.0+cdh5.2.1+0-1.cdh5.2.1.p0.13.el6.noarch
bigtop-jsvc-0.6.0+cdh5.2.1+578-1.cdh5.2.1.p0.13.el6.x86_64
parquet-1.5.0+cdh5.2.1+38-1.cdh5.2.1.p0.12.el6.noarch
hadoop-hdfs-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-mapreduce-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-tasktracker-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-0.98.6+cdh5.2.1+64-1.cdh5.2.1.p0.9.el6.x86_64
avro-libs-1.7.6+cdh5.2.1+69-1.cdh5.2.1.p0.13.el6.noarch
parquet-format-2.1.0+cdh5.2.1+6-1.cdh5.2.1.p0.14.el6.noarch
hadoop-yarn-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-hdfs-datanode-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
I still wonder which rpm is missing.
Instead of adding the file manually to HDFS, you can add the hbase library path to the .bashrc file. Add the lib folder in hbase to the CLASSPATH.
Also, add the classpath of hbase to HADOOP_CLASSPATH.
Your .bashrc file should contain the following:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:`${HBASE_HOME}/bin/hbase classpath`
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:`${HBASE_HOME}/bin/hbase mapredcp`
export CLASSPATH=${HBASE_HOME}/lib/*
Note: The CLASSPATH should point to the lib folder of your hbase installation folder. Use the following to compile and run your java code.
javac Example.java
java -classpath $CLASSPATH:. Example
I am running graphbuilder with hadoop (single-node). I have followed the tutorial "https://01.org/graphbuilder/documentation/how-run-demo-application". However, when I run the command
bin/hadoop jar /usr/local/graphbuilder/target/graphbuilder-0.0.1-SNAPSHOT-hadoop-job.jar com.intel.hadoop.graphbuilder.demoapps.wikipedia.docwordgraph.TFIDFGraphEnd2End 1 /user/hduser/wiki-input /user/hduser/en-wiki-articles-output ingressCode*
I am getting error
INFO docwordgraph.CreateWordCountGraph: ========== Creating Graph ================
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.filecache.DistributedCache.addFileToClassPath(Lorg/apache/hadoop/fs/Path;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/fs/FileSystem;)V
at com.intel.hadoop.graphbuilder.util.FsUtil.distributedTempClassToClassPath(FsUtil.java:46)
at com.intel.hadoop.graphbuilder.job.AbstractPreprocessJob.run(AbstractPreprocessJob.java:111)
at com.intel.hadoop.graphbuilder.demoapps.wikipedia.docwordgraph.CreateWordCountGraph.main(CreateWordCountGraph.java:77)
at com.intel.hadoop.graphbuilder.demoapps.wikipedia.docwordgraph.TFIDFGraphEnd2End.main(TFIDFGraphEnd2End.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)"
Can you please guide me how to solve this issue?
Thanks
Looks like you're using a back level version of hadoop. Check the version of hadoop that your version of graph builder needs and make sure that's the version you're running.
When I'm trying to run my JUnit tests with coverage I receive the following error
FATAL ERROR in native method: processing of -javaagent failed
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sun.instrument.InstrumentationImpl.loadClassAndStartAgent(InstrumentationImpl.java:382)
at sun.instrument.InstrumentationImpl.loadClassAndCallPremain(InstrumentationImpl.java:397)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.coverage.main.CoveragePremain.premain(CoveragePremain.java:50)
... 6 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 1
at com.intellij.rt.coverage.instrumentation.Instrumentator.premain(Instrumentator.java:40)
... 11 more
Exception in thread "main"
Process finished with exit code 1
Can anybody help to fix it?
Take a look here:
https://youtrack.jetbrains.com/issue/IDEABKL-5941
You can add -Djava.io.tmpdir param to idea's launching script or replace TMP system property.
Also probably you will have to change idea.config.path and idea.system.path in idea.properties in installation directory.
I have a core.31690 file which I want to convert to hprof. I tried: jmap -dump:format=b,file=mydump.hprof /usr/bin/java core.31690 but I got the following:
Attaching to core core.31690 from executable /usr/bin/java, please wait...
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at sun.tools.jmap.JMap.runTool(JMap.java:179)
at sun.tools.jmap.JMap.main(JMap.java:110)
Caused by: sun.jvm.hotspot.debugger.UnmappedAddressException
at sun.jvm.hotspot.debugger.PageCache.checkPage(PageCache.java:208)
at sun.jvm.hotspot.debugger.PageCache.getData(PageCache.java:63)
at sun.jvm.hotspot.debugger.DebuggerBase.readBytes(DebuggerBase.java:217)
at sun.jvm.hotspot.debugger.linux.LinuxDebuggerLocal.readCInteger(LinuxDebuggerLocal.java:482)
at sun.jvm.hotspot.debugger.linux.LinuxAddress.getCIntegerAt(LinuxAddress.java:69)
at sun.jvm.hotspot.utilities.CStringUtilities.getString(CStringUtilities.java:61)
at sun.jvm.hotspot.HotSpotTypeDataBase.readVMTypes(HotSpotTypeDataBase.java:128)
at sun.jvm.hotspot.HotSpotTypeDataBase.<init>(HotSpotTypeDataBase.java:85)
at sun.jvm.hotspot.bugspot.BugSpotAgent.setupVM(BugSpotAgent.java:568)
at sun.jvm.hotspot.bugspot.BugSpotAgent.go(BugSpotAgent.java:494)
at sun.jvm.hotspot.bugspot.BugSpotAgent.attach(BugSpotAgent.java:348)
at sun.jvm.hotspot.tools.Tool.start(Tool.java:169)
at sun.jvm.hotspot.tools.HeapDumper.main(HeapDumper.java:77)
... 6 more
I tried looking online but couldn't find anything. Any pointers?