I am running a simple code of MapReduce and am getting the following error:
`Exception in thread "main" java.io.IOException: Error opening job jar: Test.jar
at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:114)
at java.util.jar.JarFile.<init>(JarFile.java:133)
at java.util.jar.JarFile.<init>(JarFile.java:70)
at org.apache.hadoop.util.RunJar.main(RunJar.java:88)`
Some details of the problem:
My hadoop version is 0.20.
I have set new JobConf(Statecount.class) where Statecount.class is the class from which I am running this job. What do I have to do to resolve this error?
Can anyone help me?
Thanks.
check that the hadoop user (usually 'hadoop') have permission to this file
sometimes hadoop need that some files will be on the HDFS and not in your file system.
Are you trying to run a jar named Test.jar in the java program RunJar?
If so please remember that any local path used could only be the on the name node.
Related
I created credentials like as:
Configuration conf = new Configuration();
conf.set("fs.s3a.impl", org.apache.hadoop.fs.s3a.S3AFileSystem.class.getName());
conf.set("fs.s3a.access.key", "ACCESS_ID");
conf.set("fs.s3a.secret.key", "SECRET VALUE");
conf.set("fs.s3a.endpoint", "s3.us-east-1.amazonaws.com");
conf.set("fs.s3a.aws.credentials.provider", SimpleAWSCredentialsProvider.NAME);
conf.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
and Path
Path path = new Path("s3a://<backet>/<directory>/my.parquet");
When tried to write parquet data:
try (ParquetWriter<GenericData.Record> writer = AvroParquetWriter.<GenericData.Record>builder(path)
.withSchema(avroSchema)
.withConf(conf)
.withCompressionCodec(SNAPPY)
.withWriteMode(OVERWRITE)
.build())
And following error occured:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.metrics2.lib.MetricsRegistry.newCounter(Ljava/lang/String;Ljava/lang/String;J)Lorg/apache/hadoop/metrics2/lib/MutableCounterLong;
at org.apache.hadoop.fs.s3a.S3AInstrumentation.counter(S3AInstrumentation.java:183)
at org.apache.hadoop.fs.s3a.S3AInstrumentation.counter(S3AInstrumentation.java:206)
at org.apache.hadoop.fs.s3a.S3AInstrumentation.<init>(S3AInstrumentation.java:160)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:174)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
at org.apache.parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:209)
at org.apache.parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:266)
at org.apache.parquet.hadoop.ParquetWriter$Builder.build(ParquetWriter.java:489)
at Main.main(Main.java:64)
There my dependencies:
org.apache.avro-1.8.2, org.apache.hadoop-hadoop-core-1.2.1, org.apache.parquet-parquet-hadoop-1.8.1, org.apache.parquet-parquet-avro -1.8.1, org.apache.hadoop-hadoop-hdfs-3.2.1, org.apache.hadoop-hadoop-aws -2.8.2
Do you know how resolve this?
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.metrics2.lib.MetricsRegistry.newCounter(Ljava/lang/String;Ljava/lang/String;J)Lorg/apache/hadoop/metrics2/lib/MutableCounterLong;
The MetricsRegistry class is part of the hadoop-common library. You are missing it from the list of dependencies.
Make sure the version of this jar is same as that of the hadoop-aws jar.
Also, it is recommended to have same version for all the hadoop dependency jars.
Update:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
You are trying to use Hadoop libraries to write to S3. In that case, Hadoop will attempt to use the local filesystem for creating temporary files. This being the Windows operating system, you would need the winutils.
Download the winutils.exe binary and other required files as per your Hadoop version from here.
Set the environment variable %HADOOP_HOME% to point to the directory where these binaries were installed.
Read more: Hadoop2 - Windows Problems
I want to verify a cap file using cmd on windows with
verifycap.bat
tool.
I used the following command:
C:\Users\Mohammad\Desktop\VERIFY CAP\verifycap\bin>verifycap.bat imp.exp imp.cap > imp.hash
but the following exception error occurs:
java.util.zip.ZipException: error in the opening zip file
Can anyone help me with resolving this error, please?
I am trying to copy the file from HDFS to Local linux file system using Hadoop FileSystem class.
I have access to create folder in the path where i am trying to copy, i checked using mkdir command.
Also i tried using shell command hadoop fs -copyToLocal hdfsFilePath localFilepath it was working.
I am running this on YARN Cluster.
I tried below approaches, but i am getting the java.io.IOException: Mkdirs failed to create file:/home/user error.
Error log:
16/01/14 01:09:36 ERROR util.FileUtil:
java.io.IOException: Mkdirs failed to create /home/user (exists=false, cwd=file:/hdfs4/yarn/nm/usercache/user/appcache/application_1452126203792_8862/container_e2457_1452126203792_8862_01_000001)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:428)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:889)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:786)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:365)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1970)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1939)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1915)
at com.batch.util.FileUtil.copyToLocalFileSystem(FileUtil.java:66)
at com.batch.dao.impl.DaoImpl.writeFile(DaoImpl.java:108)
at com.batch.JobDriver.runJob(JobDriver.java:79)
at com.batch.JobDriver.main(JobDriver.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
Actually i am passing localFilePath as /home/user/test, but i am getting the error like failed to create file:/home/user
fs.copyToLocalFile(hdfsFilePath, localFilePath);
fs.copyToLocalFile(false, hdfsFilePath, localFilePath, true);
This week i faced the same thing, problem was that i was deploying the job in cluster mode, therefore the machine where the job was going to run did not have that directory created. Is it possible you are deploying the job in cluster mode? If so, try deploying it in client mode (the output directory has to exist though)
For anyone looking for this exact error, but maybe not from YARN:
I had this exact error when trying to run org.apache.hadoop.fs.FileSystem.copyToLocalFile on my local (Mac) machine, with local FS configured using the job.local.dir attribute.
This was the exception:
java.io.IOException: Mkdirs failed to create file:/User/yossiv/algo-resources/AWS/QuerySearchEngine.blacklistVersionFile (exists=false, cwd=file:/Users/yossiv/git/c2s-algo)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:441)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:928)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:806)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:368)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:341)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2066)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2035)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2011)
What fixed it was to change job.local.dir to be under the current directory , which is listed in the exception text after cwd=, In my case that's /Users/yossiv/git/c2s-algo.
Broke my head two days over this, hope this helps someone.
Is there a way to simply run a python script in Apache Storm?
I'm trying to figure out how to use storm to run scripts but am having trouble. It seems like I need to create a Java program to call the script and use it as a bolt but I simply want to send a very basic python script to storm to see if it is possible.
I read that the following command is helpful in sending topologies to storm but am having trouble understanding the syntax and if I am allowed to send any python code to storm or if it needs to have specific syntax.
Can someone clarify whether or not I can submit any python script to storm and if so what the following line of code means.
storm shell resources/ python topology.py arg1 arg2
When I try to submit a basic python script using the above code i get the following output.
956 [main] INFO backtype.storm.StormSubmitter - Uploading topology jar stormshell8691441.jar to assigned location: /home/scix3/apache/storm/data/nimbus/inbox/stormjar-ae0739f9-7c93-4f00-a02b-c4eceba3b005.jar
966 [main] INFO backtype.storm.StormSubmitter - Successfully uploaded topology jar to assigned location: /home/scix3/apache/storm/data/nimbus/inbox/stormjar-ae0739f9-7c93-4f00-a02b-c4eceba3b005.jar
Exception in thread "main" java.io.IOException: Cannot run program "simple.py" (in directory "."): error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at java.lang.Runtime.exec(Runtime.java:617)
at org.apache.commons.exec.launcher.Java13CommandLauncher.exec(Java13CommandLauncher.java:58)
at org.apache.commons.exec.DefaultExecutor.launch(DefaultExecutor.java:254)
at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:319)
at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:160)
at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:147)
at backtype.storm.util$exec_command_BANG_.invoke(util.clj:386)
at backtype.storm.command.shell_submission$_main.doInvoke(shell_submission.clj:29)
at clojure.lang.RestFn.applyTo(RestFn.java:139)
at backtype.storm.command.shell_submission.main(Unknown Source)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
... 10 more
The exact command I'm using (possibly incorrect) is storm shell resources/ simple.py
simple.py is merely a print 'Hello, world script.
I'm using storm version 0.9.4
Yes, you can run python on Storm. In fact you can run just about code from just about any language on a storm cluster, its just a matter of implementing the API.
However, there are some requirements for that to work, and so far as I can tell said requirements are not spelled out in the storm documentation. The fastest path to get up an running would be to take the splitsentence.py example from the storm source and run with it.
try pyleus (https://github.com/Yelp/pyleus) or streamparse (https://github.com/Parsely/streamparse), i will recommend using pyleus as it is simple.
I am trying to replace a jar file after it's been added to the class path programmatically.
When I try to replace a jar that's been loaded, it gives me the subject error.
I'm using Windows 8 , and JDK 7.
Any suggestions or workaround is highly appreciated.
error line
Files.copy(Paths.get(sourcePath), Paths.get(desPath), StandardCopyOption.REPLACE_EXISTING);
exception
java.nio.file.NoSuchFileException: D:\test\test.jar
> D:\project\plugins\test.jar
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:79)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsFileCopy.copy(WindowsFileCopy.java:205)
at sun.nio.fs.WindowsFileSystemProvider.copy(WindowsFileSystemProvider.java:278)
at java.nio.file.Files.copy(Files.java:1225)
at org.test.Test.apply(Test.java:89)