All I am trying is to create a directory in HDFS using Java programmatically.
I am getting this error.
Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.s3a.S3AFileSystem could not be instantiated
Caused by: java.lang.NoClassDefFoundError: com/amazonaws/AmazonServiceException
Caused by: java.lang.ClassNotFoundException: com.amazonaws.AmazonServiceException
Not sure where all these Amazon S3 came over here. Please help.
Here's the code. This is Hadoop 2.7
package tas.module1;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.BlockLocation;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class JavaClient {
public JavaClient() {
}
public static void main(String[] args) throws IOException {
JavaClient jc = new JavaClient();
Configuration config = new Configuration();
config.addResource(new Path("/usr/local/hadoop-2.7.1/etc/hadoop/core-site.xml"));
config.addResource(new Path("/usr/local/hadoop-2.7.1/etc/hadoop/hdfs-site.xml"));
config.set("fs.hdfs.impl",
org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()
);
config.set("fs.file.impl",
org.apache.hadoop.fs.LocalFileSystem.class.getName()
);
FileSystem dfs = FileSystem.get(config);
String dirName = "TestDirectory";
System.out.println(dfs.getWorkingDirectory() +" this is from /n/n");
Path src = new Path(dfs.getWorkingDirectory()+"/"+dirName);
dfs.mkdirs(src);
System.out.println("created dir");
dfs.close();
}
}
ah, this is a bug fixed a while back in HADOOP-12636, which relates to the java service API and classpaths. Hadoop 2.7.2 enumerates all available filesystem implementation classes in JARs, and fails here due to the transient classpath problems.
If you drop the hadoop-aws JAR from your CP this will go away, or just upgrade to Hadoop 2.7.3
It seems that you are missing some dependency related to the working with S3 file system. In order to work with it, you need to have aws java sdk jars deployed in your cluster. You can download aws-java-sdk from http://sdk-for-java.amazonwebservices.com/latest/aws-java-sdk.zip. Then you need to unzip it, and copy every jar in aws-java-sdk/lib/ and aws-java-sdk/third-party/ to your datanodes.
Another option is to create uber jar and include this dependency directly into your jar through maven:
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.91</version>
</dependency>
Related
I have:
Oracle 19c
java 8 on its machine
What i did:
I write simple class with one method in Java 8.
package <mypackage>;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.networknt.schema.JsonSchema;
import com.networknt.schema.JsonSchemaFactory;
import com.networknt.schema.SpecVersion;
import com.networknt.schema.ValidationMessage;
import java.io.IOException;
import java.io.Reader;
import java.sql.Clob;
import java.sql.SQLException;
import java.util.Set;
import java.util.stream.Collectors;
public class <classname> {
<some simple validation function>
}
I compile my project with maven and maven-assembly-plugin to build .jar file with dependencies.
I upload it with loadtool: loadjava -f -r -v -synonym -oracleresolver -resolve -grant <user> -thin -user <credentials> <filename>.jar
There were 0 errors during upload. All uploaded classes (including dependencies) have 'VALID' status in dba_objects table.
I write PL/SQL wrapper over my Java function.
create FUNCTION <funcname>(P_IN_BODY_TEXT CLOB, P_IN_BODY_SCHEMA_TEXT CLOB)
RETURN VARCHAR2
AS LANGUAGE JAVA NAME '<packagename>.<classname>.<funcname>(java.sql.Clob, java.sql.Clob) return java.lang.String';
/
I use this function in my ORDS REST Service.
When doing request to ORDS i am getting this exception:
The request could not be processed because an error occurred whilst attempting to evaluate
the SQL statement associated with this resource.
Please check the SQL statement is correctly formed and executes without error. SQL Error Code: 29532,
Error Message: ORA-29532: Java call terminated by uncaught Java exception:
java.lang.NoClassDefFoundError ORA-06512: at <rest stacktrace that gives me nothing>
Quiestion is:
What is root of this problem? By -synonym flag tool creates synonyms for me, all classes is valid. Do i need some permisions to use java packages liike java.sql that in my class imports? I upload some others open sources java libraries into my Oracle, but they doesn't hava dependencies - is that the problem?
Problem was in slf4j library that throws this exception. slf4j was dependency of library that i used.
Didn't dig the problem, I just pick another labrary with less dependencies and its works.
I am new to Java and Docker, so this may be very simple.
The program reads user input and passes it to a function that does a dfs for broken links with a depth limit.
import java.util.Scanner;
public class CrawlerTest {
public static void main(String [ ] args)
{
Scanner reader = new Scanner(System.in);
System.out.println("Enter full website url to crawl, starting with http://");
String domain = reader.next();
System.out.println("Enter max crawl depth: ");
int maxDepth = reader.nextInt();
reader.close();
Crawler crawler = new Crawler();
crawler.crawl(domain, maxDepth);
}
}
and the Crawler class imports the following libraries
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Stack;
import org.jsoup.Connection;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import org.jsoup.nodes.Element;
import org.jsoup.select.Elements;
import javafx.util.Pair;
I exported a runnable jar file in eclipse
I created a Docker file with contents below:
FROM openjdk:12-alpine
WORKDIR / ADD Test.jar Test.jar
EXPOSE 8080
CMD java -jar Test.jar
I built the docker image with docker image build .. This succeeds and I get docker image id
Next, I just run this image with docker run -it
I am prompted to enter user input, which I successfully do. Then on hitting Enter second time I see the following errors, which I don't see when just running jar file in console:
Add --attach to the docker run command.
Without --attach, there is no console for the Java program to use, so any use of System.in will fail.
JavaFX was removed from default Java distribution starting from JDK11. It needs to be added explicitly via Java module system. That's why you are getting NoClassDefFoundError for the Pair class. Either change Java version prior to 11 or remove Pair class to resolve the issue. You can also add the JavaFX module to your module path.
I'm new with spark, and I want to use the Fp Growth found in Mllib with java and maven. But I get this error during the compilation:
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /home/cjd/fpgexample/src/main/java/org/fpgexample/FpgTest.java:[25,7] cannot find symbol
symbol: class Function
location: class org.fpgexample.FpgTest
this are the imports and the error line:
package org.fpgexample;
import java.util.Arrays;
import java.util.List;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.mllib.fpm.AssociationRules;
import org.apache.spark.mllib.fpm.FPGrowth;
import org.apache.spark.mllib.fpm.FPGrowthModel;
JavaRDD<List<String>> transactions = data.map( new Function<String, List<String>>() {`
I update the maven compiler to the 3.3 (using JDK 1.7), and spark-core and mllib 2.11 (1.5.1) versions. (with the mllib 2.10 1.4 maven didn't recognize the fpm.AssociationRules).
EDIT: I changed the maven compiler, pom.xml and JDK to 1.8 from 1.7, but the error persist.
You should import the correct package as below:
import org.apache.spark.api.java.function.Function;
I have been given a java app to modify with the following:
import javax.xml.soap.MessageFactory;
import javax.xml.soap.SOAPException;
import javax.xml.soap.SOAPMessage;
import javax.xml.soap.SOAPPart;
But When I run it from cmd it throws the error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/axis/SOAPP art
It seems to find the other imports without issue. Does 'NoClassDefFound' mean the class itself wasn't found? If so how can I check it is there and replace it if not? (I am using Eclipse)
Update: Okay I have found the class in 'JRE System Library>rt.jar>javax.xml.soap>SOAPPart.class' So if the class is there why do I get the error?
Turns out I needed to choose Export> 'Runnable JAR file' in eclipse instead of 'JAR file'
import javax.smartcardio.Card;
import javax.smartcardio.CardChannel;
import javax.smartcardio.CardException;
import javax.smartcardio.CardTerminal;
import javax.smartcardio.CommandAPDU;
import javax.smartcardio.ResponseAPDU;
import javax.smartcardio.TerminalFactory;
TerminalFactory terminalFactory = TerminalFactory.getDefault();
i want to use usb Host communicating with smart card on android pad
but why do i get java.lang.NoClassDefFoundError: javax.smartcardio.TerminalFactory
i already import......
and i don't know how to package this lib into app
When you did the development the required jar which contains
javax.smartcardio.TerminalFactory
is in the classpath.
But when you did installation/deployment/packaging, did you ensure that the required jar is also packaged along with app.