I built *.jar file for my apache spark scala project using maven. When I try to execute the main class, at some line of the code it provides the error Exception in thread "main" java.lang.NoClassDefFoundError for the class org.apache.spark.ml.recommendation.ALS.
I run the spark-submit as follows:
sudo -u hdfs /usr/bin/spark-submit --class
org.apache.spark.examples.ml.MyTest spark-examples-*.jar --rank 10 --path
/home/ubuntu/data
It looks like it only cannot find org.apache.spark.ml.recommendation.ALS. I have the following import statements in the class:
package org.apache.spark.examples.ml
import java.util.Date
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.examples.mllib.AbstractParams
import org.apache.spark.ml.recommendation.ALS
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.{Row, SQLContext}
import scopt.OptionParser
How to solve this issue?
UPDATE 1:
I added maven-assebly plugin to pom.xml and also assembly.xml into the folder resources. Then I successfully made mvn package, but again the same problem.
UPDATE 2:
jar -tvf spark-examples-1.5.3-SNAPSHOT-hadoop2.2.0.jar | grep "ALS"
2678 Mon May 23 13:11:44 CEST 2016 org/apache/spark/examples/ml/recommendation/MyFunc$ALSParams.class
First find out reason for failure. 1) Is class missing ? 2) or It is initialization problem.
To find out whether class is available run command jar -tvf <.jar> | grep "class name" by this way you can find out whether class file is present or not or its initialization issue.
Related
I have:
Oracle 19c
java 8 on its machine
What i did:
I write simple class with one method in Java 8.
package <mypackage>;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.networknt.schema.JsonSchema;
import com.networknt.schema.JsonSchemaFactory;
import com.networknt.schema.SpecVersion;
import com.networknt.schema.ValidationMessage;
import java.io.IOException;
import java.io.Reader;
import java.sql.Clob;
import java.sql.SQLException;
import java.util.Set;
import java.util.stream.Collectors;
public class <classname> {
<some simple validation function>
}
I compile my project with maven and maven-assembly-plugin to build .jar file with dependencies.
I upload it with loadtool: loadjava -f -r -v -synonym -oracleresolver -resolve -grant <user> -thin -user <credentials> <filename>.jar
There were 0 errors during upload. All uploaded classes (including dependencies) have 'VALID' status in dba_objects table.
I write PL/SQL wrapper over my Java function.
create FUNCTION <funcname>(P_IN_BODY_TEXT CLOB, P_IN_BODY_SCHEMA_TEXT CLOB)
RETURN VARCHAR2
AS LANGUAGE JAVA NAME '<packagename>.<classname>.<funcname>(java.sql.Clob, java.sql.Clob) return java.lang.String';
/
I use this function in my ORDS REST Service.
When doing request to ORDS i am getting this exception:
The request could not be processed because an error occurred whilst attempting to evaluate
the SQL statement associated with this resource.
Please check the SQL statement is correctly formed and executes without error. SQL Error Code: 29532,
Error Message: ORA-29532: Java call terminated by uncaught Java exception:
java.lang.NoClassDefFoundError ORA-06512: at <rest stacktrace that gives me nothing>
Quiestion is:
What is root of this problem? By -synonym flag tool creates synonyms for me, all classes is valid. Do i need some permisions to use java packages liike java.sql that in my class imports? I upload some others open sources java libraries into my Oracle, but they doesn't hava dependencies - is that the problem?
Problem was in slf4j library that throws this exception. slf4j was dependency of library that i used.
Didn't dig the problem, I just pick another labrary with less dependencies and its works.
I am trying to use Quarkus native to build my AWS Lambda.
My setup is:
GraalVM 19.3.0
Java 11
Ubuntu
When I run
docker run -v /home/mypc/dev/java/quarkus/alexa_swear/target/<my project>-1.0-SNAPSHOT-native-image-source-jar:/project:z --user 1000:1000 --rm quay.io/quarkus/ubi-quarkus-native-image:19.2.1 -J-Djava.util.logging.manager=org.jboss.logmanager.LogManager --initialize-at-build-time= -H:InitialCollectionPolicy=com.oracle.svm.core.genscavenge.CollectionPolicy\$BySpaceAndTime -jar <my project>-1.0-SNAPSHOT-runner.jar -J-Djava.util.concurrent.ForkJoinPool.common.parallelism=1 -H:FallbackThreshold=0 -H:+ReportExceptionStackTraces -H:+AddAllCharsets -H:EnableURLProtocols=http -H:-JNI --no-server -H:-UseServiceLoaderFeature -H:+StackTrace <my project>-1.0-SNAPSHOT-runner
I get the following error:
[alexa_swear-1.0-SNAPSHOT-runner:23] (typeflow): 52,070.99 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (objects): 25,961.57 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (features): 803.41 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] analysis: 81,015.48 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] (clinit): 1,277.52 ms
[alexa_swear-1.0-SNAPSHOT-runner:23] universe: 4,416.32 ms
Error: Unsupported features in 5 methods
Detailed message:
Call path from entry point to java.lang.Runtime.traceInstructions(boolean):
at java.lang.Runtime.traceInstructions(Runtime.java)
at com.oracle.svm.reflect.Runtime_traceInstructions_91eaacf084b9d7e2af6dcc0028ee87fea9223b51_77.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.net.www.protocol.http.NTLMAuthenticationProxy.isTrustedSite(NTLMAuthenticationProxy.java:102)
at sun.net.www.protocol.http.HttpURLConnection.getServerAuthentication(HttpURLConnection.java:2481)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1743)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at io.quarkus.amazon.lambda.runtime.AmazonLambdaRecorder$2.run(AmazonLambdaRecorder.java:171)
at java.lang.Thread.run(Thread.java:748)
at com.oracle.svm.core.thread.JavaThreads.threadStartRoutine(JavaThreads.java:460)
at com.oracle.svm.core.posix.thread.PosixJavaThreads.pthreadStartRoutine(PosixJavaThreads.java:193)
at com.oracle.svm.core.code.IsolateEnterStub.PosixJavaThreads_pthreadStartRoutine_e1f4a8c0039f8337338252cd8734f63a79b5e3df(generated:0) ... 6 more
Error: Image build request failed with exit status 1
The above error is truncated: the same call stack points to different unsupported methods, such as java.lang.Thread.stop.
My basic understanding is thatio.quarkus.amazon.lambda.runtime.AmazonLambdaRecorder$2.run(AmazonLambdaRecorder.java is referencing to some unsupported methods, such as java.lang.Thread.resume().
I have also tried with Quarkus 19.2.1 unsuccessfully.
The above command was executed by mvn clean install -Pnative -Dnative-image.docker-build=true -e.
Finally I have found the cause of my issue.
In the non-working version of my code, I use a factory of com.amazon.ask.AlexaSkill that is somehow injected in the entry point, as follows:
package io.mirko.lambda;
import com.amazon.ask.AlexaSkill;
import com.amazon.ask.Skills;
import com.amazon.ask.dispatcher.request.handler.HandlerInput;
import com.amazon.ask.dispatcher.request.handler.RequestHandler;
import com.amazon.ask.model.RequestEnvelope;
import com.amazon.ask.model.ResponseEnvelope;
import com.amazon.ask.request.interceptor.GenericRequestInterceptor;
import io.mirko.lambda.handlers.*;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Instance;
import javax.enterprise.inject.Produces;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.*;
import java.util.stream.StreamSupport;
public class SkillFactory {
#Inject
Instance<RequestHandler> handlers;
#Produces
#ApplicationScoped
#Named
public AlexaSkill<RequestEnvelope, ResponseEnvelope> createSkill() {
return Skills.standard()
.addRequestHandlers(handlers.stream().toArray(RequestHandler[]::new))
.addRequestInterceptor(new GenericRequestInterceptor<HandlerInput>() {
#Override
public void process(HandlerInput handlerInput) {
System.out.format("Processing %s\n", handlerInput.getRequest());
}
})
// Add your skill id below
//.withSkillId("")
.build();
}
}
...
public class SwearStreamLambda extends SkillStreamHandler {
#Named("swearStream")
public SwearStreamLambda() {
//noinspection unchecked
super((AlexaSkill<RequestEnvelope, ResponseEnvelope>)
getBean(new ParameterizedTypeImpl(AlexaSkill.class, RequestEnvelope.class, ResponseEnvelope.class)));
By removing the SkillFactory class and moving its logic inside SwearStreamLambda class the compilation went well.
Some notes:
The issue is not related to CDI, as it is heavily used throughout the project
The issue is not related to #Produces annotation, which is present in other parts of the project
The issue is not related to javax.enterprise.inject.Instance, as its removal does not solve the issue
All in all, I could not find the root cause of the problem, but I consider my issue solved.
P.S. The original problem is eradicated by building with the following:
mvn clean install -Pnative -Dnative-image.docker-build=true -Dquarkus.native.enable-jni=true
Please see https://github.com/quarkusio/quarkus/issues/6395#issuecomment-570755587.
This does not solve all the problems, as Quarkus reflection has to be configured, but it solves the specific issue.
Just some update for 2022
There is an extension in quarkus which do some configuration already for transitive dependencies: https://github.com/quarkiverse/quarkus-amazon-alex
You need GraalVM and docker installed
I just went to:
https://code.quarkus.io/?b=GRADLE&e=io.quarkiverse.amazonalexa%3Aquarkus-amazon-alexa&extension-search=origin:platform%20alexa
download the zip
overwrite all of my existing project/skill.
small further adaptions to my existing skill as described here: https://quarkus.io/guides/writing-native-applications-tips
gradlew build -x test -Dquarkus.package.type=native -Dquarkus.native.container-build=true
upload functions.zip to AWS Lambda
I'm using following package for calling Weka functions from within Matlab https://github.com/NicholasMcCarthy/wekalab
and my code is
close all; clear all; clc;
dbstop if error
%%
javaclasspath('C:\Program Files (x86)\Weka-3-8\weka.jar');
javaaddpath('C:\Users\PC\wekafiles\packages\imageFilters\imageFilters.jar');
%%
import weka.filters.*
import weka.filters.Filter.*
import weka.filters.unsupervised.instance.imagefilter.*
import weka.filters.unsupervised.instance.imagefilter.BinaryPatternsPyramidFilter.*
import weka.classifiers.Classifier.*
import weka.classifiers.functions.SMO.*
import weka.classifiers.Evaluation.*
import weka.core.Attribute.*
import weka.core.FastVector.*
import weka.core.Instances.*
import weka.core.DenseInstance.*
import weka.classifiers.Classifier.*
import weka.classifiers.Evaluation.*
import weka.core.converters.ArffLoader.*
import weka.filters.unsupervised.instance.imagefilter.*
import weka.core.converters.ConverterUtils.*;
D = wekaLoadData('E:\pro\program\selectedPics\character\test.arff', 'ARFF');
myFilter = wekaFilter('weka.filters.unsupervised.instance.imagefilter.BinaryPatternsPyramidFilter');
filteredData = wekaApplyFilter(D, myFilter);
when i use the default filters of weka,
myFilter = wekaFilter('weka.filters.unsupervised.attribute.Standardize');
it works fine but when i use the installed package of weka (imageFilters) it give me this error
Error using javaObject
No class weka.filters.unsupervised.instance.imagefilter.BinaryPatternsPyramidFilter can be located on the Java class path
image filter (package) path: C:\Users\PC\wekafiles\packages\imageFilters
weka path: C:\Program Files (x86)\Weka-3-8
Environment variable & their paths:
CLASSPATH
C:\Program Files (x86)\Weka-3-8\weka.jar;
C:\Program Files(x86)\Weka-3-8\imageFilters\imageFilters.jar;
C:\Users\PC\wekafiles\packages\imageFilters\src\main\java;
C:\Users\PC\wekafiles\packages\imageFilters\src\main\java\filters\unsupervised\instance\imagefilter;
C:\Users\PC\wekafiles\packages\imageFilters\src\main\java\filters\unsupervised\instance;
java
C:\Program Files\Java\jre1.8.0_181\bin;
PATH
C:\Program Files\Java\jdk1.8.0_181\bin;
PATH_HOME
C:\Program Files\Java\jdk1.8.0_181;
You can use weka.Run in order to use additional packages in weka (weka.Run details)
On the Terminal:
Before using it you may want to add your weka.jar file into CLASSPATH:
export CLASSPATH=path_to_weka.jar
For example, a way to use this command:
java weka.Run weka.filters.unsupervised.instance.imagefilter.BinaryPatternsPyramidFilter -D "directory were files to filter are located" -i "input arff" -o "output arff"
For more info, check: Weka official documenation
In my case, I used the manual for Weka 3-7-8. The information mentioned is on the page 26 under Running installed learning algorithms. This could change from version to version.
Here you have a list of any manual you could need: Weka manuals for every version
About using it on Java, I don't have information about it. I wanted to answer this question because there might be people out there needing help with command line Weka.
I have been given a java app to modify with the following:
import javax.xml.soap.MessageFactory;
import javax.xml.soap.SOAPException;
import javax.xml.soap.SOAPMessage;
import javax.xml.soap.SOAPPart;
But When I run it from cmd it throws the error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/axis/SOAPP art
It seems to find the other imports without issue. Does 'NoClassDefFound' mean the class itself wasn't found? If so how can I check it is there and replace it if not? (I am using Eclipse)
Update: Okay I have found the class in 'JRE System Library>rt.jar>javax.xml.soap>SOAPPart.class' So if the class is there why do I get the error?
Turns out I needed to choose Export> 'Runnable JAR file' in eclipse instead of 'JAR file'
I'm trying to parse sentence with Malt Parser in NLTK. When I did raw_parse(sent) it gave an error with exit code 1. I executed java command on terminal and it gives class Not Found exception, I don't understand what is wrong now?
java -Xmx1024m -jar /usr/local/bin/malt.jar -w /home/abc/maltparser-1.7.2 -c engmalt.linear-1.7 -i /home/abc/maltparser-1.7.2/malt_input.conllrPZgwc -o /home/abc/maltparser-1.7.2/malt_output.conllDMSKpg -m parse
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Layout
Your working directory is not correctly set. Log4j is a package used by Malt Parser (see: maltparser-1.7.2/lib/log4j.jar). Which is used for logging logically.
In order to run maltparser in NLTK, the working directory should be set to this folder (in your case: /home/abc/maltparser-1.7.2).
So, step one is getting the latest NLTK from git:
git clone https://github.com/nltk/nltk.git
Install NLTK:
sudo python setup.py install
To run Malt Parser using NLTK try this code example:
import os
import nltk
os.environ['MALTPARSERHOME']="/home/abc/maltparser-1.7.2"
verbose = False
maltParser = nltk.parse.malt.MaltParser(working_dir="/home/abc/maltparser-1.7.2",
mco="engmalt.linear-1.7",
additional_java_args=['-Xmx512m'])
print(maltParser.raw_parse('This is a test sentence', verbose=verbose).tree().pprint())
As you may notice I'm using the pre-learned mco file (engmalt.linear-1.7), which can be downloaded from here:
http://www.maltparser.org/mco/english_parser/engmalt.html
Move this mco file to: /home/abc/maltparser-1.7.2 directory.
Finally NLTK only except malt.jar. So create a copy (or rename):
cp maltparser-1.7.2.jar malt.jar
Which can still be located in your /home/abc/maltparser-1.7.2.jar directory.
Hopefully you'll get it running!