I can compile the Java Web Service client fine with the following command:
javac
-classpath lib\spring-ws-2.0.0-M2-all.jar;lib\xml-apis.jar;lib\j2ee.jar;lib\saaj.jar;lib\saaj-impl.jar
WebServiceClient.java
When I actually run it (java WebServiceClient), it gives me the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/springframework/ws/client/core/WebServiceTemplate
at WebServiceClient.<init>(WebServiceClient.java:14)
at WebServiceClient.main(WebServiceClient.java:37)
Caused by: java.lang.ClassNotFoundException: org.springframework.ws.client.core.
WebServiceTemplate
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
... 2 more
Here's the code for WebServiceClient.java:
import java.io.StringReader;
import javax.xml.transform.stream.StreamResult;
import javax.xml.transform.stream.StreamSource;
import org.springframework.ws.WebServiceMessageFactory;
import org.springframework.ws.client.core.WebServiceTemplate;
import org.springframework.ws.transport.WebServiceMessageSender;
public class WebServiceClient {
private static final String MESSAGE =
"<message xmlns=\"http://tempuri.org\">Hello Web Service World</message>";
private final WebServiceTemplate webServiceTemplate = new WebServiceTemplate();
public void setDefaultUri(String defaultUri) {
webServiceTemplate.setDefaultUri(defaultUri);
}
// send to the configured default URI
public void simpleSendAndReceive() {
StreamSource source = new StreamSource(new StringReader(MESSAGE));
StreamResult result = new StreamResult(System.out);
webServiceTemplate.sendSourceAndReceiveToResult(source, result);
}
// send to an explicit URI
public void customSendAndReceive() {
StreamSource source = new StreamSource(new StringReader(MESSAGE));
StreamResult result = new StreamResult(System.out);
webServiceTemplate.sendSourceAndReceiveToResult("http://wsdl",
source, result);
}
public static void main(String[] args) throws Exception {
WebServiceClient ws = new WebServiceClient();
ws.setDefaultUri("http://wsdl");
ws.simpleSendAndReceive();
}
}
Any help is appreciated.
Try
java -classpath lib\spring-ws-2.0.0-M2-all.jar;lib\xml-apis.jar;lib\j2ee.jar;lib\saaj.jar;lib\saaj-impl.jar WebServiceClient
I suppose your folder structure is as follows;
\WebServiceClient.java
\WebServiceClient.class
\lib\spring-ws-2.0.0-M2-all.jar
\lib\xml-apis.jar
\lib\j2ee.jar
\lib\saaj.jar
\lib\saaj-impl.jar
When you passed in that classpath to your javac invocation, it was necessary because your classes referenced files that were defined only in those JARs.
The same holds true at runtime as well, your compiled Java bytecode will need to be able to "see" those JARs in order to load the classes and use the Spring functionality. So you can't merely invoke java WebServiceClient and expect it to work.
Instead you'll need to invoke the command that pakore's answer shows, which looks like it should work. If in doubt, after successfully compiling, press the Up arrow to rebuffer the last command, delete the c from javac and delete the .java from the filename at the end. (If your shell doesn't support this, copy-and-paste the previous line via e.g. Notepad).
org.springframework.ws.client.core.WebServiceTemplate is located on spring-ws-core.jar. Have you checked if its included on your WAR/EAR when the application is deployed to the Application Server where you are attempting to run it in or if its included as part of the server's lib? A successful compilation doesn´t mean all classes required to run an application will be there at runtime.
Related
I am working on a project that is supposed to parse texts from PDF files.
Having multiple dependencies I have decided to build a combined JAR with all the dependencies and the classes.
However, when I build JAR including dependencies via Intellij IDEA even though the JAR file is added properly and I can import the class the program throws NoClassDefFoundError (Please refer to the screenshot).
Firstly, I thought the jar wasn't in the classpath. However, even if I add -cp TessaractPDF.jar through VM Options the class still get undetected.
I think it is worth to mention that, everything works smoothly if I build JAR without dependencies and add the dependencies manually.
What should I do?
Exception in thread "main" java.lang.NoClassDefFoundError: me/afifaniks/parsers/TessPDFParser
at Test.main(Test.java:20)
Caused by: java.lang.ClassNotFoundException: me.afifaniks.parsers.TessPDFParser
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 1 more
Code Snippet:
import me.afifaniks.parsers.TessPDFParser;
import java.io.IOException;
import java.util.HashMap;
public class Test {
public static void main(String[] args) throws IOException {
System.out.println(System.getProperty("java.classpath"));
HashMap<String, Object> arguments = new HashMap<>();
arguments.put("imageMode", "binary");
arguments.put("toFile", false);
arguments.put("tessDataPath", "/home/afif/Desktop/PDFParser/tessdata");
TessPDFParser pdfParser = new TessPDFParser("hiers15.pdf", arguments);
String text = (String) pdfParser.convert();
System.out.println(text);
}
}
I am trying to classify an instance using a .model file which I have created on the Weka GUI. It seems I have successfully classified the test instance, however, I am not sure whether I am able to successfully load my .model file and of the Stub compiler error.
I have tried to remove the extends AppCompatActivity and if that makes any difference in the .model upload. It turns out that to use getAssets(), the code must be in an activity. However, I an still unsure of whether the model has upload and the unusual compiler error. I have followed the basic framework of #davidmascharka's work on GitHub (he's also loading a WEKA model from assets), but mine does not compile.
Here's my code:
package com.example.owner.introductoryapplication;
import android.support.v7.app.AppCompatActivity;
import weka.classifiers.Classifier;
import weka.classifiers.rules.DecisionTable;
import weka.core.Attribute;
import weka.core.DenseInstance;
import weka.core.Instances;
import java.util.ArrayList;
public class Test extends AppCompatActivity {
public static void main(String[] args) {
Test test = new Test();
test.start();
}
public void start() {
//LOADS THE MODEL...------------------------------------------------------
String rootPath = "/assets/";
String fileName = "PGBD_DecisionTableUPD.model";
Classifier cls = null;
try {
//cls = (Classifier) weka.core.SerializationHelper.read(rootPath + fileName);
cls = (DecisionTable) weka.core.SerializationHelper.read(getAssets().open(fileName));
} catch (Exception e) {
e.printStackTrace();
}
}
}
And here's my error output:
Exception in thread "main" java.lang.RuntimeException: Stub!
at android.content.Context.<init>(Context.java:67)
at android.content.ContextWrapper.<init>(ContextWrapper.java:30)
at android.view.ContextThemeWrapper.<init>(ContextThemeWrapper.java:40)
at android.app.Activity.<init>(Activity.java:643)
at android.support.v4.app.SupportActivity.<init>(ComponentActivity.java:46)
at android.support.v4.app.FragmentActivity.<init>(FragmentActivity.java:68)
at android.support.v7.app.AppCompatActivity.<init>(AppCompatActivity.java:62)
at com.example.owner.introductoryapplication.Test.<init>(Test.java:13)
at com.example.owner.introductoryapplication.Test.main(Test.java:15)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMainV2.main(AppMainV2.java:131)
Process finished with exit code 1
I expect the program to at least compile! I have absolutely no clue why it's not. I tried switching the order of my dependencies, hoping that would make a difference, but to no luck.
Any ideas?
Thanks in advance.
This may have been covered before, but weka.jar only allows for Stub implementations. Essentially, you must configure the run setting to "app" instead of a specific file.
If you want to see how a specific file works, then you can use the debug option for your app.
I am using Hadoop 1.0.3 and HBase 0.94.22. I am trying to run a mapper program to read values from a Hbase table and output them to a file. I am getting the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
The code is as below
import java.io.IOException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class Test {
static class TestMapper extends TableMapper<Text, IntWritable> {
private static final IntWritable one = new IntWritable(1);
public void map(ImmutableBytesWritable row, Result value, Context context) throws IOException, InterruptedException
{
ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(), 0 , Bytes.SIZEOF_INT);
String key =Bytes.toString(userkey.get());
context.write(new Text(key), one);
}
}
public static void main(String[] args) throws Exception {
HBaseConfiguration conf = new HBaseConfiguration();
Job job = new Job(conf, "hbase_freqcounter");
job.setJarByClass(Test.class);
Scan scan = new Scan();
FileOutputFormat.setOutputPath(job, new Path(args[0]));
String columns = "data";
scan.addFamily(Bytes.toBytes(columns));
scan.setFilter(new FirstKeyOnlyFilter());
TableMapReduceUtil.initTableMapperJob("test",scan, TestMapper.class, Text.class, IntWritable.class, job);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true)?0:1);
}
}
I get the above code exported to a jar file and on the command line I use the below command to run the above code.
hadoop jar /home/testdb.jar test
where test is the folder to which the mapper results should be written.
I have checked a few other links like Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException where it has been suggested to include the zookeeper file in the classpath, but while creating the project in eclipse I have already included zookeeper file from the lib directory of hbase. The file I have included is zookeeper-3.4.5.jar. Ans also visited this link too HBase - java.lang.NoClassDefFoundError in java , but I am using a mapper class to get the values from the hbase table not any client API. I know I am making a mistake somewhere, guys could you please help me out ??
I have noted another strange thing, when I remove all of the code in the main function except the first line " HBaseConfiguration conf = new HBaseConfiguration();", then export the code to a jar file and try to compile the jar file as hadoop jar test.jar I still get the same error. It seems either I am defining the conf variable incorrectly or there is some issue with my environment.
I got the fix to the problem, I had not added the hbase classpath in the hadoop-env.sh file. Below is the one I added to make the job work.
$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
$HBASE_HOME/hbase-0.94.22-test.jar:\
$HBASE_HOME/conf:\
${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
${HBASE_HOME}/lib/guava-11.0.2.jar
I tried editing the hadoop-env.sh file, but the changes mentioned here didn't work for me.
What worked is this:
export HADOOP_CLASSPATH="$HADOOP_CLASSPATH:$HBASE_HOME/lib/*"
I just added that at the end of my hadoop-env.sh.
Do not forget to set your HBASE_HOME variable.
You can also replace the $HBASE_HOME with the actual path of your hbase installation.
In case there is someone who has different paths/configuration. Here is what I added to hadoop-env.sh in order to make it work:
$ export HADOOP_CLASSPATH="$HBASE_HOME/lib/hbase-client-0.98.11-hadoop2.jar:\
$HBASE_HOME/lib/hbase-common-0.98.11-hadoop2.jar:\
$HBASE_HOME/lib/protobuf-java-2.5.0.jar:\
$HBASE_HOME/lib/guava-12.0.1.jar:\
$HBASE_HOME/lib/zookeeper-3.4.6.jar:\
$HBASE_HOME/lib/hbase-protocol-0.98.11-hadoop2.jar"
NOTE: if you haven't set the $HBASE_HOME you have 2 choices.
- By export HBASE_HOME=[your hbase installation path]
- Or just replace the $HBASE_HOME with your hbase full path
HADOOP_USER_CLASSPATH_FIRST=true \
HADOOP_CLASSPATH=$($HBASE_HOME/bin/hbase mapredcp) \
hadoop jar /home/testdb.jar test
here CreateTable is my java class file
use this command
java -cp .:/home/hadoop/hbase/hbase-0.94.8/hbase-0.94.8.jar:/home/hadoop/hbase/hbase-0.94.8/lib/* CreateTable
I'm trying Hadoop's Basic MapReduce Program whose tutorial is on http://java.dzone.com/articles/hadoop-basics-creating
The Full code of the class is(the code is present on net on above url)
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
public class Dictionary {
public static class WordMapper extends Mapper<Text, Text, Text, Text> {
private Text word = new Text();
public void map(Text key, Text value, Context context) throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString(), ",");
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(key, word);
}
}
}
public static class AllTranslationsReducer extends Reducer<Text, Text, Text, Text> {
private Text result = new Text();
public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException {
String translations = "";
for (Text val : values) {
translations += "|" + val.toString();
}
result.set(translations);
context.write(key, result);
}
}
public static void main(String[] args) throws Exception {
System.out.println("welcome to Java 1");
Configuration conf = new Configuration();
System.out.println("welcome to Java 2");
Job job = new Job(conf, "dictionary");
job.setJarByClass(Dictionary.class);
job.setMapperClass(WordMapper.class);
job.setReducerClass(AllTranslationsReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(KeyValueTextInputFormat.class);
FileInputFormat.addInputPath(job, new Path("/tmp/hadoop-cscarioni/dfs/name/file"));
FileOutputFormat.setOutputPath(job, new Path("output"));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
But after running in eclipse; I'm getting the error,
welcome to Java 1
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:73)
at Dictionary.main(Dictionary.java:43)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 2 more
Please note that the exception is NoClassDefFoundError instead of ClassNotFoundException.
Note : NoClassDefFoundError is thrown when a class is not visible at run time but was visible at compile time. This may be something that can happen in the distribution or production of JAR files, where not all the required class files were included.
To fix : Please check for differences in your build time and runtime classpaths.
NoClassDefFoundError and ClassNotFoundException are different.
one is an Error and the other is an Exception.
NoClassDefFoundError: arises from the JVM having problems finding a class it expected to find. A program that was working at compile-time can't run because of class files not being found.
ClassNotFoundException: This exception indicates that the class was not found on the classpath i.e we are trying to load the class definition and class/jar containing the class does not exist in the classpath.
NoClassDefFoundError comes when a class is not visible at run time but was at compile time. Which may be related to JAR files, because all the required class files were not included.
So try adding in your class path commons-logging-1.1.1 jar which you can get from http://commons.apache.org/logging/download_logging.cgi
NoClassDefFoundError occurs when the named class is successfully located in the classpath, but for some reason cannot be loaded and verified. Most often the problem is that another class needed for the verification of the named class is either missing or is the wrong version.
Generally speaking, this error means "double-check that you have all the right JAR files (of the right version) in your classpath".
It's a very common error when you run a Hadoop Map/Reduce program in local IDE (Eclipse).
You should already added hadoop-core.jar in your build path, so no compile error detected in your program. But you get the error when you run it, because hadoop-core is dependent on commons-logging.jar (as well as some other jars). You may need to add the jars under /lib to your build path.
I recommend you to use Maven or other dependency management tool to manage the dependency.
Please read an article: http://kishorer.in/2014/10/22/running-a-wordcount-mapreduce-example-in-hadoop-2-4-1-single-node-cluster-in-ubuntu-14-04-64-bit/. It explains how to reference dependencies in Eclipse without Marven. However, Marven is preferred way, from what I understood.
I am new to boilerpipe. I tried to run sample code given on their website:
import java.net.URL;
import de.l3s.boilerpipe.extractors.ArticleExtractor;
import de.l3s.boilerpipe.extractors.DefaultExtractor;
public class TESTURLBOILERPIPE {
public static void main(String[] arges) throws Exception
{
final URL url = new URL(
"http://www.l3s.de/web/page11g.do?sp=page11g&link=ln104g&stu1g.LanguageISOCtxParam=en");
ArticleExtractor ae = new ArticleExtractor();
System.out.println(ae.INSTANCE.getText(url));
}
}
I have added all the required jar files to the class path, however I get the exception:
Exception in thread "main" java.lang.IllegalArgumentException: usage: supply url to fetch
at org.jsoup.helper.Validate.isTrue(Validate.java:45)
at org.jsoup.examples.HtmlToPlainText.main(HtmlToPlainText.java:26)
I don't know Boilerpipe, but are you sure you are trying to run the correct Java class? The stack trace looks like you are trying to run HtmlToPlainText (without arguments, thus the exception), but from the code you posted I think you would like to run your TESTURLBOILERPIPE class.
Try using a python wrapper. It takes care of all the dependencies, though you might have to install jpype manually (that source code is on sourceforge).
https://github.com/misja/python-boilerpipe