How to access Java dependency on command line? - java

I have a moderately old, small Java application which has an option to read and export PDF files using the Apache PDFBox library (hereunder, "pdfbox-app.jar"). All the files, including this resource, are stored in a single flat folder.
This works fine when called from a JAR file:
D:\Prog\!GitHub\Arena>java -jar Athena.jar NPCGenerator -p
OED NPC Generator
-----------------
Writing Gwenllian-ElfFtr1Wiz1.pdf
It similarly works fine when run from my IDE (jGrasp).
But it fails when called from the command line, outside of its JAR:
D:\Prog\!GitHub\Arena>java NPCGenerator -p
OED NPC Generator
-----------------
Writing Eoin-HalflingFtr1.pdf
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/pdfbox/pdmodel/PDDoc
ument
at CharacterPDF.writePDF(CharacterPDF.java:49)
at NPCGenerator.printToPDF(NPCGenerator.java:294)
at NPCGenerator.makeAllNPCs(NPCGenerator.java:270)
at NPCGenerator.main(NPCGenerator.java:308)
Caused by: java.lang.ClassNotFoundException: org.apache.pdfbox.pdmodel.PDDocument
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.j
ava:641)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoader
s.java:188)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
... 4 more
What should I be doing to run this on the command line outside of its own JAR?

You should put the pdfbox jar, and any other dependencies, on the classpath:
java -classpath .;pdfbox-app.jar NPCGenerator -p
Without that, Java doesn't know where to look for org/apache/pdfbox classes. It looks for .class files relative to the default classpath (which is just ., the current directory), but does not look inside jars.

Related

cobertura-instrument.sh fails to instrument jar file with java.lang.NoClassDefFoundError: net.sourceforge.cobertura.instrument.InstrumentMain

I'm trying to instrument jar file (from Spacewalk project) so I can measure code coverage of mine testing, but it is failing:
# /opt/cobertura-2.1.1/cobertura-instrument.sh --datafile /tmp/out /usr/share/rhn/lib/rhn.jar
Exception in thread "main" java.lang.NoClassDefFoundError: net.sourceforge.cobertura.instrument.InstrumentMain
Caused by: java.lang.ClassNotFoundException: net.sourceforge.cobertura.instrument.InstrumentMain
at java.net.URLClassLoader.findClass(URLClassLoader.java:432)
at java.lang.ClassLoader.loadClass(ClassLoader.java:676)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:358)
at java.lang.ClassLoader.loadClass(ClassLoader.java:642)
Could not find the main class: net.sourceforge.cobertura.instrument.InstrumentMain. Program will exit.
I have tried to provide one random class (in the ideal state I want to instrument all of them) from that jar as well, but with same result:
# jar tf rhn.jar | tail
org/cobbler/CobblerConnection.class
[...]
# /opt/cobertura-2.1.1/cobertura-instrument.sh --datafile /tmp/out /usr/share/rhn/lib/rhn.jar org.cobbler.CobblerConnection
I'm pretty sure I'm just missing something what it is trying to tell me.
I'm using cobertura-2.1.1 downloaded from SourceForge and extracted into /opt, running on Red Hat Enterprise Linux 6.
OK, this was simple:
# dos2unix /opt/cobertura-2.1.1/cobertura-instrument.sh
also it is missing bash shebang (#!/bin/bash), so you might need to add it to the beginning of the file (I do not know why it worked for me even without that).

ClassNotFoundException while running jar

I am getting ClassNotFoundException when I try to run a jar whith below command:
java -jar MyJar.jar
I created the jar with eclipe. MyJar depends of a couple of other jars. I saw in ecplise that all these other jars are there in the classpath.
I also added these jars to the classpath in Unix using export classpath. But still I get the ClassNotFoundException.
Exception Stack Trace:
Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.hadoop.conf.Configuration
at com.a.HDFSCopy.readURI(HDFSCopy.java:16)
at com.a.CopyMain.main(CopyMain.java:9)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:434)
at java.lang.ClassLoader.loadClass(ClassLoader.java:660)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:358)
at java.lang.ClassLoader.loadClass(ClassLoader.java:626)
... 2 more
Class Path Before:
/opt/ibm/biginsights/IHC/lib/biginsights-gpfs-1.1.1.jar:/opt/ibm/biginsights/IHC/hadoop-core.jar:
Class Path After (included the locations of the jars needed for dependencies) :
[biadmin#big-instght-15 ~]$ echo $CLASSPATH
/opt/ibm/biginsights/IHC/lib/biginsights-gpfs-1.1.1.jar:/opt/ibm/biginsights/IHC/hadoop-core.jar::/home/biadmin/hadoop_jars/commons-logging-1.1.1.jar:/home/biadmin/hadoop_jars/commons-configuration-1.8.jar:/home/biadmin/hadoop_jars/commons-lang-2.4.jar
But, I realized that if I open a new terminal and echo $CLASSPATH, it displays only the Class Path Before. It doesn't reflect the changes I made to the classpath. i.e. it doesn't show Class Path After.
How to fix this?
Thanks,
Mahalakshmi
What is the main class listed in MANIFEST.MF? If you unjar the jar, is it in the jarfile in the correct location?

bin/hive giving issue with the errors

I have installed the hive using source and run ant package.
as per cwiki.apache.org document, I have added PATH var also i.e $HIVE_HOME and $PATH but running the command from base directory (bin/hive or hive)
It give the following error.
I have added the patch (HIVE-3606.1.patch) to resolve it but still it's not working.
Command to add patch:
hive-0.10.0-bin]$ patch -p0 < ~/Downloads/HIVE-3606.1.patch
To run Hive:
hive-0.10.0-bin]$ bin/hive
Exception in thread "main" java.lang.NoSuchFieldError: ALLOW_UNQUOTED_CONTROL_CHARS
at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple.<clinit>(GenericUDTFJSONTuple.java:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:545)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:539)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:472)
at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:202)
at org.apache.hadoop.hive.cli.CliSessionState.<init>(CliSessionState.java:86)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:635)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Can anyone help here?
It's most probably because your Hadoop uses a different (older) version of Jackson libraries than Hive. As a quick workaround you can replace jackson-core-asl-X-X-X.jar and jackson-mapper-asl-X.X.X.jar in $HADOOP_HOME/lib with the newer ones in $HIVE_HOME/lib
Its because you are working with old version of Hadoop.
If you have Hadoop, its better to compile the source code yourself with the following command for old version of Hadoop:
$ svn co http://svn.apache.org/repos/asf/hive/trunk hive
$ cd hive
$ mvn clean install -Phadoop-2,dist
Check this link for more info: https://cwiki.apache.org/confluence/display/Hive/GettingStarted
Then, change the jackson* file names in $HADOOP_HOME/lib and add an .old postfix to them (Its a good practice not to delete them, as we may want them in future):
$ mv jackson-core-asl-1.0.1.jar jackson-core-asl-1.0.1.jar.old
$ mv jackson-mapper-asl-1.0.1.jar jackson-mapper-asl-1.0.1.jar.old
You can find the new jackson compiled files somewhere around Hive's packaging folder, mine is in:
packaging/target/apache-hive-0.14.0-SNAPSHOT-bin/apache-hive-0.14.0-SNAPSHOT/bin/hcatalog/share/webhcat/svr/lib
If you can't find it, its ok. use the following command in your hive directory.
$ find ./ -iname "*jackson*"
It will show you all the jackson* files that it can find. Then go to that specific folder that contains them and copy all of them to the $HADOOP_HOME/lib (currently we may just need the "jackson-core-*" but we copy all for future use):
$ cp jackson* $HADOOP_HOME/lib
Ask if you have more enquiries.

Amazon Web Services Java classpath

I am trying to run an application that reads and writes to the amazon dynamo DB. I downloaded the Eclipse toolkit and AWS SDK and if I run the application from my local PC it works perfectly. Next, I exported it to a jar file and uploaded it to my EC2 instance. However, when I run it there I get an error.
/home/apps/java/database/bin$ java -jar myJar.jar
Exception in thread "main" java.lang.NoClassDefFoundError: com/amazonaws/auth/AW SCredentials
Caused by: java.lang.ClassNotFoundException: com.amazonaws.auth.AWSCredentials
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
Could not find the main class: DynamoDB. Program will exit.
I assume it has to do with the classpath, but in /home/ubuntu/.bashrc I have set it as such:
CLASSPATH="./:/home/apps/java/database/bin/*:/home/apps/java/database/bin/aws-java-sdk-1.3.12.jar"
export CLASSPATH
/home/apps/java/database/bin contains all the .jar files that are in the AWS SDK:
aspectjrt.jar
aspectjweaver.jar
aws-java-sdk-1.3.12.jar
aws-java-sdk-1.3.12-javadoc.jar
aws-java-sdk-1.3.12-sources.jar
aws-java-sdk-flow-build-tools-1.3.12.jar
commons-codec-1.3.jar
commons-logging-1.1.1.jar
freemarker-2.3.18.jar
httpclient-4.1.1.jar
httpcore-4.1.jar
jackson-core-asl-1.8.7.jar
jackson-mapper-asl-1.8.7.jar
mail-1.4.3.jar
myJar.jar
spring-beans-3.0.7.jar
spring-context-3.0.7.jar
spring-core-3.0.7.jar
stax-1.2.0.jar
stax-api-1.0.1.jar
What am I missing?? I have been looking at this for a day and a half. Thank you in advance!!
The classpath entry of "/directory/*" may be messing things up. Classpaths are separated with colons, but asterisk expansion gives spaces. Try this little shell script to start it.
#!/bin/sh
JAVA_OPTS="-Xms256M -Xmx4G"
CP=`find /home/apps/java/database/bin/*jar -exec echo -n "{}:" \;`
java -cp ${CP%?} -jar yourjar.jar
A couple of notes:
the JAVA_OPTS is only there as a reminder that you may need more memory than the default.
the crazy syntax for CP on the final line strips the last character, since the "find" line is leaving a colon on the end.
You may want to include your jar and launch the correct class if it isn't an executable jar.
Hope this helps!
This list of libs works for me
aws-java-sdk-1.11.285-javadoc.jar
aws-java-sdk-1.11.285-sources.jar
aws-java-sdk-1.11.285.jar
aws-java-sdk.jar
aspectjrt-1.8.2.jar
aspectjweaver.jar
aws-swf-build-tools-1.1.jar
commons-codec-1.9.jar
commons-logging-1.1.3.jar
freemarker-2.3.9.jar
httpclient-4.5.2.jar
httpcore-4.4.4.jar
ion-java-1.0.2.jar
jackson-annotations-2.6.0.jarÃ…
jackson-core-2.6.7.jar
jackson-databind-2.6.7.1.jar
jackson-dataformat-cbor-2.6.7.jar
javax.mail-api-1.4.6.jar
jmespath-java-1.11.285.jar
joda-time-2.8.1.jar
netty-buffer-4.1.17.Final.jar
netty-codec-4.1.17.Final.jar
netty-codec-http-4.1.17.Final.jar
netty-common-4.1.17.Final.jar
netty-handler-4.1.17.Final.jar
netty-resolver-4.1.17.Final.jar
netty-transport-4.1.17.Final.jar
spring-beans-3.0.7.RELEASE.jar
spring-context-3.0.7.RELEASE.jar
spring-core-3.0.7.RELEASE.jar
spring-test-3.0.7.RELEASE.jar

Protocol Buffers Invalid embedded descriptor problem

I'm having some problems at runtime with some of my generated protocol buffer classes.
My project layout is as follows:
module/
protobuf-api/
proto/
com/foo/api/Service.proto
com/foo/shared/Shared.proto
org/bar/api/Message1.proto
org/bar/api/Message2.proto
The Service.proto file depends on Shared.proto and some of the Message*.proto files. From the protobuf-api directory, I run the following command to compile:
find . -name *.proto -exec protoc --java_out=java -I=proto {} \;
When I attempt to run my Service, I get the following exception:
java.lang.ExceptionInInitializerError
at com.linkedin.history.api.protobuf.HistoryServiceProtos$HistoryServiceQuery.(HistoryServiceProtos.java:544)
at com.linkedin.history.api.serializer.HistoryServiceSerializer.serialize(HistoryServiceSerializer.java:47)
at test.history.serializer.TestSerializer.testHistoryServiceQuery(TestSerializer.java:38)
at test.fwk.util.core.BaseTestSuiteCore.run(BaseTestSuiteCore.java:304)
at test.fwk.util.core.BaseTestSuiteConf.run(BaseTestSuiteConf.java:186)
at test.fwk.lispring.BaseTestSuite.run(BaseTestSuite.java:232)
at test.fwk.lispring.BaseTestSuite.callAppropriateRun(BaseTestSuite.java:265)
at test.fwk.util.core.BaseTestSuiteCore.run(BaseTestSuiteCore.java:199)
Caused by: java.lang.IllegalArgumentException: Invalid embedded descriptor for "com/linkedin/history/api/protobuf/HistoryService.proto".
at com.google.protobuf.Descriptors$FileDescriptor.internalBuildGeneratedFileFrom(Descriptors.java:268)
at com.linkedin.history.api.protobuf.HistoryServiceProtos.(HistoryServiceProtos.java:1794)
Caused by: com.google.protobuf.Descriptors$DescriptorValidationException: com/linkedin/history/api/protobuf/HistoryService.proto: Dependencies passed to FileDescriptor.buildFrom() don't match those listed in the FileDescriptorProto.
at com.google.protobuf.Descriptors$FileDescriptor.buildFrom(Descriptors.java:221)
at com.google.protobuf.Descriptors$FileDescriptor.internalBuildGeneratedFileFrom(Descriptors.java:266)
I've read the post here but I think I'm doing everything correctly. Any suggestions on why I'm having the initializer errors? I'm compiling everything with the same -I flag.
I suspect that the problem is that when you're finding the proto file, you've given it the full path, e.g. proto/com/foo/api/Service.proto but when it refers to it via the include directory, it's using com/foo/api/Service.proto
Simple fix - run this from the proto directory:
find . -name *.proto -exec protoc --java_out=../java -I=. {} \;
I must admit I can't remember a lot of the details of protoc (which I really should) but I suspect that will work.
Another alternative which may work:
protoc --java_out=java `find . -name '*.proto'`
i.e. pass all the proto files into a single call to protoc.
I had the same error type in C# and here was my problem: I called the protoc in a pre-build step in my project. There I used Visual Studio built-in macros like $(SolutionDir) and $(ProjectDir) to retrieve necessary paths. Since I referenced *.proto files from other projects, I used two --proto_path options: one for the root path (to resolve import paths) and one for the file itself. My solution file was inside a subdirectory of the root directory, so I used the relative path .. to get to the root. Proto files are always in subdirectory gen of the particular project. All in all, the command was like this:
protoc.exe --proto_path=$(SolutionDir).. --proto_path=$(ProjectDir)gen $(ProjectDir)gen\DemoFile.proto
It compiled fine, but I got the System.TypeInitializationException at runtime on calling CreateBuilder() method. The problem was that both paths $(SolutionDir).. and $(ProjectDir) (though effectively pointing to the same directory) had different textual representation due to the relative path component ... I solved the problem by consistently using the same path like this:
protoc.exe --proto_path=$(SolutionDir).. $(SolutionDir)..\My\Demo\Project\Directory\gen\DemoFile.proto
It cost me almost 3 days to narrow down and recognize the problem, so I share my solution here in hope that it will save some time for someone.

Categories