I am having trouble with mockito dependencies.I wrote a sample test and when I ran it I got this exception
java.lang.NoClassDefFoundError: javassist/NotFoundException
at org.powermock.core.transformers.TestClassTransformerBuilder$RemovesTestMethodAnnotation.fromMethods(TestClassTransformerBuilder.java:62)
at org.powermock.tests.utils.impl.AbstractCommonTestSuiteChunkerImpl.createDefaultMockLoader(AbstractCommonTestSuiteChunkerImpl.java:126)
....
the full Exception can be viewed at : https://pastebin.com/xWqUX0Wc
and the
test code - https://pastebin.com/pbWLc27B
My dependencies are the following:
mockito-all-1.9.5.jar
powermock-api-mockito-1.6.3.jar
powermock-api-support-1.4.9.jar
powermock-core-2.0.4.jar
powermock-module-junit-1.7.4.jar
powermock-module-junit-common-1.7.4.jar
powermock-reflect-2.0.4.jar
powermock-test-utils-1.5.3.jar
Where can be the problem?I guess something is wrong with the version of the jars.What version of the jars would you suggest to use ?
Your are missing JavaAssist jar download below jars and add them into your projects or add them in maven pom.xml file.
<javaassist.version>3.20.0-GA</javaassist.version>
<dependency>
<groupId>org.javassist</groupId>
<artifactId>javassist</artifactId>
<version>${javaassist.version}</version>
<scope>compile</scope>
</dependency>
Related
I have a simple maven project to do some SIF calls with MDM hub, and adding castor dependencies for this.
Maven dependency added:
org.codehaus.castor
castor-xml
1.4.1
This downloaded the castor-xml-1.4.1.jar file.
Right at the line calling sipClient.process(req) below exception is thrown
Exception in thread "main" java.lang.NoSuchMethodError: org.exolab.castor.xml.Marshaller.getResolver()Lorg/exolab/castor/xml/ClassDescriptorResolver;
at com.siperian.sif.message.CastorUtil.setMappingLoader(CastorUtil.java:470)
at com.siperian.sif.message.CastorUtil.beanToXmlString(CastorUtil.java:358)
at com.siperian.sif.message.CastorUtil.beanToXmlString(CastorUtil.java:323)
at com.siperian.sif.message.CastorUtil.beanToXmlString(CastorUtil.java:309)
at com.siperian.sif.message.CastorUtil.beanToXmlString(CastorUtil.java:295)
at com.siperian.sif.client.HttpSiperianClient._process(HttpSiperianClient.java:117)
at com.siperian.sif.client.SiperianClient.process(SiperianClient.java:179)
I can see the getResolver method and classDescriptorResolver in the jar file in Java Decompiler, Images
classResolverDescriptor
getResolver method
Same exception even for 1.3.2 dependency.
Should I download any extra dependencies.
Thanks
This specific error for 2 reasons:
1- You are missing the jar file that has this method (This might not be the issue you have, as you stated you can see it when looking at the decompiled jar)
2- You have 2 or more jars in your dependencies, and it is actually looking at the jar which does not have the method you need.
How you should approach this is as follows:
Go to your ide, and open your pom.xml file
Open the Dependency Heirarchy view and search for org.codehaus.castor or castor-xml and see how many different versions you have.
If you have more than 1, and some are included as part of another jar, you can use in your pom.xml to remove the versions which you dont want.
If you like command line you can probably do the above, using mvn dependency:tree
Hope this helps you in some way.
-- Edited --
Your code is using 1.3.2 dependency. How? You can download the castor-xml.1.3.2.jar and extract it and look into the Marshaller. You will see the method getResolver() does not take any parameters, and therefore you get NoMethodFound.
´´´
/**
* Returns the ClassDescriptorResolver for use during marshalling
*
* #return the ClassDescriptorResolver
* #see #setResolver
*/
public XMLClassDescriptorResolver getResolver() {
}
´´´
Therefore you need to find out in your dependency hierarchy, will one includes this 1.3.2 jar and exclude this jar from it.
An example of how to do exclude is in pom.xml:
<dependency>
<groupId>sample.group.which.has.castor.in.it</groupId>
<artifactId>artifactor.which.has.castor.in.it</artifactId>
<version>1.0</version>
<scope>compile</scope>
<exclusions>
<exclusion> <!-- declare the exclusion here -->
<groupId>org.codehaus.castor</groupId>
<artifactId>castor-xml</artifactId>
<version>1.3.2</version>
</exclusion>
</exclusions>
</dependency>
<dependency> <!-- add proper dependency also, as it is needed -->
<groupId>org.codehaus.castor</groupId>
<artifactId>castor-xml</artifactId>
<version>1.4.1</version>
</dependency>
I'm hitting this error when run my package jar:
java.lang.NoClassDefFoundError: com/google/common/base/Joiner
I simply call: java -jar xxx.jar
I already added the dependency in my pom.xml:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
<scope>compile</scope>
</dependency>
I'm using IntelliJ editor. I have a unit test for the function which uses the Joiner class. It runs successfully from within IntelliJ.
I put my cursor on the Joiner and use "command + B" to search for the declaration of the Joiner class. It opens the Decompiled source code page, and on the heading it shows the path as: guava-18.0.jar/com/goog/common/base/Joiner
So everything looks correct.
Can anyone help me figure out why do I hit this error?
You could shade guava into your jar but it would result in a bigger jar file. But that way it would be included in your programm 100%. Read more about shading maven depencies into your jar file in the official maven-documentation: http://maven.apache.org/plugins/maven-shade-plugin/
I want to upload a file to HDFS. I compiled my code using following jars as dependencies:
hadoop-auth-2.6.1.jar,
hadoop-common-2.6.1.jar and
hadoop-hdfs-2.6.1.jar,
My code:
I compiled it with Ant. But, it gave me this error: No FileSystem for scheme:hdfs.
Then I changed the code and compiled again:
But now I got another error: Class org.apache.hdfs.DistributedFileSystem not found.
What's wrong? And what should I do?
DistributedFileSystem is part of hadoop-core.
To fix this problem, you need to include hadoop-core-1.2.1.jar also (Note: I am using Maven for building):
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
Overall, I am using following Maven dependencies:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
While getting Hadoop Filesystem object like below
FileSystem fs = FileSystem.get(hdfsUrl,configuration);
If you get following error :
"No FileSystem for scheme:hdfs"
You can resolve it by setting following 2 properties on configuration.
configuration.set("fs.hdfs.impl","org.apache.hadoop.hdfs.DistributedFileSystem");
configuration.set("fs.file.impl", "org.apache.hadoop.fs.LocalFileSystem");
Now, you may get new error like below :
java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found
Hadoop-common.jar uses Thread.currentThread.getContextClassLoader() and configuration.getClassLoader to load classes.
So, if you set your classLoader by using
Thread.currentThread.setContextClassLoader(yourClassLoader);
configuration.setClassLoader(yourClassLoader);
you will be able to load required class from other hadoop jars (e.g. hadoop-hdfs)
Let me know if you need more help. And don't forget to upvote if you find this bit useful.
I have the same problem, when I compile my Java code into a executable jar and run the compiled jar. Always some error "Not found" (e.g. in your case no FileSystem...), which means some hadoop jar is not included in the compilation.
The solution is add the correct dependencies in the Maven/Gradle or add (all) the jars.
In my case, the hdfs is from the class org.apache.hadoop.hdfs.DistributedFileSystem from the jar: hadoop-hdfs-client-3.2.1.jar.
The relevant jars which have been used can be found in the log file (if you successfully run the program and have the log file). In my example is the following:
You can simply add all the jars (from the installed hadoop folder). They should be in the folder of common/hdfs/ ... under the folder: hadoop 3.2.1/share/hadoop. There are possible other jars which are used but not shown in the log. To be safe, just include all the jars. You can run hdfs classpath in the terminal to find the location of all the jars.
Once all the jars have been added, in your java code, you may also need to set the hadoop configuration
Configuration hadoopConfiguration = new Configuration();
hadoopConfiguration.addResource(new Path(CoreSiteXMLStr));
hadoopConfiguration.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
Summary
When trying XMLConfiguration configuration = new XMLConfiguration("config/config.xml"); with only commons-configuration 1.10 I need to add more depencies (namely commons-collections not newer than 3.2.1) to my maven setup. Why is that so and why doesn't maven simply resolve all needed dependencies?
Details
I am trying to get commons-configuration to work. First I wanted to use the latest version, 2.0-alpha2, which didn't work well at all since I was unable to configure Maven to download the correct ressources - but that is another story.
After I found out that version 1.10 is in fact "one point ten" (not "one point one zero") and thus the latest version of commons-configuration 1 (and covered by the tutorials), I decided to give it a try instead.
For my maven dependencies (integrated in eclipse) I used:
<dependency>
<groupId>commons-configuration</groupId>
<artifactId>commons-configuration</artifactId>
<version>1.10</version>
</dependency>
However, when trying out this example:
package main;
import java.util.Iterator;
import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.XMLConfiguration;
public class ConfigurationTest {
public static void main(String... args) {
try {
XMLConfiguration configuration =
new XMLConfiguration("config/config.xml");
Iterator<String> iterator = configuration.getKeys();
while (iterator.hasNext()) {
System.out.println(iterator.next());
}
} catch (ConfigurationException e) {
e.printStackTrace();
}
}
}
with the following config.xml:
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<configuration>
<property>value</property>
<nestedproperty>
<arrayvalue>0,1,2,3,4</arrayvalue>
<property>anothervalue</property>
</nestedproperty>
</configuration>
I got the error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/collections/CollectionUtils
at org.apache.commons.configuration.XMLConfiguration.constructHierarchy(XMLConfiguration.java:640)
at org.apache.commons.configuration.XMLConfiguration.initProperties(XMLConfiguration.java:596)
at org.apache.commons.configuration.XMLConfiguration.load(XMLConfiguration.java:1009)
at org.apache.commons.configuration.XMLConfiguration.load(XMLConfiguration.java:972)
at org.apache.commons.configuration.XMLConfiguration$XMLFileConfigurationDelegate.load(XMLConfiguration.java:1647)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:324)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:261)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:238)
at org.apache.commons.configuration.AbstractHierarchicalFileConfiguration.load(AbstractHierarchicalFileConfiguration.java:184)
at org.apache.commons.configuration.AbstractHierarchicalFileConfiguration.<init>(AbstractHierarchicalFileConfiguration.java:95)
at org.apache.commons.configuration.XMLConfiguration.<init>(XMLConfiguration.java:261)
at main.ConfigurationTest.main(ConfigurationTest.java:12)
I first hoped they (not me, of course) just screwed up some maven dependencies and since I wouldn't bother which version to use anyway anymore (I didn't get 2.0 to work, remember?) I decided to go down to version 1.9 by replacing the maven dependency with:
<dependency>
<groupId>commons-configuration</groupId>
<artifactId>commons-configuration</artifactId>
<version>1.9</version>
</dependency>
That solved the problem pretty well, the test case is running:
property
nestedproperty.arrayvalue
nestedproperty.property
But when I tried to implement a similar example to the one referenced in Very simple Apache-commons configuration example throws NoClassDefFoundError and its follow-up question I got the exact same error which is referenced there - but the solution, importing org.apache.commons.beanutils.PropertyUtils is not working as I am missing the beanutils. So basically by downgrading I just switched from the error of missing the collections to missing beanutils.
There is a dependency overview where you can see which dependencies are used when you do what. I was a bit suprised to learn that version 1.10 now used other dependencies (namely the CollectionUtils) than 1.9 did in the constructor call. Since there were dependency problems in 1.10 as well as in 1.9 I just sticked to the newer version.
I found the CollectionUtils located in the following artifact (as I was pointed there by its maven repository):
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-collections4</artifactId>
<version>4.0</version>
</dependency>
Sadly that one (not obvious to me at first) doesn't define the class CollectionUtils in the package collections, but in the package collections4. It was hinted at this problem on the dependency overview, but they only mentioned possible problems with earlier versions... I appeared to be at a point of not thinking much about it anymore but simply changed the dependency to:
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.1</version>
</dependency>
I got everything to work (more or less, but the Exceptions I get now are not anymore depending on missing class definitions) after using these dependencies:
<dependencies>
<dependency>
<groupId>commons-configuration</groupId>
<artifactId>commons-configuration</artifactId>
<version>1.10</version>
</dependency>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.9.2</version>
</dependency>
</dependencies>
Why do I have to add the dependencies myself? I thought the whole point in using maven is to avoid having to do such things and in terms of javadocs and source files it does a pretty good job.
By now I am convinced that the dependencies are not included in the hierarchy by design (is that so?), probably to avoid overhead. However is there a way to either simply get all dependencies at once or even better to get all dependencies I need? And why is it designed this way?
If we analyse commons-configuration's POM we see that the commons-collections dependency is optional:
<dependencies>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.1</version>
<optional>true</optional>
</dependency>
...
Furthermore, from the Maven docs:
If a user wants to use functionality related to an optional
dependency, they will have to redeclare that optional dependency in
their own project.
This issue is explained on the Runtime dependencies page of the Commons Configuration website.
Quoting from that page:
A lot of dependencies are declared in the Maven POM. These are all needed during compile time. On runtime however you only need to add the dependencies to your classpath that are required by the parts of the Commons Configuration package you are using. The following table helps you to determine which dependencies you have to include based on the components you intend to use.
The other answers explain why this works from a Maven perspective. This answer is intended to provide a defence, of sorts, to the Commons Configuration folks. They did at least warn you!
In cases where the dependencies are on other Apache Commons components, they've taken the time to test with a variety of versions and have posted information on compatibility at the bottom of that page.
Maven tries to resolve all necessary dependencies for a library you're using in your pom. Well sometimes you have some dependencies which are only necessary for some specific features and you don't want to force the user of your dependency to download it if he doesn't use it. Then you're declaring your dependency as optional. This happened with commons-collections within commons-configuration. See commons-configuration-pom here
I am trying to use H2 to connect to a database in Java (using Eclipse as the IDE). The sample does (below) throws a ClassNotFoundException. The thing is, I did add the h2 jar file to the system CLASSPATH. I have even checked it's there several times via printenv in the console. Am I omitting a step?
CODE:
import java.sql.*;
public class Program {
/**
* #param args
*/
public static void main(String[] args)
throws Exception{
try{
System.out.println("hello, world!");
Class.forName("org.h2.Driver");
Connection conn = DriverManager.getConnection("jdbc:h2:~/testdb", "sa", "");
// add application code here
conn.close();
}catch(ClassNotFoundException ex){
System.out.println( "ERROR: Class not found: " + ex.getMessage() );
}
System.exit(0);
}
}
In my case (unrelated a bit, but worth mentioning), I added this to my maven pom, and the error message went away:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>xxx</version> <!-- ex: 1.2.140 -->
</dependency>
or if you are only using h2 during unit testing:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>xxx</version> <!-- ex: 1.2.140 -->
<scope>test</scope>
</dependency>
I was having the following error (using Intellij)
java ClassNotFoundException for org.h2.Driver
Solved it by removing the scope from my pom.
was:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.197</version>
<scope>test</scope>
</dependency>
changed to:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.197</version>
</dependency>
This type of error will come when we are implementing Maven Quickstart project as a dependency to another project. Mostly occurs as test only for junit. So in application it will not work.
Recently I encountered the java.lang.ClassNotFoundException: org.h2.Driver exception in IntelliJ IDEA 2017.2 EAP while using the latest version (1.4.196) of H2 driver. The solution was to downgrade to 1.4.195 that worked.
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.195</version>
<scope>test</scope>
</dependency>
The sample does (below) throws a ClassNotFoundException
Then the driver is not on the classpath.
The thing is, I did add the h2 jar file to the system CLASSPATH. I have even checked it's there several times via 'printenv' in the console.
How did you do that exactly? Please show the obtained output.
Am I omitting a step?
I can't say with the provided informations. But relying on the CLASSPATH environment variable is a bad practice anyway and you should use the -cp option if you're running Java on the command line. Like this:
java -cp h2.jar com.acme.Program
Is there a way I can set Eclipse to use the jar file when I use the RUN menu so that I don't have to run from the Console all the time?
Yes. Under Eclipse, add the JAR to the project build path: right-click on your project then Properties > Java Build Path > Libaries > Add JARS... (assuming the H2 JAR is available in a directory relative to your project). Others IDE have equivalent way of doing this.
In my case(I use sbt)
change
libraryDependencies += "com.h2database" % "h2" % "1.4.196" % Test
to
libraryDependencies += "com.h2database" % "h2" % "1.4.196"
If you use Gradle change dependency in build.gradle:
testCompile group: 'com.h2database', name: 'h2', version: '1.4.199'
to
compile group: 'com.h2database', name: 'h2', version: '1.4.199'
The <scope>[database_name]</scope> should include the database you are working with. If you change your db from one to another, make sure to change the scope also. As soon as i changed it the error went away.
Use release version.
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>RELEASE</version>
<scope>compile</scope>
</dependency>
In my case it's actually the connection string issue. I saw this.
After I added the mem in the URL string below, and it worked.
String url = "jdbc:h2:mem:~/test";
Remove the scope tag
<scope>test</scope>
Using <scope>test</scope> should not work logically. try it with <scope>runtime</scope> or <scope>provided</scope>, unless you need it only for testing phase.
On maven docs, it says that <scope>test</scope> dependency is not required for normal use of the application, and is only available for the test compilation and execution phases
https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html
I have faced same issue when using : 2 times in h2 database, because Driver manager can not identify proper connection type
when you use "jdbc:h2:localhost:123/db", then it divide into "jdbc","h2:localhost","123/db",
so here expected value is h2 but getting h2:localhost because of issue in core regex for identify driver class from list of loaded drivers
Use full path like:
Don't: "jdbc:h2:testdb", Do:"jdbc:h2:/c:/.../testdb"
Don't: "jdbc:h2:localhost:123/db", Do: "jdbc:h2://localhost:123/db"
I am working with intelliJ. I had the "h2-1.4.200" in the lib folder. I tried every suggestion, ranging from , to . Strangely, my problem got solved only by going to these places :
Project Structure -> Artifact -> Output Layout -> Available Elements
and then expanding the content of the folder and then right click on "h2-1.4.200" and finally select "Extract Into Output Root". :)
The strange solution
This worked with
testRuntimeOnly 'com.h2database:h2:1.4.200'
and in application.yaml property
remove or comment out # driverClassName: org:h2:Driver
As per springboot latest documentation jdbc driver is not required as it is automatically detected.
Ran into this issue and steps to fix:
1 - Corrected this org.h2.driver to org.h2.Driver
2 - Verify that the scope of h2 dependency in pom.xml is not set to test, if that is the case, remove it so your dependency is similar to:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>2.1.214</version>
</dependency>
3 - Finally, I had to run mvn clean verify compile. This will refresh the classes in target folder and add the h2 class to your classpath if it is not there.
Hope it helps :)
I use sbt.
In build.sbt, I removed the "h2" dependency and included this instead:
"com.h2.database" % "h2" % "1.4.200"
And it worked!