I have a mock class with a trivial implementation of a service I provide from a module. I'm using OpenJDK 11.03, gradle 5.2.1 and IntelliJ 2019.2.
In /main/code/myPackage/myService.java I have:
package myPackage;
class myService {
public abstract void someFunction();
}
And in my test/code/somePackage/myMockService I have:
package myPackage;
// no import, they're in the same package.
class myMockService extends myService {
#Override
public void someFunction() { System.out.prinln("Hello World"); }
}
In my main/code/module-info.java I have:
module myModule {
exports somePackage;
}
I've tried several variations on a test/code/module-info.java, without success. For example:
// "open module" lets anyone use reflection within (mostly JUnit 5 in my case)
import myPackage.myService;
import myPackage.myMockService;
open module myTestModule {
exports myPackage;
provides myService with myMockService
}
The above module-info.java spews errors about how "module name myTestModule does not match expected name myModule", "package 'myPackage' is not visible" (from myMockModule.java), explaining "package myPackage is declared in module myModule but module myTestModule does not read it"
On the other hand, with the following module-info.java, I get a different batch of errors (below the code)
import myPackage.myService;
import myPackage.myMockService;
open module myModule {
provides myService with myMockService;
}
Without a requires myModule;, every reference to the main code branch from my test code gives an "error: cannot find symbol". With a requires myModule;, I get an "error: cyclic dependence involving myModule".
So... my tests can't be in a different module. AND they can't be the same module! [long string of expletives deleted]
How do I introduce a mock version of a service in test code rather than creating an entirely different module/gradle sub-project?
Or is this simply a case where that's not possible, and while you can have a separate test module-info, you can't do much with it?
Or is there some way to dynamically load things at runtime such that I don't have to put every little mock service in any module-info, test or otherwise? Such that ServiceLoader.load() will find them. Hmm... perhaps extend ServiceLoader and wrap its usage in main code such that it'll use the right one either in production code or test code...
a) Welcome to "Testing in the Modular World"!
TL;DR https://sormuras.github.io/blog/2018-09-11-testing-in-the-modular-world.html
Having one or more dedicated test modules is good. With all bells-and-whistles, read module-info.java declarations. Those test modules are your main modules' first clients. Just make sure, that your build tool packages all main modules before compiling and running the test modules. Otherwise you don't test your main modules as close as possible to reality — others will consume your main modules as JAR files. So should you. This solves all issues with services and multi-release JARs as well.
Now the interesting part: in-module testing, also named white box testing. Or how do test types residing non-exported packages or package-private types in exported packages? Either use a build that knows how to patch test modules into main modules (or vice versa) at test compile and/or test runtime. Like pro or Bach.java (which I maintain), or in your case of using Gradle, see b)elow part of this answer.
b) Gradle and Java main, test, … modules are not friends out-of-the-box, yet
Best plugin-based solution: https://github.com/java9-modularity/gradle-modules-plugin -- which honors the pass theses java command line options at test runtime module-info.test configuration file (which I invented). Here you basically desribe your test module requirements via verbose command line options, although a perfect DSL already exists: module-info-java ... loop back to a) and the module-aware build tools.
c) IntelliJ IDEA and Java test modules are ... improving!
https://youtrack.jetbrains.com/issue/IDEA-171419 module-info.java support in 2019.3
https://youtrack.jetbrains.com/issue/IDEA-222831 module-info.test support, soon?
Related
I have two jars. One provides a service interfaces and a service loading class and one provides the implementation of that service.
This works perfectly when running this in jdk8, but I get an service type not accessible to unnamed module #3754a4bf error when running on jdk9 or higher.
I migrated that two jars to a module base jar and that works great when running on >= jdk9, but that fails on jdk8 of cause, because of wrong class file version.
So, I don't have problem using java serivceloader api with java 8 or 9.
I'm aware of https://blog.codefx.org/tools/multi-release-jars-multiple-java-versions/ , but I want to avoid to make the build process more complicated. The build already involves class relocation and
other stuff.
My question: Is there a way to use the same jars running on jdk8 and >=jdk9 using java serviceloading api?
In JDK9 and up, all jars are effectively 'modules'. If they do not have a module definition (which you create by making a file named module-info.java and putting a module declaration inside), its name is 'unnamed module #whatever', it exports all its packages, and can access anything exported by any other module (in module-speak: it 'reads' everything). Meaning, if all your classpath dependencies are such unnamed modules, they all export everything and they all read everything thus they all can access anything marked public from anybody else – which is how it worked in JDK8, and thus, why it's all compatible.
To be clear: in JDK9, for code in module (jar) A to access a method, class, or field from module (jar) B, then in addition to the usual access modifiers (the public keyword), B needs to 'export' that package, and A needs to read that module, or it doesn't work. If you aren't explicitly writing module info files for either side, well, then you get the default behaviour which is 'export everything' and 'read everything', bringing us back to the JDK8 scenario of: the thing has to be marked public, and then you can access it.
I actually do not recommend you make that module-info file; that's a big step, one you should only take once you're familiar with the module system.
The error means that the 'service type'[1] is not 'accessible'[2] to 'unnamed module #3754a4bf'[3]. Let's break that into pieces:
[1] serviceloader works by defining an interface that a 'service provider' implements, and the class using the service loader then ends up with a bunch of instances of that interface; each representing one implementation. It's the Foo in ServiceLoader.load(Foo.class).
[2] 'accessible' is module speak for: Either the code that needs this did not explicitly 'read' it, or the code that has this did not export it.
[3] 'unnamed module#3754abf' is the name of the module that is trying to access it.
Taking this all together, it means: You're operating under false assumptions: Whatever jar contains your service interface (that Foo I was talking about), is NOT an unnamed module, or, possibly, isn't public. Note that if it is a public interface inside a not-public class (i.e.: /* package private */ class Example { public interface MyServiceInterface {} }), that probably still counts as 'not public enough'.
If it's a named module (which means: it has a module-info.java file), then export the package that the service interface is in. See any jigsaw (the name of the java module system) tutorial on how to set that up. If it's not, ensure it's a public interface or abstract class, and if its inside some other type, make sure those are public too. If neither is the case, check your classpaths; javac is rather adamant that one of these two is the case.
This is my base module which needs implementations of interfaces defined in myspi package. Various providers can offer MyProvider implementations. Base module uses them via myspi.MyProvider interface implementation.
module base {
exports myspi;
uses myspi.MyProvider;
}
This is my sample implementation module which provides the MyProvider implementation with MyProviderImpl
module myspi.provider {
provides myspi.MyProvider with myspi.provider.MyProviderImpl;
}
All these work fine when I load the implementations in base module, with
public static List<MyProvider> getMyProviders() {
var myProviders = new ArrayList<MyProvider>();
for (MyProvider myProvider : ServiceLoader.<MyProvider>load(MyProvider.class)) {
myProviders.add(myProvider);
}
return myProviders;
}
But same code returns empty list in Junit 5 test code (ServiceLoader returns null). How can I test the service provider modules with Junit 5. Or is there any alternative to Junit that allows us to create test modules (modularized test API) that declares "uses myspi.MyProvider" in the module-info and works fine with getMyProviders()?
Basically you're on the right track. You need to convince the Java module system that your test modules are the single source of thruth when it comes to resolve modules are test runtime.
Black-box testing is easy.
White-box testing in the modular world, meaning testing protected and package private members within a module, is tricky. There are at least two ways to achieve this: a) use java command line options to configure the Java module system at test startup or b) blend main sources into the test sources at compile time and maintain a dedicated module-info.java in your test sources.
Please visit the links to the blogs and examples posted over at How to make a modular build with jdk > 1.8
Here is an excerpt for convenience:
Examples
Work-in-progress blueprint https://github.com/sormuras/sandbox/tree/master/sors-modular-testing-blueprint
Integration tests starting with "modular-world-" at https://github.com/sormuras/junit-platform-maven-plugin/tree/master/src/it
Background and other resources
https://github.com/junit-team/junit5-samples/tree/master/junit5-modular-world
https://github.com/forax/pro
https://blog.codefx.org/java/five-command-line-options-to-hack-the-java-9-module-system/
And expect most IDE to not support you either. For now.
SOLVED!
I've removed the Junit from class-path to module-path and also removed all Junit 4 compatibility stuff such as RunWith() etc, and made my test pure Junit 5 test.
I've added a module-info.java (Junit 5 doesn't require an open module although the books tell the opposite)
After I've modularized the tests I found that it still doesn't execute the ServiceLoader stuff. Then I've started looking for the fault myself.
And I found it! Running the ServiceLoader stuff in base module was possible, because the base module refers to the exported myProvider.jar, which in turns access a myProvider-config.properties file in the same directory. Without this config file myProvider cannot work properly.
The problematic test module on the other hand, refered the eclipse project of the myProvider instead of its exported .jar file and hence could not find its config file and exits. I'd moved this config file from Netbeans to Eclipse simply copying it into the same directory. Thus missing config file was the problem.
Changing the project settings I could run the tests without any failure.
I would like to thank all the contributors who responded.
This is quite an old post but if anyone gets here trying to test java modules with junit 5 with gradle, especially the consumer/provider as presented in this post , Sormuras solution is the easy way, to patch the consumer module with the tests classes.
it is supported by gradle-modules-plugin that does that out of the box:
https://github.com/java9-modularity/gradle-modules-plugin
I have an "executable" Java 9 module (meaning it won't expose any packages, it just contains a main function) which I need to test.
I am using Gradle's java-library and org.gradle.java.experimental-jigsaw plugins.
I have some package-private methods I need to test, and when I run in IntelliJ the tests work, but when running with Gradle, I get many errors like this:
abc.MyClassTest > myTestMethod FAILED
java.lang.IllegalAccessException
In the Gradle report, I see the root of the error:
class org.junit.runners.BlockJUnit4ClassRunner (in module junit)
cannot access class abc.MyClassTest (in module com.my.mod)
because module com.my.mod does not export abc to module junit
If I add this to my module-info.java file, it works with a warning:
exports abc to junit; // I don't really want to export this
Warning (when compiling):
warning: [module] module not found: junit
This looks pretty horrible even without the warning, in my opinion.
My question: how to "open" this package for tests only to avoid warnings and errors?
As the Gradle Plugin is still experimental (6 months after the release of Java 9), it seems it still has some bugs and missing features.
But here's how to work around the issue while the Gradle devs don't improve the situation.
Add this to your build file:
test {
doFirst {
jvmArgs += [
'--add-exports', "module/package=junit",
]
}
}
Where module/package is the name of the module and package you need to make visible to JUnit.
This is basically equivalent to the exports package to junit clause in module-info.java, but without having the clause make into the compiled jar!
This option is not documented in javac -help, but you can at least find it mentioned in the Oracle JDK Docs and, more thoroughly, in JEP-261... see the Breaking encapsulation section.
I have some JUnit tests that contained in a .jar that is intended to be used as a library. The library contains some tests that should be run whenever the library is used in another project.
However when I create a new project using the library and run JUnit on it in Eclipse then the tests in the dependency .jar don't run / don't get detected by the JUnit test runner. I get the message:
No tests found with test runner 'JUnit 4'.
Is there a way I can configure the dependency .jar so that the tests will run alongside any tests that might be contained in the main project?
Basically I want the dependency .jar to "export" the tests to whatever projects it is used in.
I'm using Eclipse Juno, JUnit 4.10, and Maven for the dependency management.
EDIT:
The point of this library is to be able to help test projects that use it - i.e. it runs some specialised tests. This is why I want to be able to import the library .jar and have it contribute the extra tests to the importing project.
You can try Maven Surefire.
In some cases it would be useful to have a set of tests that run with various dependency configurations. One way to accomplish this would be to have a single project that contains the unit tests and generates a test jar. Several test configuration projects could then consume the unit tests and run them with different dependency sets. The problem is that there is no easy way to run tests in a dependency jar. The Surefire plugin should have a configuration to allow me to run all or a set of unit tests contained in a dependency jar.
This can be done as follows (Junit 3):
Ensure test jar contains a class which has a static suite() method
import junit.framework.Test;
import junit.framework.TestSuite;
public class AllTests {
public static Test suite()
{
TestSuite suite = new TestSuite( "All Tests");
suite.addTestSuite(TestOne.class);
suite.addTestSuite(TestTwo.class);
return suite;
}
}
Then in the project using the test-jar dependency:
create a TestCase:
package org.melati.example.contacts;
import org.melati.poem.AllExportedTests;
import junit.framework.Test;
import junit.framework.TestCase;
public class PoemTest extends TestCase {
public static Test suite()
{
return AllExportedTests.suite();
}
}
Now the tests will be found.
I think that making a library of unit tests (#Test annotated methods) is a bad idea. However, making a library of reusable test components is a good one. We've done this in a few open source projects, and you can take a look how it works.
One Maven module exports test components (we call them "mocks"), from src/mock/java directory. Exported artifact has -mock classifier. See rexsl/pom.xml (pay attention to highlighted lines).
Mock artifacts are being deployed to Maven Central, together with usual artifacts: http://repo1.maven.org/maven2/com/rexsl/rexsl-core/0.3.8/ (pay attention to ...-mock.jar files)
Modules that need that mocks can include them as usual artifacts, for example rexsl-core/pom.xml (see highlighted lines):
Then, in your unit tests just use the classes from that mock libraries, like regular builders of mocks, for example: BulkHttpFeederTest
That's how you can make your test artifacts reusable, in an elegant way. Hope it helps.
#Mikera,
I find that this may help you. Just extend the Testcase Class to one of your java classes in project and you can run that particular class to run it as a JUnit Test.
I am not sure that this is desirable - On the one hand, if you use a jar, its behaviour might be influenced by the external context, e.g. other libraries in the classpath. From inside the jar, there is no simple way to analyse this context and to adjust the tests accordingly. On the other hand, if you write and compile a library, you should test it before packaging it as a jar. You might even want to not include your tests.
If it is really important to you to run the tests again, I would be interested in what could make them fail without changing the jar. In that case, however, you might want to extend the testrunner. As far as I know it uses reflection. You can quite easily load jars in a classloader and go through all their classes. By reflection you can identify the test classes and assemble testsuites. You could look into the testrunner for an example. Still, you would need to start this process from outside, e.g. from inside one of your test classes in the client project. Here, QATest's approach might be helpful: By providing an overriden version of testsuite or testrunner, you could automate this - if the client uses your overridden API.
Let me know if this rather costly approach seems to be applicable in your scenario and I can provide code examples.
Why should the user of the jar run the test cases inside the jar!!! When the jar is packaged and delivered, it means that the unit tests are run successfully.
Typically, the jar itself should be either treated as a separate project or as one of the modules. In both the cases, unit test cases are run before its delivered.
As I read from Art of unit test, knowing that .NET can hide seam methods for testing in production runtime. (p.78~p.80).
Such as,
public class LogAnalyzer
{
...
internal LogAnalyzer (IExtensionManager extentionMgr)
{
manager = extentionMgr;
}
}
run like this.
using System.Runtime.CompilerServices;
[assembly:
InternalsVisibleTo("AOUT.CH3.Logan.Tests")]
So LogAnalyzer() can only be called for test classes, without worries of adding extra cost on production code on purpose of testability.
After brief survey, seems Java does not have equivalent feature.
But does Java have alternatives?
Thanks.
What about implementing your own custom ClassLoader? You can define your own annotation like #HideFromProductionCode and have your custom ClassLoader throw an exception if it loads a class that has the #HideFromProductionCode annotation. See How to set my custom class loader to be the default?
Alternately, just add a script to your build process that goes through all your compiled production code and looks for the #HideFromProductionCode annotation.
One fairly straightforward approach would be to use a Maven-like directory structure, with separate directories for production code and test code (typically under directories called src/main and src/test). When unit tests are run, the classpath includes both the main directory and the test directory. But when you build the JAR that gets deployed in production, only classes defined in the main directory are included; this way, production code that references test classes will result in a compile error.