Gradle and JaCoCo: instrument classes from a separate subproject - java

I have a legacy application that has a unit test module that's separate from the application modules. I'm converting the project to use Gradle and the structure looks like this:
/root
/module1
/module2
...
/moduleN
/test
where the test module executes tests for module1 through moduleN (and depends on them). I know this isn't a very good practice as it kinda defeats the purpose of unit tests but as all know, legacy code is always a headache to work with.
So before I start refactoring the code so that each module has its own unit tests (which means disassembling the test module in a sensible way, i.e., a lot of work) I wanted to find a temporary solution to get the correct code coverage, i.e., have JaCoCo instrument all the classes from module1, ..., moduleN instead of just module test.
Is there a way to tell JaCoCo to instrument classes from other modules?

To include coverage results from subprojects "module*" in the "test" project, you might want to try something like this in your build.gradle from the test project:
// [EDIT] - 'afterEvaluate' NOK, use 'gradle.projectsEvaluated' instead (see comments)
// afterEvaluate {
gradle.projectsEvaluated {
// include src from all dependent projects (compile dependency) in JaCoCo test report
jacocoTestReport {
// get all projects we have a (compile) dependency on
def projs = configurations.compile.getAllDependencies().withType(ProjectDependency).collect{it.getDependencyProject()}
projs.each {
additionalSourceDirs files(it.sourceSets.main.java.srcDirs)
additionalClassDirs files(it.sourceSets.main.output)
}
}
}

Related

In a java maven or gradle project, can we write test cases in main package rather than in test package?

How do we make mvn build or Gradle build to take test cases from the main package if we have written test cases in the main package?
In gradle you just redefine the main and test sourceSets with filters in your build.gradle file to exclude / include the test files in the specific phase of the build.
For example the file named LibraryTest.java will be compiled and executed only during the test phase.
sourceSets {
main {
java {
srcDirs = ['src/main/java']
exclude '**/*Test.java'
}
}
test {
java {
srcDirs = ["src/main/java"]
include '**/*Test.java'
}
}
}
you can write your test cases where ever you want, BUT!
It is not a recommended practice, so if you are using maven/gradle - they will give you dedicated folder/path for writing test cases.
The reason why it is not recommended - is maven/gradle provides lot of plugins which will help you to run your test cases, generate reports for those test cases, control the build if test cases fails.
All these plugins will look up at the default path, so if you decide to use a different path rather than default - you need to change the path for test cases in all your plugin.
so if you choose to use your own path for test resources, you are just adding overhead of additional configuration changes.

Java 9 + maven + junit: does test code need module-info.java of its own and where to put it?

Let's say I have a Java project using Maven 3 and junit. There are src/main/java and src/test/java directories which contain main sources and test sources, respectively (everything is standard).
Now I want to migrate the project to Java 9. src/main/java content represents Java 9 module; there is com/acme/project/module-info.java looking approximately like this:
module com.acme.project {
require module1;
require module2;
...
}
What if test code needs module-info.java of its own? For example, to add a dependence on some module that is only needed for tests, not for production code. In such a case, I have to put module-info.java to src/test/java/com/acme/project/ giving the module a different name. This way Maven seems to treat main sources and test sources as different modules, so I have to export packages from the main module to the test module, and require packages in the test module, something like this:
main module (in src/main/java/com/acme/project):
module prod.module {
exports com.acme.project to test.module;
}
test module (in src/test/java/com/acme/project):
module test.module {
requires junit;
requires prod.module;
}
This produces
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.7.0:testCompile (default-testCompile) on project test-java9-modules-junit: Compilation failure: Compilation failure:
[ERROR] /home/rpuch/git/my/test-java9-modules-junit/src/test/java/com/acme/project/GreeterTest.java:[1,1] package exists in another module: prod.module
because one package is defined in two modules. So now I have to have different projects in main module and test module, which is not convenient.
I feel I follow wrong path, it all starts looking very ugly. How can I have module-info.java of its own in test code, or how do I achieve the same effects (require, etc) without it?
The module system does not distinguish between production code and test code, so if you choose to modularize test code, the prod.module and the test.module cannot share the same package com.acme.project, as described in the specs:
Non-interference — The Java compiler, virtual machine, and run-time system must ensure that modules that contain packages of the same name do not interfere with each other. If two distinct modules contain packages of the same name then, from the perspective of each module, all of the types and members in that package are defined only by that module. Code in that package in one module must not be able to access package-private types or members in that package in the other module.
As indicated by Alan Bateman, the Maven compiler plugin uses --patch-module and other options provided by the module system when compiling code in the src/test/java tree, so that the module under test is augmented with the test classes. And this is also done by the Surefire plugin when running the test classes (see Support running unit tests in named Java 9 modules). This means you don't need to place your test code in a module.
You might want to rethink the project design you're trying to implement. Since you are implementing a module and its test into a project, you shall refrain from using different modules for each of them individually.
There should just be one single module-info.java for a module and its corresponding tests.
Your relevant project structure might look like this:-
Project/
|-- pom.xml/
|
|-- src/
| |-- test/
| | |-- com.acme.project
| | | |-- com/acme/project
| | | | |-- SomeTest.java
| |
| |-- main/
| | |-- com.acme.project
| | | |-- module-info.java
| | | |-- com/acme/project
| | | | |-- Main.java
where the module-info.java could further be:-
module com.acme.project {
requires module1;
requires module2;
// requires junit; not required using Maven
}
Just to sum all of the above as per your questions --
I feel I follow wrong path, it all starts looking very ugly. How can I
have module-info.java of its own in test code, or how do I achieve the
same effects (require, etc) without it?
Yes, you should not consider managing different modules for test code making it complex.
You can achieve similar effect by treating junit as a compile-time dependency using the directives as follows-
requires static junit;
Using Maven you can achieve this following the above-stated structure and using maven-surefire-plugin which would take care of patching the tests to the module by itself.
Adding some details.
In Java since 9, a jar file (or a directory with classes) may be put on classpath (as earlier), or on module path. If it is added to classpath, its module-info is ignored, and no module-related restrictions (what reads what, what exports what, etc) are applied. If, however, a jar is added to module path, it is treated as a module, so its module-info is processed, and additional module-related restrictions will be enforced.
Currently (version 2.20.1), maven-surefire-plugin can only work in the old way, so it puts the classes being tested on classpath, and module-path is ignored. So, right now, adding module-info to a Maven project should not change anything with tests being run using Maven (with surefire plugin).
In my case, the command line is like the following:
/bin/sh -c cd /home/rpuch/git/my/test-java9-modules-junit && /home/rpuch/soft/jdk-9/bin/java --add-modules java.se.ee -jar /home/rpuch/git/my/test-java9-modules-junit/target/surefire/surefirebooter852849097737067355.jar /home/rpuch/git/my/test-java9-modules-junit/target/surefire 2017-10-12T23-09-21_577-jvmRun1 surefire8407763413259855828tmp surefire_05575863484264768860tmp
The classes under test is not added as a module, so they are on classpath.
Currently, a work is under way in https://issues.apache.org/jira/browse/SUREFIRE-1262 (SUREFIRE-1420 is marked as a duplicate of SUREFIRE-1262) to teach surefire plugin to put code under test on module path. When it is finished and released, a module-info will be considered. But if they will make the module under test to read junit module automatically (as SUREFIRE-1420 suggests), module-info (which is a main module descriptor) will not have to include a reference to junit (which is only needed for tests).
A resume:
module-info just needs to be added to the main sources
for the time being, surefire ignores new module-related logic (but this will be changed in the future)
(when modules will work under surefire tests) junit will probably not need to be added to the module-info
(when modules will work under surefire tests) if some module is required by tests (and only by them), it may be added as a compile-only dependence (using require static), as suggested by #nullpointer. In this case, the Maven module will have to depend on an artifact supplying that test-only module using compile (not test) scope which I don't like much.
I just want to add my 0.02$ here on the general testing approach, since it seems no one is addressing gradle and we use it.
First thing first, one needs to tell gradle about modules. It is fairly trivial, via (this will be "on" since gradle-7):
plugins.withType(JavaPlugin).configureEach {
java {
modularity.inferModulePath = true
}
}
Once you need to test your code, gradle says this:
If you don’t have a module-info.java file in your test source set (src/test/java) this source set will be considered as traditional Java library during compilation and test runtime.
In plain english, if you do not define a module-info.java for testing purposes - things "will just work" and in the majority of cases this is exactly what we want.
But, that is not the end of story. What if I do want to define an JUnit5 Extension, via ServiceLocator. That means I need to go into module-info.java, from tests; one that I yet do not have.
And gradle has that solved again:
Another approach for whitebox testing is to stay in the module world by patching the tests into the module under test. This way, module boundaries stay in place, but the tests themselves become part of the module under test and can then access the module’s internals.
So we define a module-info.java in src/test/java, where I can put :
provides org.junit.jupiter.api.extension.Extension with zero.x.extensions.ForAllExtension;
we also need to do --patch-module, just like maven plugins do it. It looks like this:
def moduleName = "zero.x"
def patchArgs = ["--patch-module", "$moduleName=${tasks.compileJava.destinationDirectory.asFile.get().path}"]
tasks.compileTestJava {
options.compilerArgs += patchArgs
}
tasks.test {
jvmArgs += patchArgs
}
The only problem is that intellij does not "see" this patch and thinks that we also need a requires directive (requires zero.x.services), but that's not really the case. All the tests run just fine from command line and intellij.
The example is here
Also note that maven-surefire-plugin now has useModulePath false as a configuration option.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M6</version>
<configuration>
<useModulePath>false</useModulePath> <!-- tests use classpath -->
</configuration>
</plugin>
This is an option where a project uses module-path for main but classpath for tests and testing. It is probably not a bad option for people to go to this approach if "patching" the module-path gets painful.
Edit: We can also set this via property - surefire.useModulePath e.g.
<properties>
<surefire.useModulePath>false</surefire.useModulePath>
</properties>
I was not able to make it work also with the latest Maven surefire plugin version (3.0.0-M5). It seems if the main sources are using a module, the compiler plugin when using Java 11 is also expecting referenced packages to be in a module.
My solution was to place an own module-info.java inside the test sources (src/test/java in Maven) for the test module with the below contents.
I my case I had to use the keyword open (See Allowing runtime-only access to all packages in a module) because I'm using Mockito in my test, which requires reflective access.
// the same module name like for the main module can be used, so the main module has also the name "com.foo.bar"
open module com.foo.bar {
// I use junit4
requires junit;
// require Mockito here
requires org.mockito;
// very important, Mockito needs it
requires net.bytebuddy;
// add here your stuff
requires org.bouncycastle.provider;
}

Shared test code in multi-project Maven Java environment

I have a Maven multi-project set-up with a parent POM, one library-type jar and several application-jars using this library. Now I'd like to take advantage of this library-type jar's testing facility (especially the API mocking features) in the unit tests of the applications to avoid repeating the same unit test code in all of the applications, which I am doing now. However it seems this is not trivial.
Any of you done this, or any other solutions to suggest to this problem?
[edit]
clarification:
If I have applications A, B and library L, and dependencies like this:
A -> L <- B
I would like the test code of above have dependencies correspondingly:
A' -> L' <- B'
So the application's test code could see the library's test code.

Run JUnit tests from a dependency jar in Eclipse

I have some JUnit tests that contained in a .jar that is intended to be used as a library. The library contains some tests that should be run whenever the library is used in another project.
However when I create a new project using the library and run JUnit on it in Eclipse then the tests in the dependency .jar don't run / don't get detected by the JUnit test runner. I get the message:
No tests found with test runner 'JUnit 4'.
Is there a way I can configure the dependency .jar so that the tests will run alongside any tests that might be contained in the main project?
Basically I want the dependency .jar to "export" the tests to whatever projects it is used in.
I'm using Eclipse Juno, JUnit 4.10, and Maven for the dependency management.
EDIT:
The point of this library is to be able to help test projects that use it - i.e. it runs some specialised tests. This is why I want to be able to import the library .jar and have it contribute the extra tests to the importing project.
You can try Maven Surefire.
In some cases it would be useful to have a set of tests that run with various dependency configurations. One way to accomplish this would be to have a single project that contains the unit tests and generates a test jar. Several test configuration projects could then consume the unit tests and run them with different dependency sets. The problem is that there is no easy way to run tests in a dependency jar. The Surefire plugin should have a configuration to allow me to run all or a set of unit tests contained in a dependency jar.
This can be done as follows (Junit 3):
Ensure test jar contains a class which has a static suite() method
import junit.framework.Test;
import junit.framework.TestSuite;
public class AllTests {
public static Test suite()
{
TestSuite suite = new TestSuite( "All Tests");
suite.addTestSuite(TestOne.class);
suite.addTestSuite(TestTwo.class);
return suite;
}
}
Then in the project using the test-jar dependency:
create a TestCase:
package org.melati.example.contacts;
import org.melati.poem.AllExportedTests;
import junit.framework.Test;
import junit.framework.TestCase;
public class PoemTest extends TestCase {
public static Test suite()
{
return AllExportedTests.suite();
}
}
Now the tests will be found.
I think that making a library of unit tests (#Test annotated methods) is a bad idea. However, making a library of reusable test components is a good one. We've done this in a few open source projects, and you can take a look how it works.
One Maven module exports test components (we call them "mocks"), from src/mock/java directory. Exported artifact has -mock classifier. See rexsl/pom.xml (pay attention to highlighted lines).
Mock artifacts are being deployed to Maven Central, together with usual artifacts: http://repo1.maven.org/maven2/com/rexsl/rexsl-core/0.3.8/ (pay attention to ...-mock.jar files)
Modules that need that mocks can include them as usual artifacts, for example rexsl-core/pom.xml (see highlighted lines):
Then, in your unit tests just use the classes from that mock libraries, like regular builders of mocks, for example: BulkHttpFeederTest
That's how you can make your test artifacts reusable, in an elegant way. Hope it helps.
#Mikera,
I find that this may help you. Just extend the Testcase Class to one of your java classes in project and you can run that particular class to run it as a JUnit Test.
I am not sure that this is desirable - On the one hand, if you use a jar, its behaviour might be influenced by the external context, e.g. other libraries in the classpath. From inside the jar, there is no simple way to analyse this context and to adjust the tests accordingly. On the other hand, if you write and compile a library, you should test it before packaging it as a jar. You might even want to not include your tests.
If it is really important to you to run the tests again, I would be interested in what could make them fail without changing the jar. In that case, however, you might want to extend the testrunner. As far as I know it uses reflection. You can quite easily load jars in a classloader and go through all their classes. By reflection you can identify the test classes and assemble testsuites. You could look into the testrunner for an example. Still, you would need to start this process from outside, e.g. from inside one of your test classes in the client project. Here, QATest's approach might be helpful: By providing an overriden version of testsuite or testrunner, you could automate this - if the client uses your overridden API.
Let me know if this rather costly approach seems to be applicable in your scenario and I can provide code examples.
Why should the user of the jar run the test cases inside the jar!!! When the jar is packaged and delivered, it means that the unit tests are run successfully.
Typically, the jar itself should be either treated as a separate project or as one of the modules. In both the cases, unit test cases are run before its delivered.

Gradle - how to run the tests from a different gradle project and still get coverage data

Does anyone know how to run the tests from a different gradle project and still get emma coverage reporting data?
Here is my current layout:
Root/
settings.gradle (no explicit build.gradle - just defines all subprojects)
SubProjectA/
build.gradle
src/ (all real source is here)
SubProjectATest/
build.gradle
src/ (all testing code is here)
SubProjectB/ (similar structure as A)
SubProjectBTest/ (similar structure as ATest)
I am currently using the emma plugin, and I would like to build SubProjectA and run all the tests in SubProjectATest from within the build.gradle of SubProjectA.
Here are some things I tried inside the build.gradle of SubProjectA
testCompile project(':SubProjectATest').sourceSets.test.classes (as suggested by this article), but I got an error "Could not find property 'sourceSets' on project"
Just the straight-up testCompile project(':SubProjectATest'), but then I get "..SubProjectA/build/classes/test', not found" and also "Skipping task ':SubProjectA:compileTestJava' as it has no source files."
Simply adding a sourceSet like the following:
test {
java {
srcDir '../SubProjectATest/src'
}
}
Adding the source set in (option 3) is the only option that worked, but it seems sloppy to do it this way. Does anyone know how to do this using project dependencies?
Update #1
I also tried one of the answers below to use test.dependsOn and the tests do run, but the emma plugin reported the following: build/classes/test', not found
1. and 2. just add classes to the test compile class path. This doesn't have any effect on which tests are going to be executed.
3. is the wrong approach because you should not add sources from project X to project Y.
If what you want is that gradle :SubProjectA:test also executes :SubProjectATest:test, all you need to do is to add a task dependency:
SubProjectA/build.gradle:
test.dependsOn(":subProjectATest:test")
By the way, what is your motivation for putting the tests in a separate project?

Categories