I'm using Lombok in a gradle based project and the build process requires to delombok the source code before continuing with unit tests or production build. What is the best way to achieve something like that in gradle?
At the moment the generated classes are created in a /build/delombok directory.
The first thought I had was to create a new source set and a task that is based on compileJava but if I do this I will probably have to update every other gradle task to depend on my new one. Is this the right approach or is there something better?
I'm using gradle 4.10
Related
My library depends on another library; let's call it "lib". I want to test my library with multiple versions of lib, in an automated manner.
Test if my library compiles for each version of lib.
Run JUnit 5 tests for each version of lib.
Are there any existing solutions for this?
I could write a script that changes the version number of lib in my pom.xml and executes mvn compile and mvn surefire:test. I could also use profiles and automate this with a script. I was hoping there is a better way, through something like a Maven plugin.
Maven focuses on reproducible builds which means that if you repeat the build at a later date you should get the same results which in turn requires that the dependency versions are fixed.
This fundamental mindset is what you want to challenge. Maven won't like it even if it is for a good reason, and you will most likely need to have a separate full run for each version instead of looping inside Maven.
I was thinking that the way I would approach this, is to have a bill-of-materials POM that has a dependencyManagement section that lists the exact version you want to have, which is generated in the local filesystem before each run, and then orchestrate a run for each version you want to test.
You can also leverage your build system and have a repository which orchestrates this. Github Actions can do array builds which might be what you need.
I had a code that was working correctly when was executed during standard unit testing, but didn't work when it was compiled into the jar and was added as dependency for some other project.
It wasn't an issue to find the root cause and fix it, but I started to think how can I test freshly made jar artifact before deploying it anywhere, to make sure that it will work for end users and other projects. I have googled this topic for several hours, but didn't even find something close to it.
Maybe I'm totally wrong and trying to achieve something weird, but I cannot figure out another way how to verify compiled packages and be confident that it will work for others.
Some details about the project - simple Java library with few classes, using Gradle 5.5 as a build system and travis-ci as CI/CD tool, for testing I'm using TestNG, but I can easily switch to JUnit if it will be required.
If you curious about the code, which was not working when was compiled into the package, here is simplified version:
public String readResourceByURI() throws IOException, URISyntaxException
{
new String(Files.readAllBytes(Paths.get(ClassLoader.getSystemClassLoader().getResource("resource.txt").toURI())));
}
This function will throw java.nio.file.FileSystemNotFoundException if packaged into the jar file. But as I said the problem is not with the code...
Ideally, I want to create a build pipeline, that will produce jar artifacts, which then will be tested and if tests are successful those jars will be automatically deployed to repository (maven and/or bintray).
At the moment all tests are executed before jar creation and as a result there is chance, that compiled code inside jar package will not work due to packaging.
So, to simplify my question I'm looking for a Gradle configuration that can execute unit tests on a freshly made jar file.
That's what I came up with:
test {
// Add dependency on jar task, since it will be main target for testing
dependsOn jar
// Rearrange test classpath, add compiled JAR instead of main classes
classpath = project.sourceSets.test.output + configurations.testRuntimeClasspath + files(jar.archiveFile)
useTestNG()
}
Here I'm changing default classpath for test task by combining folder with test classes, runtime dependencies and compiled JAR file. Not sure if it's correct way to do it...
I don't think there is a good way to detect this kind of problem in a unit test. It is the kind of problem that is normally found in an integration test.
If your artifact / deliverable is a library, integration tests don't normally make a lot of sense. However, you could spend some time to create a sample or test application that uses your library, which you can then write an integration test for.
You would need to ask yourself whether there are enough potential errors of this kind to warrant doing that:
I don't image that you will make this particular mistake again soon.
Other problems of this nature might include assumptions in your library about the OS platform or (occasionally) Java versions ... which can only really be tested by running an application on the different platforms.
Maybe the pragmatic answer is to recognize that you cannot (or cannot afford to) test everything.
Having said that, one possible approach might be to chose a free-standing test runner (packaged as a non-GUI Java application). Then get Gradle to run the test runner as a scripted task with JARs for your library and the unit tests on the classpath.
In Gradle you can try to execute some scripting task that run code from your jar. Or complex but on simple POC it works.
In main gradle project create subproject 'child'.
Add inforation about it in settings.gradle:
include 'child'
In build.gradle add this:
task externalTest {
copy {
from 'src/test'
into './child/src/test'
}
}
externalTest.dependsOn(':child:build')
jar.doLast {
externalTest
}
And in child/settings.gradle in dependency add parent jar:
compile files('../build/libs/parent.jar')
Now in main project on build, the child project will be build after jar creation.
As example, i have program with version 0.0.1. Maven must create separate folder for it - "target/0.0.1/" instead of "target/". It must be done for version "0.0.2", "0.0.3", etc.
I use Eclipse & it's Maven:
Version: Oxygen.3a Release (4.7.3a)
Build id: 20180405-1200
JDK 1.8.0_172
Maven doesn't work that way, and trying to do something like that will lead to a path of suffering. Options I see include
Creating a separate assembly (and output Jar) for each version (see Maven Assembly Plugin)
Create a multi-project reactor with a separate output configuration for every project. Keep common code in one project that you link as dependency from the others. Possibly use the maven-shade-plugin to re-link the packages in your common project into the individual output projects
As you can see, both of these approaches are pretty hacky and require advanced Maven skills. It would be much easier to have parameterized builds where you pass in the output version. But that would make sense on a CI server like Jenkins.
I am wondering about how to perform specific tasks during a maven build: I would like to use some of my code to do some preprocessing on the data that I am shipping in the resulting jar. Generally given some input.xml in src/main/resources I would like to be able to call a java function / main method to obtain a file output.xml which is included available as a resource (and probably placed in target/classes/...). Using Makefiles this would correspond to an additional rule, I guess this could be done with an ant task as well (though I have never used ant myself). can I add such a rule to a maven project as well?
You can use the Maven Exec Plugin to run arbitrary Java code during your build.
If you should happen to have your tasks formulated as Ant targets, the Maven Antrun Plugin can be used to run those.
I've set up a Gradle task to auto generate one of the subprojects of my Gradle build on which another depends (reason for doing this: long story involving Apache Cordova!). So the root build.gradle contains this autogenerate task that creates a "CordovaLib" sub project. The build.gradle in the other sub project (that isn't autogenerated) depends on CordovaLib:
dependencies {
compile project(':CordovaLib')
}
Is there a way to execute the autogenerate task before the non-generated subproject's build.gradle is evaluated (specifically the above line)? I'm using Gradle 1.11 on JDK 1.7 and as it currently stands I can't even run gradle tasks without it failing due to the missing project.
It isn't possible to execute a task before build files have been evaluated, at least not without complications such as one build executing another build using a GradleBuild task. You are likely better off checking the generated project in to source control, or finding a solution that doesn't involve generating build scripts.
You can use init script to gradle to achieve this.
https://gradle.org/docs/current/userguide/init_scripts.html