I am trying to generate classes from multiple .xsds in gradle. Tried like below, but it does not work and i get error that file not exists.
buildscript {
ext {
springBootVersion = '1.5.4.RELEASE'
}
repositories {
jcenter()
mavenCentral()
}
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}")
classpath("com.github.jacobono:gradle-jaxb-plugin:1.3.6")
}
}
apply plugin: 'java'
apply plugin: 'eclipse-wtp'
apply plugin: 'org.springframework.boot'
apply plugin: 'war'
apply plugin: 'com.github.jacobono.jaxb'
version = '0.0.1-SNAPSHOT'
repositories {
mavenCentral()
}
configurations.all {
exclude module: 'spring-boot-starter-logging'
}
configurations {
jaxb
}
task createDirs {
file("$buildDir/generated-sources").mkdirs()
}
xjc.dependsOn createDirs
jaxb {
xsdDir = "src/main/resources/xsd"
xjc {
destinationDir = "$buildDir/generated-sources"
taskClassname = "org.jvnet.jaxb2_commons.xjc.XJC2Task"
generatePackage = "com.test.generated1"
args = ["-Xinheritance", "-Xannotate"]
}
}
compileJava {
dependsOn(xjc)
}
compileTestJava {
dependsOn(xjc)
}
dependencies {
compile('org.springframework.boot:spring-boot-starter-web')
compile('org.springframework.boot:spring-boot-starter-log4j2')
compile('org.springframework.boot:spring-boot-starter-actuator')
jaxb('org.jvnet.jaxb2_commons:jaxb2-basics-ant:0.6.5')
jaxb('org.jvnet.jaxb2_commons:jaxb2-basics:0.6.4')
jaxb('org.jvnet.jaxb2_commons:jaxb2-basics-annotate:0.6.4')
jaxb('com.sun.xml.bind:jaxb-xjc:2.2.7-b41')
jaxb('com.sun.xml.bind:jaxb-impl:2.2.7-b41')
}
If i remove destinationDir = file("build/generated-sources"), it generates classes inside src/main/java which i do not want.
I get error .../build/generated-sources: non-existent directory
Any help is appreciated :)
NOTE: Gradle Version 3.2.1
Try using a variable for build directory "$buildDir/generated-sources"
Like vampire noticed you need to create the directory first.
I would suggest creating a task for it with doFirst() which will execute sooner on the execution phase.
task createDirs {
doFirst{
file("$buildDir/generated-sources").mkdirs()
}
}
and then run if before xjc:
xjc.dependsOn createDirs
And in this task particular task try to just pass the raw /build path because it seems like destinationDir already points to the root. Not sure what is going on here. Other than that I strongly sugest usign the variables $buildDir
jaxb {
xsdDir = "src/main/resources/xsd"
xjc {
destinationDir = "build/generated-sources"
taskClassname = "org.jvnet.jaxb2_commons.xjc.XJC2Task"
generatePackage = "com.test.generated1"
args = ["-Xinheritance", "-Xannotate"]
}
}
Another thing is you could watch on the directory changes if you don't want to run expensive xjc task all the time.
xjc {
inputs.dir ("$projectDir/src/main/resources/xsd")
outputs.dir ("$buildDir/generated-sources")
}
One point is that you should use the variable buildDir instead of using hard-coded 'build'.
That's not your problem though. Your problem is, that the plugin you are using does not create a non-existent target directory. So add the creation action to the task like
jaxb {
doFirst {
file("$buildDir/generated-sources").mkdirs()
}
}
Or if you prefer, create a separate task that does only the directory creation and then add a dependency from jaxb to your directory creation task.
Related
A bit of Context first:
I am working on migrating my companies projects to be built by Gradle.
One thing, that this results in, is having redundancy in my build.gradle files,
as I am configuring the same Skeleton over and over again.
This includes:
Setting the java-,maven-publish- and org.sonarcube-plugin
Configuring the repositories to be mavenCentral and our private Artifactory Repo
Configuring the publishing block, that is all the same, except for the artifactId
Building a Manifest inside the Jar block (using helper Methods, to correctly build the Manifests classpath)
Helper Methods
two Tasks
two dependsOn statements
build.gradle file as of now:
plugins {
id 'io.spring.dependency-management' version '1.0.12.RELEASE'
id "org.sonarqube" version "3.2.0"
id 'maven-publish'
id 'java'
}
group = 'group'
version = 'version'
sourceCompatibility = '11'
ext.artifactName = 'ProjectName'
// Where to look for dependencies:
repositories {
mavenCentral()
maven{
credentials{
username = "${artifactory_user}"
password = "${artifactory_password}"
}
url "${artifactory_contextUrl}"
allowInsecureProtocol = true
}
}
// Where to publish what Artifacts to:
publishing {
publications {
maven(MavenPublication) {
groupId = 'group'
artifactId = 'ProjectName'
String buildEnvVar = System.env.BUILDENVIRONMENT
if(buildEnvVar == null){
version = 'LOCAL BUILD'
}else{
version = 'version'
}
from components.java
}
}
repositories {
maven {
// change to point to your repo, e.g. http://my.org/repo
name = "gradle-dev"
url = "${artifactory_contextUrl}"
allowInsecureProtocol = true
credentials{
username = "${artifactory_user}"
password = "${artifactory_password}"
}
}
}
}
dependencies {...}
jar {
// configuration of variables
String dateString = new Date().format("yyyy-MM-dd HH:mm:ss ")
String localBuild = "LOCAL BUILD by " + System.getProperty("user.name") + " on " + dateString
String buildEnvVar = System.env.BUILDENVIRONMENT
String buildEnvironment
String classpath = createCP()
if(buildEnvVar == null){
buildEnvironment = localBuild
archiveName = "ProjectName"
}else{
buildEnvironment = buildEnvVar
archiveFileName= "ProjectName_" + version + ".jar"
delete fileTree("build/libs") {
include('*')
}
}
manifest {
attributes (
"Main-Class": "org.example.foo",
"Specification-Title" : "ProjectName",
"Specification-Vendor" : "blab",
"Specification-Version" : "Spec-version",
"Implementation-Title" : "$System.env.JOB_NAME",
"Implementation-Version" : "Impl-version",
"Implementation-Vendor" : "blub",
"Implementation-Vendor-Id" : "blob",
"Implementation-Url" : "bleb",
"Build-By" : buildEnvironment,
'Class-Path': classpath
)
}
}
String createCP () {
// super secret can't share
}
// will suffix the jars with release or debug, depending on it being compiled with or without debug-information
project.gradle.taskGraph.whenReady{
boolean isDebug = project.gradle.taskGraph.getAllTasks().join(' ').contains('debugJar')
compileJava.options.debug = isDebug
String suffix = isDebug? "debug" : "release"
String fullJarName = "$artifactName-$suffix" + ".jar"
jar.setProperty('archiveName', fullJarName)
}
tasks.named('test') {
useJUnitPlatform()
}
task debugJar() {}
debugJar.dependsOn(jar)
//Downloads all Jars the project depends on, and saves them in buildDirectory/output/libs if the gradle build command is executed.
task copyToLib(type: Copy) {
into "${buildDir}/output/libs"
from configurations.runtimeClasspath
}
build.dependsOn(copyToLib)
what I want to achive:
plugins {
id 'io.spring.dependency-management' version '1.0.12.RELEASE'
id "org.sonarqube" version "3.2.0"
id 'maven-publish'
id 'java'
id 'mySuperPlugin'
}
// Configure mySuperPlugin
mySuperPlugin {
artifactId = 'xyz'
mainClass = 'org.example.foo'
version = 'version'
stuffFromOtherTasks = ...
}
// Where to look for dependencies:
repositories {
mavenCentral()
maven{
credentials{
username = "${artifactory_user}"
password = "${artifactory_password}"
}
url "${artifactory_contextUrl}"
allowInsecureProtocol = true
}
}
dependencies {...}
Most of the values are the same.
The ones that aren't are passed in via Environment-Variables (Jenkins-JobName,...),
or get determined through helper Methods.
I reckon, that i will most likely not end up with a buildfile like the one above,
but atleast some of the buildfile must be outsourceable.
I know as of now, that i can create seperate Tasks in a plugin, like comparing two files, that have been passed. What I didn't find a solution to yet:
Can I modify the Jar Task of the project applying the plugin, inside the plugin?
How do I pass Outputs from other Tasks into my plugins tasks?
How do I access the applying projects data (i.e. the runtimeClasspath)
Is a plugin even what i want to do, or is there another way of cutting down the build.gradle file?
I am relatively unexperienced with gradle. I have read through quite a bit of the docs and other postings, but chances are i just overlooked some best-practice way of doing certain things.
Therefore, feel free to criticize my buildfile aswell as my approach!
You can do this in a couple of ways. And this comes down to if your project is composed of multiple sub-projects or if they are stand alone projects. For stand alone projects you can do the following in your settings.gradle file:
includeBuild("../common-project/build.gradle")
The common project would just house the common build.gradle file, and you'd declare all of the items you want to share in there. It would look like a normal build.gradle file.
That would require that each project share the common configuration from another project, but wouldn't require any additional projects be checked out. Just the common project and the project itself. For more details see:
https://docs.gradle.org/current/userguide/composite_builds.html
If you have more of a multi-project build like say many micro-services or subprojects that make up a larger project then using multi-project setup and declare the common pieces in the allprojects closure in the root build.gradle:
allprojects {
plugin: java
plugin: web
repositories {
....
}
dependencies {
....
}
}
In the multi-project case you'd have to check out the top level project and all subprojects. Your folder layout might look like this:
- my-project
- service-1
src
build.gradle
- service-2
src
build.gradle
- service-3
src
build.gradle
build.gradle
settings.gradle
In the multi-project setup service1, service2, and service3 would be declared in the settings.gradle file using include like so:
rootProject.name = 'my-project'
include `service1`
include `service2`
include `service3`
In a multi-project setup you'd typically house that in a single source repository as oppose to using includeBuild where each project would belong to a separate source code repo. The former way forces you to checkout the appropriate number of projects in the multi-project case. But, in the includeBuild case the developer would have to know to check out minimum of 2 projects.
I have the following folder structure:
-bin
-build/build.gradle (the gradle script)
-lib/[*.jar] (libraries that the project is using)
-src/folder/folder/[*.java] (project's source code)
The following content for the build.gradle script:
plugins {
id 'java'
id 'groovy'
}
buildDir = new File('../bin')
sourceCompatibility = JavaVersion.VERSION_1_8
sourceSets {
main {
java {
allJava.srcDirs = [ '../src/folder/folder/ ']
compileClasspath = fileTree('../bin')
}
}
}
repositories {
flatDir {
dirs '../lib'
}
}
dependencies {
implementation fileTree('../lib')
}
tasks.register('javac', JavaCompile) {
println 'Call javac'
source.forEach { e -> println e}
classpath = sourceSets.main.compileClasspath
destinationDirectory = file('../bin')
source sourceSets.main.allJava.srcDirs
includes.add('*.java')
sourceCompatibility = JavaVersion.VERSION_1_8
}
When running gradle javac I got the error: error: cannot find symbol import com...
The documentations clearly says:
dependencies {
.
.
.
//putting all jars from 'libs' onto compile classpath
implementation fileTree('libs')
}
I'm using Gradle 7.3.1
Allow me to give you some general advice first.
I can strongly recommend using the Kotlin DSL instead of the Groovy DSL.
You instantly get strongly typed code in the build scripts and much better IDE support.
Also you should imho consider changing your project layout to be more like most other Java projects out there and especially not use a libs directory, but use normal dependencies in a repository where transitive dependencies are then handled automatically and so on.
But to answer your actual question, this is the build complete build script in Groovy DSL that you want:
plugins {
id 'java'
}
buildDir = '../bin'
java {
toolchain {
languageVersion.set(JavaLanguageVersion.of(8))
}
}
sourceSets {
main {
java {
srcDirs = ['../src/folder/folder']
}
}
}
dependencies {
implementation fileTree('../lib')
}
And this is the matching Kotlin DSL version:
plugins {
java
}
setBuildDir("../bin")
java {
toolchain {
languageVersion.set(JavaLanguageVersion.of(8))
}
}
sourceSets {
main {
java {
setSrcDirs(listOf("../src/folder/folder"))
}
}
}
dependencies {
implementation(fileTree("../lib"))
}
After updating to Gradle 5.2.1 my build is failing with this error:
Gradle DSL method not found: 'destination()'
I figured out that this error has something todo with my analysis.gradle
My analysis.gradle looks like that
apply plugin: 'checkstyle'
apply plugin: 'pmd'
apply plugin: 'jacoco'
jacoco {
toolVersion = "0.7.7.201606060606"
}
check.dependsOn 'checkstyle', 'pmd', 'lint'
task checkstyle(type: Checkstyle) {
println "----- checkstyle -----"
configFile file(projectDir.getAbsolutePath() + '/analysis/checkstyle-ruleset.xml')
source 'src'
source '../domain/src'
source '../util/src'
include '**/*.java'
exclude '**/gen/**'
exclude '**/java-gen/**'
exclude '**/androidTest/**'
exclude '**/test/**'
ignoreFailures = true
classpath = files()
reports {
xml {
destination buildDir.absolutePath + "/outputs/reports/checkstyle_report.xml"
}
}
}
I think I have to replace the destination flag but I have no idea how to replace it.
Before Gradle 5.0 the method setDestination(Object file) was already deprecated, see here : setDestination(Object file)
In Gradle 5.x this method has been removed, you must now use setDestination(File file) which takes a File parameter (see setDestination(File file) )
So you need to change your code into:
reports {
xml {
destination file("$buildDir/outputs/reports/checkstyle_report.xml")
}
}
All adjustment was done in my quality.gradle. Check config folder for quality.gradle file and change all usage of
destination "$reportsDir/pmd/pmd.xml"
to
destination file("$reportsDir/pmd/pmd.html")
Trying to exclude packages from coverage report as my Jenkins pipeline fail. I have a sub project with all POJO:s. I don't want to write uittest for all these. Hence, they will drag down branch/line coverage som that coverage will be below threshold and fail my build.
It should be possible to exclude some packages, but I cant get it to work.
I have the following jacoco.gradle file:
apply plugin: 'jacoco'
apply plugin: 'java'
jacoco {
toolVersion = "0.8.2"
}
jacocoTestReport {
reports {
xml.enabled true
csv.enabled false
html.enabled true
}
afterEvaluate {
classDirectories = files(classDirectories.files.collect {
fileTree(dir: it, exclude: '**xxx/yyy/zzz/**')
})
}
task coverage { dependsOn 'jacocoTestReport' }
check.dependsOn 'jacocoTestReport'
The following in my sonar.gradle file:
apply plugin: 'org.sonarqube'
sonarqube {
properties {
property "sonar.forceAnalysis", "true"
property "sonar.forceAuthentication", "true"
property "sonar.java.coveragePlugin", "jacoco"
property "sonar.login", ""
property "sonar.password", ""
}
}
subprojects {
sonarqube {
properties {
property "sonar.jacoco.reportPaths",
"$buildDir/reports/jacoco/allTests.exec"
property "sonar.junit.reportsPath", "$buildDir/test-results/test"
}
}
}
task sonar { dependsOn 'sonarqube' }
In my build.gradle:
apply from: 'gradle/sonar.gradle'
...
apply plugin: 'java'
...
subprojects {
apply from: '../gradle/jacoco.gradle'
...
}
And finally from my Jenkins file:
step([$class: 'JacocoPublisher', changeBuildStatus: false,
exclusionPattern: '**/*Test*.class', inclusionPattern:
'**/*.class', minimumBranchCoverage: '80', sourcePattern: '**/src'])
try {
dir(BUILD_DIR) {
sh './gradlew sonar'
}
But still the xxx.yyy.zzz Is generated in the coverage report in Jenkins!
Finally got it working! The key is the JacocoPuplisher.
step([$class: 'JacocoPublisher', changeBuildStatus: false, exclusionPattern:
'**/xxx/yyy/zzz/**/*.class, **/*Test*.class', inclusionPattern: '**/*.class',
minimumBranchCoverage: '80', sourcePattern: '**/src'])
This is the only way to I get it to work with Jenkins.
Setting the sonar.coverage.exclusion or using the afterEvalueate stuff above has no effect.
I was able to accomplish file/package exclusions from the jacoco() Jenkins pipeline plugin ( which in our standard Jenkinsfile gets called in pipeline { ... post { ... always { ) by just adding directly to the plugin invocation arguments like so:
...
post {
always {
...
// Change the exclusion for the jacoco() jenkins plugin to exclude testutils package.
jacoco(exclusionPattern: '**/testutils/**,**/*Test*.class')
It seems one of the big confusions in this whole topic is that there is a jacocoTestReport gradle target, which has its own exclusion syntax, and then this jacoco() Jenkins pipeline task which seems totally independent. Sonar coverage exclusion looks like maybe a 3rd thing?
Lombok makes Java great, gradle's an awesome, flexible build tool, Eclipse simplifies development significantly, and Javadocs make the world go round.
So when I started a new project, I wanted to figure out how to merge all this magic. Attached is the buildscript I wrote for this purpose, isolated from the other concerns of the project Save it somewhere accessible and include it using apply from: "path/to/lombok.gradle"
/**
* Project Lombok, Eclipse, Javadocs and Gradle
*/
// Doing this twice (once here, once in your main project) has no effect on Gradle, but
// this script depends on the Java and Eclipse plugins
apply plugin: 'java'
apply plugin: 'eclipse'
// Create a configuration to hold the lombok jar as a dependency
configurations {
lombok
}
// Add the lombok jar to the configuration
dependencies {
lombok 'org.projectlombok:lombok:+'
}
// Add the lombok configuration to all of the compile classpaths
sourceSets.each{ sourceSet ->
sourceSet.compileClasspath += configurations.lombok
sourceSet.ext.delombok = new File(buildDir, "generated-src/delombok/" + sourceSet.name);
}
// This task will download lombok and install it in your eclipse instance
task installLombok() {
dependsOn configurations.lombok
} << {
File jarFile = null;
configurations.lombok.resolvedConfiguration.resolvedArtifacts.find {
if ("lombok".equals(it.name)) {
jarFile = it.file;
}
}
javaexec {
main="-jar"
args = [
jarFile,
"install",
"auto"
]
}
}
// Install lombok into eclipse when you set up the project (optional line)
eclipseProject.dependsOn installLombok
// Javadoc doesn't handle lombok'd code, so we have to "delombok" it - that is, expand the
// neat annotations so that Javadoc can do something with them.
task delombok() {
dependsOn configurations.compile
dependsOn configurations.lombok
} << {
File jarFile = null;
configurations.lombok.resolvedConfiguration.resolvedArtifacts.find {
if ("lombok".equals(it.name)) {
jarFile = it.file;
}
}
sourceSets.each{ sourceSet ->
def classPath = sourceSet.compileClasspath.files.join(";")
def delombokPath = sourceSet.ext.delombok
delombokPath.mkdirs();
javaexec {
main = "-jar"
args jarFile, "delombok"
if (!classPath.empty) {
args "-c", classPath
}
args "-d", delombokPath
args sourceSet.allJava.srcDirs
}
}
}
javadoc {
dependsOn delombok
// At your discretion; I actually use ext.apiDelombok in my case
source = sourceSets.main.ext.delombok
}