Empty source jar with 'doLast', screwed up subproject dependencies without - java

I have a task in build.gradle, looking like this:
task sourceJar(type: Jar, dependsOn: classes) << {
classifier = 'sources'
from sourceSets.main.allSource
}
Running gradle sourceJar creates a jar file in libs/ but it is empty (does not include any sources, just the manifest).
Removing << fixes it for some reason, the jar is created properly, but screws up other things (the subprojects now lose the compile dependencies, that are defined for them specifically).
So, three (maybe, four?) questions here:
(1) what's wrong? Why sourceSets are empty when the task is defined with <<?
(2) why does removing << fix it? My understanding is that it just makes the insides of the block to be executed "inline", every time, not just when the task is specifically exacuted.
(3) How to fix this? I can't just remove <<, because, like I said, it screws up other things (but see question #4).
(4) Why does removing << screws up subprojects? Is this expected?
To clarify, here is what I am talking about:
subprojects {
apply plugin: 'java'
dependencies {
compile project(':a')
}
task cp << {
println ("PROJECT " + project.name + ">> " + sourceSets.main.runtimeClasspath.collect { it.absolutePath }.join(':'))
}
}
project(':b') {
dependencies {
compile project(':c')
}
}
Running gradle -q b:cp prints out
PROJECT b>> b/build/classes/main:b/build/resources/main:a/build/libs/a.jar:c/build/libs/c.jar
(I removed the absolute paths). This is what I want.
Now, if I remove << from the file, and run gradle -q b:cp again, I get this
PROJECT a>> a/build/classes/main:a/build/resources/main:/a/build/libs/a.jar
PROJECT b>> b/build/classes/main:b/build/resources/main:a/build/libs/a.jar
PROJECT c>> c/build/classes/main:c/build/resources/main:a/build/libs/a.jar
This is wrong in two ways: first, I did not ask it to be run for all three subprojects, just for b, and second, notice that b does not have c in its classpath any more.
Can someone with a clue please help me figure out what's going on here ... I am really about to give up and switch to sbt (yes, it is a threat!).

When you declare a task of type:Jar, you do not have to use <<, because you're effectively extending a jar task which already has all the required << declared correctly. But first it sounds like you need to read up on what << means and about gradle's configuration and execution phases.
Please see Peter's answer here: Why is my Gradle task always running?
(<< is gradle shorthand for doLast any task code that isn't enclosed in doLast or isn't annotated by << is executed in the configuration phase and not execution phase. This can cause your printlns for example to be executed when the task isn't explicitly invoked, because all tasks are configured even if they are not executed)
Second, your cp task isn't extending a task type. so this needs << in its definition.
task cp << { ... }

Related

Configure plugin from custom task [duplicate]

I'm using Gradle. I have two tasks: "a" and "b". I want task "a" to call task "b". How can I do this?
task b(type: Exec) {
description "Task B"
commandLine 'echo', 'task-b'
}
task a(type: Exec) {
description "Task A"
commandLine 'echo', 'task-a'
// TODO: run task b
}
In Ant this is a piece of cake:
<target name="a">
<echo message="task-a"/>
<antcall target="b"/>
</target>
<target name="b">
<echo message="task-b"/>
</target>
The first method I tried is using the "dependsOn" feature. However this is not ideal as we would need to think of all the tasks in reverse and also has several other issues (like running a task when a condition is satisfied).
Another method I tried is:
b.mustRunAfter(a)
However this only works if I run the gradle tasks like so:
gradle -q a b
Which is also not ideal.
Is there anyway to simply just call another task from an existing task?
As suggested one method would be to add a finalizer for the task
task beta << {
println 'Hello from beta'
}
task alpha << {
println "Hello from alpha"
}
// some condition
if (project.hasProperty("doBeta")) {
alpha.finalizedBy beta
}
Then we can execute the other task if needed. As for executing tasks from another tasks you cannot do that. Task declaration is declarative not imperative. So a task can depend on another task but they cannot execute another task.
$ gradle -q alpha
Hello from alpha
$ gradle -q alpha -PdoBeta
Hello from alpha
Hello from beta
You can use
a.dependsOn 'b'
Or
a.dependsOn b
Or
task a(type: Exec, dependsOn: 'b') { ... }
etc
See adding dependencies to tasks
To summarize and combine the answers from #JBirdVegas and #lance-java, using non-deprecated doLast instead of leftShift (<<):
task beta {
doLast {
println 'Hello from beta'
}
}
task alpha {
doLast {
println 'Hello from alpha'
}
}
// some condition
if (project.hasProperty('doBeta')) {
alpha.finalizedBy beta // run 'beta' after 'alpha'
// or
// alpha.dependsOn beta // run 'beta' before 'alpha'
}
It is working fine but providing a warning as Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
I am using gradle version 4.7. So it means some of the features you have added in the build.gradle will not work as it is in gradle 5.0.
Run the Gradle build with a command line argument --warning-mode=all to see what exactly the deprecated features are.
It will give you a detailed description of found issues with links to the Gradle docs for instructions how to fix your build.

Gradle - Write Task Output Into A File

I am working with Gradle 7.1, and I am trying to write some of the tasks resuts into a file.
Specifically, I would like to write the output of dependencies task into a file after each jar task execution.
Looking for some solutions, I understand that at first I need to have jar.finalizedBy(dependencies) in order fot it to run.
However, I can't find how to redirect the dependencies's specific output into a file. All the solutions that I have found discuss Exec tasks, which dependencies isn't.
I am looking for somehing like dependencies.doFirst(///REDIRECT HERE).
You can make dependencies task write to file by attaching a StandardOutputListener:
tasks.named('dependencies').configure {
it.logging.addStandardOutputListener(new StandardOutputListener() {
#Override
void onOutput(CharSequence charSequence) {
project.file("$buildDir/dependencies_task_output.txt") << charSequence
}
})
}
This can also be done with any other Gradle task.

How to make Gradle fail the build if a file dependency is not found?

I have a Gradle build that has some dependencies of the form
compile files('path/to/local/lib.jar')
(the build is being migrated - eventually these will be replaced)
The build failed because one of these paths was incorrectly specified. But it failed due to a compile error - it looked like Gradle silently ignored the missing dependency.
Is there a simple option or switch that will force Gradle to fail the build if any dependency (particularly local file dependencies) cannot be resolved (eg., file missing)?
Edit: to clarify further:
If a dependency cannot be found in the configured repositories, Gradle will fail the build when attempting to resolve them, as expected.
BUT - if a dependency is defined as "compile files ....", and the file specified does not exist at build time, Gradle will IGNORE that error, and attempt compilation anyway. That seems spectacularly wrong-headed and inconsistent default behaviour.
My question is - is there a Gradle option or switch or environment variable or system property that I can set to force Gradle to verify that file dependencies exist? (E.g,, behave in a sane and rational way?)
This is a bit of an old thread, but given that none of the currently proposed solutions actually works, and the solution appears to be trivial (collating two of them), I am leaving it here for future reference.
The point here is that we simply want to ensure that the files do exist, so we can just use the exists() method of the File class:
task ensureDepsExist() {
doLast {
configurations.implementation.canBeResolved(true)
Set<File> impFiles = configurations.implementation.resolve()
impFiles.forEach { f ->
if (!f.exists()) {
ant.fail "${f} could not be found"
}
}
}
}
compileJava.dependsOn ensureDepsExist
The canBeResolved() call is required, or Gradle will complain that configurations dependencies cannot be resolved.
Here's how you can check transitive dependencies using Gradle 7.3 (example: Fail if the project depends on log4j directly or transitively).
Kotlin DSL
configurations {
all {
relsolutionStrategy {
eachDependency {
if (requested.name == "log4j") {
throw RuntimeException("Project depends on log4j")
}
}
}
}
}
Groovy DSL
configurations.all {
resolutionStrategy.eachDependency { DependencyResolveDetails details ->
if (details.requested.name == 'log4j') {
throw new RuntimeException("Project depends on log4j")
}
}
}
You could do something as shown below. It is not a built-in Gradle function but does not require code to check each dependency specifically (it checks all in the compile configuration):
apply plugin: 'java'
dependencies {
compile files('lib/abc.jar')
compile files('lib/def.jar')
}
task checkDependencies() {
doLast {
configurations.compile.each { file ->
assert file.exists()
}
}
}
compileJava.dependsOn checkDependencies
To fail the build you can:
ant.fail('message why it failed')
Then you can craft a condition then fail the build with nice message ;)
I would suggest to create a task that will bring the file to the project first with a condition to check if the file is available etc if not then throw a Gradle exception and fail the build with a message, and execute the task first in the execution phase.
I have no chance to test it now but it could be something like this, correct me if any syntax is wrong - but you should get the idea.
def yourDep = $/\path\to\your\depdendency/$
task bringDeps << {
if (yourDep.exists()){
copy {
from yourDep
into $projectDir/depsOrSmthg
}
} else{
ant.fail('message why it failed')
}
}
task ensureDependenciesExist() {
doLast {
configurations.implementation.canBeResolved(true)
DependencySet deps = configurations.implementation.getDependencies()
Set<File> impFiles = configurations.implementation.resolve()
deps.each { d ->
boolean depWasResolved = impFiles.any { impFile -> impFile.name.find(".*${d.name}.*${d.version}") }
if (!depWasResolved) {
println "${d} was not resolved"
assert depWasResolved
}
}
}
}
compileJava.dependsOn ensureDependenciesExist

Gradle strange behavior while extending sourceSets with Map variable

We are developing a Java project that is able to instrument (change) class files at build time. We defined a Gradle task that invokes a java based Ant task which takes an inputDir (e.g. build/classes) and an outputDir (e.g. build/classes-instrumented) and possible other parameters. The task gets invoked separately for main and test class files after compilation. Since the "normal" java sourceSet is not a good fit, our first thought was to implement our own sourceSet but couldn't find an easy way. A reasonable alternative, similar to ANTLR etc, seemed to be extra variables. Since I needed several, I went for a Map.
sourceSets.all { ext.instrumentation = [:] }
sourceSets.all {
instrumentation.inputDir = null
instrumentation.outputDir = null
instrumentation.classPath = null
}
def postfix = '-instrumented'
Below you see how we initialize the variables.
sourceSets {
main {
instrumentation.inputDir = sourceSets.main.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
instrumentation.classPath = sourceSets.main.output + configurations.compile
}
test {
instrumentation.inputDir = sourceSets.test.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
}
}
However it fails with "Could not find method main() for arguments [build_f2cvmoa3v4hnjefifhpuk6ira$_run_closure5_closure23#12a14b74] on root
project 'Continuations'."
We are using Gradle 2.1
I have the following questions:
any idea why the first one fails?
Is the extra variable a reasonable solution to approach the problem?
Thanks a lot for your help
solution: install last version.
I had the same problem, I read gradle documentation of gradle 3, but gradle 2.7 was installed.
checked gradle version 2.7
then read gradle 2.7 doc https://docs.gradle.org/2.7/userguide/tutorial_java_projects.html#N103CD , but found no info about sourceSet in java plugin for that version
installed gradle 3 --> problem solved

Excluding a a folder from test runs with TestNG and Gradle

I'm trying to exclude a 'quarantine' folder that I set up for Selenium tests that need to be updated and I do not wish to have run. I know that one solution is to set up and assign test groups for the tests in these classes but given the sheer size and volume of tests that will be in here, I'd rather do it using an Ant-style filter.
Here is a snippet of my build.gradle file:
apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'eclipse'
repositories {
mavenCentral()
}
dependencies {
compile "org.seleniumhq.selenium:selenium-java:2.35.0"
compile "org.testng:testng:5.14.10"
testCompile('org.uncommons:reportng:1.1.2') {
exclude group: 'org.testng'
}
testCompile "junit:junit:4.8.2"
compile "com.jayway.restassured:rest-assured:1.8.1"
}
//initialize thread count variable for parallel testing and default to 1
def threadCount = System.getProperty("MAXTHREADS", "1")
tasks.withType(Test) {
maxParallelForks = 1
forkEvery = 1000
ignoreFailures = false
// Pass all system properties to the tests
systemProperties = System.getProperties()
// Makes the standard streams (err and out) visible at console when running tests
testLogging.showStandardStreams = true
exclude '**/tasks/'
exclude '**/disabled/'
classpath += configurations.testCompile
}
task firefox(type: Test) {
maxParallelForks = Integer.valueOf(threadCount) //default is 1 if not specified
testLogging.events "started"
testLogging {
events "started", "passed", "skipped", "failed", "standardOut", "standardError"
exceptionFormat "full" // default is "short"
}
useTestNG() {
excludeGroups 'chrome'
useDefaultListeners = false
listeners << 'org.uncommons.reportng.HTMLReporter'
listeners << 'org.uncommons.reportng.JUnitXMLReporter'
listeners << 'com.xmatters.testng.Listener'
}
testResultsDir = file("${buildDir}/test-results/firefox")
testReportDir = file("${reporting.baseDir}/firefox")
systemProperties.BROWSER = System.getProperty('BROWSER', 'firefox')
exclude '**/selenium/'
exclude '**/setupscripts/'
}
task chrome(type: Test) {
maxParallelForks = Integer.valueOf(threadCount) //default is 1 if not specified
testLogging.events "started"
useTestNG() {
useDefaultListeners = false;
listeners << 'org.uncommons.reportng.HTMLReporter'
listeners << 'org.uncommons.reportng.JUnitXMLReporter'
listeners << 'com.xmatters.testng.Listener'
}
testResultsDir = file("${buildDir}/test-results/chrome")
testReportDir = file("${reporting.baseDir}/chrome")
systemProperties.BROWSER = System.getProperty('BROWSER', 'chrome')
exclude '**/selenium/'
exclude '**/setupscripts/'
}
On line 34 you can see exclude '**/disabled/' that I added. This folder is a couple levels up from the root folder. The preceding like with exclude '**/tasks/' was already in the build file and seems to work fine with a similar directory structure.
When I run the build, tests in the /disabled/ folder are still getting run. Is there something I'm doing wrong here? I'm assuming that with that syntax, a directory named 'exclude' a couple levels up would be ignored by scanForTestClasses which is true by default. Any idea what is up here?
One other thing I've noticed in Gradle test report is that the package name listed in the report is default-packagefor the excluded tests that are not 'excluding' whereas the other tests that are meant to be run are listing the correct package names. The package names in the Java files match their folder structure correctly so I'm not sure why this is being reported this way. I've checked for duplicates, typos, etc, and am not getting anywhere.
If anyone could shed some light on this that would be great as having these incomplete / broken test classes running is causing failures that should be ignored until these tests are updated.
These test are being run using the Gradle wrapper generated bash script on our test CI (Jenkins) box running on Linux.
Looks like the exclude pattern is applied to the relative path of the files (i.e. relative to your root folder), which explains why it works for folders under your root folder.
Using an excludeSpec (see Gradle Test task DSL) should work fine:
exclude { it.file.canonicalPath.contains('/disabled/')}
Of course, pay attention to / vs \ according to your OS.

Categories