I'm converting a large project from Maven to Gradle. All of the projects, except for one, use java-library plugin.
One of the projects, let's call it build-info-impl creates a .jar. file with build.properties file with the current date, commit hash and other properties. The date, of course, changes every time a build occurs so that task is marked with outputs.upToDateWhen { false }.
Other projects depends on that build-info-impl using implementation dependency like so
dependencies {
implementation project(':mqm-server-root:mqm-infra-common:mqm-build-info')
}
The problem
When I build the projects the test task of all sub-projects always run. When I add --info I see the following output:
Executing task ':server-root:pom:platform-services:platform-services-impl:test' (up-to-date check took 0.017 secs) due to:
Input property 'classpath' file C:\Users\user\Documents\dev\project\gradle\Server\infra-common\build-info-impl\build\libs\build-info-impl-1.24.9-SNAPSHOT.jar has changed.
As far as I understand this means that the .jar file did get into the classpath even though the Java-Library plugin documentation clearly state:
Dependencies found in the implementation configuration will, on the
other hand, not be exposed to consumers, and therefore not leak into
the consumers' compile classpath
How can I debug why that .jar file ends on the classpath?
Related
As of upgrading to Gradle 7.x, I'm getting tons of these in my build:
Gradle detected a problem with the following location: '/Users/me/myapp/server/build/libs/myapp-server-1.0-SNAPSHOT.jar'. Reason: Task ':myapp-content:war' uses this output of task ':myapp-server:explodedWar' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem.
The culprit seems to be some code I cargo-cult pulled off the web somewhere, for unpacking the war after the build:
// This builds the server as a WAR file, though we don't actually use it.
// The important thing is the exploded WAR, from which the Dockerfile copies in
// all the app server's dependency jars.
task explodedWar(type: Copy) {
into "$buildDir/libs"
with war
}
war.finalizedBy "explodedWar"
So basically I am unpacking the war at the end of the build, so that Docker can find all the files when I (later) do a Docker build.
Gradle seems to dislike this, and I get well over 100 warnings like the one above on each build/test run, for just about every top level build and test target in all my subprojects (each of which has a war-exploding step like that one).
I don't understand why all the other targets are seen as depending on the output of the war step.
What's the right way to do this?
I am used to specifying project dependencies in ant/Netbeans, where a project recompiles if its dependency (another, otherwise separate project) changes, "clean and build" cleans and rebuilds dependencies etc. There is also source code navigation, where Netbeans switches seamlessly between projects.
Now I want to learn Gradle, but I was told that I should use a repository for accessing dependencies, like Maven Central. Project dependency configuration in Netbeans UI, in case of a Gradle project, is gone. Thus the question: is the aforementioned possibility of a deep integration between a project and its dependencies possible in a Gradle project?
For your project's source, meaning the stuff under the typical src/main/java, Gradle will "cache" this out of the box for most built-in tasks.
In Gradle, this is known as Up-to-date checks (AKA Incremental Build):
Once your source files have been compiled, there should be no need to recompile them unless something has changed that affects the output, such as the modification of a source file or the removal of an output file.
If you have a custom build task defined or a task that requires to be incremental (cached), then you'll need to follow this to make your custom task incremental.
And for the following:
"clean and build" cleans and rebuilds dependencies etc.
Gradle does not "build" dependencies. It will retrieve dependencies from the configured repositories in the project and then cache them.
You can configure the build cache if needed to suite your needs: https://docs.gradle.org/current/userguide/build_cache.html#sec:build_cache_configure
I have a simple testing-purpose gradle project, which I want to scan its dependencies using gradle dependencies command. Before I do so I want to make sure that the project's dependencies are actually found in the gradle's cache (.gradle/caches/modules-2/files-2/1). To do so I run the gradle assemble command to download the missing dependencies before scanning.
I found out that its working only if the project has a src/main/java folder with a Java file inside it (even if that Java file is completely empty).
Is this a valid workaround? Is there any better solution to guarantee the dependencies are found in the cache folder before scanning them?
What is the reason that you want to do that?
assemble task assemble your source files, if there is nothing to assemble the task is not needed to run. The fact you are adding the java file to src its a hack to run this task and its children tasks.
Depending on what you want to achieve there are few ways to 'scan' dependencies.
For more info you can visit https://docs.gradle.org/current/userguide/userguide_single.html#sec:listing_dependencies
Aditionally:
There is a netflix plugin that I believe can scan through your gradle scripts a check unused dependencies https://github.com/nebula-plugins/gradle-lint-plugin
There is a plugin that can scan the vulnerabilities of used dependencies etc https://jeremylong.github.io/DependencyCheck/dependency-check-gradle/
For several months we've been using Buildship 1.X plus some manual .launch/tasks to build our Eclipse/WTP config files per development environment. I am currently attempting to migrate to using Buildship 2 (which I'm hoping will rid us of the need for the manual bits.)
However, when I import the projects (which have 0 eclipse config files at this point) via the buildship/gradle import, the subprojects are included via 'Libraries' rather than as 'Projects' (see image below.) In contrast, if I use gradle's eclipse task to generate the eclipse config files (i.e. .classpath) then the configuration ends up as I would expect it to be. Is this a current limitation of Buildship, or do I need to do something differently in my gradle files to coerce Buildship to bring them in as Projects?
Ultimately I don't know that I should care about this difference, but I do know that I'm getting compiler errors saying classes from the subprojects are missing from the classpath. As long as I can fix that issue, I'm perfectly happy.
Potentially helpful info
settings.gradle:
rootProject.name = 'projectroot'
include 'Project2.0'
project(':Project2.0').name = 'projectx'
include 'the-platform'
include 'the-platform:central-repo:central-repo-common'
include 'the-platform:central-repo:central-repo-model'
include 'the-platform:central-repo:central-repo-persist'
include 'the-platform:central-repo:central-repo-service'
Project2.0/build.gradle (snippet):
dependencies {
...
compile project(':the-platform:central-repo:central-repo-common')
compile project(':the-platform:central-repo:central-repo-model')
compile project(':the-platform:central-repo:central-repo-persist')
compile project(':the-platform:central-repo:central-repo-service')
...
}
Hmmm, nevermind. My intuition about the difference between the behavior of buildship vs the eclipse plugin to gradle being responsible for my classpath issues was incorrect. Something else (as yet unexplained) must've been the issue as it is working correctly now.
There are several projects that need to build in order. By build in order the meaning:
1. clean, build - project 2
2. clean, build - project 1
3. clean, build - project 4
4. clean, build - project 3
of each project.
-Root Folder
------- project1
-----------build.gradle
------- project2
-----------build.gradle
------- project3
-----------build.gradle
------- project4
-----------build.gradle
--build.gradle
--settings.gradle
build.gradle projects (1,2,3,4):
apply plugin: 'java'
dependencies {
// other dependencies
}
root_folder/settings.gradle
include ':project1', ':project2', ':project3', ':project4'
root_folder/build.gradle
????????
Please tell me how to organize the build order of all the projects in a script?
In your root build.gradle:
dependencies {
compile project(':project1'), project(':project2')
}
Now, when you run gradle build in the root project, Gradle is guaranteed to always build project1 and project2 first.
You can read more about multi-project builds and build order in the User Guide.
Note that doing "clean" every time you build is usually a bad practice, as it only takes time without giving any benefit. Gradle automatically keeps track of which parts of a project need to be re-built, and which are not changed (and thus don't need to be re-built).
You can read more about how Gradle determines what tasks are up-to-date in this section of the user's guide.
Before a task is executed for the first time, Gradle takes a snapshot of the inputs. This snapshot contains the set of input files and a hash of the contents of each file. Gradle then executes the task. If the task completes successfully, Gradle takes a snapshot of the outputs. This snapshot contains the set of output files and a hash of the contents of each file. Gradle persists both snapshots for the next time the task is executed.
Each time after that, before the task is executed, Gradle takes a new snapshot of the inputs and outputs. If the new snapshots are the same as the previous snapshots, Gradle assumes that the outputs are up to date and skips the task. If they are not the same, Gradle executes the task. Gradle persists both snapshots for the next time the task is executed.