Gradle SubProject tasks are not getting picked-up - java

I have a multi-module setup for a Java project with following structure.
mainApp
|--> core-module
| |--> src
| |--> build.gradle
| |--> gradle.properties
|
|--> lib-module
| |--> src
| |--> build.gradle
| |--> gradle.properties
|--> lib-another-module
| |--> src
| |--> build.gradle
| |--> gradle.properties
|--> settings.gradle
|--> build.gradle
in mainApp/build.gradle I've mentioned
subprojects {
test.dependsOn "CreateMessageKeys"
//test.dependsOn ":CreateMessageKeys"
//test.dependsOn ("CreateMessageKeys")
//test.dependsOn (":CreateMessageKeys") none of this working....
}
task CreateMessageKeys(type: CreateMessageKeysTask) {
destDir = "bundle-common/src/";
outputClass = "common.messages.MessageKeys";
}
and my core-module/build.gradle have a test target as
test {
useTestNG() {
useDefaultListeners = true
suites 'test/testng.xml'
}
}
but getting error as below.. What Am I missing here?
Caused by: groovy.lang.MissingMethodException: No signature of method: java.lang.String.dependsOn() is applicable for argument types: (String) values: [CreateMessageKeys]
Possible solutions: respondsTo(java.lang.String)
Edit
subprojects {
test.dependsOn(rootProject.tasks['CreateMessageKeys'])
}
task CreateMessageKeys(type: CreateMessageKeysTask) {
destDir = "bundle-common/src/";
outputClass = "common.messages.MessageKeys";
}
It generates the error:
* What went wrong:
A problem occurred evaluating root project 'myApp'.
> Task with name 'CreateMessageKeys' not found in root project 'mainApp'.

The task definition does not look correct, see defining tasks.
It should look like this:
task ('CreateMessageKeys', type: CreateMessageKeysTask) {
destDir = "bundle-common/src/";
outputClass = "common.messages.MessageKeys";
}
or
task (CreateMessageKeys, type: CreateMessageKeysTask) {
destDir = "bundle-common/src/";
outputClass = "common.messages.MessageKeys";
}
Irrelevant to this question:
In your second example, perhaps the task CreateMessageKeys does not exist yet when this is evaluated. One possible workaround would be the following:
subprojects.each {
it.afterEvaluate {
it.test.dependsOn(...)
}
}
Or simply putting the task definition above this block could resolve this.

Related

Gradle circular dependency, but not seeing it

So I have a "main" module called main-data. It doesn't depend on any of my other projects.
main-data build.gradle
dependencies {
implementation("org.postgresql:postgresql:${Versions.postgresVersion}")
implementation("com.zaxxer:HikariCP:${Versions.hikariVersion}")
}
// To run: ./gradlew flywayMigrate
flyway {
url = "jdbc:postgresql://localhost:5432/app"
user = "app"
password = "app"
validateOnMigrate = false
}
Then I have another project vendor-data that depends on main-data.
dependencies {
implementation(project(":app-main-data"))
}
And finally I have another module invoice-data that depends on both.
dependencies {
implementation(project(":app-main-data"))
implementation(project(":app-modules:vendors:data"))
}
And the error:
FAILURE: Build failed with an exception.
* What went wrong:
Circular dependency between the following tasks:
:app-modules:invoices:data:compileJava
+--- :app-modules:invoices:data:compileJava (*)
\--- :app-modules:invoices:data:compileKotlin
+--- :app-modules:invoices:data:compileJava (*)
\--- :app-modules:invoices:data:compileKotlin (*)
(*) - details omitted (listed previously)
My settings.gradle
include(
":app-main",
":app-main-data",
)
include(
":app-modules:vendors:data",
)
include(
":app-modules:invoices:api",
":app-modules:invoices:data",
)
Where is the circular dependency here? As soon as I remove implementation(project(":app-modules:vendors:data")) from invoice project it works. But I'm puzzled.
Can anyone understand this?

Get module path not root path in Java

I have a multi-module Java(Spring) project, which build by Gradle 6.7.1. And I use in Jetbrain IDEA to develop. The file Structure like this:
root
|--orm
| +---hibernates
|
|--web
|--mvc
|--rest
And then, I have tried some codes in my module project like below, what I get all are root path (/home/user/IdeaProjects/root/), not module path (/home/user/IdeaProjects/root/web/mvc). How can I get module path (/home/user/IdeaProjects/root/web/mvc) ?
new File("").getAbsolutePath()
Assuming for instance that your mvc project is setup like this in setting.gradle, in the root folder :
include 'mvc'
project(':mvc').projectDir = new File('./web/mvc')
Then, to get the path /home/user/IdeaProjects/root/web/mvc, just try this :
println project(':mvc').projectDir
Will prints :
/home/user/IdeaProjects/root/web/mvc
based on the answer of #ToYonos. We can do that by this:
settings.gradle gets the project path of every module.
write a key value into the info.properties in every module.
Spring Project read this properties file.
Code
Because struct of my project is:
root
|--orm
| +---mybatis
| +---jpa
| +---...
|--web
+--mvc
+--rest
+--...
So, I should loop twice to get the module name. And I exclude project without build.gradle.
file("${rootDir}").eachDir {
it.eachDirMatch(~/.*/) {
if (it.list().contains("build.gradle")) {
def moduleName = "${it.parentFile.name}:${it.name}"
println " ${moduleName}"
include moduleName
}}}
And then, read and write info.properties.
import java.nio.file.Paths
// read
def project_dir = project(":${moduleName}").projectDir
def propFile = Paths.get("${project_dir}", "src", "main","resources","info.properties").toFile()
propFile.createNewFile()
Properties props = new Properties()
propFile.withInputStream {
props.load(it)
}
// write
props.setProperty("project.dir","$project_dir")
props.store propFile.newWriter(), null

geotools 20.5 error: Provider org.geotools.referencing.factory.epsg.CartesianAuthorityFactory could not be instantiated

I am working on a java project which required to convert WGS84 to UMT. I used geotools v20.5 to create a transform with following code:
transform = CRS.findMathTransform(
CRS.decode("EPSG:4326", true),
CRS.decode("EPSG:3857", true),
false);
It was working correctly until geotools changed their repos.
Currently when I run the program, I will get a warning:
WARNING: Can't load a service for category "CRSAuthorityFactory". Cause is "ServiceConfigurationError: org.opengis.referencing.crs.CRSAuthorityFactory: Provider org.geotools.referencing.factory.epsg.CartesianAuthorityFactory could not be instantiated".
then with following errors:
Caused by: java.lang.NoSuchFieldError: ONE
Execution failed for task ':Application.main()'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 255
my environment: Ubuntu 20.04 with OpenJDK 8 (64bit)
I tested with other Windows machine, it hits same error.
Here are the libs I am using from org.geotools
def geotoolsVersion=20.5
compile group: 'org.geotools', name: 'gt-opengis', version: geotoolsVersion
compile group: 'org.geotools', name: 'gt-referencing', version: geotoolsVersion
compile group: 'org.geotools', name: 'gt-epsg-wkt', version: geotoolsVersion
compile group: 'org.geotools', name: 'gt-geometry', version: geotoolsVersion
As you may notice I am using gt-epsg-wkt instead of those ones with db since I may not have write permission to some directories in production. But I tested other plugins with db, still hitted same error.
I tried to debug which part of code in geotools caused that error and I found out.
The error started at the last line Units.autoCorrect(...) of following codes in Parser.java class in gt-referencing lib.
/**
* Parses an "UNIT" element. This element has the following pattern:
*
* <blockquote>
*
* <code>
* UNIT["<name>", <conversion factor> {,<authority>}]
* </code>
*
* </blockquote>
*
* #param parent The parent element.
* #param unit The contextual unit. Usually {#link SI#METRE} or {#link SI#RADIAN}.
* #return The "UNIT" element as an {#link Unit} object.
* #throws ParseException if the "UNIT" can't be parsed.
* #todo Authority code is currently ignored. We may consider to create a subclass of {#link
* Unit} which implements {#link IdentifiedObject} in a future version.
*/
#SuppressWarnings("unchecked")
private <T extends Quantity<T>> Unit<T> parseUnit(final Element parent, final Unit<T> unit)
throws ParseException {
final Element element = parent.pullElement("UNIT");
final String name = element.pullString("name");
final double factor = element.pullDouble("factor");
final Map<String, ?> properties = parseAuthority(element, name);
element.close();
Unit<T> finalUnit = (factor != 1) ? unit.multiply(factor) : unit;
return Units.autoCorrect(finalUnit);
}
Then I stepped into it and found out following method in DefaultUnitParser.java class in gt-referencing lib
DefaultUnitParser.getInstance()
//this method returns null, with error said org.geotools.measure.units failed to instantialize.
I am totally lost now, why it was working and sunddly not working after they changed their remote repo?!
If you need further info from me, please leave a comment and I am still awaiting for a solution since I cannot easily change geotools lib.
Thanks all
BTW I confirmed it gets correct WKT via code: EPSG:4326 or EPSG:3857
UPDATE
I changed the geotools version down to 12.5 which is written on their website https://geotools.org/about.html and switched JTS lib to com.vividsolutions.jts then it is working now. I think I may need to raise an issue on their github.
The trick to solving this sort of issue is to see what gradle is pulling in as dependecies using gradle -q dependencies - if you have a well constructed build you should see something like:
+--- org.locationtech.jts:jts-core:1.16.1
+--- org.geotools:gt-opengis:23.1
| +--- commons-pool:commons-pool:1.5.4
| +--- systems.uom:systems-common-java8:0.7.2
| | +--- tec.uom:uom-se:1.0.8
| | | +--- javax.measure:unit-api:1.0
| | | \--- tec.uom.lib:uom-lib-common:1.0.2
| | | \--- javax.measure:unit-api:1.0
| | +--- si.uom:si-quantity:0.7.1
| | | \--- javax.measure:unit-api:1.0
| | \--- si.uom:si-units-java8:0.7.1
| | +--- javax.measure:unit-api:1.0
| | +--- tec.uom:uom-se:1.0.8 (*)
| | \--- si.uom:si-quantity:0.7.1 (*)
| \--- javax.media:jai_core:1.1.3
+--- org.geotools:gt-epsg-wkt:23.1
| +--- org.geotools:gt-referencing:23.1
| | +--- org.ejml:ejml-ddense:0.34
| | | \--- org.ejml:ejml-core:0.34
| | +--- commons-pool:commons-pool:1.5.4
| | +--- org.geotools:gt-metadata:23.1
If however you see something like:
+--- org.locationtech.jts:jts-core:1.16.1
+--- org.geotools:gt-opengis:23.1
+--- org.geotools:gt-epsg-wkt:23.1
+--- org.geotools:gt-geometry:23.1
+--- org.geotools:gt-referencing:23.1
+--- org.geotools:gt-main:23.1
\--- org.geotools:gt-metadata:23.1
then something is wrong. From my (brief) experiments with gradle it seems that you need something like:
repositories {
maven { url "http://download.java.net/maven/2" }
maven { url "https://repo.osgeo.org/repository/release/" }
maven { url "http://maven.geo-solutions.it/" }
maven { url "https://repo.maven.apache.org/maven2" }
}
to make sure that gradle can find all the dependencies that you (and GeoTools) needs. Note if you are using snapshots of GeoTools you will need https://repo.osgeo.org/repository/snapshot/ instead of https://repo.osgeo.org/repository/release/.

Make Zip task run after dependencies are resolved

I have a subproject B that depends on other subproject A. I have included subproject A in "build.gradle" of subproject B.
dependencies {
compile project(':projA')
}
Both of my sub-projects A and B create a bundled zip upon a release. I want to copy some files belonging to subproject A to subproject B without referencing subproject A again. The root project's "build.gradle" script contains the following task.
subprojects {
task bundleBin(type: Zip) {
description 'Creates "bin.zip" bundle.'
dependsOn build
def bundleName = "$outputName-bin"
/// THIS DOES NOT WORK
def deps = configurations.runtime.getAllDependencies().findAll { it instanceof ProjectDependency }
println "GROOT: " + deps
into("$bundleName/dep") {
/// THE LINE BELOW WORKS
/// I do not want a fixed reference since it is already defined in each subproject's "build.gradle" file
//from project(':projA').file('conf/')
for (dep in deps) {
def proj = dep.getDependencyProject()
from (proj.projectDir) {
include "conf/"
include "scripts/"
}
}
}
into(bundleName) {
from(".") {
include "conf/"
include "scripts/"
}
}
into("$bundleName/lib") {
from configurations.runtime.allArtifacts.files
from configurations.runtime
}
archiveName = "${bundleName}.zip"
}
}
The reason why I do not want to reference subproject A again is because I have a list of projects that depend on some other projects and I do not want to maintain each dependency individually.
What I want the above script to do is, when running for B takes "conf/" and "scripts/" in A and B, and puts them in "B-bin.zip". Whereas, if I have a subproject C that have a dependency on A and B, the above script will take "conf/" and "scripts/" in A, B and C, and puts them in "C-bin.zip".
When I run the above script, the dependencies do not appear unless I encapsulate it in "doLast". However, this does not work in the Zip task.
My question is, how do I fix this?
You need to make sure to resolve the configuration first.
You could do that by using .resolvedConfiguration but note that resolving at configuration time means that this will be done regardless of what task is called, and should be avoided.
This anwser suggest you can achieve the same by iterating directly over the configuration.
You could use gradle.taskGraph.whenReady to delay resolving the configuration only if your task is about to be executed. You can still configure your task there.
As mentioned by #Alpar, this is due to the Zip task processing the dependencies during the configuration phase. To resolve it, I followed this answer.
Thus, my bundle code now looks like:
task bundleBin << {
task bundleBin_childTask(type: Zip) {
def bundleName = "$outputName-bin"
def deps = configurations.runtime.getAllDependencies().findAll { it instanceof ProjectDependency }
into(bundleName) {
for (dep in deps) {
def proj = dep.getDependencyProject()
from (proj.projectDir) {
include "conf/"
include "scripts/"
}
}
}
into(bundleName) {
from(".") {
include "conf/"
include "scripts/"
}
}
into("$bundleName/lib") {
from configurations.runtime.allArtifacts.files
from configurations.runtime
}
archiveName = "${bundleName}.zip"
}
bundleBin_childTask.execute()
}
This solution forces the Zip task resolve its included files in the execution phase.

Replace File in Gradle Build

I am trying to replace a file in my resource folder (src/main/resources) with a new file generated by the Gradle build script. I'm having trouble doing this; the exclusion seems to be remembered, and preventing the addition of my new file.
Here's a short example that illustrates the behavior.
Project Structure:
TestProject
-- src/main/java
---- entry
------ EntryPoint.java
---- run
------ HelloWorldTest.java
-- src/main/resources
---- test.properties // FILE TO REPLACE
test.properties contents in src/main/resources:
Wrong File with extra text to make it obvious which one is being put into the jar based on size
build.gradle:
apply plugin: 'java'
task makeProp {
def propDir = new File(buildDir, "props")
ext.propFile = new File(propDir, "test.properties")
outputs.file propFile
doLast {
propDir.mkdirs()
propFile.createNewFile()
propFile.withWriter('utf-8') { writer ->
writer.writeLine 'Right File'
}
}
}
jar {
dependsOn('makeProp')
if (project.hasProperty('testExclude')) {
sourceSets {
exclude('test.properties')
}
}
from (makeProp.propFile) {
into '/'
}
}
JAR contents of ./gradlew build (both files included):
Archive: TestProject.jar
Length Date Time Name
-------- ---- ---- ----
0 08-07-15 14:27 META-INF/
25 08-07-15 14:27 META-INF/MANIFEST.MF
0 08-07-15 13:50 run/
499 08-07-15 13:50 run/HelloWorldTest.class
0 08-07-15 13:50 entry/
1413 08-07-15 13:50 entry/EntryPoint.class
95 08-07-15 14:27 test.properties
11 08-07-15 14:03 test.properties
-------- -------
2043 8 files
JAR contents of ./gradlew build -PtestExclude (neither file included):
Archive: TestProject.jar
Length Date Time Name
-------- ---- ---- ----
0 08-07-15 14:29 META-INF/
25 08-07-15 14:29 META-INF/MANIFEST.MF
0 08-07-15 13:50 run/
499 08-07-15 13:50 run/HelloWorldTest.class
0 08-07-15 13:50 entry/
1413 08-07-15 13:50 entry/EntryPoint.class
-------- -------
1937 6 files
I have done something very similar and this is what worked for me. The main objective is to make sure the task runs before your jar is created and your files are processed. Try this out.
// create a properties file add it to folder preprocessing
task propertiesFile << {
//
description 'Dynamically creates a properties file.'
// needed for the first pass
def folder = project.file('src/main/resources');
if(!folder.exists()){
folder.mkdirs()
}
//write it to a propertiess file
def props = project.file('src/main/resources/test.properties')
props.delete()
props << 'write this to my file!!!!'
}
processResources.dependsOn propertiesFile
The solution I went with in the end was similar to what was posted by Pumphouse, but instead of deleting the file, I renamed it to a temp name, excluded that temp name, and renamed it once I was done (I needed to preserve the file contents and position after the build was complete).
Modified build.gradle:
apply plugin: 'java'
def props = project.file('src/main/resources/test.properties')
def temp = project.file('src/main/resources/temp.properties')
task makeProp {
... // Unchanged
}
jar {
dependsOn('makeProp')
// Move the properties file to a temp location
props.renameTo(temp) // This returns a boolean; can perform conditional checks.
// Exclude the temp file
if (project.hasProperty('testExclude')) {
sourceSets {
exclude('temp.properties')
}
}
// Insert the right prop file
from (makeProp.propFile) {
into '/'
}
}
jar << {
// Restore the temp file
temp.renameTo(props)
}
The thing is that the exclusion patterns for the gradle SourceSet apply to all included paths and not only to specific paths. Therefore in your example above all files named test.properties will be excluded, regardless on their location.
What you can do is to have the default test.properties located somewhere else and make the build copy/generate the relevant version based on the scenario.

Categories