Multi-module project, gradle and bintray - java

I have multi-module Java project (Jodd). Main module, i.e. the root project is not java project. Only large subset of submodules are java projects:
root
|-- jodd-core
|-- jodd-bean
...
I wanted to apply bintray plugin. So first thing I did is:
1. Apply bintray to just java submodules.
This worked fine, except, since bintray is not enabled from the root; I am not able to invoke just:
gradle bintrayUpload
since this task does not exist in the root. Then I tried this:
2. Apply bintray to all modules, including the root.
This worked, except now on the bintray I have an empty package for the root, that does not have any file.
Question
What would be the right way to upload to bintray? I think I would go with the solution 1 and create my own custom task that will depend on bintrayUpload tasks on all Java modules. Am I missing something?

Solution #1 seems the way to go. Ive created task bintray that depends on all bintrayUpload task from submodules that have something to publish.

Related

Gradle Plugin dependency

What is the exact dependency I need to develop a Gradle Plugin in Java? Ideally I would like to get it from a well-known repository such as Maven Central or similar.
I have a Maven project with a core functionality and I just added two extra plugins, one for Ant, one for Maven. They are already tested and working; easy! Now, I wanted to add a third module for a Gradle plugin to make this functionality also available from any Gradle project.
However, I can't find the exact dependencies I need to develop a Gradle plugin.
The Gradle docs (such as https://docs.gradle.org/current/userguide/java_gradle_plugin.html) are not very well written to say the least. They mention:
the gradleAPI() dependency
or the java-gradle-plugin dependency
But they are quite unclear... no group, no version (really?).
If anyone can enlighten me to where I can get these dependencies from, I would be very thankful.
Gradle's public and internal APIs, aka gradleApi(), are bundled with the Gradle distribution and not independently published and therefore not easily consumable by Maven builds. There's the pending epic #1156 (Ensure plugin cross-version compatibility by allowing a user to depend on gradlePublicApi()) that might help here.
Since Gradle plugins are best to be built with Gradle, a pragmatic solution is to invoke the Gradle build from Maven and attach the produced artifact to the Maven build. Andres Almiray (aalmiray) once described this in the blog post Running Gradle Inside Maven (Web Archive Link). He describes the following high level steps:
Create a new Maven module (e.g. gradle-plugin) and add attach it to the parent POM
In the POM of gradle-plugin add a dependency to your core module. Use the maven-dependency-plugin to store dependencies to the Maven build folder, e.g. target/dependencies.
Create the build.gradle, add a Maven repository that points to target/dependencies (step 2) and let it depend on the core module as well as gradleApi(). Implement the Gradle plugin.
Use the exec-maven-plugin to invoke the Gradle build.
Use the maven-resources-plugin to copy the Gradle built plugin jars to the standard Maven build folder.
Use the build-helper-maven-plugin to attach the copied jars to the Maven build.
Sample project to be found here (gradle-in-maven).
https://docs.gradle.org/current/userguide/custom_plugins.html#sec:custom_plugins_standalone_project
In here it is mentioned that it is gradleApi() and I know that this works (from experience). The localGroovy() on that page is only needed if your plugin code uses groovy (does not apply if you only use groovy in the build.gradle of your plugin).
java-gradle-plugin is a library that makes it a bit simpler to make plugins, it is not required though. I personally prefer using gradleApi only.
EDIT:
It appears I've misunderstood the question. Here are the steps to get gradleApi jar:
Create a Gradle project with your desired Gradle version.
Add implementation gradleApi() dependency.
Import/run the project once.
Go to your .gradle folder (located in home folder in Linux-based operating systems).
Open caches folder
Open the version folder you want, e.g. 6.0.1
Open generated-gradle-jars folder.
Copy the jar to wherever you want and use it.
For me the 6.0.1 jar is at ~/.gradle/caches/6.0.1/generated-gradle-jars/gradle-api-6.0.1.jar
Please note that I have not tested this, I know the jar is there but I haven't tried using it.

Gradle multi-project build order using Kotlin script

I use Kotlin DSL script (.kts) for building. There the structure of my project is:
Root project 'demo'
+--- Project ':backend'
\--- Project ':frontend'
I need to build project frontend first, than backend. I tried
include(":frontend")
include(":backend)
and
include(":frontend", ":backend")
with and without : in settings.gradle.kts of root project, but still the order of build is alphabetical - backend, than frontend.
View source code on GitHub
Do you have any ideas what is wrong?
There is nothing wrong. If you don't specify any inter-project dependencies, Gradle will execute them in alphabetical order. This should be fine if the two projects are unrelated, as they are now.
But let's say you like to build the frontend (using node) and then include those resources in the backend (using Spring Boot). Then you will need to make the backend depend on frontend project. Then Gradle will honor the dependency graph and build the frontend first.
There are many ways to do that. One is to use the java plugin in the frontend to build a jar file of your frontend resources. You can then make a normal project dependency to it. You could also make a dependency directly into the frontend project's "internal" build tasks, but that is a bit frowned upon. Or you could declare your own artifact, or do a it in a bunch of other different ways.
For the first approach, you can build a jar file of your frontend resources like this:
plugins {
// ...
id("java")
}
java {
// Required to make the jar artifact compatible with your backend, which is configured for Java 1.8
targetCompatibility = JavaVersion.VERSION_1_8
}
tasks.named("jar", Jar::class) {
dependsOn("assembleFrontend")
from("$buildDir/dist")
into("static")
}
Then in the backend, depend on it like this:
dependencies {
// ...
runtimeOnly(project(":frontend"))
}
There are a few other things wrong with your build script as well.
The runtime configuration is deprecated; use runtimeOnly instead (for your spring-boot-devtools dependency).
A multi-project should only have a single settings.gradle file, but you have one in each project. Delete them except for the one in the root folder.
You have declared the org.siouan.frontend plugin twice: once using the recommended way and once using the "old" way. Remove the latter (everything in the buildscript block and the apply statement.
Also, while I am not familiar with the org.siouan.frontend plugin, it appears it does not declare inputs and outputs for you - probably because it is very generic. So to avoid running npm each time you build your backend (as you now have a dependency to the frontend), you should declare proper inputs and outputs for the frontend tasks like installFrontend and assembleFrontend.

How do I get Maven to ignore local project for local repository version

I have a multi-module maven project where we shade Google's Guava to relocate the packages into our package tree so that we don't have to worry about version conflicts. I understand that this is a fairly common practice.
Building the jar works fine as does building the system generally. However, when executing the site:site target in Maven it calls compiler:testCompile and at this point the relocated classes can not be found. Let me point out that the earlier testCompile worked just fine.
I suspect that this is because the maven shading project is a peer of the project that fails and that during the second testCompile execution it finds the shading project, looks and finds no classes to compile against and so dies. There is a properly named shaded jar in the target directory and it does contain the classes that are being looked for.
What I think I want to know is: Is there a mechanism to tell maven to look for the jar in the sub-module (the source never existed there) and skip the compiled classes? I suppose as a last resort I could extract the contents of the just built jar into the target/classes directory so they can be found.
Any assistance would be appreciated.
We had the same problem in the past. And ended up with a separate release cycle for shaded artefact and other modules, while they were kept in the same repo:
/pom.xml - root for all child modules, except the shaded one
/shaded/pom.xml - no ref to parent (no <parent> section), own release cycle
/module1/pom.xml - explicit ref to released version of shaded artefact
...
/moduleN/pom.xml
In IDE these files /pom.xml and /shaded/pom.xml were imported as separate projects.
This becomes even more useful later, when re-building shaded modules actually happened only several times a year.

Gradle build fails

I have two Gradle projects, A and B. B is a simple Java application and is consumed by A as a compile project dependency. A is a web application.
Both A and B apply the java plugin, A applies the war plugin as well.
When building A, I get the following error:
FAILURE: Build failed with an exception.
* What went wrong:
Could not determine the dependencies of task ':war'.
> Configuration with name 'default' not found.
When separately building B I get no errors. When building from root, I get no errors either. The issue shows up only when building A.
I've also tried copying the B.jar to the lib folder of A and setting a dependency:
compile files("lib/B.jar")
The build in this case works fine.
What configurations do I need to include to avoid the error?
It's possible, when one subproject doesn't see another, if settings.gradle file is not on the parent directory for both subprojects and the subprojects in it are included via includeFlat. In this case you can call any task from your parent projects and all subprojects will know about each other, but it'll be unable to build separate subproject.
Any way, you need to show your project structure and build/settings files as well, to find out the problem.

get maven clean install to work like maven clean + maven install

I have the following project hierarchy:
app
|-module1
| |-pom.xml
|-module2
| |-pom.xml
|-pom.xml
Module1 and module2 both copies files to the same target directory, so im using the app's pom.xml to clear that directory. My problem is, the execution order right now is module1[clean], module1[install], module2[clean], module2[install], app[clean], app[install], so everything module1 and module2 puts into that directory will be deleted.
I would like to get it to execute all clean first, then all install, even when i run mvn clean install. Or if there is another way to execute app[clean] before module1[install] and module2[install], that would work too.
EDIT
I ended up making a separate module (Netbeans POM projekt) for cleaning alone. Not the sollution i was hoping for, but it works for now.
The root of the problem here is that you're trying to make Maven do something that sort-of contradicts Maven's multi-module "conventions", as well as conflicting with Maven's "understanding" of a "target directory". There is a reason why Maven's reactor is operating the way that it does, and it is to preserve the Maven "spirit" (or "convention") of how modules are structured in a multi-module build.
In Maven, the target directory is supposed to belong only to one project: each project has its own target directory. In your scenario, there should really be a different target directory for app, module1 and module2.
I suppose your best bet, in order to both achieve your objective and keep your build process flexible, is to:
Have module1 output its own JAR into its own target directory (module1/target).
Have module2 output its own JAR into its own target directory (module2/target).
Add a plugin to app (the parent module) that will collect whatever it needs from module1/target and module2/target into app/target, and do whatever processing on those artifacts.

Categories