I created a small library project with the following build.gradle:
apply plugin: 'java'
apply plugin: 'gwt-base'
apply plugin: 'eclipse'
buildscript {
repositories {
mavenCentral()
jcenter()
maven {
url new File(rootProject.projectDir.parentFile, 'repo').toURI()
// url "https://oss.sonatype.org/content/repositories/snapshots/"
}
}
dependencies {
classpath 'de.richsource.gradle.plugins:gwt-gradle-plugin:0.6'
}
}
gwt {
gwtVersion='2.7.0'
}
The folder structure looks like this:
/library
/library/Library.gwt.xml
/library/client/HelloWorldWidget.java
The sources are taken from here.
When I perform a gradle buildgradle generates a jar file which does not contain the sources and also does not contain the gwt.xml module file.
How can I force gradle to include the sources and the gwt.xml file in the generated jar?
Here is the solution to include the *.java files use:
jar {
from('src/main/java') {
include '**/*.java'
}
}
The include any other resources like gwt.xml files put them into:
/src/main/resources
Alternatively you can use:
jar {
from project.sourceSets.main.allSource
from project.sourceSets.main.output
}
When using the Java plugin, Gradle assumes that the project has the standard structure, i.e. 'src\main\java' & 'src\test\java'. Therefore when executing the build tasks it simply doesn't find any of those directories.
The best way to fix this will be to define your project structure by modifying the existing source sets, which define where your sources are:
sourceSets {
main {
java {
include '/library/**'
}
}
test {
java {
include '/test/sources/directory/**'
}
}
}
Related
I have multimodule java/kotlin app with Gradle.
I wanna make .jar to launch my app in a terminal: like java -jar mayApp.jar
How correct build .jar in multimodule app?
My .jar generated by IDEA is not runs when I trying in terminal due to error:
no main manifest attribute, in /Users/me/IdeaProjects/MyProject/out/artifacts/MyProject_jar/MyProject.jar
project structure:
- :ApplicationName
- :bot-app
- src/main/java/main
Main.java // psvm
- src/main/resources
- META-INF
MANIFEST.MF
build.gradle // module's build
- :data
- :utils
build.gradle // application (root) build
So, in my multimodule project the main class is located in the :bot-app module.
Each module has its own build.gradle and in the root project I have build.gradle of the app;
Module build.gradle
buildscript {
repositories {
...
}
dependencies {
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:1.8.0"
}
}
apply plugin: "org.jetbrains.kotlin.jvm"
apply plugin: 'kotlin-kapt'
group 'org.my_project'
version '2.4.0'
repositories {
mavenCentral()
maven { url 'https://jitpack.io' }
}
dependencies {
implementation "org.jetbrains.kotlin:kotlin-stdlib"
}
And it is my root build.gradle
buildscript {
repositories {
...
}
dependencies {
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:1.8.0"
}
}
apply plugin: 'java'
tasks.withType(Jar) {
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
manifest {
attributes["Main-Class"] = "main.Main"
}
}
As u see, I added
tasks.withType(Jar) {
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
manifest {
attributes["Main-Class"] = "main.Main"
}
}
but it is not works for me. With 1 module it worked, but after refactoring to multimodule, no.
How build .jar in multimodule app?
----
UPD:
if I delete the .gradle folder in the root project and then try to run the app via IDEA it works well. But when I build artifacts via IDEA, jar is created but not works with the error:
no main manifest attribute
And each next build in IDEA is failed with error:
`Execution failed for task ':bot-app:jar'.
Entry META-INF/bot-app.kotlin_module is a duplicate but no duplicate handling strategy has been set.`
If I delete .gradle again, build in IDEA works well.
Need to delete:
tasks.withType(Jar) {
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
manifest {
attributes["Main-Class"] = "main.Main"
}
}
This code is extra for me.
Then create MANIFEST.MF at the root of the project.
Note: you must create it via file->project structure->artifact->+
and in the field Directory for META-INF/MANIFEST.MF: u must set root folder location (NOT module folder, but root)
Remove .gradle, clean, and build artifacts.
I am converting over to using IntelliJ (version 2019.1). The multi-project directory structure used has the standard src/main/java and src/test/java for each project, but additionally has some non-standard ones such as: src/testsupport/java.
Gradlew (using the internal/recommended gradlew packaged within IntelliJ) is used to import the projects. The Gradle build files include both:
apply plugin: 'idea'
apply plugin: 'java'
Edited to improve clarity
Every project imports fine. Interproject references work to the standard directories. However, when I am in Project B, but need access to src/generated/java or src/testsupport/java from Project A, those are not imported (import statements that compile fine from the gradle command line show up as unresolvable within IntelliJ). Is there a configuration change or something needed to make these take effect?
Currently, I have:
subprojects {
idea {
module {
testSourceDirs += project.sourceSets.generated.java.srcDirs
testSourceDirs += project.sourceSets.testsupport.java.srcDirs
}
}
}
You need help Gradle out by creating a source set for the custom sources your projects define. So from your question, something like:
(using Kotlin DSL)
allprojects {
apply {
plugin("idea")
plugin("java-library")
}
repositories {
mavenCentral()
}
configure<SourceSetContainer> {
create("generated") {
compileClasspath += project.the<SourceSetContainer>()["main"].output
runtimeClasspath += project.the<SourceSetContainer>()["main"].output
}
create("testsupport") {
compileClasspath += project.the<SourceSetContainer>()["main"].output
runtimeClasspath += project.the<SourceSetContainer>()["main"].output
}
}
val api by configurations
val testImplementation by configurations
val testRuntimeOnly by configurations
dependencies {
api(platform("org.junit:junit-bom:5.5.1"))
testImplementation("org.junit.jupiter:junit-jupiter-api")
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine")
}
val test by tasks.getting(Test::class) {
useJUnitPlatform()
}
}
The above will give you:
So now you want to use projectA in projectB, so projectB's Gradle file would include a dependency on projectA:
dependencies {
implementation(":projectA")
}
This should hopefully get you started. Keep in mind, the examples given above use the Kotlin DSL which you should be able to convert back to Groovy.
References:
https://docs.gradle.org/current/userguide/java_plugin.html#source_sets
https://docs.gradle.org/current/userguide/java_testing.html#sec:configuring_java_integration_tests
I have some code that I do not want included in the jar file based on a condition.
My build script looks like
plugins {
id 'java'
id 'org.springframework.boot' version '2.0.0.RELEASE'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
Now, when I run the script with gradlew clean build bootJar -Penvironment=prod the absolute paths of everything but the dangerous java files is printed, but they are still included in the jar.
If I remove the boot plugin and run the jar task, the dangerous class files are still included in the jar.
gradlew clean build jar -Penvironment=prod
plugins {
id 'java'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
If I add an exclude clause to the jar task, the dangerous files are not printed, and they are not included in the jar.
gradlew clean build jar -Penvironment=prod
plugins {
id 'java'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
jar {
exclude '**/dangerous/**'
}
If I enable the boot plugin, and use the bootJar task (which inherits from the Jar task) (gradlew clean build bootJar -Penvironment=prod), I do not see the dangerous files printed, but the files are still included in the jar.
plugins {
id 'java'
id 'org.springframework.boot' version '2.0.0.RELEASE'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
bootJar {
exclude '**/dangerous/**'
}
How can I exclude a java file conditionally with the Spring Boot Gradle Plugin and bootJar task?
I was having same issue when i was using 2.0.1.RELEASE. I created jar using bootJar option. Add exclude inside it with file patterns which you want to exclude from executable jar.
This worked fine with spring 2.0.4.RELEASE version.
bootJar {
exclude("**/dangerous/*")
}
I narrowed down the problem. I didn't put in all of the plugins up above, because I thought the only important ones were java and spring boot. However, my actual code also uses the protobuf plugin. If I remove the configuration property generatedFilesBaseDir, then it successfully excludes the dangerous directory.
However, this opens up a new question of, what the hell is happening?
I was specifying the generated files base dir property so I could reference the generated classes in my source code, but I think I may need to create a different project just for the proto, and add that project as a reference to my main module.
Edit
Making a separate project for the protobuf files and referencing it as a project seems to be a viable workaround for this issue.
I'm having a lot of trouble figuring out how to compile a GRPC Java server. I looked all over the grpc.io website and closest thing I found was this: http://www.grpc.io/docs/#quick-start , where I run
../gradlew -PskipCodegen=true installDist to build, and
./build/install/grpc-examples/bin/hello-world-client to run the client. This all works, but only for the hello-world tutorial. I have no idea how to do this for my own client/server. I'm able to generate the client/server protobufs using the .proto file. I looked in their readme and Java tutorial and couldn't find out how to compile the actual server (and client) after I write them
https://github.com/grpc/grpc-java/blob/master/examples/README.md
(can't link java tutorial because I dont have enough reputation). Unless there's documentation im missing, does anyone know how to compile a server and client that implements the GRPC classes generated from the .proto file? I did spend a fair amount of time searching. Any advice is much appreciated, thanks.
Also have a similar problem, ensure that:
You configured correctly the protoc, that will be downloading the executable and configure it as an environment variable of your OS.
The build.gradle file, make sure to include protobuf-gradle-plugin:
apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'com.google.protobuf'
ext.grpcVersion = '1.0.1'
ext.protobufVersion = '3.0.2'
buildscript {
repositories { mavenCentral() }
dependencies {
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.0' }
}
repositories {
mavenCentral()
}
dependencies {
compile "com.google.protobuf:protobuf-java:${protobufVersion}"
compile "io.grpc:grpc-all:${grpcVersion}"
}
protobuf {
protoc { artifact = "com.google.protobuf:protoc:${protobufVersion}" }
plugins { grpc { artifact = "io.grpc:protoc-gen-grpc-java:${grpcVersion}" } }
generateProtoTasks { ofSourceSet('main')*.plugins { grpc { } } }
}
idea {
module {
sourceDirs += file("${protobuf.generatedFilesBaseDir}/main/java");
sourceDirs += file("${protobuf.generatedFilesBaseDir}/main/grpc");
}
}
Your proto file:
syntax = "proto3";
package com.company.project;
service CompanyService{
rpc call(RequestMessage) returns (ResponseMessage) {}
}
Run gradle clean build, and it should generate your service and client classes for CompanyService.
The idea plugin, is jut for telling IntelliJ to recognize the src/main/proto as a source set.
To actually execute the client and server, you will need to make the implementation, mentioned in the tutorial for gRpc, and then apply the application plugin, in order to generate correctly the executable jar
//build.grdle code...
apply plugin: 'application'
mainClassName = 'com.company.project.MainClass'
jar { manifest { attributes('Main-Class' : mainClassName) } }
I had a similar issue but solved it using it in Gradle by adding the 'application' plugin. Before I was using the 'java' plugin and I could only generated a jar file. After switching to the 'application' plugin there is a gradle task similar to the gRPC example.
./gradlew installDist
And now to start your server you can run something similar to this:
./build/install/your-project/bin/your-server
To actually generate the Java classes off my .proto files I needed to run './gradle build' and also include the source generated using the sourceDir element you can see in the build.gradle below.
This is the full build.gradle file.
apply plugin: 'application'
apply plugin: 'com.google.protobuf'
apply plugin: 'idea'
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.6'
}
}
repositories {
jcenter()
}
dependencies {
compile 'io.grpc:grpc-all:0.14.0'
}
mainClassName = "com.domain.service.YourMainClass"
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.0.0-beta-2"
}
plugins {
grpc {
artifact = 'io.grpc:protoc-gen-grpc-java:0.14.0'
}
}
generateProtoTasks {
all()*.plugins {
grpc {}
}
}
}
idea {
module {
sourceDirs += file("${projectDir}/build/generated/source/proto/main/grpc");
sourceDirs += file("${projectDir}/build/generated/source/proto/main/java");
}
}
I am new to gRPC so any improvments to my Gradle file would be appericated.
This question has been answered on groups.google.com by 'Eric Anderson':
The JAR only has your code in it. It sounds like you want to make a "fat" jar which includes all your dependencies. You can do something like this:
jar {
from {
configurations.compile.collect {
it.isDirectory() ? it : zipTree(it)
}
}
}
Note that that isn't gRPC-specific; it's just working with Gradle. There may be alternatives, such as a quick Googling returned gradle-fatjar-plugin.
As for now we have a project structure with single source folder named src, which contains source code for three modules. What I want to do is:
1) Compile source code. This is easily done with sourceSets definition:
sourceSets {
main {
java {
srcDir 'src'
}
}
}
2) Put compilation results into three jars. I am doing this via three 'jar' type tasks:
I am doing this now via three separate tasks:
util.jar
task utilJar(type: Jar) {
from(sourceSets.main.output) {
include "my/util/package/**"
}
}
client.jar
task clientJar(type: Jar) {
from(sourceSets.main.output) {
include "my/client/package/**"
}
}
server.jar
task serverJar(type: Jar) {
from(sourceSets.main.output) {
include "**"
}
excludes.addAll(utilJar.includes)
excludes.addAll(clientJar.includes)
}
The thing is that server.jar should contain all classes that are not contained within client.jar and util.jar. In ant build script we solve this problem by using difference ant task. How this can be done in gradle (my current approach doesn't work)?
Maybe my approach is completely wrong. Please advice.
P.S. as for now we CAN NOT change the project source code folder structure.
I will post my working solution here as an answer (I've got a hint on gradle's forum).
The scopes in gradle are very strange thing :) I thought that every task definition creates an object of some 'Task' class, which is something like 'JarTask' in this particular case. Then I can access any property of the class from anywhere in my build.gradle script. However, I found the only place where I can see the patterns, which are included in jar file - inside a from block of a task. So my working solution for now is to:
1) Define a project-level collection to contain patterns to be excluded from server.jar
2) Exclude all patterns in from block of serverJar task.
Please see final version below
sourceSets {
main {
java {
srcDir 'src'
}
}
}
// holds classes included into client.jar and util.jar, so they are to be excluded from server.jar
ext.serverExcludes = []
// util.jar
task utilJar(type: Jar) {
from(sourceSets.main.output) {
include "my/util/package/**"
project.ext.serverExcludes.addAll(includes)
}
}
// client.jar
task clientJar(type: Jar) {
from(sourceSets.main.output) {
include "my/client/package/**"
project.ext.serverExcludes.addAll(includes)
}
}
// server.jar
task serverJar(type: Jar) {
from(sourceSets.main.output) {
exclude project.ext.serverExcludes
}
}
I think the approach is wrong. I recommend making a project with 3 sub projects.
project
- util
- server (depends on util)
- client (depends on util)
If for some reason you cannot change the class structure use this kind of build files:
settings.gradle
include 'util', 'client', 'server'
build.gradle
subprojects {
apply plugin: 'java'
}
project(':util') {
sourceSets {
main {
java {
srcDir '../src'
include 'util/**'
}
}
}
}
project(':server') {
sourceSets {
main {
java {
srcDir '../src'
include 'server/**'
}
}
}
dependencies {
compile project(':util')
}
}
project(':client') {
sourceSets {
main {
java {
srcDir '../src'
include 'client/**'
}
}
}
dependencies {
compile project(':util')
}
}
You still need directories for subprojects but the sources are in one place as you wanted.
When you run gradle assemble you will have 3 jars with separate set of classes. The advantage of this solution is that we make a proper Gradle multi module project with correct dependencies, not just tasks for building jars.
Please read Multi-Project Builds.
We have the same problem at my company, ie. legacy code that is difficult to migrate into a "good" project structure, and the need to build several jars from the same codebase. We decided to define different sourceSets and build each of the sourceSets using standard Gradle.
We then use iterators to add jar- and javadoc-tasks for each sourceSet:
sourceSets.all { SourceSet sourceSet ->
Task jarTask = tasks.create("jar" + sourceSet.name, Jar.class)
jarTask.from(sourceSet.output)
// Configure other jar task properties: group, description, manifest etc
Task javadocTask = tasks.create("javadoc" + sourceSet.name, Javadoc.class)
javadocTask.setClasspath(sourceSet.output + sourceSet.compileClasspath)
javadocTask.setSource(sourceSet.allJava)
// Extra config for the javadoc task: group, description etc
Task javadocJarTask = tasks.create("javadocJar" + sourceSet.name, Jar.class)
javadocJarTask.setClassifier("javadoc") // adds "-javadoc" to the name of the jar
javadocJarTask.from(javadocTask.outputs)
// Add extra config: group, description, manifest etc
}
I agree in principal with the accepted answer too.
I found a project where the client requires two JAR essentially of the same file except the Manifest is different only by the Class-Path key.
jar {
manifest {
attributes(
"Main-Class": platformMainClass,
"Implementation-Title": platformDisplayName,
"Implementation-Description": platformDescription,
"Platform-Version": platformVersion,
"Implementation-Version": version,
"Build-Assembly-User": System.getProperty("user.name"),
"Build-Assembly-Date": new java.util.Date().toString(),
"Class-Path": configurations.compile.collect { "lib/"+it.getName() }.join(' ')
)
}
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
exclude( [ 'log4j*.properties', 'uk/gov/acme/secret/product/server/**' ])
}
The same manifest and the source code then is:
task applicationClientJar(type: Jar, description: "Creates the Application Client JAR file.") {
dependsOn compileJava
manifest {
attributes(
"Main-Class": platformMainClass,
"Implementation-Title": platformDisplayName,
"Implementation-Description": platformDescription,
"Platform-Version": platformVersion,
"Implementation-Version": version,
"Assembly-Date": new java.util.Date().toString()
)
}
archiveName = "acme-client-${platformVersion}.jar"
destinationDir = file("${buildDir}/libs")
from sourceSets.main.output
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
exclude( [ 'log4j*.properties', 'uk/gov/acme/secret/product/server/**' }
So Grzegorz notation is correct, because the Gradle should know there are two different JAR with GAVs. Multi-module is the preferred option.
compile "uk.gov.acme.secret:acme:1.0" // CORE
compile "uk.gov.acme.secret:acme-client:1.0"
The only way to configure for this is to use the Multi-Module Gradle project and then add a compile and/or deploy dependency to the core / main project.
project(':common:acme-micro-service-webapp') {
dependencies {
compile project(':common:acme-core')
}
}
Inside the 'acme-micro-service-webapp' project, this ensures that the dependent 'common:acme-core' is compiled first.
PS: I am still trying to figure out a better solution.
PS PS: If you are using Maven as well as, it may be possible to hook on the `install' task.