I'm evaluating gradle for replacing an ant build script and I can't manage to find a solution for creating a standard build script that correctly manages dev/prod environment.
Than ant script (it's for a java project, not android) is structured in this way:
a common script with the standard tasks (compile, build-jar, build-war)
a specific project script that includes the first one and through some properties it defines where the war task should pick the correct files
Our project structure/taks allows to override entire directories in the final war. Let consider this example:
the dev configuration is the standard one and lays int the dir webcontent
there are multiple prod conf (one of each specific installation, we do not have more that 10 different prod configs) all under the prod dir (i.e. *prod/conf1*m prod/conf2, etc)
The ant build has the dev_build task as the prod_conf1_build one, the prod_conf2_build one ,etc
the XXX_build task do the same things:
specify the parent (it's a project property) dir that contains the env dir/files
call the same ant taks that build the war using the property specified in the calling task
I'm trying to do the same in gradle but it seems that even calling a taks from another one it creates some problem (i.e. the task is always up to date)
Here is the script (it's a working draft, I'm learning gradle) that tries to do the same but it's not working when I call war_prod the taks does nothing since it reports up-to-date
apply plugin: 'java'
apply plugin: 'war'
apply plugin: 'eclipse'
project.ext.envdir = ""
eclipse {
jdt {
sourceCompatibility = 1.8
targetCompatibility = 1.8
javaRuntimeName = "jdk-1.8.x"
}
}
// In this section you declare where to find the dependencies of your project
repositories {
maven {
url 'http://artifactory.zzzz.priv/artifactory/libs-release'
url 'http://artifactory.zzzz.priv/artifactory/libs-snapshot'
credentials {
username 'xxxx'
password 'yyyy'
}
}
}
// In this section you declare the dependencies for your production and test code
dependencies {
// The production code uses the SLF4J logging API at compile time
compile 'org.slf4j:slf4j-api:1.7.18'
// Declare the dependency for your favourite test framework you want to use in your tests.
// TestNG is also supported by the Gradle Test task. Just change the
// testCompile dependency to testCompile 'org.testng:testng:6.8.1' and add
// 'test.useTestNG()' to your build script.
testCompile 'junit:junit:4.12'
}
task war_prod {
project.ext.envdir='prod/conf1'
project.ext.envdir=project.ext.envdir.replaceAll('\\\\',File.pathSeparator)
project.ext.envdir=project.ext.envdir.replaceAll('/',File.pathSeparator)
tasks.war.execute()
}
war {
eachFile {
println 'endir' + project.ext.envdir
println 'evaluating' + it
FileTree tree = fileTree(dir: project.ext.envdir)
tree.visit { FileVisitDetails file->
if (!file.file.isDirectory()) {
println '\tFileVisitDetails relpath ' + file.relativePath
println '\tsourcepath ' + it.file.getAbsolutePath()
println '\tcontains ' + it.file.getAbsolutePath().contains(project.ext.envdir)
if (it.relativePath == file.relativePath && !it.file.getAbsolutePath().contains(project.ext.envdir)) {
it.exclude()
println '\texcluding ' + it
} else {
if (it!=null) {
//println '\tincluding ' + it
}
}
}
}
}
from 'prod/conf1'
}
Can anyone point me in the right direction for creating a correct gradle script?
Is there a specific gradle way to build war files with prod/dev configurations (where the configuration is represented by some dir and files)?
In such scenarios task rules might be very useful. Basic idea is to keep configurations in a structured way and use a general task to build a war file with a configuration defined. Please have a look at build.gradle below:
apply plugin: 'war'
repositories {
mavenCentral()
}
tasks.addRule("Pattern: buildWar<ENV>") { String taskName ->
if (taskName.startsWith('buildWar')) {
def env = (taskName - 'buildWar').toLowerCase()
if (env in ['dev', 'prod',]) {
task(taskName, type: War) {
println "Configuring env: $env"
from("src/main/conf/$env") {
into("conf")
}
}
} else {
println "Invalid env: $env, skipping."
}
}
}
The buildWarENV rule defined here is pretty self descriptive. It accepts two environments dev and prod and prepares war file by taking configuration from appropriate folder. You can find a demo here. In case of questions, just ask.
P.S. Gradle has a bit different working model than ant, start with the basics. And what's important, never run a task from within other task.
Related
I have some code that I do not want included in the jar file based on a condition.
My build script looks like
plugins {
id 'java'
id 'org.springframework.boot' version '2.0.0.RELEASE'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
Now, when I run the script with gradlew clean build bootJar -Penvironment=prod the absolute paths of everything but the dangerous java files is printed, but they are still included in the jar.
If I remove the boot plugin and run the jar task, the dangerous class files are still included in the jar.
gradlew clean build jar -Penvironment=prod
plugins {
id 'java'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
If I add an exclude clause to the jar task, the dangerous files are not printed, and they are not included in the jar.
gradlew clean build jar -Penvironment=prod
plugins {
id 'java'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
jar {
exclude '**/dangerous/**'
}
If I enable the boot plugin, and use the bootJar task (which inherits from the Jar task) (gradlew clean build bootJar -Penvironment=prod), I do not see the dangerous files printed, but the files are still included in the jar.
plugins {
id 'java'
id 'org.springframework.boot' version '2.0.0.RELEASE'
}
sourceSets {
main {
java {
if (project.environment == 'prod') {
exclude '**/dangerous/**'
}
forEach {
println it.absolutePath
}
}
}
}
bootJar {
exclude '**/dangerous/**'
}
How can I exclude a java file conditionally with the Spring Boot Gradle Plugin and bootJar task?
I was having same issue when i was using 2.0.1.RELEASE. I created jar using bootJar option. Add exclude inside it with file patterns which you want to exclude from executable jar.
This worked fine with spring 2.0.4.RELEASE version.
bootJar {
exclude("**/dangerous/*")
}
I narrowed down the problem. I didn't put in all of the plugins up above, because I thought the only important ones were java and spring boot. However, my actual code also uses the protobuf plugin. If I remove the configuration property generatedFilesBaseDir, then it successfully excludes the dangerous directory.
However, this opens up a new question of, what the hell is happening?
I was specifying the generated files base dir property so I could reference the generated classes in my source code, but I think I may need to create a different project just for the proto, and add that project as a reference to my main module.
Edit
Making a separate project for the protobuf files and referencing it as a project seems to be a viable workaround for this issue.
I'm trying to achieve a simple scenario in my spring boot project build: including / excluding dependencies and packaging war or jar depending on the environment.
So for example, for the environment dev include devtools and package jar, for prod package war etc.
I know it is not XML based configuration anymore and I can basically write if statements in my build.gradle but is there a recommended way of achieving this?
Can I declare some common dependencies and refer them in a single file instead of creating multiple build files?
Is there a best practice changing build configuration based on the build target environment?
ext {
devDependencies = ['org.foo:dep1:1.0', 'org.foo:dep2:1.0']
prodDependencies = ['org.foo:dep3:1.0', 'org.foo:dep4:1.0']
isProd = System.properties['env'] == 'prod'
isDev = System.properties['env'] == 'dev'
}
apply plugin: 'java'
dependencies {
compile 'org.foo:common:1.0'
if (isProd) {
compile prodDependencies
}
if (isDev) {
compile devDependencies
}
}
if (isDev) tasks.withType(War).all { it.enabled = false }
My version (inspired by Lance Java's answer):
apply plugin: 'war'
ext {
devDependencies = {
compile 'org.foo:dep1:1.0', {
exclude module: 'submodule'
}
runtime 'org.foo:dep2:1.0'
}
prodDependencies = {
compile 'org.foo:dep1:1.1'
}
commonDependencies = {
compileOnly 'javax.servlet:javax.servlet-api:3.0.1'
}
env = findProperty('env') ?: 'dev'
}
dependencies project."${env}Dependencies"
dependencies project.commonDependencies
if (env == 'dev') {
war.enabled = false
}
Sometimes it's also useful to completely switch between different build files by adding some lines of code to the file settings.gradle. This solution reads the environment variable BUILD_PROFILE and inserts it into the buildFileName:
# File: settings.gradle
println "> Processing settings.gradle"
def buildProfile = System.getenv("BUILD_PROFILE")
if(buildProfile != null) {
println "> Build profile: $buildProfile"
rootProject.buildFileName = "build-${buildProfile}.gradle"
}
println "> Build file: $rootProject.buildFileName"
Then you run gradle like this, e.g. to use build-local.gradle:
$ BUILD_PROFILE="local" gradle compileJava
> Processing settings.gradle
> Build profile: local
> Build file: build-local.gradle
BUILD SUCCESSFUL in 3s
This approach also works for CI/CD pipelines where you might want to add extra tasks like checking quality gates or other time consuming things you don't want to execute locally.
I'm using Gretty to run my web application via gradle appRun. I'm also using the Gradle Asset Pipeline plugin to compile my Less files to CSS.
I want to integrate with Gretty's Fast reload feature so that when I change a Less file, it automatically compiles it and copies the CSS to the in-place web-app.
I have implemented a solution using Gretty's onScanFilesChanged setting in my build.gradle file:
buildscript {
dependencies {
classpath 'org.akhikhl.gretty:gretty:1.2.4'
classpath 'com.bertramlabs.plugins:asset-pipeline-gradle:2.7.0'
classpath 'com.bertramlabs.plugins:less-asset-pipeline:2.7.0'
}
}
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'war'
apply plugin: 'org.akhikhl.gretty'
apply plugin: 'com.bertramlabs.asset-pipeline'
dependencies {
// ...
}
assets {
excludes = ['bootstrap/**']
}
war.dependsOn assetCompile
gretty {
servletContainer = 'tomcat8'
enableNaming = true
contextPath = '/'
// This affects the war task as well
webappCopy {
from 'build/assets', { into 'stylesheet' }
}
afterEvaluate {
prepareInplaceWebAppFolder.dependsOn assetCompile
}
scanDir "src/assets"
fastReload "src/assets"
onScanFilesChanged { List<String> files ->
if (files.findAll { it.endsWith ".less" }.size() > 0) {
assetCompile.compile()
}
}
}
Is there a neater way to do this that doesn't involve so much code in the build.gradle file?
The behavior you are describing is what Gretty is doing by default. The documentation states:
fastReload: When set to true (the default), webAppDir folder (which is typically src/main/webapp) is set as being fast-reloaded. That means: whenever
some files within webAppDir are changed, these files are copied into
running web-app without web-app restart.
Means, any change in a subdirectory of src/main/webapp will trigger Gretty's fast reload, but any change which is made outside of this directory triggers a server restart.
A smarter approach to your problem would be overriding the output path of gradle assetsCompile to a subdirectory of src/main/webapp, or to hook a copy task to it like this in your build.gradle file:
task copyAssets(type: Copy) {
from buildDir + '/assets'
into webAppDir + '/stylesheet'
}
copyAssets.shouldRunAfter assetsCompile
I have a Gradle build script which has to instantiate a Java class in a Task and call a method on the created object. Currently, I have the following:
apply plugin: 'java'
dependencies {
compile files("libs/some.library.jar")
}
task A << {
def obj = new some.library.TestClass()
obj.doSomething()
}
The problem is that the class some.library.TestClass() is not found. I read this article about how to use Groovy classes in Gradle, but I need my Java class to come from an external JAR file. How can I add a jar to the build source? It seems that the dependencies block doesnt do what I expect it to do. Can anyone give me a hint in the right direction?
The dependency compile files("libs/some.library.jar") is added as a project dependency not as the script dependency itself. What You need to do is to add this dependency in script's classpath scope.
apply plugin: 'java'
buildscript {
dependencies {
classpath files("libs/some.library.jar")
}
}
task A << {
def obj = new some.library.TestClass()
obj.doSomething()
}
Now it should work.
I'm currently in the situation of editing to gradle projects( "database" & "masterdata"). Masterdata depends on the database-project. Database is published to a Nexus server from where it is loaded by masterdata as a dependency.
The build.gradle of masterdata
import org.gradle.plugins.ide.eclipse.model.SourceFolder
apply plugin: "java"
apply plugin: "eclipse"
sourceCompatibility = 1.7
version = '0.1-SNAPSHOT'
group = "net.example"
def nexusHost = "http://nexus:8081"
repositories {
logger.lifecycle("Configuration: Repositories")
maven {
url nexusHost + "/nexus/content/groups/public"
}
}
dependencies {
logger.lifecycle("Configuration: Dependencies")
compile 'net.example:database:0.1-SNAPSHOT' // project where the changes happen
compile 'com.google.guava:guava:14.0.1'
testCompile 'ch.qos.logback:logback-classic:1.0.13'
testCompile 'org.testng:testng:6.8.5'
testCompile 'org.dbunit:dbunit:2.4.9'
testCompile 'org.mockito:mockito-all:1.9.5'
testCompile 'org.easytesting:fest-assert-core:2.0M10'
testCompile 'org.hsqldb:hsqldb:2.2.9'
}
eclipse.classpath.file {
beforeMerged { classpath ->
classpath.entries.clear()
logger.lifecycle("Classpath entries cleared")
}
whenMerged { cp ->
cp.entries.findAll { it instanceof SourceFolder && it.path.startsWith("src/main/") }*.output = "bin/main"
cp.entries.findAll { it instanceof SourceFolder && it.path.startsWith("src/test/") }*.output = "bin/test"
cp.entries.removeAll { it.kind == "output" }
logger.lifecycle("Classpath entries modified")
}
}
When I change something in the database project it needs a complete build, publish, etc. till I see the changes in the masterdata project. In the company where I previously worked we had a similar setup using maven. There I saw changes in dependencies immediately with out publishing them first. Is this also possible with gradle? Maybe via Multi-Project Builds?
Basically a the following entry is missing from .classpath:
<classpathentry combineaccessrules="false" kind="src" path="/database"/>
Is there a way to automate the generation of it.
Update: As a workaround I add the entry manually to the .classpath
I did some additional searching and currently this is only possible with multi-project builds. Basically you need all your projects in one gigantic multi-project build. There you can reference them as you like and get the correct dependencies in eclipse as it seems.
There is a jira issue with a feature request to make that possible without a multi-project build. Custom logic for eclipse will only help with builds in eclipse, because in a gradle build it would use the dependency from the repository, where the changes are missing. You need to make sure that all changed dependencies are build and published before building the main project.
Eclipse workaround:
eclipse.classpath.file {
whenMerged { cp ->
// remove library dependencies
def toBeRemoved = cp.entries.findAll { it instanceof Library
&& ((Library) it).library.path.contains('someProject') }
//configure the project dependencies:
def toBeAdded = [ new ProjectDependency('/someProject', null)]
cp.entries -= toBeRemoved
cp.entries += toBeAdded
}
}
This would still fail when doing a manual gradle build, but if you using a CI-system with a good build order you should be fine.
Solution Multi-Project Build:
Creating a Multi-Project build is easier than I thought and it is also possible when the "subproject" is on the same level.
settings.gradle:
includeFlat 'database'
build.gradle:
dependencies {
...
compile project(':database')
...
}
This works well with gradle builds and with eclipse. The only disadvantage is that you always have to checkout the subproject everytime you checkout the project that depends on it. I'm sure someone can build some fancy groovy logic to fix that.