Gradle "setup" tasks (pre-build / compile / jar) - java

Relatively new to java and gradle -- trying to do things "right". Prior to building my application (I've added the gradle "application" plugin) I want to setup some environment and system things -- for example, I'd like to create the log/ directory and log.txt file.
So I'm doing something like:
task setup {
println 'Setup task executing ...'
File d = new File('log');
d.mkdir();
f = new File(d.getPath() + '/log.txt');
f.createNewFile();
}
Which works -- but I get a bunch of stdout warnings when running > gradle setup
Setup task executing ...
Creating properties on demand (a.k.a. dynamic properties) has been deprecated and is scheduled to be removed in Gradle 2.0. Please read http://gradle.org/docs/current/dsl/org.gradle.api.plugins.ExtraPropertiesExtension.html for information on the replacement for dynamic properties.
Deprecated dynamic property: "f" on "task ':setup'", value: "log/log.txt".
:setup UP-TO-DATE
So one question: What is the correct way to leverage Gradle to perform setup / installation tasks? (This should only really be executed once, when the application is deployed)

Ah, you are mixing task configuration and execution. This:
task foo {
// Stuff
}
is not the same as this:
task foo << {
// Stuff
}
In the first, "stuff" is run at configuration time, leading to the warnings that you're seeing (because f is interpreted as a project variable during this phase). In the second, it's run at execution time.
(Gradle is great, but this very subtle syntax distinction can be the source of many infuriating bugs!)
As for how to do setup properly, as you're using the Application plugin, you should look into Including other resources in the distribution.
(You should also consider moving the directory-creation logic into your application itself, as ideally you want it to be robust against someone deleting the log directory!)

Related

gradle: tar task not creating a tar.gz

Hi I have a tar task that I made after looking at numerous methods and some SO posts.
task buildDist(type: Tar, dependsOn: jar) {
print 'here'
archiveName = 'xyz-' + version
destinationDir = file('build/dist')
extension = 'tar.gz'
compression = Compression.GZIP
from 'build/libs'
include 'xyz.jar'
}
buildDist.mustRunAfter jar
I have the java plugin applied and the jar task makes the xyz.jar file available under build/libs. The build/dist directory does not exist yet, but I tried new File("build/dist") as well. That did not work either - I even pointed it to the build directory that exists - doesn't work. I run the entire script with /gradlew clean build. The print in the above code does print.
I am making a few assumptions here as you didn't post the output from running Gradle.
The build task is just a normal Gradle task that doesn't do anything by itself. Instead, it depends on other tasks. If you create your own custom task and you like to have it included when executing build, you have to add a dependency to it. If this is not the problem and you have actually done this, please give some more details as to what makes it "not work" when you run build.
If you want to test your task in isolation (e.g. to make sure it works correctly without running unit tests or whatever else that is unrelated), just run gradlew cleanBuildDist buildDist.
A note about the 'print' statement - it executes doing the configuration phase, but this doesn't mean you can use it to test if the task actually executes. In fact, it will most likely print no matter what task you execute. If you wanted to print something on execution time, you would have to put it in a doLast block.
There is a few other things you should change as well:
It is not a good practice to use relative references. Instead, use the buildDir property to get an absolute reference to the build directory.
Don't use deprecated methods like archiveName and destinationDir. Use archiveFileName and destinationDirectory instead.
The extension property is also deprecated, but it is ignored if you set the full name of the archive yourself. So just remove it. This also means you are missing the extension on the full name.
The from and include is a little fragile. Just use from jar.archivePath if you only want to gzip your application jar.
Example:
task buildDist(type: Tar, dependsOn: jar) {
archiveFileName = "${jar.baseName}-${version}.tar.gz"
destinationDirectory = file("$buildDir/dist")
compression = Compression.GZIP
from jar.archivePath
}
build.dependsOn buildDist
Lastly, if your intention is to create a distribution of your application that is runnable on its own (with all required dependencies), you should consider using the distribution plugin and perhaps also the application plugin.

Local Unit test: getResourceAsStream when reading production resource

There are a ton questions regarding getResource or getResourceAsStream returning null and so far I understand the issue but I currently cannot properly solve it.
I have a resource file which is used by some class in production. The file is located in
app\src\main\res\raw\some.def
The class SomeManager uses this to access this file:
InputStream stream = SomeClass.class.getResourceAsStream("/res/raw/some.def");
This succeeds when running the debug variant of the application on the emulator and it also succeeds when running the debug variant of the instrumented tests. I assume because the resource is properly packaged into the jar?
However when I run some local jUnit tests in android studio this resource is not found. I did not fully understand what is exactly executed when running a local test and I am not sure how to provide the resource file in a way that can be loaded in a test.
I would like to avoid doubling this resource file because it is actually something I want to test, I also would like to not change the getResourceAsStream path because this is the production file I want to test.
I am using gradle and android studio if that matters.
I debugged this issue with Sysinternal's Process Monitor and realized that when I run code locally on my machine the resources are as streams are looked up from various different locations on disk. One of those locations is
<build_directory>/intermediate/classes/<build_type> where it is obviously missing.
The solution to this was to create a copy task that performs the copying and make it robust enough to work for all build types.
So I modified my app's gradle file and added those dynamic tasks:
android.buildTypes.all{ theBuildType ->
task "copyDebugAssets_${theBuildType.name}"(type: Copy){
from "${projectDir}/src/main/res"
into "${buildDir}/intermediates/classes/${theBuildType.name}/res"
eachFile { println it.name }
}
}
tasks.whenTaskAdded { task ->
// println "A message which is logged at QUIET level ${task.name}"
if (task.name.startsWith("process") && task.name.endsWith("Resources")) {
def partInBetween = task.name.substring("process".length(), task.name.indexOf("Resources")).toLowerCase()
if (partInBetween == "debugandroidtest") {
return
}
def dependentTask = "copyDebugAssets_${partInBetween}"
println "Registering ${dependentTask} to ${task.name} for config '${partInBetween}'"
task.dependsOn dependentTask
}
}
I have really no idea on how to properly use gradle but the first statement generates as many copyDebugAssets_xxx tasks as there are build types. After syncing you can see and execute the in the gradle projects.
To avoid calling them whenever a clean or rebuild is done manually, the second part registers the copyDebugAssets_xxx copy tasks to the various process<Configuration>Resources tasks, which are then called automatically. So far I can run local unit tests in multiple build type successfully.

JOOQ Code Generation Before Source Code Compilation

I am using JOOQ code generation Tool for generating source code for my schema(MYSQL). I would like to generate source code every time I compile my Project. But I am not able to do it because when I run Code generation gradle Task, Compiler starts complaining about references of deleted source code.
Here is what I did:-
Created an Empty Spring boot Project.
Generated Source code using config xml(jooq.xml below) like this
Triggered Code Generation using a Gradle Task.
Build.gradle
task generateJooqDatabaseSource(type: JavaExec) {
classpath = sourceSets.main.runtimeClasspath
main = 'org.jooq.util.GenerationTool'
args = ['/jooq.xml']
standardOutput = System.out
errorOutput = System.err
}
Used the generated source code and wrote SQLs using JOOQ.
Everything is fine till here. But now I don't want to Push the generated Java Classes to my Project. I would like it to create every time when I compile my Project.
so lets delete the generated source code and re-generate it again(say for my Test environment)
But as soon as I run the Gradle Task generateJooqDatabaseSource
it starts complaining about the generated code references.
error: package autogenered.jooq.code.db.tables does not exist
import autogenered.jooq.code.db.tables.Author;
Tried googling the problem and found suggestions to use plugins like flyway, suggested here
But I really don't want to add another plugin if it can be achieved easily without it.
PS:- Just started to use Gradle, JOOQ from couple of days, apologies if answer is obvious.
Adding Following Lines in build.gradle have done the tweak for me:
compileJava.dependsOn(generateJooqDatabaseSource)
generateJooqDatabaseSource.dependsOn = [processResources, processTestResources]
Intellij Specific configuration:-
Added gradle build task to be triggered every time I do
make Project (Ctrl-F9)
or
Re-build Project:

Turn off parts of code to speed up build times (Gradle)

I have an Android project that has grown with time, and with the size have grown the gradle build times.
It was bearable while it was under the 65k limit - around 14s.
Now with multidex it takes 36s.
So my question is - are there any ways to "turn off" parts of the code that are not being used so it's back under the 65k limit?
For e.g. turn off the amazon s3 sdk which is brought in via gradle and has n thousands of methods.
I know you can strip code with proguard, but that just bumps up the build time even higher.
I'm happy with it crashing at runtime when I open the parts that use it, just want to make testing quicker.
At the moment when I remove amazon from gradle imports, I obviously get this:
Error:(24, 26) error: package com.amazonaws.auth does not exist
Is there a way to somehow ignore the error? I know that in Picasso, it has a runtime check to see if you have OkHttp, and if you haven't - use standard networking.
static Downloader createDefaultDownloader(Context context) {
if (SDK_INT >= GINGERBREAD) {
try {
Class.forName("com.squareup.okhttp.OkHttpClient");
return OkHttpLoaderCreator.create(context);
} catch (ClassNotFoundException ignored) {}
}
return new UrlConnectionDownloader(context);
}
Is there something like this I could do? Or any other way?
The only realistic way of doing this (that I'm aware of) is to refactor your project so that your packages are split into separate modules. You would therefore have separate gradle build files for each module, but would only have to recompile each module whenever they were touched. You could, for instance, have a data access package and a UI package. That seems like a pretty natural split.
I realize that this is a disappointing answer but the issue you're complaining about is that your build dependencies require all those extra unnecessary libraries and method calls: not that your code uses them.
The only other tip I can give you is that the Google Play API kit has tens of thousands of method calls. If you can use only the pieces that you're using you stand a much better chance of being beneath the 65k limit.
It is possible to specify compile-time dependencies for each build type independently. I use this method to include "production-only" dependencies in only the release builds, reducing the method count for debug builds.
For example, I only include Crashlytics in release builds. So in build.gradle I include the dependency for only my release build (and beta and alpha):
releaseCompile('com.crashlytics.sdk.android:crashlytics:2.5.5#aar') {
transitive = true;
}
Then I abstract the functionality of Crashlytics into a class called CrashReportingService. In my debug source code, this class does nothing:
/app/src/debug/java/com/example/services/CrashReportingService.java:
public class CrashReportingService {
public static void initialise(Context context) {
}
public static void logException(Throwable throwable) {
}
}
And I flesh out the implementation in my release source code:
/app/src/release/java/com/example/services/CrashReportingService.java
public class CrashReportingService {
public static void initialise(Context context) {
Fabric.with(context, new Crashlytics());
}
public static void logException(Throwable throwable) {
Crashlytics.getInstance().core.logException(throwable);
}
}
Crashlytics is now only included in release builds and there is no reference to Crashlytics in my debug builds. Back under 65k methods, hooray!
I have got another option. That also helps to speed up but not as your
demand. That is using demon.
If you use the new Gradle build system with Android (or Android Studio) you might have realized, that even the simplest Gradle call (e.g. gradle project or grade tasks) is pretty slow. On my computer it took around eight seconds for that kind of Gradle calls. You can decrease this startup time of Gradle (on my computer down to two seconds), if you tell Gradle to use a daemon to build. Just create a file named gradle.properties in the following directory:
/home/<username>/.gradle/ (Linux)
/Users/<username>/.gradle/ (Mac)
C:\Users\<username>\.gradle (Windows)
Add this line to the file:
org.gradle.daemon=true
From now on Gradle will use a daemon to build, whether you are using Gradle from command line or building in Android Studio. You could also place the gradle.properties file to the root directory of your project and commit it to your SCM system. But you would have to do this, for every project (if you want to use the daemon in every project).
Note: If you don’t build anything with Gradle for some time (currently
3 hours), it will stop the daemon, so that you will experience a long
start-up time at the next build.
How does the Gradle Daemon make builds faster?
The Gradle Daemon is a long lived build process. In between builds it waits idly for the next build. This has the obvious benefit of only requiring Gradle to be loaded into memory once for multiple builds, as opposed to once for each build. This in itself is a significant performance optimization, but that's not where it stops.
A significant part of the story for modern JVM performance is runtime code optimization. For example, HotSpot (the JVM implementation provided by Oracle and used as the basis of OpenJDK) applies optimization to code while it is running. The optimization is progressive and not instantaneous. That is, the code is progressively optimized during execution which means that subsequent builds can be faster purely due to this optimization process.
Experiments with HotSpot have shown that it takes somewhere between 5
and 10 builds for optimization to stabilize. The difference in
perceived build time between the first build and the 10th for a Daemon
can be quite dramatic.
The Daemon also allows more effective in memory caching across builds. For example, the classes needed by the build (e.g. plugins, build scripts) can be held in memory between builds. Similarly, Gradle can maintain in-memory caches of build data such as the hashes of task inputs and outputs, used for incremental building.
Some other ways to speed up the process
How to Speed Up Your Gradle Build From 90 to 8 Minutes?
How to optimize gradle build performance regarding build duration and RAM usage?

Gradle test tasks with webcontext fails to use entities

I have an issue with some gradle task that should run tests. Since this is some legacy from ant tasks we do not want to include them into our test suite. Especially considering that those ant ones are in testng, and those made by us, and used on regular basis are made using spock and junit.
The problem is that those tests are using some context which works pretty well when I run those tests under eclipse IDE, but it fails if I try to do sth like:
task testNgTesting(type: Test, dependsOn: testClasses){
useTestNG()
includes = ["**/*IT*"]
}
But when I try to use that task I get errors like "org.hibernate.MappingException: Unknown entity:" or "java.lang.IllegalArgumentException: No query defined for that name"
Actually the problem is deeper. Gradle is trying to be smart and from whatever folder it has defined it puts classes files into classes folder, and all the other files into resources. When persistence.xml is loaded it scans for annotated entities starting with classpath root for folder it is present in (i.e. build/resources/main). Since all those classes are in build/classes/main it fails to find them. The workaround I've made is introduce two copy tasks. one named is copying persistence.xml into classes folder, and another is moving this file back to resources after the tests are finished. You might want to use something like
testNgTesting.finalizedBy(cleanAfterIntegrationTests) to make sure the cleanup occurs even if there are some tests that fails.

Categories