How do I include a local jar dependency in Gradle? - java

This must be fairly simple to accomplish, but I can't seem to get it right.
I have a gradle task that creates a jar from some external classes, and my code is heavily dependent on those classes. When I try to build, I get errors from compileJava saying package <com.etc...> does not exist for the import lines in my classes.
Here's the relevant code
project.ext.set("myVersion", "v1")
dependencies {
// tried this, but it gives me a circular dependency error for my compile & zip tasks
compile files('${buildDir}/dist/my-jar-${project.myVersion}.jar') {
builtBy 'zipExternalClasses'
}
// tried either of these, but still get package does not exist
compile files('${buildDir}/dist/my-jar-${project.myVersion}.jar')
runtime files('${buildDir}/dist/my-jar-${project.myVersion}.jar')
}
// The dependent task compileExternalClasses compiles the classes from a source folder
// I can see that the jar is successfully created in 'build/dist'
task zipExternalClasses(dependsOn: 'compileExternalClasses', type: Jar) {
// code for zipping compiled external classes
}

This is what i do to include local jars.
I place them in a app/libs folder.
Then in the build.gradle (Module: app) looks like this:
dependencies {
compile project(':protobuf')
compile files('libs/android-support-v13.jar')
}
Where "android-support-v13.jar" is the jar file i previously placed in libs folder.

I think the problem might be that you are using single quote in your file path. Single quoted String in Groovy does not do String interpolation so what you essentially get as your path is just ${buildDir}/dist/my-jar-${project.myVersion}.jar itself, which is clearly not right.
Just try double quotes like below:
dependencies {
compile files("${buildDir}/dist/my-jar-${project.myVersion}.jar")
}
The variables, 'buildDir' and 'project.myVersion', will be substituted with the real value when the String is evaluated.
Take a look at the Groovy documentation about String and GString and I'm sure it'll be useful.

Related

Gradle include outside resource including the containing folder

I am converting a project built with Ant to use Gradle. The project looks something like
root
|-RelevantProject
...
|-LotsOfOtherSubprojects
...
|-Resources
|--resources
|---subfolder
|----bunchOfProps.properties
The code references these as Resources/resources/subfolder/bunchOfProps.properties. This code and the folder structure cannot be changed as the ant scripts need to keep functioning
I have attempted to include this as
sourceSets {
main {
resources {
srcDir '../Resources'
}
}
}
Which fails as the code top level folder is now cut off. The code would work if looking for resources/subfolder/bunchOfProps.properties.
I have also attempted compile files('../Resources') with the same problem. Hard to say as this one did not appear in the Build directory. compile fileTree(dir: '../', include: '**/*.properties'), which I hoped would just pick up the relevant files also did not show up in the build directory.
Simply using the root directory as a resource folder caused problems as it included other projects and even the .gradle directory. I haven't yet gotten it to compile this way. Not sure yet if I can exclude enough things to get this to work.
PrasadU's answer is sort-of correct, but it breaks up-to-date checking as it introduces a task where the output overlaps with the one from processResources. It is better to just reconfigure the latter task instead:
processResources {
from(projectDir) {
include("Resources/**")
}
}
if you just need to copy the files
ext.prjRoot = project.projectDir.toString()
task copyExtResources(type: Copy) {
from prjRoot
include "Resources/**"
into "$buildDir/resources/main"
}
processResources.dependsOn copyExtResources

Gradle + Eclipse : use class from existing project in a new project

I know there are a lot of questions that seem similar. I have also spent a few hours getting to grips with Gradle multiprojects. But I still don't understand what the best course of action is here. Incidentally I am using Groovy as my coding language, but explanations referencing Java would be just as good.
I have developed an Eclipse Gradle project, "ProjectA", which in particular has a class, IndexManager, which is responsible for creating and opening and querying Lucene indices.
Now I am developing a new Eclipse Gradle project, "ProjectB", which would like to use the IndexManager class from ProjectA.
This doesn't really mean that I would like both projects to be part of a multiproject. I don't want to compile the latest version of ProjectA each time I compile ProjectB - instead I would like ProjectB to be dependent on a specific version of ProjectA's IndexManager. With the option of upgrading to a new version at some future point. I.e. much as with the sorts of dependencies you get from Maven or JCenter...
Both projects have the application plugin, so ProjectA produces an executable .jar file whose name incorporates the version. But currently this contains only the .class files, the resource files, and a file called MANIFEST.MF containing the line "Manifest-Version: 1.0". Obviously it doesn't contain any of the dependencies (e.g. Lucene jar files) needed by the .class files.
The application plugin also lets you produce a runnable distribution: this consists of an executable file (2 in fact, one for *nix/Cygwin, one for Windows), but also all the .jar dependencies needed to run it.
Could someone explain how I might accomplish the task of packaging up this class, IndexManager (or alternatively all the classes in ProjectA possibly), and then including it in my dependencies clause of ProjectB's build.gradle... and then using it in a given file (Groovy or Java) of ProjectB?
Or point to some tutorial about the best course of action?
One possible answer to this which I seem to have found, but find a bit unsatisfactory, appears to be to take the class which is to be used by multiple projects, here IndexManager, and put it in a Gradle project which is specifically designed to be a Groovy library. To this end, you can kick it off by creating the project directory and then:
$ gradle init --type groovy-library
... possible to do from the Cygwin prompt, but not from within Eclipse as far as I know. So you then have to import it into Eclipse. build.gradle in this library project then has to include the dependencies needed by IndexManager, in this case:
compile 'org.apache.lucene:lucene-analyzers-common:6.+'
compile 'org.apache.lucene:lucene-queryparser:6.+'
compile 'org.apache.lucene:lucene-highlighter:6.+'
compile 'commons-io:commons-io:2.6'
compile 'org.apache.poi:poi-ooxml:4.0.0'
compile 'ch.qos.logback:logback-classic:1.2.1'
After this, I ran gradle jar to create the .jar which contains this IndexManager class, initially without any fancy stuff in the manifest (e.g. name, version). And I put this .jar file in a dedicated local directory.
Then I created another Gradle project to use this .jar file, the critical dependency here being
compile files('D:/My Documents/software projects/misc/localJars/XGradleLibExp.jar' )
The file to use this class looks like this:
package core
import XGradleLibExp.IndexManager
class Test {
public static void main( args ) {
println "hello xxx"
Printer printer = new Printer()
IndexManager im = new IndexManager( printer )
def result = im.makeIndexFromDbaseTable()
println "call result $result"
}
}
class Printer {
def outPS = new PrintStream(System.out, true, 'UTF-8' )
}
... I had designed IndexManager to use an auxiliary class, which had a property outPS. Groovy duck-typing means you just have to supply anything with such a property and hopefully things work.
The above arrangement didn't run: although you can do build and installdist without errors, the attempt to execute the distributed executable fails because the above 6 compile dependency lines are not present in build.gradle of the "consumer" project. When you put them in this "consumer" Gradle project's build.gradle, it works.
No doubt you can add the version to the generated .jar file, and thus keep older versions for use with "consumer" projects. What I don't understand is how you might harness the mechanism which makes the downloading and use of the dependencies needed by the .jar as automatic as we are used to for things obtained from "real repositories".
PS in the course of my struggles today I seem to have found that Gradle's "maven-publish" plugin is not compatible with Gradle 5.+ (which I'm using). This may or may not be relevant: some people have talked of using a "local Maven repository". I have no idea whether this is the answer to my problem... Await input from an über-Gradle-geek... :)
You should be able to update the Eclipse model to reflect this project-to-project dependency. It looks something like this (in ProjectB's build.gradle):
apply plugin: 'eclipse'
eclipse {
classpath.file.whenMerged {
entries << new org.gradle.plugins.ide.eclipse.model.ProjectDependency('/ProjectA')
}
project.file.whenMerged {
// add a project reference, which should show up in /ProjectB/.project's <projects> element
}
}
These changes may be to the running data model, so they may not actually alter the .classpath and .project files. More info can be found here: https://docs.gradle.org/current/dsl/org.gradle.plugins.ide.eclipse.model.EclipseModel.html
This issue is discussed here: http://gradle.1045684.n5.nabble.com/Gradle-s-Eclipse-DSL-and-resolving-dependencies-to-workspace-projects-td4856525.html and a bug was opened but never resolved here: https://issues.gradle.org/browse/GRADLE-1014

Annotation processor in Gradle outputs source files to build/classes making javadoc fail. How to fix it?

I have an annotation processor that is automatically picked up by the Java compiler at build time (using SPI). During a gradle build, the generated java sources of this annotation processor are put in build/classes as Gradle tells the annotation processor that this is the place to output generated source files.
When the standard javadoc task is run, it tries to create javadoc for all files in build/classes, including *.java. This failes because javadoc only expects *.class files, making the whole build fail.
So my question is:
Is this a Gradle bug/feature?
How do I fix it/make it work?
It seems the problem is that the generated source files are not picked up, making the javadoc fail because it had nothing to process.
I'm posting the solution here in case somebody is experiencing the same problem:
The problem with compile time source generation in gradle is that the outputted sources are not automatically picked up by the javadoc. This is a problem if all your sources are auto generated. The build will fail with an error saying that no sources could be processed. In the other case your build will succeed but you will have no javadoc of your generated java sources.
The root problem here is gradle's poor integration with generating sources that are both generated and compiled during the same compile step. To remedy this I changed my build files to this.
project layout:
rootproject
rootproject/annotationProcessor
rootproect/userOfAnnotationProcessor
build file of userOfAnnotationProcessor
def generatedSources = "$buildDir/generated-src"
def generatedOutputDir = file("$generatedSources")
compileJava {
doFirst {
generatedOutputDir.exists() || generatedOutputDir.mkdirs()
options.compilerArgs = [
'-s', "${generatedSources}"
]
}
}
sourceSets {
main {
java {
srcDirs += generatedOutputDir
}
}
}
javadoc {
source = sourceSets.main.resources
}
compileJava.dependsOn clean
The trick here is to not add your generated sources to a custom sources set, else we'll run into troubles when trying to build aggregated javadoc in our root project. However this solution has the nasty side effect that our generated sources or added twice for some reason when trying to build after a first clean+build. The solution here is to always do a clean+build.
Now when doing an aggregate javadoc build, we'd like our generated source javadoc to be part of it as well.
This is how our rootproject build file looks like:
def exportedProjects = [
":annotationProcessor",
":userOfAnnotationProcessor",
]
task alljavadoc(type: Javadoc) {
source exportedProjects.collect { project(it).sourceSets.main.allJava }
classpath = files(exportedProjects.collect { project(it).sourceSets.main.compileClasspath })
destinationDir = file("${buildDir}/docs/javadoc")
}
alljavadoc.dependsOn(":userOfAnnotationProcessor:compileJava")
If we had used a custom source set previously, gradle would now start complaining about source set properties not being found. Why? I don't know... A last important thing to notice is that our alljavadoc depeonds on the compilation step of userOfAnnotationProcessor, this is needed to make sure our generated source files are there when the aggregated javadoc is build.
I hope I've helped sombody with this explanation!
I am not quite sure weather it is a bug or not. But as a workaround just filter the sources of javadoc.
Depending on how your build script looks like, it should look something like thistask
myJavadocs(type: Javadoc) {
classpath = sourceSets.main.output.filter { it -> !it.name.endsWith('.java') }
}

Dynamically add JAR to Gradle dependencies

Currently, my build.gradle has a dependency on an external library built with Ant. To accomplish building the library, I followed the advice here and created a task which builds the external library, and copies it to the libs/ folder.
The task is called as part of a dependency:
build.gradle
dependencies {
compile fileTree('libs') {
include '*.jar'
builtBy 'myTask'
}
}
task myTask (type: GradleBuild) { GradleBuild antBuild ->
antBuild.buildFile('external-stub.gradle')
antBuild.tasks = ['clean', 'ivy.check', 'ivy.download', 'ivy.task', 'ivy',
'init', 'mergeCode', 'compile', 'jar', 'copyJarsToProject']
}
However, when the compile actually runs, the library I just built and copied in is not included in the dependencies, as evidenced by a whole lot of compilation errors.
Am I including the library the wrong way?
The full build.gradle and associated files are over at Github, and I've linked to the specific commit I'm having issues with: Source Repository
Alright, took me a while to get a build I was happy with. But, here's what was changed.
The actual build of the JAR was built using the same style, but moved to the external project (so that the main build project wasn't reaching across to it). I'll give an in-depth explanation below, but the commits are here and here. These are in order.
Basically, we export the jar as an artifact that other projects can depend on, rather than copying over the Jar ourselves. This way, the Ant build runs and other projects can see the Jar we just created. This is the end of the first commit. In the second commit, the task outputs are marked as needing to be regenerated only if the Jar does not exist. This was due to the fact that whenever I tried to build the app, it would take minutes to regen the Jar, and then have to repackage everything else as well. The code is below:
build.gradle External Project
configurations {
buildJSword
}
task doBuildJSword (type: GradleBuild) {
buildFile = 'jsword-stub.gradle'
tasks = ['clean', 'ivy.check', 'ivy.download', 'ivy.task', 'ivy',
'init', 'mergeCode', 'compile', 'jar'] //, 'copyJarsToMinimalBible']
ext.outputJar = file('distribution/jsword.jar')
outputs.upToDateWhen {
ext.outputJar.exists()
}
}
artifacts {
buildJSword(doBuildJSword.ext.outputJar) {
builtBy doBuildJSword
}
}
Then, the main project just has to add this project as a compile-time dependency:
build.gradle Main Project
compile project(path: ':jsword-minimalbible', configuration: 'buildJSword')
Hope this is helpful for anyone with a similar issue, let me know if you have questions!
Note: The build currently does not clean itself properly, so if you change any code in the external project, you need to delete the external Jar for everything to regenerate itself correctly.

Copy file from a jar in a custom gradle plugin

I write a custom gradle plugin where I would like to copy a specific file from a jar inside the classpath into the buildDir. I played around in a sandbox project and got this solution working:
task copyFile(type: Copy) {
from zipTree(project.configurations.compile.filter{it.name.startsWith('spring-webmvc')}.singleFile)
include "overview.html"
into project.buildDir
}
but if copy it into my plugin:
project.task(type: Copy, "copyFile") {
from zipTree(project.configurations.compile.filter{it.name.startsWith('spring-webmvc')}.singleFile)
include "overview.html"
into project.buildDir
}
I got the error:
* What went wrong:
A problem occurred evaluating root project 'gradle-springdoc-plugin-test'.
> Could not find method zipTree() for arguments [/Users/blackhacker/.gradle/caches/artifacts-26/filestore/org.springframework/spring-webmvc/4.0.0.RELEASE/jar/a82202c4d09d684a8d52ade479c0e508d904700b/spring-webmvc-4.0.0.RELEASE.jar] on task ':copyFile'.
The result of
println project.configurations.compile.filter{it.name.startsWith('spring-webmvc')}.singleFile.class
is
class java.io.File
What I am doing wrong?
Unlike a build script, a plugin does not have an implicit project context (unless you give it one). Hence you'll have to use project.task rather than task, project.zipTree rather than zipTree, project.file rather than file, etc.
PS: In your case, it's important to use project.zipTree { ... } (note the curly braces) to defer searching for the file until the Zip contents are actually requested. Otherwise you risk slowing down each build invocation (even ones that never execute copyFile) and, if the file is being produced by the same build, even build failures (because the configuration is resolved before the file has been added).

Categories