I have a Gradle build script which has to call a Scala (or Java) method in a Task.
currently, i have src/.../utils/lib.java
this file has a simple class with a static method that I would like to call in the build.gradle script.
but I have no idea how to import this file and use this method.
This really depends on how your gradle script is set up and what you are trying to accomplish. Without seeing it, it is hard to give a concrete all encompassing response. Here are the two most common methods I know of that would allow you to execute code in a gradle script.
1) To directly answer your response, to execute a static java method, it needs to be compiled first. If the rest of your build requires this to be executed, then you would need to structure this as a multi-project build.
https://docs.gradle.org/current/userguide/multi_project_builds.html
Assuming you have
apply plugin: 'java'
You would then be able to create a jar with the class containing your static method, and create a task that you can execute of type Exec or JavaExec that would execute this jar.
2) The easier and less complex approach:
Since gradle allows you to directly execute groovy code, if possible, I would try to move the logic from the static method into the gradle file. You can reference Objects and methods on the org.gradle.api to customize a lot of build actions.
https://docs.gradle.org/current/javadoc/
Here is an example from my build.gradle use to copy resources files after a compile step into the build directory before it is packaged into a jar.
compileJava {
doLast {
def f = file("src/main/java/resources/db/migration")
println "Copying SQL from {$f} to {$buildDir/db/migration}"
copy {
from file(f)
into file("$buildDir/db/migration")
include('*.sql')
}
}
}
This may not be the easy solution that you were looking for but I hope it gives you some good guidance.
Related
Hi I have a tar task that I made after looking at numerous methods and some SO posts.
task buildDist(type: Tar, dependsOn: jar) {
print 'here'
archiveName = 'xyz-' + version
destinationDir = file('build/dist')
extension = 'tar.gz'
compression = Compression.GZIP
from 'build/libs'
include 'xyz.jar'
}
buildDist.mustRunAfter jar
I have the java plugin applied and the jar task makes the xyz.jar file available under build/libs. The build/dist directory does not exist yet, but I tried new File("build/dist") as well. That did not work either - I even pointed it to the build directory that exists - doesn't work. I run the entire script with /gradlew clean build. The print in the above code does print.
I am making a few assumptions here as you didn't post the output from running Gradle.
The build task is just a normal Gradle task that doesn't do anything by itself. Instead, it depends on other tasks. If you create your own custom task and you like to have it included when executing build, you have to add a dependency to it. If this is not the problem and you have actually done this, please give some more details as to what makes it "not work" when you run build.
If you want to test your task in isolation (e.g. to make sure it works correctly without running unit tests or whatever else that is unrelated), just run gradlew cleanBuildDist buildDist.
A note about the 'print' statement - it executes doing the configuration phase, but this doesn't mean you can use it to test if the task actually executes. In fact, it will most likely print no matter what task you execute. If you wanted to print something on execution time, you would have to put it in a doLast block.
There is a few other things you should change as well:
It is not a good practice to use relative references. Instead, use the buildDir property to get an absolute reference to the build directory.
Don't use deprecated methods like archiveName and destinationDir. Use archiveFileName and destinationDirectory instead.
The extension property is also deprecated, but it is ignored if you set the full name of the archive yourself. So just remove it. This also means you are missing the extension on the full name.
The from and include is a little fragile. Just use from jar.archivePath if you only want to gzip your application jar.
Example:
task buildDist(type: Tar, dependsOn: jar) {
archiveFileName = "${jar.baseName}-${version}.tar.gz"
destinationDirectory = file("$buildDir/dist")
compression = Compression.GZIP
from jar.archivePath
}
build.dependsOn buildDist
Lastly, if your intention is to create a distribution of your application that is runnable on its own (with all required dependencies), you should consider using the distribution plugin and perhaps also the application plugin.
My product is using a third party dependency called matlab control version 4.1.0. This jar file allow java to send command to matlab.
The way we use it is by calling a java command in matlab (triggered by the user), this perform a bunch of computation in the java plugin, then java sends a notification to matlab which then start executing m code with data received from java.
So far, the jar file is stored in our repository, and directly loaded in matlab.
We want to move to matlab-control v5.0.0, which is on maven. So we thought it was the occasion to get rid of the jar file, include it as a dependency from maven in our gradle build.
However the api seems to not be visible to Matlab, which returns an exception 'can not find class org.n52.matlabcontrol.MatlabProxy'.
If I test the connection directly from java (by creating a java test), java will by itself launch matlab and request the command to be executed (for instance, an hello world command). It's working fine, and I can then instantiate a MatlabProxy. But if I launch Matlab and try to access the MatlabProxy directly from it, it cannot find it.
So it seems that the Matlab-control API is not exposed by my gradle build. So far I have tried the following:
apply plugin: 'java-library'
dependencies {
implementation('org.n52.matlab:matlab-control:5.0.0')
api('org.n52.matlab:matlab-control:5.0.0')
}
But it is not working, so what I am missing ?
Thanks to JB Nizet who put me in the right direction, I found a solution.
I just had to create a configuration first, then assign dependencies to this configuration in order to copy them:
configurations {
deployerJars
}
dependencies {
deployerJars group: 'org.n52.matlab', name: 'matlab-control', version: '5.0.0'
}
task copyToLib(type: Copy) {
into "$buildDir/libs"
from configurations.deployerJars
}
I know there are a lot of questions that seem similar. I have also spent a few hours getting to grips with Gradle multiprojects. But I still don't understand what the best course of action is here. Incidentally I am using Groovy as my coding language, but explanations referencing Java would be just as good.
I have developed an Eclipse Gradle project, "ProjectA", which in particular has a class, IndexManager, which is responsible for creating and opening and querying Lucene indices.
Now I am developing a new Eclipse Gradle project, "ProjectB", which would like to use the IndexManager class from ProjectA.
This doesn't really mean that I would like both projects to be part of a multiproject. I don't want to compile the latest version of ProjectA each time I compile ProjectB - instead I would like ProjectB to be dependent on a specific version of ProjectA's IndexManager. With the option of upgrading to a new version at some future point. I.e. much as with the sorts of dependencies you get from Maven or JCenter...
Both projects have the application plugin, so ProjectA produces an executable .jar file whose name incorporates the version. But currently this contains only the .class files, the resource files, and a file called MANIFEST.MF containing the line "Manifest-Version: 1.0". Obviously it doesn't contain any of the dependencies (e.g. Lucene jar files) needed by the .class files.
The application plugin also lets you produce a runnable distribution: this consists of an executable file (2 in fact, one for *nix/Cygwin, one for Windows), but also all the .jar dependencies needed to run it.
Could someone explain how I might accomplish the task of packaging up this class, IndexManager (or alternatively all the classes in ProjectA possibly), and then including it in my dependencies clause of ProjectB's build.gradle... and then using it in a given file (Groovy or Java) of ProjectB?
Or point to some tutorial about the best course of action?
One possible answer to this which I seem to have found, but find a bit unsatisfactory, appears to be to take the class which is to be used by multiple projects, here IndexManager, and put it in a Gradle project which is specifically designed to be a Groovy library. To this end, you can kick it off by creating the project directory and then:
$ gradle init --type groovy-library
... possible to do from the Cygwin prompt, but not from within Eclipse as far as I know. So you then have to import it into Eclipse. build.gradle in this library project then has to include the dependencies needed by IndexManager, in this case:
compile 'org.apache.lucene:lucene-analyzers-common:6.+'
compile 'org.apache.lucene:lucene-queryparser:6.+'
compile 'org.apache.lucene:lucene-highlighter:6.+'
compile 'commons-io:commons-io:2.6'
compile 'org.apache.poi:poi-ooxml:4.0.0'
compile 'ch.qos.logback:logback-classic:1.2.1'
After this, I ran gradle jar to create the .jar which contains this IndexManager class, initially without any fancy stuff in the manifest (e.g. name, version). And I put this .jar file in a dedicated local directory.
Then I created another Gradle project to use this .jar file, the critical dependency here being
compile files('D:/My Documents/software projects/misc/localJars/XGradleLibExp.jar' )
The file to use this class looks like this:
package core
import XGradleLibExp.IndexManager
class Test {
public static void main( args ) {
println "hello xxx"
Printer printer = new Printer()
IndexManager im = new IndexManager( printer )
def result = im.makeIndexFromDbaseTable()
println "call result $result"
}
}
class Printer {
def outPS = new PrintStream(System.out, true, 'UTF-8' )
}
... I had designed IndexManager to use an auxiliary class, which had a property outPS. Groovy duck-typing means you just have to supply anything with such a property and hopefully things work.
The above arrangement didn't run: although you can do build and installdist without errors, the attempt to execute the distributed executable fails because the above 6 compile dependency lines are not present in build.gradle of the "consumer" project. When you put them in this "consumer" Gradle project's build.gradle, it works.
No doubt you can add the version to the generated .jar file, and thus keep older versions for use with "consumer" projects. What I don't understand is how you might harness the mechanism which makes the downloading and use of the dependencies needed by the .jar as automatic as we are used to for things obtained from "real repositories".
PS in the course of my struggles today I seem to have found that Gradle's "maven-publish" plugin is not compatible with Gradle 5.+ (which I'm using). This may or may not be relevant: some people have talked of using a "local Maven repository". I have no idea whether this is the answer to my problem... Await input from an über-Gradle-geek... :)
You should be able to update the Eclipse model to reflect this project-to-project dependency. It looks something like this (in ProjectB's build.gradle):
apply plugin: 'eclipse'
eclipse {
classpath.file.whenMerged {
entries << new org.gradle.plugins.ide.eclipse.model.ProjectDependency('/ProjectA')
}
project.file.whenMerged {
// add a project reference, which should show up in /ProjectB/.project's <projects> element
}
}
These changes may be to the running data model, so they may not actually alter the .classpath and .project files. More info can be found here: https://docs.gradle.org/current/dsl/org.gradle.plugins.ide.eclipse.model.EclipseModel.html
This issue is discussed here: http://gradle.1045684.n5.nabble.com/Gradle-s-Eclipse-DSL-and-resolving-dependencies-to-workspace-projects-td4856525.html and a bug was opened but never resolved here: https://issues.gradle.org/browse/GRADLE-1014
I can't tell if this is a bug with Gradle 1.0m7, or if we are just doing this wrong.
We have some classes that get compiled as apart of a project, that we want to individually jar into it's own artifact. These are for example standalone domain model objects, that we want to share with another project.
I'd prefer not to go the multi-project build route, so how do we tell Gradle to create another jar for these?
Currently we are doing this:
task modelJar(type: Jar) {
classifier = 'model'
from fileTree(dir: sourceSets.main.classesDir).matching { include 'com/foo/bar/model/**' }
}
artifacts {
archives modeljar
}
The issue here, is the modeljar task runs before the classes are compiled. At first we didn't realise this and thought this was working. Turns out, the artifact was picking up classes from the previous run, not the current run. Doing clean before the build results in a jar with no classes in it, and reveals the problem.
I was looking at custom configuration, but it seems pretty complex and I didn't want to overly complicate the build file.
Appreciate any advice.
Thanks.
the most convenient way to do this is
task modelJar(type: Jar) {
classifier = 'model'
from sourceSets.main.output
include 'com/foo/bar/model/**'
}
Some background:
sourceSets.main.output is a buildable filecollection. This means that if a task works with this file collection, gradle knows that this file collection must be created before another task can use it. in this particular case, sourcesets.main.classes is wired to the classes task of the java plugin. Therefore you your modelJar task does not need to depend on the classes task explicitly.
How about making modelJar task depend on classes (built-in) task? This should make sure compilation is done before modelJar task.
task modelJar(dependsOn: classes, type: Jar){
...
Has anyone successfully created a NetBeans project that combines Clojure and Java source?
I have projects where the driver program (startup, gui, user prefs, etc.) are in Java, but the logic is in Clojure. At the moment, I compile the Clojure code to a jar in one project and import it as a library in a separate Java project. It would be convenient if all the source could be combined in one single NetBeans project.
Has anyone come up with a method to do this?
One possible solution is to modify your NetBeans Java project's Ant script (build.xml in your root directory) to have it .
By default, NetBeans creates several placeholder Ant targets in the root project directory's build.xml for you to override to automate tasks beyond the standard build process (such as compiling other languages to use their libraries in your current project). By overriding one of the placeholder targets in that build script such as "-pre-compile" you could write a simple target to call the Clojure compilation process using the Ant "exec" task and place all the resulting class files (or JAR) in the appropriate build directory.
If you do this frequently, you could define an Ant extension (via a macro or Ant plugin) so you don't have to modify the build.xml each time.
I use the RT method. I put my Clojure code into a script file that I include and process at startup:
try {
RT.loadResourceScript("com/mydomain/app/clojure_scripts.clj"); // Initialize Clojure script processor with our script
} catch (Exception e) {
Util.logException(e, "Unable to run Clojure initialization script.");
}
Then, since my main logic is in Java and I'm only calling out to Clojure for calculations, I use some glue code to map the calls for me:
/*
* Class to wrap Clojure scripts with Java friendly methods.
*/
public class Clojure {
private static final String ns="com.mydomain.app";
public static double calculate(final double size, final double otherVar) {
Var report=RT.var(ns, "calculate");
return (Double) report.invoke(size, otherVar);
}
};