I have been trying to execute a jar file ssr.runner, jar using task with type JavaExec. But gradle is giving me the error that no main class specified. I am new to gradle so if any one could please elaborate in detail that would be great. Here is the task that I wrote:
task executeSqlScriptRunnerBeforeTenantCreation(type:JavaExec) {
description "Execute ssr.runner to install certificates into the device"
doLast {
if(scriptRunnerVariables.get('SSR_INTERSECT_MOCK') == 'true') {
println "Executing SQL Script Runner..."
println "Adding certificates for mocked intersect service"
if(scriptRunnerVariables.get('SSR_DB_TYPE') == 'SQL_SERVER') {
classpath = file("{$workingDir}\\ssr.runner.jar")
main = '-jar'
args '-dbtype', "${scriptRunnerVariables.get('SSR_DB_TYPE')}", '-dbhost', "${scriptRunnerVariables.get('SSR_DB_HOST')}", '-dbinstance',
"${scriptRunnerVariables.get('SSR_DB_INSTANCE')}", '-dbname', "${scriptRunnerVariables.get('SSR_DB_NAME')}", '-dbuser', "${scriptRunnerVariables.get('SSR_DB_USER')}",
'-dbpass', "${scriptRunnerVariables.get('SSR_DB_PASS')}", '-sqlscriptpath', "${scriptRunnerVariables.get('SSR_INTERSECT_MOCK_DB_SCRIPT')}"
}
}
}
}
First of all you're mixing two phases: execution and configuration. There's no need to add doLast for predefined tasks. See here. Then this is (probably) how the script should look like:
task executeSqlScriptRunnerBeforeTenantCreation(type:JavaExec) {
description "Execute ssr.runner to install certificates into the device"
if(scriptRunnerVariables.get('SSR_INTERSECT_MOCK') == 'true') {
println "Configuring SQL Script Runner..."
println "Adding certificates for mocked intersect service"
if(scriptRunnerVariables.get('SSR_DB_TYPE') == 'SQL_SERVER') {
main = '<FULLY QUALIFIED NAME OF CLASS YOU NEED TO RUN>'
classpath = file("{$workingDir}\\ssr.runner.jar")
args '-dbtype',
"${scriptRunnerVariables.get('SSR_DB_TYPE')}",
'-dbhost',
"${scriptRunnerVariables.get('SSR_DB_HOST')}",
'-dbinstance',
"${scriptRunnerVariables.get('SSR_DB_INSTANCE')}",
'-dbname',
"${scriptRunnerVariables.get('SSR_DB_NAME')}",
'-dbuser',
"${scriptRunnerVariables.get('SSR_DB_USER')}",
'-dbpass',
"${scriptRunnerVariables.get('SSR_DB_PASS')}",
'-sqlscriptpath',
"${scriptRunnerVariables.get('SSR_INTERSECT_MOCK_DB_SCRIPT')}"
}
}
}
}
main was misconfigured there should be a fully qualified (with package) name of java class you need to run passed. Of course, what is configured via classpath, main, args and so on in configuration phase will be run in execution phase only if the two expressions in if block evaluate to true.
Related
My project root directory is:
D:/Project/Node_Project
I am using a gradle plugin to install nodejs temporarily in my project root directory so that some nodejs command can run in the project while the thoject builds. The plugin is as below:
plugins {
id "com.github.node-gradle.node" version "2.2.4"
}
node {
download = true
version = "10.10.0"
distBaseUrl = 'https://nodejs.org/dist'
workDir = file("${project.buildDir}/nodejs")
}
So, nodejs is getting installed inside the project in the location:
D:/Project/Node_Project/build/nodejs/node-v10.10.0-win-x64
Now, I am using a .execute(String[] "path to set at environment variable", String path of file to be executed which is in the project root directory) method to run a windows command with node dependency. Code below:
cmd = "node connect.js"
def process = cmd.execute(["PATH=${project.projectDir}/build/nodejs/node-v10.10.0-win-x64"],null)
In the above .execute method, is there a way to auto-populate the "build/nodejs/node-v10.10.0-win-x64" part of the string instead of hardcoding it into the method?
Something like:
def process = cmd.execute(["PATH=${project.projectDir}/.*"],null)
Syntax of .execute method:
https://docs.groovy-lang.org/latest/html/groovy-jdk/java/lang/String.html#execute(java.lang.String[],%20java.io.File)
All the codes are inside "build.gradle" file. Please help!
I asked why you don't just write a task of type NodeTask, but I understand that you like to run a it in the background, which you can't do with that.
You could list the content of a directory and use that as part of the command. But you could also just grab it from the extension provided by the plugin.
This is not documented and it might break in future releases of the plugin, but you can do something like this (Groovy DSL):
task connectJS {
dependsOn nodeSetup
doFirst {
def connectProcess = "$node.variant.nodeExec $projectDir/src/js/connect.js".execute()
// Blocking readers (if async, pipe to a log file instead)
connectProcess.in.eachLine { logger.info(it) }
connectProcess.err.eachLine { logger.err(it) }
}
}
I have a Gradle build that has some dependencies of the form
compile files('path/to/local/lib.jar')
(the build is being migrated - eventually these will be replaced)
The build failed because one of these paths was incorrectly specified. But it failed due to a compile error - it looked like Gradle silently ignored the missing dependency.
Is there a simple option or switch that will force Gradle to fail the build if any dependency (particularly local file dependencies) cannot be resolved (eg., file missing)?
Edit: to clarify further:
If a dependency cannot be found in the configured repositories, Gradle will fail the build when attempting to resolve them, as expected.
BUT - if a dependency is defined as "compile files ....", and the file specified does not exist at build time, Gradle will IGNORE that error, and attempt compilation anyway. That seems spectacularly wrong-headed and inconsistent default behaviour.
My question is - is there a Gradle option or switch or environment variable or system property that I can set to force Gradle to verify that file dependencies exist? (E.g,, behave in a sane and rational way?)
This is a bit of an old thread, but given that none of the currently proposed solutions actually works, and the solution appears to be trivial (collating two of them), I am leaving it here for future reference.
The point here is that we simply want to ensure that the files do exist, so we can just use the exists() method of the File class:
task ensureDepsExist() {
doLast {
configurations.implementation.canBeResolved(true)
Set<File> impFiles = configurations.implementation.resolve()
impFiles.forEach { f ->
if (!f.exists()) {
ant.fail "${f} could not be found"
}
}
}
}
compileJava.dependsOn ensureDepsExist
The canBeResolved() call is required, or Gradle will complain that configurations dependencies cannot be resolved.
Here's how you can check transitive dependencies using Gradle 7.3 (example: Fail if the project depends on log4j directly or transitively).
Kotlin DSL
configurations {
all {
relsolutionStrategy {
eachDependency {
if (requested.name == "log4j") {
throw RuntimeException("Project depends on log4j")
}
}
}
}
}
Groovy DSL
configurations.all {
resolutionStrategy.eachDependency { DependencyResolveDetails details ->
if (details.requested.name == 'log4j') {
throw new RuntimeException("Project depends on log4j")
}
}
}
You could do something as shown below. It is not a built-in Gradle function but does not require code to check each dependency specifically (it checks all in the compile configuration):
apply plugin: 'java'
dependencies {
compile files('lib/abc.jar')
compile files('lib/def.jar')
}
task checkDependencies() {
doLast {
configurations.compile.each { file ->
assert file.exists()
}
}
}
compileJava.dependsOn checkDependencies
To fail the build you can:
ant.fail('message why it failed')
Then you can craft a condition then fail the build with nice message ;)
I would suggest to create a task that will bring the file to the project first with a condition to check if the file is available etc if not then throw a Gradle exception and fail the build with a message, and execute the task first in the execution phase.
I have no chance to test it now but it could be something like this, correct me if any syntax is wrong - but you should get the idea.
def yourDep = $/\path\to\your\depdendency/$
task bringDeps << {
if (yourDep.exists()){
copy {
from yourDep
into $projectDir/depsOrSmthg
}
} else{
ant.fail('message why it failed')
}
}
task ensureDependenciesExist() {
doLast {
configurations.implementation.canBeResolved(true)
DependencySet deps = configurations.implementation.getDependencies()
Set<File> impFiles = configurations.implementation.resolve()
deps.each { d ->
boolean depWasResolved = impFiles.any { impFile -> impFile.name.find(".*${d.name}.*${d.version}") }
if (!depWasResolved) {
println "${d} was not resolved"
assert depWasResolved
}
}
}
}
compileJava.dependsOn ensureDependenciesExist
One of my build tasks pulls information on the current SVN branch. On builds from a tag I want to be more strict and fail the build when e.g., a link checker finds dead links for the online help files. On regular builds from branches or trunk this should not break the build.
I have the following code, where the mentioned Perl script creates a properties file:
task generateSvnInfo(type: Exec) {
outputs.files "generated/svninfo"
executable "perl"
args "..."
}
Properties buildProps = new Properties()
task svninfo() {
inputs.files generateSvnInfo.outputs.files
outputs.upToDateWhen { false }
buildProps.load(new FileInputStream(inputs.files.getSingleFile()))
}
Now my other targets depend on svninfo (and the fact that it populates buildProps).
task checkHelpLinks(type: Exec) {
dependsOn "svninfo"
executable "perl"
args "..."
}
This will always fail if it finds dead help links. As far as I understand it, ignoreExitValue is false by default. To set it to true on non-tag builds, I can add this to the checkHelpLinks task:
ignoreExitValue = true
doLast {
ignoreExitValue = buildProps.FROM_TAG == "false"
}
This works, but I have four or five of these check tasks and would like to not duplicate that code around. So I tried
tasks.grep(~ /^check.+/).each { task ->
task.ignoreExitValue = true
task.doLast {
task.ignoreExitValue = buildProps.FROM_TAG == "false"
}
}
This code does not seem to get executed. I thought that may be because I compare the Task object to a String in grep, but using
tasks.grep(it.name =~ /^check.+/).each { task ->
gets me a build script error ("Could not find property 'it' on root project 'foo'.)
How can I add my tag check to all check tasks?
Is there a better way to load the properties from a file that is created as part of the build process?
Is there a SVN plugin that would do the work for me?
I have these files under the <project_root> folder:
./build.gradle
./build/libs/vh-1.0-SNAPSHOT.jar
./libs/groovy-all-2.1.7.jar
./src/main/groovy/vh/Main.groovy
In the build.gradle file, I have this task:
task vh( type:Exec ) {
commandLine 'java -cp libs/groovy-all-2.1.7.jar:build/libs/' +
project.name + '-' + version + '.jar vh.Main'
}
The Main.groovy file is simple:
package vh
class Main {
static void main( String[] args ) {
println 'Hello, World!'
}
}
After plugging in the string values, the command line is:
java -cp libs/groovy-all-2.1.7.jar:build/libs/vh-1.0-SNAPSHOT.jar vh.Main
If I run the command directly from shell, I get correct output. However, if I run gradle vh, it will fail. So, how do I make it work? Thank you very much.
Exec.commandLine expects a list of values: one value for the executable, and another value for each argument. To execute Java code, it's better to use the JavaExec task:
task vh(type: JavaExec) {
main = "vh.Main"
classpath = files("libs/groovy-all-2.1.7.jar", "build/libs/${project.name}-${version}.jar")
}
Typically, you wouldn't have to hardcode the class path like that. For example, if you are using the groovy plugin, and groovy-all is already declared as a compile dependency (and knowing that the second Jar is created from the main sources), you would rather do:
classpath = sourceSets.main.runtimeClasspath
To find out more about the Exec and JavaExec task types, consult the Gradle Build Language Reference.
I would like to use the Gradle "application" plugin to create startScripts for a second mainClass. Is this possible? Even if the application plugin doesn't have this functionality built in, is it possible to leverage the startScripts task to create a second pair of scripts for a different mainClass?
Add something like this to your root build.gradle:
// Creates scripts for entry points
// Subproject must apply application plugin to be able to call this method.
def createScript(project, mainClass, name) {
project.tasks.create(name: name, type: CreateStartScripts) {
outputDir = new File(project.buildDir, 'scripts')
mainClassName = mainClass
applicationName = name
classpath = project.tasks[JavaPlugin.JAR_TASK_NAME].outputs.files + project.configurations.runtimeClasspath
}
project.tasks[name].dependsOn(project.jar)
project.applicationDistribution.with {
into("bin") {
from(project.tasks[name])
fileMode = 0755
}
}
}
Then call it as follows either from the root or from subprojects:
// The next two lines disable the tasks for the primary main which by default
// generates a script with a name matching the project name.
// You can leave them enabled but if so you'll need to define mainClassName
// And you'll be creating your application scripts two different ways which
// could lead to confusion
startScripts.enabled = false
run.enabled = false
// Call this for each Main class you want to expose with an app script
createScript(project, 'com.foo.MyDriver', 'driver')
I combined parts of both of these answers to arrive at the relatively simple solution:
task otherStartScripts(type: CreateStartScripts) {
description "Creates OS specific scripts to call the 'other' entry point"
classpath = startScripts.classpath
outputDir = startScripts.outputDir
mainClassName = 'some.package.app.Other'
applicationName = 'other'
}
distZip {
baseName = archivesBaseName
classifier = 'app'
//include our extra start script
//this is a bit weird, I'm open to suggestions on how to do this better
into("${baseName}-${version}-${classifier}/bin") {
from otherStartScripts
fileMode = 0755
}
}
startScripts is created when the application plugin is applied.
You can create multiple tasks of type CreateStartScripts and in each task you configure a different mainClassName. for convenience, you can do this in a loop.