I'm generating javadoc for my Android project with this gradle task:
android.applicationVariants.all { variant ->
task("generate${variant.name.capitalize()}Javadoc", type: Javadoc) {
description "Generates Javadoc for $variant.name."
source = variant.javaCompile.source
classpath = files(variant.javaCompile.classpath.files, project.android.getBootClasspath())
exclude '**/BuildConfig.java'
exclude '**/R.java'
options.links("http://docs.oracle.com/javase/7/docs/api/");
options.linksOffline("http://d.android.com/reference","${android.sdkDirectory}/docs/reference");
options {
failOnError false
}
destinationDir = file("${project.projectDir}/javadoc")
}
}
It excludes R.java, so i don't get R.html in output dir.
However, i'm getting very annoying errors cannot find symbol class R in the process of generating doc for my usual java classes, in the line import com.mypackagename.R. I use common android things like R.string.string_res, so i can't remove this import.
Is there a proper way to include symbol R to index, but not include it to a javadoc, or, at least, simply to supress this error?
You can try to add next two lines to your code:
classpath += files("build/generated/source/r/${variant.flavorName}/release")
classpath += files("build/generated/source/buildConfig/${variant.flavorName}/release")
But in this case your task should depend on one of the tasks which generates R classes.
Related
I'm trying to develop a game plugin (oldschool runescape). I'm trying to add in org.json so that it's easier to read/write game states and stuff, but can't seem to figure out how to get it to package org.json with my plugin. It compiles fine, but doesn't run with that package. Any help?
This is what my plugin.gradle.kts looks like
version = "4.0.0"
project.extra["PluginName"] = "Plugin Name"
project.extra["PluginDescription"] = "Misc QOL fixes I wanted"
repositories{
mavenCentral()
}
dependencies{
// https://mavenlibs.com/maven/dependency/org.json/json
compileOnly(group = "org.json", name = "json", version = "20220320")
}
tasks {
jar {
manifest {
attributes(
mapOf(
"Plugin-Version" to project.version,
"Plugin-Id" to nameToId(project.extra["PluginName"] as String),
"Plugin-Provider" to project.extra["PluginProvider"],
"Plugin-Description" to project.extra["PluginDescription"],
"Plugin-License" to project.extra["PluginLicense"]
)
)
}
}
}
edit: I tried compileOnly, implementation, testImplementation, all with the same error "ClassnotFoundException: org.json.JSONObject"
You are using the wrong configuration, you should use "implementation" instead of "compileOnly" as per this documentation.
The gist of it is that "compileOnly" means these libraries are only needed at compiletime, and not at runtime, so they are not included in the jar, as the jar is used at runtime. The "implementation" configuration means these libraries are needed both at compile time and at runtime. Alternatively, you could also use "runtimeOnly" to indicate the package is only needed at runtime, but I don't know if that would work with your project.
I am new to Gradle and trying to migrate an existing system build from ant to Gradle.
As part of this I need to run a java program on every file in a directory. Directory contains xml files and the java code will parse and convert .xml to .java files (and these Java files would be build to generate class and package in final jar) after performing some business specific transformation.
below is a function I wrote in Gradle
private runJavaFile(String dirPath) {
FileTree tree = fileTree(dir: dirPath, include: '**/*.xml')
tree.each {
def xmlfile = it.path
def javaFile = it.path.replaceFirst(".xml", ".java")
javaexec { //// getting error on this line
classpath configurations.all
main = 'XmlToJavaParser'
args = ["$xmlfile", "$javaFile", 'Java']
}
}
}
I am calling this function from a Gradle task by passing the dir path which contains the xml files to be parsed.
While running the task, I am getting below error:
> Resolving configuration 'apiElements' directly is not allowed
Any help would be appreciated.
Let me know if any more information is needed.
In Gradle, a configuration represents a group of artifacts and their dependencies. You typically have several configurations depending on what you want to do. For instance, you could have one where you declare which dependencies are needed for compilation, which are only needed at runtime, or which are needed for running a particular Java application.
In your case, you are saying that the classpath to the XmlToJavaParser class is "all configurations combined" and that doesn't really make sense. You are also not allowed to do that as some configurations from the Java plugin are not resolvable like this, which is why you get an error.
So to fix it, you should declare your own configuration for XmlToJavaParser. You can then declare dependencies for it like you normally do. Example (using the Groovy DSL):
configurations {
xmlJavaParser {
canBeResolved = true
canBeConsumed = false
}
}
dependencies {
xmlJavaParser "org.example:xml-java-parser:1.0" // or whatever you need
}
private runJavaFile(String dirPath) {
// ...
javaexec {
classpath = configurations.xmlJavaParser // The configuration is referenced here
main = 'XmlToJavaParser'
args = ["$xmlfile", "$javaFile", 'Java']
}
}
There are also other ways to go about it. But the main point is to not use configurations.all as a classpath.
I have a Gradle build that has some dependencies of the form
compile files('path/to/local/lib.jar')
(the build is being migrated - eventually these will be replaced)
The build failed because one of these paths was incorrectly specified. But it failed due to a compile error - it looked like Gradle silently ignored the missing dependency.
Is there a simple option or switch that will force Gradle to fail the build if any dependency (particularly local file dependencies) cannot be resolved (eg., file missing)?
Edit: to clarify further:
If a dependency cannot be found in the configured repositories, Gradle will fail the build when attempting to resolve them, as expected.
BUT - if a dependency is defined as "compile files ....", and the file specified does not exist at build time, Gradle will IGNORE that error, and attempt compilation anyway. That seems spectacularly wrong-headed and inconsistent default behaviour.
My question is - is there a Gradle option or switch or environment variable or system property that I can set to force Gradle to verify that file dependencies exist? (E.g,, behave in a sane and rational way?)
This is a bit of an old thread, but given that none of the currently proposed solutions actually works, and the solution appears to be trivial (collating two of them), I am leaving it here for future reference.
The point here is that we simply want to ensure that the files do exist, so we can just use the exists() method of the File class:
task ensureDepsExist() {
doLast {
configurations.implementation.canBeResolved(true)
Set<File> impFiles = configurations.implementation.resolve()
impFiles.forEach { f ->
if (!f.exists()) {
ant.fail "${f} could not be found"
}
}
}
}
compileJava.dependsOn ensureDepsExist
The canBeResolved() call is required, or Gradle will complain that configurations dependencies cannot be resolved.
Here's how you can check transitive dependencies using Gradle 7.3 (example: Fail if the project depends on log4j directly or transitively).
Kotlin DSL
configurations {
all {
relsolutionStrategy {
eachDependency {
if (requested.name == "log4j") {
throw RuntimeException("Project depends on log4j")
}
}
}
}
}
Groovy DSL
configurations.all {
resolutionStrategy.eachDependency { DependencyResolveDetails details ->
if (details.requested.name == 'log4j') {
throw new RuntimeException("Project depends on log4j")
}
}
}
You could do something as shown below. It is not a built-in Gradle function but does not require code to check each dependency specifically (it checks all in the compile configuration):
apply plugin: 'java'
dependencies {
compile files('lib/abc.jar')
compile files('lib/def.jar')
}
task checkDependencies() {
doLast {
configurations.compile.each { file ->
assert file.exists()
}
}
}
compileJava.dependsOn checkDependencies
To fail the build you can:
ant.fail('message why it failed')
Then you can craft a condition then fail the build with nice message ;)
I would suggest to create a task that will bring the file to the project first with a condition to check if the file is available etc if not then throw a Gradle exception and fail the build with a message, and execute the task first in the execution phase.
I have no chance to test it now but it could be something like this, correct me if any syntax is wrong - but you should get the idea.
def yourDep = $/\path\to\your\depdendency/$
task bringDeps << {
if (yourDep.exists()){
copy {
from yourDep
into $projectDir/depsOrSmthg
}
} else{
ant.fail('message why it failed')
}
}
task ensureDependenciesExist() {
doLast {
configurations.implementation.canBeResolved(true)
DependencySet deps = configurations.implementation.getDependencies()
Set<File> impFiles = configurations.implementation.resolve()
deps.each { d ->
boolean depWasResolved = impFiles.any { impFile -> impFile.name.find(".*${d.name}.*${d.version}") }
if (!depWasResolved) {
println "${d} was not resolved"
assert depWasResolved
}
}
}
}
compileJava.dependsOn ensureDependenciesExist
We are developing a Java project that is able to instrument (change) class files at build time. We defined a Gradle task that invokes a java based Ant task which takes an inputDir (e.g. build/classes) and an outputDir (e.g. build/classes-instrumented) and possible other parameters. The task gets invoked separately for main and test class files after compilation. Since the "normal" java sourceSet is not a good fit, our first thought was to implement our own sourceSet but couldn't find an easy way. A reasonable alternative, similar to ANTLR etc, seemed to be extra variables. Since I needed several, I went for a Map.
sourceSets.all { ext.instrumentation = [:] }
sourceSets.all {
instrumentation.inputDir = null
instrumentation.outputDir = null
instrumentation.classPath = null
}
def postfix = '-instrumented'
Below you see how we initialize the variables.
sourceSets {
main {
instrumentation.inputDir = sourceSets.main.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
instrumentation.classPath = sourceSets.main.output + configurations.compile
}
test {
instrumentation.inputDir = sourceSets.test.output.classesDir
instrumentation.outputDir = instrumentation.inputDir + postfix
}
}
However it fails with "Could not find method main() for arguments [build_f2cvmoa3v4hnjefifhpuk6ira$_run_closure5_closure23#12a14b74] on root
project 'Continuations'."
We are using Gradle 2.1
I have the following questions:
any idea why the first one fails?
Is the extra variable a reasonable solution to approach the problem?
Thanks a lot for your help
solution: install last version.
I had the same problem, I read gradle documentation of gradle 3, but gradle 2.7 was installed.
checked gradle version 2.7
then read gradle 2.7 doc https://docs.gradle.org/2.7/userguide/tutorial_java_projects.html#N103CD , but found no info about sourceSet in java plugin for that version
installed gradle 3 --> problem solved
I have a simple Gradle build script to compile and package (similar to the application plugin) my Java application. The only thing I do not accomplish is to replace the current version number in a simple .properties file.
I have created a file 'src/main/resources/app-info.properties' with a single line 'application.version = #version#'. No I want to replace this version string whenever the file is copied to the build folder (think this happens during the build task).
I already tried a simple solution with ants ReplaceTokens. This one replaced the version but also broke my .png files in the resources..
So is there a simple solution to just replace tokens in one single file during the build task (or whatever task handles the copy to the build folder)?
Thank you for any help!
Ben
====== Edit based on the comment from Opal =====
Based on the hint I have added the following:
import org.apache.tools.ant.filters.ReplaceTokens
// ...
build {
from('src/main/resources') {
include '*.properties'
filter(ReplaceTokens, tokens: [version : project.version])
}
}
Which throws this error:
Could not find method from() for arguments [src/main/resources, build_vbjud9ah7v3pj5e7c5bkm490b$_run_closure6_closure12#43ead1a8] on root project
Seems like I am on the wrong task?
====== Edit for completeness adding the solution based on Opals suggest =====
Thanks man, the following is the working solution!
processResources {
from('src/main/resources') {
include '*.properties'
filter(ReplaceTokens, tokens: [version : project.version])
}
}
Books and blogs alike, including the answer from Opal all recommend using a vivid mixture of exclude/include, from() and filter(). And of course, so did I on my first attempt to replace the text {{app javascript library}} in a index.html file to the path of a JavaScript library which depended on a simple project property setting.
The problem that hit me was that my 'war' task produced duplicated index.html files in the war archive and getting rid of the problem, using the pattern described previously, resulted in one huge unreadable hack.
Then I found a really straight forward solution. The following example is from my own build script and you have to customize it a bit to suite your needs:
war {
eachFile { copyDetails ->
if (copyDetails.path == 'index.html') {
filter { line ->
line.replace('{{app javascript library}}', "lib/someLib.js")
}
}
}
}
Paste sample code. What You need to do is to include file for replacement and exclude other files from replacement. Here is sample usage. Search for ReplaceTokens and You'll see what am I talking about.
You need to add filtering to processResources task. Sample code:
processResources {
def profile = project.properties['profile']
def replace_tokens = profile ? filter_tokens[profile] : filter_tokens['default']
exclude '**/log4j-test.xml'
from('src/main/resources') {
exclude '**/*.ttf'
filter(ReplaceTokens, tokens: replace_tokens)
}
from('src/main/resources') {
include '**/*.ttf'
}
}
Above ttf (binary) files are excluded from filtering but copied. replace_tokens is a filter taken from map defined in other part of the script.