Gradle exclude VS include - java

Thanks to answers on this question, the following works:
task copyToLib(type: Copy) {
into "$buildDir/myapp/lib"
from configurations.runtime {
exclude module: 'commons-io'
}
}
I would assume that the following should also work (include instead of exclude):
task copyToLib(type: Copy) {
into "$buildDir/myapp/lib"
from configurations.runtime {
include module: 'commons-io'
}
}
But I'm getting following error:
org.gradle.internal.metaobject.AbstractDynamicObject$CustomMessageMissingMethodException: Could not find method include() for arguments [{module=commons-io}] on configuration ':runtime' of type org.gradle.api.internal.artifacts.configurations.DefaultConfiguration.
Is that expected or am I missing anything obvious?

Groovy allows you to omit a lot of braces and other unneccessary syntax, but this may also lead to undesired behaviour, like in your case.
A common approach to create a child CopySpec via the from(Object, Closure) method looks just like your code:
[...]
from 'sourcePath' {
// configuration closure
}
[...]
First you pass an object, which will be evaluated via Project.files(), then you pass a closure for configuration. Braces can be omitted. Easy-peasy.
But, in your example, the expression passed as object is a method call to configure a Configuration in a ConfigurationContainer, just like in the following common piece of Gradle code:
configurations.runtime {
exclude module: 'xyz'
}
So, the passed closure is interpret to configure the Configuration (globally, btw.) and not to configure the CopySpec. One way to handle this problem is to explicitly set the omitted braces:
[...]
from(configurations.runtime, {
// configuration closure
})
[...]
Please note: Using the above example you will be able to use both the exclude() and the include() method, but not the way you used them in your code examples! In a CopySpec, you can only exclude (or include) files or file patterns, not modules. The methods simply won't allow you to pass a map, you need to pass a list of strings or another closure.

Related

Getting the resource path of the java project from a custom gradle plugin

Trying to create a custom gradle plugin in java, how do i get the resources path from inside the task class?
public class MyCustomPlugin implements Plugin<Project> {
#Override
public void apply(Project project) {
project.getTasks().register("doStuff", CustomTask.class);
}
}
public class CustomTask extends DefaultTask {
// How do I get java project resources dir from here?
#Inject
public CustomTask(ProjectLayout projectLayout) {
directoryProperty = projectLayout.getBuildDirectory();
}
#TaskAction
public void execute() {
...
}
}
I would recommend to not get the directory inside the task, because the plugin that provides it might not be applied. Instead I would do it from within your plugin that registers the task, this way you can also ensure that the necessary plugin is actually applied. Gradle will display an error if the task is used without a value being assigned to the input that explains that nothing was assigned.
With the kotlin-dsl:
#CacheableTask
abstract class CustomTask : DefaultTask() {
#get:InputFiles
abstract val resources: FileCollection
//...
}
I cannot answer if #InputFiles is the right annotation for your use case, because I don't know what you want to do with the resource. Refer to the Gradle documentation for more information on the available annotations, and what they do.
plugins {
java
}
tasks.register<CustomTask>("customTask") {
resources.set(sourceSets.main.map { it.resources })
}
Notice the map {} which ensures that our task has a dependency on the processResources task, this is done automatically for us because we stick to the provider API of Gradle for everything.
Note that the resources are by default in one directory, but they don't have to be. This is why the resources are defined as SourceDirectorySet and not as Provider<Directory>. The same is true for anything that originates from the SourceSetContainer. It is easier to explain with Java source code: imagine you have Java and Kotlin, then you will have src/main/java and src/main/kotlin, hence, 2 directories. The former will have a **/*.java include filter, whereas the latter has a **/*.kt includes filter. If we just want to get all sources then we use sourceSets.main.map { it.java.sourceDirectories }, and if we want to get one of both it gets complicated. 😝
First, you'd have to ensure this is a Java project: either applying the "java" plugin from your plugin (project.getPluginManager().apply("java")), or only registering the task when the "java" plugin has been applied by the user (project.getPluginManager().withPlugin("java", ignored -> { project.getTasks().register(…); });).
You could then get the resources from the main source set:
SourceSetContainer sourceSets = project.getExtensions().getByType(SourceSetContainer.class);
// Use named() instead of get() if you prefer/need to use providers
SourceSet mainSourceSet = sourceSets.get(SourceSet.MAIN_SOURCE_SET_NAME);
SourceDirectorySet resources = mainSourceSet.getResources();
BTW, the best practice is to have tasks only declare their inputs and outputs (e.g. I need a set of directories, or files, as inputs, and my outputs will be one single file, or in one single directory) and have the actual wiring with default values be done by the plugin.
You could have the plugin unconditionally register the task, then conditionally when the "java" plugin is applied configure its inputs to the project resources; or conditionally register the task or unconditionally apply the "java" plugin, as I showed above.
You can access the sources through the project.sourceSets.
#Inject
public CustomTask(Project project) {
directoryProperty = project.projectLayout.getBuildDirectory();
sourceSet = project.sourceSets.main
}
See also the reference documentation here: https://docs.gradle.org/current/userguide/java_plugin.html#sec:java_project_layout

How to make an interface a compile-only dependency when loading its implementation dynamically

Consider the following interface
// src/MyInterface.java
interface MyInterface {
public void quack();
}
which is used in the following application dynamically; i.e. its implementation is loaded dynamically—for demonstration purposes we'll just use the implementing class' name to determine which implementation to load.
// src/Main.java
class Main {
public static void main(String[] args) {
try {
MyInterface obj = (MyInterface) Class.forName("Implementation")
.getDeclaredConstructor()
.newInstance();
obj.quack();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
The following implementation of the interface is available:
// src/Implementation.java
class Implementation implements MyInterface {
public void quack() {
System.out.println("This is a sample implementation!");
}
}
As I would intuitively think, MyInterface provides information that is only relevant at compile-time, such as which methods can be invoked on objects that implement it, but it shouldn't be needed at runtime, since it doesn't provide any "executable code". But this is not the case: if I try to run the compiled Main.class without MyInterface.class, it complains:
$ javac -d bin/ src/*
$ rm bin/MyInterface.class
$ java -cp bin/ Main
Exception in thread "main" java.lang.NoClassDefFoundError: MyInterface
[...]
I guess it makes sense because it needs access to the MyInterface's Class object to perform the cast to MyInterface, so it needs to load MyInterface. But I feel there should be a way to make it a compile-time only dependency. How?
Some context
This question arose when I learned that there can be compile-time only dependencies, an example of which is the servlet api. I read that when compiling servlet code, you need to have the servlet-api (in Tomcat's case) jar, but at runtime it is not needed because the server provides an implementation. Since I didn't understand exactly how that could work, I tried setting up the little experiment above. Did I misunderstand what that means?
Edit: this Gradle page mentions that a compile-time only dependency could be
Dependencies whose API is required at compile time but whose implementation is to be provided by a consuming library, application or runtime environment.
What would be an example for that? I find that sentence a bit confusing, because it seems to imply that the API is not needed at runtime, and only the implementation is. From the answers, I gather that's not possible, right? (Unless somehow implementing a custom classloader?)
Yes, looks like you misunderstood example with servlet-api.jar. You need it in your project as a compile time dependency because Tomcat comes itself with that jar and that jar will be added to runtime classpath by Tomcat.
if you use classes/interfaces in your code they should be somehow added to classpath since your code depends on them.
And starting Java 8 interfaces can have default implementations for methods ("executable code") and interfaces also can have constants.
Maybe it is possible to run application without interface declaration but in that case you need to develop your custom Classloader which will check for interface implementation and load it instead of interface itself.
Did I misunderstand what that means?
Yes.
You're talking about "provided" dependencies (at least, that's what Maven calls them). Such a dependency still must be present on the classpath/modulepath at both compile-time and runtime. However, you don't have to include the provided dependency with your application when deploying your application, because the target container/framework already includes the dependency.

Gradle generate sources.jar for only public interfaces

I am working on a closed-source Android library (published as an AAR), and want to include some javadocs for consumers, which requires a sources.jar.
I know I could cherry-pick each file using an includes property or maybe even a whole package/folder.
task('androidSourcesJar', type: Jar) {
classifier = 'sources'
baseName = artifactBaseName
from android.sourceSets.main.java.srcDirs
include ('MyInterface1.kt', 'MyInterface2.kt', 'MyInterface3.kt')
}
Instead, is there a way to include only public classes, interfaces, methods, etc? This seems like a problem that would've come up before.
You could try adding something like this, instead of your include:
from 'src/main/java'
eachFile { currentFile ->
String contents = new File(currentFile.getSourcePath()).text
if(!contents.contains("public class")) {
currentFile.exclude()
}
}
I'm not entirely sure if that works, but it should set you on the right path to where you want to go.
Since Gradle does not actually do any code analysis, you can't just simply say "only include files that have classes that are public". Instead, you have to either write a custom plugin that will only include public classes, or do something like what I provided. It includes everything from the source directory, but runs a little bit of code on each file. First, it gets the contents of the file, then it checks if that file contains public class. If not, the file does't have a public class, and should be excluded.
Hope this helps! Feel free to ask any more questions if you have any.

Use of Paranamer in gradle to generate parameter name data

I am trying to get parameter names of class inside my android application. to supply input to a help() method which will print all methods available and its parameter type and names. To get the parameter name, i am trying to use paranamer jar. I added paranamer.jar as a module library and i am able to import and use it. But have no idea on how to plug into gradle in order to generate parameter name data. Anyone any help ?
Java code-
public void help() {
Method[] declaredMethods = myclass.class.getDeclaredMethods();
Paranamer paranamer = new CachingParanamer();
for (Method declaredMethod : declaredMethods) {
String[] parameterNames = paranamer.lookupParameterNames(declaredMethod, false);
//process paranamer.
}
}
Use their Ant task, Ant is neatly integrated in Gradle: https://docs.gradle.org/current/userguide/userguide_single.html#sec:using_custom_ant_tasks
Should be something like
configurations {
paranamer
}
dependencies {
paranamer 'com.thoughtworks.paranamer:paranamer-ant:2.8'
}
compileJava.doLast {
ant.taskdef name: 'paranamer',
classname: 'com.thoughtworks.paranamer.ant.ParanamerTask',
classpath: configurations.paranamer.asPath
ant.paranamer(classdir: destinationDir) {
source.addToAntBuilder ant, 'fileset', FileCollection.AntType.FileSet
}
}
I didn't test this, I don't use paranamer, I just made this up from the docs and source. destinationDir and source are not to be replaced by actual strings, but are the fields of the compile task.
I made this as an additional action to the compileJava task instead of an own task, because it modifies the class files produced by the compile task in-place and with a separate task it would involve copying the class files somewhere, running the paranamer and then caring that the modified files are packaged instead of the original files. If the paranamer task would modify the result of the compileJava task in-place, it would break incremental building, as then the compileJava task would always be out-of-date and run every time. The way I suggest it, it is part of the compileJava task and thus is done before the up-to-date check logic and thus should work flawlessly in this regard.
If you have multiple source sets or custom JavaCompile tasks and want to modify all of them, it would instead something like:
tasks.withType(JavaCompile) {
it.doLast {
...
}
}

Multi-module annotation processing in Android Studio

I have a project with multiple modules in Android Studio. A module may have a dependency on another module, for example:
Module PhoneApp -> Module FeatureOne -> Module Services
I've included my annotation processing in the root module but the android-apt annotation processing occurs only at the top most level (PhoneApp) so that it should theoretically have access to all the modules at compile time. However, what I'm seeing in the generated java file is only the classes annotated in PhoneApp and none from the other modules.
PhoneApp/build/generated/source/apt/debug/.../GeneratedClass.java
In the other modules, I am finding a generated file in the intermediates directory that contains only the annotated files from that module.
FeatureOne/build/intermediates/classes/debug/.../GeneratedClass.class
FeatureOne/build/intermediates/classes/debug/.../GeneratedClass.java
My goal is to have a single generated file in PhoneApp that allows me to access the annotated files from all modules. Not entirely sure why the code generation process is running for each and failing to aggregate all annotations at PhoneApp. Any help appreciated.
Code is fairly simple and straight forward so far, checkIsValid() omitted as it works correctly:
Annotation Processor:
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
try {
for (Element annotatedElement : roundEnv.getElementsAnnotatedWith(GuiceModule.class)) {
if (checkIsValid(annotatedElement)) {
AnnotatedClass annotatedClass = new AnnotatedClass((TypeElement) annotatedElement);
if (!annotatedClasses.containsKey(annotatedClass.getSimpleTypeName())) {
annotatedClasses.put(annotatedClass.getSimpleTypeName(), annotatedClass);
}
}
}
if (roundEnv.processingOver()) {
generateCode();
}
} catch (ProcessingException e) {
error(e.getElement(), e.getMessage());
} catch (IOException e) {
error(null, e.getMessage());
}
return true;
}
private void generateCode() throws IOException {
PackageElement packageElement = elementUtils.getPackageElement(getClass().getPackage().getName());
String packageName = packageElement.isUnnamed() ? null : packageElement.getQualifiedName().toString();
ClassName moduleClass = ClassName.get("com.google.inject", "Module");
ClassName contextClass = ClassName.get("android.content", "Context");
TypeName arrayOfModules = ArrayTypeName.of(moduleClass);
MethodSpec.Builder methodBuilder = MethodSpec.methodBuilder("juice")
.addParameter(contextClass, "context")
.addModifiers(Modifier.PUBLIC, Modifier.STATIC)
.returns(arrayOfModules);
methodBuilder.addStatement("$T<$T> collection = new $T<>()", List.class, moduleClass, ArrayList.class);
for (String key : annotatedClasses.keySet()) {
AnnotatedClass annotatedClass = annotatedClasses.get(key);
ClassName className = ClassName.get(annotatedClass.getElement().getEnclosingElement().toString(),
annotatedClass.getElement().getSimpleName().toString());
if (annotatedClass.isContextRequired()) {
methodBuilder.addStatement("collection.add(new $T(context))", className);
} else {
methodBuilder.addStatement("collection.add(new $T())", className);
}
}
methodBuilder.addStatement("return collection.toArray(new $T[collection.size()])", moduleClass);
TypeSpec classTypeSpec = TypeSpec.classBuilder("FreshlySqueezed")
.addModifiers(Modifier.PUBLIC, Modifier.FINAL)
.addMethod(methodBuilder.build())
.build();
JavaFile.builder(packageName, classTypeSpec)
.build()
.writeTo(filer);
}
This is just for a demo of annotation processing that works with Guice, if anyone is curious.
So how can I get all the annotated classes to be included in the generated PhoneApp .java file from all modules?
It's never too late to answer a question on SO, so...
I have faced a very similar complication during one of tasks at work.
And I was able to resolve it.
Short version
All you need to know about generated classes from moduleB in moduleA is package and class name. That can be stored in some kind of MyClassesRegistrar generated class placed in known package. Use suffixes to avoid names clashing, get registrars by package. Instantiate them and use data from them.
Lond version
First of all - you will NOT be able to include your compile-time-only dependency ONLY at topmost module (lets call it "app" module as your typical android project structure does). Annotation processing just does not work that way and, as far as I could find out - nothing can be done about this.
Now to the details. My task was this:
I have human-written annotated classes. I'll name them "events". At compile time I need to generate helper-classes for those events to incorporate their structure and content (both statically-available (annotation values, consts, etc) and runtime available (I am passing event objects to those helpers when using latter). Helper class name depends on event class name with a suffix so I don't know it until code generation finished.
So after helpers are generated I create a factory and generate code to provide new helper instance based on MyEvent.class provided. Here's the problem: I only needed one factory in app module, but it should be able to provide helpers for events from library module - this can't be done straightforward.
What I did was:
skip generating factory for modules that my app module depends upon;
in non-app modules generate a so-called HelpersRegistrar implementation(s):
– they all share same package (you'll know why later);
– their names don't clash because of suffix (see below);
– differentiation between app module and library-module is done via javac "-Amylib.suffix=MyModuleName" param, that user MUST set - this is a limitation, but a minor one. No suffix must be specified for app module;
– HelpersRegistrar generated implementation can provide all I need for future factory code generating: event class name, helper class name, package (these two share package for package-visibility between helper and event) - all Strings, incorporated in POJO;
in app module I generate helpers - as usual, then I obtain HelperRegistrars by their package, instantiate them, run through their content to enrich my factory with code that provides helpers from other modules. All I needed for this was class names and a package.
Voilà! My factory can provide instances of helpers both from app module and from other modules.
The only uncertainty left is order of creating and running processor-class instances in app module and in other modules. I have not found any solid info on this, but running my example shows that compiler (and, therefore, code generation) first runs in module that we depend upon, and then - in app module (otherwise compilation of app module will be f..cked). This gives us reason to expect known order of code processor executions in different modules.
Another, slightly similar, approach is this: skip registrars, generate factories in all modules and write factory in app module to use other factories, that you get and name same way as registrars above.
Example can be seen here: https://github.com/techery/janet-analytics - this is a library where I applied this approach (the one without registrars since I have factories, but that can be not the case for you).
P. S.: suffix param can be switched to simpler "-Amylibraryname.library=true" and factories/registrars names can be autogenerated/incremented
Instead of using Filer to save generated file, use regular java file writing instead. You will need to serialize objects to temp files when processing because even static variables won't save in between modules. Configure gradle to delete the temp files before compilation.

Categories