Use of Paranamer in gradle to generate parameter name data - java

I am trying to get parameter names of class inside my android application. to supply input to a help() method which will print all methods available and its parameter type and names. To get the parameter name, i am trying to use paranamer jar. I added paranamer.jar as a module library and i am able to import and use it. But have no idea on how to plug into gradle in order to generate parameter name data. Anyone any help ?
Java code-
public void help() {
Method[] declaredMethods = myclass.class.getDeclaredMethods();
Paranamer paranamer = new CachingParanamer();
for (Method declaredMethod : declaredMethods) {
String[] parameterNames = paranamer.lookupParameterNames(declaredMethod, false);
//process paranamer.
}
}

Use their Ant task, Ant is neatly integrated in Gradle: https://docs.gradle.org/current/userguide/userguide_single.html#sec:using_custom_ant_tasks
Should be something like
configurations {
paranamer
}
dependencies {
paranamer 'com.thoughtworks.paranamer:paranamer-ant:2.8'
}
compileJava.doLast {
ant.taskdef name: 'paranamer',
classname: 'com.thoughtworks.paranamer.ant.ParanamerTask',
classpath: configurations.paranamer.asPath
ant.paranamer(classdir: destinationDir) {
source.addToAntBuilder ant, 'fileset', FileCollection.AntType.FileSet
}
}
I didn't test this, I don't use paranamer, I just made this up from the docs and source. destinationDir and source are not to be replaced by actual strings, but are the fields of the compile task.
I made this as an additional action to the compileJava task instead of an own task, because it modifies the class files produced by the compile task in-place and with a separate task it would involve copying the class files somewhere, running the paranamer and then caring that the modified files are packaged instead of the original files. If the paranamer task would modify the result of the compileJava task in-place, it would break incremental building, as then the compileJava task would always be out-of-date and run every time. The way I suggest it, it is part of the compileJava task and thus is done before the up-to-date check logic and thus should work flawlessly in this regard.
If you have multiple source sets or custom JavaCompile tasks and want to modify all of them, it would instead something like:
tasks.withType(JavaCompile) {
it.doLast {
...
}
}

Related

Getting the resource path of the java project from a custom gradle plugin

Trying to create a custom gradle plugin in java, how do i get the resources path from inside the task class?
public class MyCustomPlugin implements Plugin<Project> {
#Override
public void apply(Project project) {
project.getTasks().register("doStuff", CustomTask.class);
}
}
public class CustomTask extends DefaultTask {
// How do I get java project resources dir from here?
#Inject
public CustomTask(ProjectLayout projectLayout) {
directoryProperty = projectLayout.getBuildDirectory();
}
#TaskAction
public void execute() {
...
}
}
I would recommend to not get the directory inside the task, because the plugin that provides it might not be applied. Instead I would do it from within your plugin that registers the task, this way you can also ensure that the necessary plugin is actually applied. Gradle will display an error if the task is used without a value being assigned to the input that explains that nothing was assigned.
With the kotlin-dsl:
#CacheableTask
abstract class CustomTask : DefaultTask() {
#get:InputFiles
abstract val resources: FileCollection
//...
}
I cannot answer if #InputFiles is the right annotation for your use case, because I don't know what you want to do with the resource. Refer to the Gradle documentation for more information on the available annotations, and what they do.
plugins {
java
}
tasks.register<CustomTask>("customTask") {
resources.set(sourceSets.main.map { it.resources })
}
Notice the map {} which ensures that our task has a dependency on the processResources task, this is done automatically for us because we stick to the provider API of Gradle for everything.
Note that the resources are by default in one directory, but they don't have to be. This is why the resources are defined as SourceDirectorySet and not as Provider<Directory>. The same is true for anything that originates from the SourceSetContainer. It is easier to explain with Java source code: imagine you have Java and Kotlin, then you will have src/main/java and src/main/kotlin, hence, 2 directories. The former will have a **/*.java include filter, whereas the latter has a **/*.kt includes filter. If we just want to get all sources then we use sourceSets.main.map { it.java.sourceDirectories }, and if we want to get one of both it gets complicated. 😝
First, you'd have to ensure this is a Java project: either applying the "java" plugin from your plugin (project.getPluginManager().apply("java")), or only registering the task when the "java" plugin has been applied by the user (project.getPluginManager().withPlugin("java", ignored -> { project.getTasks().register(…); });).
You could then get the resources from the main source set:
SourceSetContainer sourceSets = project.getExtensions().getByType(SourceSetContainer.class);
// Use named() instead of get() if you prefer/need to use providers
SourceSet mainSourceSet = sourceSets.get(SourceSet.MAIN_SOURCE_SET_NAME);
SourceDirectorySet resources = mainSourceSet.getResources();
BTW, the best practice is to have tasks only declare their inputs and outputs (e.g. I need a set of directories, or files, as inputs, and my outputs will be one single file, or in one single directory) and have the actual wiring with default values be done by the plugin.
You could have the plugin unconditionally register the task, then conditionally when the "java" plugin is applied configure its inputs to the project resources; or conditionally register the task or unconditionally apply the "java" plugin, as I showed above.
You can access the sources through the project.sourceSets.
#Inject
public CustomTask(Project project) {
directoryProperty = project.projectLayout.getBuildDirectory();
sourceSet = project.sourceSets.main
}
See also the reference documentation here: https://docs.gradle.org/current/userguide/java_plugin.html#sec:java_project_layout

Gradle exclude VS include

Thanks to answers on this question, the following works:
task copyToLib(type: Copy) {
into "$buildDir/myapp/lib"
from configurations.runtime {
exclude module: 'commons-io'
}
}
I would assume that the following should also work (include instead of exclude):
task copyToLib(type: Copy) {
into "$buildDir/myapp/lib"
from configurations.runtime {
include module: 'commons-io'
}
}
But I'm getting following error:
org.gradle.internal.metaobject.AbstractDynamicObject$CustomMessageMissingMethodException: Could not find method include() for arguments [{module=commons-io}] on configuration ':runtime' of type org.gradle.api.internal.artifacts.configurations.DefaultConfiguration.
Is that expected or am I missing anything obvious?
Groovy allows you to omit a lot of braces and other unneccessary syntax, but this may also lead to undesired behaviour, like in your case.
A common approach to create a child CopySpec via the from(Object, Closure) method looks just like your code:
[...]
from 'sourcePath' {
// configuration closure
}
[...]
First you pass an object, which will be evaluated via Project.files(), then you pass a closure for configuration. Braces can be omitted. Easy-peasy.
But, in your example, the expression passed as object is a method call to configure a Configuration in a ConfigurationContainer, just like in the following common piece of Gradle code:
configurations.runtime {
exclude module: 'xyz'
}
So, the passed closure is interpret to configure the Configuration (globally, btw.) and not to configure the CopySpec. One way to handle this problem is to explicitly set the omitted braces:
[...]
from(configurations.runtime, {
// configuration closure
})
[...]
Please note: Using the above example you will be able to use both the exclude() and the include() method, but not the way you used them in your code examples! In a CopySpec, you can only exclude (or include) files or file patterns, not modules. The methods simply won't allow you to pass a map, you need to pass a list of strings or another closure.

how to run a groovy method from gradle task?

I am using
Gradle version 2.14
Groovy version 2.4.4
JVM: 1.8.0_121
I want to run a specific groovy method from a groovy class. How do I make associations with a groovy class from a gradle task ?
my task looks somewhat like this
task someTask << {
// Do something
// Call method, which returns a java.io.File
// Do something else
}
and my groovy class/method
File getSomeFile(String parameter) {
// Do something
// return an instance of java.io.File or maybe null, depending
}
So how do I call the method which takes a parameter and returns java.io.File ?
(hope this is not a dublicate, i looked around, not finding exactly what I need)
class Foo {
void bar() { println 'bar'; }
}
task someTask {
doLast {
new Foo().bar();
}
}
Gradle scripts ARE Groovy scripts, just do it as in any other Groovy script. Just make sure your class is in the classpath, e. g. by depending on the library that includes the file in builscript { dependencies {} }, or by stuffing the file into the buildSrc project of your Gradle project.

Correct way to define a Gradle plugin property extension with Java?

I'm trying to create a Gradle plugin in Java that has property extensions (not conventions, as this is apparently the old, wrong way). For the record, I'm working with Gradle 1.6 on a Linux machine (Ubuntu 12.04).
I've gotten as far as figuring out that the this should be done in the Plugin class definition. Here is A way of adding an extension. Create an extension class that contains your properties:
public class MyPluginExtensions {
File sourceDir;
File outputDir;
public MyPluginExtensions(Project project) {
this.project = project;
sourceDir = new File(project.getProjectDir(), "src");
outputDir = new File(project.getBuildDir(), "out");
}
}
Now add these extensions to the project in the main plugin class:
public class MyPlugin implements Plugin<Project> {
#Override
public void apply(Project project) {
Map<String,Object> taskInfo = new HashMap<String,Object>();
taskInfo.put("type", MyPluginTask.class);
taskInfo.put("description", "Generates blah from blah.");
taskInfo.put("group", "Blah");
Task myPluginTask = project.task(taskInfo, "myPluginTask");
// Define conventions and attach them to tasks
MyPluginExtensions extensions = new MyPluginExtensions(project);
myPluginTask.getExtensions().add(
"sourceDir",
extensions.sourceDir);
myPluginTask.getExtensions().add(
"outputDir",
extensions.outputDir);
}
}
This approach, however, doesn't seem to be correct. A new project property is shows up in the project.ext namespace. I expect to be able to address the plugin extensions as:
in my build.gradle:
myPluginTask.sourceDir = file('the/main/src')
myPluginTask.outputDir = file('the/output')
However, when I put such things in a gradle script that uses my plugin and try to set this property, gradle tells me I can't set it:
* What went wrong:
A problem occurred evaluating script.
> There's an extension registered with name 'sourceDir'. You should not reassign it via a property setter.
So, what's the right way to add property extensions for a task in a Java-based Gradle plugin?
EDIT:
Based on some other SO posts, I tried just adding my extensions object in one shot:
// first attempt:
//myPluginTask.getExtensions().add("sourceDir", extensions.sourceDir);
//myPluginTask.getExtensions().add("outputDir",extensions.outputDir);
// second attempt
myPluginTask.getExtensions().add("myPluginTask", extensions);
This appears to work. However, Gradle is now complaining that I've added a dynamic property:
Deprecated dynamic property: "sourceDir" on "task ':myPluginTask'", value: "/Users/jfer...".
So, again, what's the right way to add a plugin extension property?
EDIT 2
So, taking yet another shot at this, I'm adding the extension to the project object and using the create method instead:
// first attempt:
//myPluginTask.getExtensions().add("sourceDir", extensions.sourceDir);
//myPluginTask.getExtensions().add("outputDir",extensions.outputDir);
// second attempt
// myPluginTask.getExtensions().add("myPluginTask", extensions);
// third attempt
project.getExtensions().create("myPluginTask", MyPluginExtensions.class, project);
However, this fails for a couple of reasons:
Creating a properties extension with the same name ("myPluginTask") as the task creates a collision between the task name and extension name, causing the task to disappear from gradle's perspective (and throw oblique errors, such as "No such property: dependsOn for class ...MyPluginExtensions").
If I provide a name that does not collide with a task name (e.g., "myPluginPropExt"), the create() method works, but DOES NOT add the extension in its own namespace as expected (e.g., project.myPluginPropExt.propertyName and instead adds it in the project namespace (e.g., project.propertyName) which is not correct and causes Gradle to throw a bunch of "deprecated dynamic property" warnings.
So here is a solution to my problem:
public class MyPlugin implements Plugin<Project> {
#Override
public void apply(Project project) {
Map<String,Object> taskInfo = new HashMap<String,Object>();
taskInfo.put("type", MyPluginTask.class);
taskInfo.put("description", "Generates blah from blah.");
taskInfo.put("group", "Blah");
Task myPluginTask = project.task(taskInfo, "myPluginTask");
// Define conventions and attach them to tasks
MyPluginExtensions extensions = new MyPluginExtensions(project);
// the magic extension code:
project.getExtensions().add("myPluginName", extensions);
}
}
Now I can set a value for one of the extension properties in my gradle.build file like so (and I don't get a warning about adding deprecated dynamic properties):
myPluginName.sourceDir = file('the/main/src')
The final trick is to get this value in my Plugin's task:
public class MyPluginTask extends DefaultTask {
#TaskAction
public void action() {
MyPluginExtensions extensions = (MyPluginExtensions) getProject()
.getExtensions().findByName("myPluginName");
System.out.println("sourceDir value: " + extensions.sourceDir);
}
}
This works, but what annoys me about this solution is that I want to be able to put the extension properties in the same namespace as the task (e.g., myPluginTask.sourceDir) which I have seen in groovy-based plugins, but this apparently is not supported or just doesn't work.
In the meantime, hope this helps someone else.
The code is adding an extension to a task (rather than a project), which is rarely useful. After that, it tries to set myPluginTask.sourceDir = file('the/main/src'), which isn't possible because the extension was just registered under that same name.
When your task and your extension have the same name, you can do this in your build.gradle:
myPluginTask.sourceDir = file('the/main/src')
project.tasks.myPluginTask.dependsOn clean

AnnotationProcessor using multiple source-files to create one file

I have two classes with methods and i want to combine the methods of the two classes to one class.
#Service("ITestService")
public interface ITest1
{
#Export
void method1();
}
#Service("ITestService")
public interface ITest2
{
#Export
void method2();
}
Result should be:
public interface ITestService extends Remote
{
void method1();
void method2();
}
The first run of my AnnotationProcessor generates the correct output (because the RoundEnvironment contains both classes).
But if I edit one of the classes (for example adding a new method), the RoundEnviroment contains only the edited class and so the result is follwing (adding newMethod() to interface ITest1)
public interface ITestService extends Remote
{
void method1();
void newMethod();
}
Now method2 is missing. I don't know how to fix my problem. Is there a way (Enviroment), to access all classes in the project? Or is there another way to solve this?
The code to generate the class is pretty long, so here a short description how i generate the class. I iterate through the Elements with env.getElementsAnnotatedWith(Service.class) and extract the methods and write them into the new file with:
FileObject file = null;
file = filer.createSourceFile("com/test/" + serviceName);
file.openWriter().append(serviceContent).close();
-- Option 1 - Manual compilation from command line ---
I tried to do what you want, which is access all the classes from a processor, and as people commented, javac is always compiling all classes and from RoundEnvironment I do have access to all classes that are being compiled, everytime (even when no files changed), with one small detail: as long as all classes show on the list of classes to be compiled.
I've done a few tests with two interfaces where one (A) depends on the (B) other (extends) and I have the following scenarios:
If I ask the compiler to explicitly compile only the interface that has the dependency (A), passing the full path to the java file into the command line, and adding the output folder to the classpath, only the interface I passed into the command line gets processed.
If I explicitly compile only (A) and don't add the output folder to the classpath, the compiler still only processes interface (A). But it also gives me the warning: Implicitly compiled files were not subject to annotation processing.
If I use * or pass both classes to the compiler into the command line, then I get the expected result, both interfaces gets processed.
If you set the compiler to be verbose, you'll get an explicity message showing you what classes will be processed in each round. This is what I got when I explicitly passed interface (A):
Round 1:
input files: {com.bearprogrammer.test.TestInterface}
annotations: [com.bearprogrammer.annotation.Service]
last round: false
And this is what I've got when I added both classes:
Round 1:
input files: {com.bearprogrammer.test.AnotherInterface, com.bearprogrammer.test.TestInterface}
annotations: [com.bearprogrammer.annotation.Service]
last round: false
In both cases I see that the compiler parses both classes, but in a different order. For the first case (only one interface added):
[parsing started RegularFileObject[src\main\java\com\bearprogrammer\test\TestInterface.java]]
[parsing completed 15ms]
[search path for source files: src\main\java]
[search path for class files: ...]
[loading ZipFileIndexFileObject[lib\processor.jar(com/bearprogrammer/annotation/Service.class)]]
[loading RegularFileObject[src\main\java\com\bearprogrammer\test\AnotherInterface.java]]
[parsing started RegularFileObject[src\main\java\com\bearprogrammer\test\AnotherInterface.java]]
For the second case (all interfaces added):
[parsing started RegularFileObject[src\main\java\com\bearprogrammer\test\AnotherInterface.java]]
...
[parsing started RegularFileObject[src\main\java\com\bearprogrammer\test\TestInterface.java]]
[search path for source files: src\main\java]
[search path for class files: ...]
...
The important detail here is that the compiler is loading the dependency as an implicit object for the compilation in the first case. In the second case it will load it as part of the to-be-compiled-objects (you can see this because it starts searching other paths for files after the provided classes are parsed). And it seems that implicit objects aren't included in the annotation processing list.
For more details over the compilation process, check this Compilation Overview. Which is not explicitly saying what files are picked up for processing.
The solution in this case would be to always add all classes into the command for the compiler.
--- Option 2 - Compiling from Eclipse ---
If you are compiling from Eclipse, incremental build will make your processor fail (haven't tested it). But I would think you can go around that asking for a clean build (Project > Clean..., also haven't tested it) or writing an Ant build that always clean the classes directory and setting up an Ant Builder from Eclipse.
--- Option 3 - Using build tools ---
If you are using some other build tool like Ant, Maven or Gradle, the best solution would be to have the source generation in a separate step than your compilation. You would also need to have your processor compiled in a separated previous step (or a separated subproject if using multiprojects build in Maven/Gradle). This would be the best scenario because:
For the processing step you can always do a full clean "compilation" without actually compiling the code (using the option -proc:only from javac to only process the files)
With the generated source code in place, if you were using Gradle, it would be smart enough to not recompile the generated source files if they didn't change. Ant and Maven would only recompile the needed files (the generated ones and that their dependencies).
For this third option you could also setup an Ant build script to generate those files from Eclipse as a builder that runs before your Java builder. Generate the source files in some special folder and add that to your classpath/buildpath in Eclipse.
NetBeans #Messages annotation generates single Bundle.java file per all classes in the same package. It works correctly with incremental compilation thanks to following trick in the annotation processor:
Set<Element> toProcess = new HashSet<Element>();
for (Element e : roundEnv.getElementsAnnotatedWith(Messages.class)) {
PackageElement pkg = findPkg(e);
for (Element elem : pkg.getEnclosingElements()) {
if (elem.getAnnotation(Message.class) != null) {
toProcess.add(elem);
}
}
}
// now process all package elements in toProcess
// rather just those provided by the roundEnv
PackageElement findPkg(Element e) {
for (;;) {
if (e instanceof PackageElement) {
return (PackageElement)e;
}
e = e.getEnclosingElement();
}
}
By doing this one can be sure all (top level) elements in a package are processed together even if the compilation has only been invoked on a single source file in the package.
In case you know where to look for your annotation (top level elements in a package or even any element in a package) you should be able to always get list of all such elements.

Categories