So I've been making some kind of plugins API for a Java project (to load JAR files externally) and well, I wanted to be able to add any Guice module inside any plugin to my project's dependency graph.
What I did was have a PluginsModule and in the configure method scan for other modules in plugins and install them using Java's ServiceLoader.
I made a test plugin and made a module for it, I confirmed it did get installed. No problems at this point. The problems appear when I do anything inside that module, for example I bound some interface to an implementation in that plugin (just to clear this up, I did the same thing without the plugin and it worked so it's not a binding problem) and tried to inject it, configuration errors saying there was no implementation for that interface appear.
public enum StandardGuiceModuleScanningStrategy implements GuiceModuleScanningStrategy {
INSTANCE;
#Override
public Set<Module> scan(Path directory) throws IOException {
File directoryAsFile = directory.toFile();
File[] childrenFiles = directoryAsFile.listFiles();
if (!directoryAsFile.isDirectory()
|| childrenFiles == null
|| childrenFiles.length == 0) {
return Collections.emptySet();
}
Set<Module> modules = new HashSet<>();
for (File childrenFile : childrenFiles) {
ClassLoader directoryClassLoader = new URLClassLoader(
new URL[]{childrenFile.toURI().toURL()});
ServiceLoader<Module> moduleServiceLoader = ServiceLoader.load(
Module.class, directoryClassLoader);
moduleServiceLoader.forEach(modules::add);
}
return modules;
}
In that implementation of my GuiceModuleScanningStrategy, as I mentioned before, I did use ServiceLoader. Anyways, I also tried other stuff, like scanning the JAR file and checking for a Module, and seeing if it has a specific annotation.
All Guice Modules annotated with #GuiceModule, will be installed into a child Injector. All classes annotated with #AutoBind will be bound to all inherited interfaces. You can also name it, which would lead to a named binding and overwrite the interfaces, which should be used. And if you don't want to use all Features, just overwrite the StartupModule and bind only the Features you want or your own.
Related
Trying to create a custom gradle plugin in java, how do i get the resources path from inside the task class?
public class MyCustomPlugin implements Plugin<Project> {
#Override
public void apply(Project project) {
project.getTasks().register("doStuff", CustomTask.class);
}
}
public class CustomTask extends DefaultTask {
// How do I get java project resources dir from here?
#Inject
public CustomTask(ProjectLayout projectLayout) {
directoryProperty = projectLayout.getBuildDirectory();
}
#TaskAction
public void execute() {
...
}
}
I would recommend to not get the directory inside the task, because the plugin that provides it might not be applied. Instead I would do it from within your plugin that registers the task, this way you can also ensure that the necessary plugin is actually applied. Gradle will display an error if the task is used without a value being assigned to the input that explains that nothing was assigned.
With the kotlin-dsl:
#CacheableTask
abstract class CustomTask : DefaultTask() {
#get:InputFiles
abstract val resources: FileCollection
//...
}
I cannot answer if #InputFiles is the right annotation for your use case, because I don't know what you want to do with the resource. Refer to the Gradle documentation for more information on the available annotations, and what they do.
plugins {
java
}
tasks.register<CustomTask>("customTask") {
resources.set(sourceSets.main.map { it.resources })
}
Notice the map {} which ensures that our task has a dependency on the processResources task, this is done automatically for us because we stick to the provider API of Gradle for everything.
Note that the resources are by default in one directory, but they don't have to be. This is why the resources are defined as SourceDirectorySet and not as Provider<Directory>. The same is true for anything that originates from the SourceSetContainer. It is easier to explain with Java source code: imagine you have Java and Kotlin, then you will have src/main/java and src/main/kotlin, hence, 2 directories. The former will have a **/*.java include filter, whereas the latter has a **/*.kt includes filter. If we just want to get all sources then we use sourceSets.main.map { it.java.sourceDirectories }, and if we want to get one of both it gets complicated. 😝
First, you'd have to ensure this is a Java project: either applying the "java" plugin from your plugin (project.getPluginManager().apply("java")), or only registering the task when the "java" plugin has been applied by the user (project.getPluginManager().withPlugin("java", ignored -> { project.getTasks().register(…); });).
You could then get the resources from the main source set:
SourceSetContainer sourceSets = project.getExtensions().getByType(SourceSetContainer.class);
// Use named() instead of get() if you prefer/need to use providers
SourceSet mainSourceSet = sourceSets.get(SourceSet.MAIN_SOURCE_SET_NAME);
SourceDirectorySet resources = mainSourceSet.getResources();
BTW, the best practice is to have tasks only declare their inputs and outputs (e.g. I need a set of directories, or files, as inputs, and my outputs will be one single file, or in one single directory) and have the actual wiring with default values be done by the plugin.
You could have the plugin unconditionally register the task, then conditionally when the "java" plugin is applied configure its inputs to the project resources; or conditionally register the task or unconditionally apply the "java" plugin, as I showed above.
You can access the sources through the project.sourceSets.
#Inject
public CustomTask(Project project) {
directoryProperty = project.projectLayout.getBuildDirectory();
sourceSet = project.sourceSets.main
}
See also the reference documentation here: https://docs.gradle.org/current/userguide/java_plugin.html#sec:java_project_layout
I am writing a custom gradle plugin, which will generate some code for me, based on the code it finds in the project it is applied to.
For this I need to find all classes that extend a specific class.
The problem is that the class, that is extended, is not loaded in the classpath, since it is a dependency of the other project.
Currently I got this for my custom Task
public class GenerateCodeTask extends DefaultTask {
#TaskAction
public void generateCode() throws MalformedURLException, ClassNotFoundException {
File buildDir = new File(getProject().getBuildDir(), "classes/main");
File root = new File(getProject().getProjectDir(), "src/main/generated");
URLClassLoader classLoader = new URLClassLoader(new URL[]{buildDir.toURL()});
Class itemClass = classLoader.loadClass("net.minecraft.item.Item");
Reflections reflections = new Reflections(classLoader);
Set<Class<?>> items = reflections.getSubTypesOf(itemClass);
}
}
And this for the plugin
public class EasymodsPlugin implements Plugin<Project> {
#Override
public void apply(Project p) {
Task t = p.getTasks().create("generateCode", GenerateCodeTask.class);
t.dependsOn(p.getTasks().getByPath("compileJava"));
}
}
This is the error I am getting
java.lang.ClassNotFoundException: net.minecraft.item.Item
I know that the problem is that the library containing the class is not loaded, and that causes the exception.
What I want is to be able to load all dependencies of my project into the classloader, so I can use reflections to find all "items" in the project (which I need to generate code)
Greetings Failender
I think you almost got it.
You need the compileClasspath property. I pass it as an input parameter to my task, and build the Class Loader from it:
In plugin:
Set<File> ccp = project.getConfigurations().getByName("compileClasspath").getFiles();
task.classpath = ccp;
In task:
#InputFiles
Iterable<File> classpath;
What this property is adding is:
the path to the classes on projects it depends on that are generated under project_id/build dir. ( So no need to build the path manually )
All library deps for your project. That i think you were missing.
So, in the case your class is of type 1:
It is better if you have different subprojects: the one that has the original classes and the other with the generated ones.
So the second depends on the first:
dependencies {
compile project(':my_project_with_classes_to_extend')
}
In the case is the second one, you can just add the library as a dep to your project and it will find the class.
And rewire the tasks so you are sure the fist project is built before calling your task (just on root level of your 2nd project build.gradle):
I think that´s the part that wasn't really working for you, apparently, compileJava and build are not exactly the same. Or at least, compileJava wasnt working for me either.
myGeneratorTask.dependsOn( ":my_project_with_classes_to_extend:build" )
compileJava.dependsOn( "myGeneratorTask" )
I'm currently writing an application that requires to operate on different type of devices. My approach would be to make a "modular" application that can dynamically load different classes according to the device they need to operate on.
To make the application easily extensible, my goal is to assign a specific path to the additional modules (either .jar or .class files) leaving the core program as it is. This would be crucial when having different customers requiring different modules (without having to compile a different application for each of them).
These modules would implement a common interface, while the "core" application can use these methods defined on the interface and let the single implementations do the work. What's the best way to load them on demand? I was considering the use of URLClassLoader but i don't know if this approach is up-to-date according to new patterns and Java trends, as I would like to avoid a poorly designed application and deprecated techniques. What's an alternative best approach to make a modular and easily extensible application with JDK 9 (that can be extended just by adding module files to a folder) ?
Additionnaly to the ServicerLoader usage given by #SeverityOne, you can use the module-info.java to declare the different instanciation of the interface, using "uses"/"provides" keywords.
Then you use a module path instead of a classpath, it loads all the directory containing your modules, don't need to create a specific classLoader
The serviceLoader usage:
public static void main(String[] args) {
ServiceLoader<IGreeting> sl = ServiceLoader.load(IGreeting.class);
IGreeting greeting = sl.findFirst().orElseThrow(NullPointerException::new);
System.out.println( greeting.regular("world"));
}
In the users project:
module pl.tfij.java9modules.app {
exports pl.tfij.java9modules.app;
uses pl.tfij.java9modules.app.IGreeting;
}
In the provider project:
module pl.tfij.java9modules.greetings {
requires pl.tfij.java9modules.app;
provides pl.tfij.java9modules.app.IGreeting
with pl.tfij.java9modules.greetings.Greeting;
}
And finally the CLI usage
java --module-path mods --module pl.tfij.java9modules.app
Here is an example; Github example (Thanks for "tfij/" repository initial exemple)
Edit, I realized the repository already provides decoupling examples:
https://github.com/tfij/Java-9-modules---reducing-coupling-of-modules
It sounds like you might want to use the ServicerLoader interface, which has been available since Java 6. However, bear in mind that, if you want to use Spring dependency injection, this is probably not what you want.
There are two scenarios.
Implementation jar's are on classpath
In this scenario you can simply use ServiceLoader API (refer to #pdem answer)
Implementation jar's not on classpath
Lets Assume BankController is your interface and CoreController is your implementation.
If you want to load its implementation dynamically from dynamic path,c create a new module layer and load class.
Refer to the following piece of code:
private final BankController loadController(final BankConfig config) {
System.out.println("Loading bank with config : " + JSON.toJson(config));
try {
//Curent ModuleLayer is usually boot layer. but it can be different if you are using multiple layers
ModuleLayer currentModuleLayer = this.getClass().getModule().getLayer(); //ModuleLayer.boot();
final Set<Path> modulePathSet = Set.of(new File("path of implementation").toPath());
//ModuleFinder to find modules
final ModuleFinder moduleFinder = ModuleFinder.of(modulePathSet.toArray(new Path[0]));
//I really dont know why does it requires empty finder.
final ModuleFinder emptyFinder = ModuleFinder.of(new Path[0]);
//ModuleNames to be loaded
final Set<String> moduleNames = moduleFinder.findAll().stream().map(moduleRef -> moduleRef.descriptor().name()).collect(Collectors.toSet());
// Unless you want to use URLClassloader for tomcat like situation, use Current Class Loader
final ClassLoader loader = this.getClass().getClassLoader();
//Derive new configuration from current module layer configuration
final Configuration configuration = currentModuleLayer.configuration().resolveAndBind(moduleFinder, emptyFinder, moduleNames);
//New Module layer derived from current modulee layer
final ModuleLayer moduleLayer = currentModuleLayer.defineModulesWithOneLoader(configuration, loader);
//find module and load class Load class
final Class<?> controllerClass = moduleLayer.findModule("org.util.npci.coreconnect").get().getClassLoader().loadClass("org.util.npci.coreconnect.CoreController");
//create new instance of Implementation, in this case org.util.npci.coreconnect.CoreController implements org.util.npci.api.BankController
final BankController bankController = (BankController) controllerClass.getConstructors()[0].newInstance(config);
return bankController;
} catch (Exception e) {BootLogger.info(e);}
return null;
}
Reference : https://docs.oracle.com/javase/9/docs/api/java/lang/module/Configuration.html
I have a project with multiple modules in Android Studio. A module may have a dependency on another module, for example:
Module PhoneApp -> Module FeatureOne -> Module Services
I've included my annotation processing in the root module but the android-apt annotation processing occurs only at the top most level (PhoneApp) so that it should theoretically have access to all the modules at compile time. However, what I'm seeing in the generated java file is only the classes annotated in PhoneApp and none from the other modules.
PhoneApp/build/generated/source/apt/debug/.../GeneratedClass.java
In the other modules, I am finding a generated file in the intermediates directory that contains only the annotated files from that module.
FeatureOne/build/intermediates/classes/debug/.../GeneratedClass.class
FeatureOne/build/intermediates/classes/debug/.../GeneratedClass.java
My goal is to have a single generated file in PhoneApp that allows me to access the annotated files from all modules. Not entirely sure why the code generation process is running for each and failing to aggregate all annotations at PhoneApp. Any help appreciated.
Code is fairly simple and straight forward so far, checkIsValid() omitted as it works correctly:
Annotation Processor:
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
try {
for (Element annotatedElement : roundEnv.getElementsAnnotatedWith(GuiceModule.class)) {
if (checkIsValid(annotatedElement)) {
AnnotatedClass annotatedClass = new AnnotatedClass((TypeElement) annotatedElement);
if (!annotatedClasses.containsKey(annotatedClass.getSimpleTypeName())) {
annotatedClasses.put(annotatedClass.getSimpleTypeName(), annotatedClass);
}
}
}
if (roundEnv.processingOver()) {
generateCode();
}
} catch (ProcessingException e) {
error(e.getElement(), e.getMessage());
} catch (IOException e) {
error(null, e.getMessage());
}
return true;
}
private void generateCode() throws IOException {
PackageElement packageElement = elementUtils.getPackageElement(getClass().getPackage().getName());
String packageName = packageElement.isUnnamed() ? null : packageElement.getQualifiedName().toString();
ClassName moduleClass = ClassName.get("com.google.inject", "Module");
ClassName contextClass = ClassName.get("android.content", "Context");
TypeName arrayOfModules = ArrayTypeName.of(moduleClass);
MethodSpec.Builder methodBuilder = MethodSpec.methodBuilder("juice")
.addParameter(contextClass, "context")
.addModifiers(Modifier.PUBLIC, Modifier.STATIC)
.returns(arrayOfModules);
methodBuilder.addStatement("$T<$T> collection = new $T<>()", List.class, moduleClass, ArrayList.class);
for (String key : annotatedClasses.keySet()) {
AnnotatedClass annotatedClass = annotatedClasses.get(key);
ClassName className = ClassName.get(annotatedClass.getElement().getEnclosingElement().toString(),
annotatedClass.getElement().getSimpleName().toString());
if (annotatedClass.isContextRequired()) {
methodBuilder.addStatement("collection.add(new $T(context))", className);
} else {
methodBuilder.addStatement("collection.add(new $T())", className);
}
}
methodBuilder.addStatement("return collection.toArray(new $T[collection.size()])", moduleClass);
TypeSpec classTypeSpec = TypeSpec.classBuilder("FreshlySqueezed")
.addModifiers(Modifier.PUBLIC, Modifier.FINAL)
.addMethod(methodBuilder.build())
.build();
JavaFile.builder(packageName, classTypeSpec)
.build()
.writeTo(filer);
}
This is just for a demo of annotation processing that works with Guice, if anyone is curious.
So how can I get all the annotated classes to be included in the generated PhoneApp .java file from all modules?
It's never too late to answer a question on SO, so...
I have faced a very similar complication during one of tasks at work.
And I was able to resolve it.
Short version
All you need to know about generated classes from moduleB in moduleA is package and class name. That can be stored in some kind of MyClassesRegistrar generated class placed in known package. Use suffixes to avoid names clashing, get registrars by package. Instantiate them and use data from them.
Lond version
First of all - you will NOT be able to include your compile-time-only dependency ONLY at topmost module (lets call it "app" module as your typical android project structure does). Annotation processing just does not work that way and, as far as I could find out - nothing can be done about this.
Now to the details. My task was this:
I have human-written annotated classes. I'll name them "events". At compile time I need to generate helper-classes for those events to incorporate their structure and content (both statically-available (annotation values, consts, etc) and runtime available (I am passing event objects to those helpers when using latter). Helper class name depends on event class name with a suffix so I don't know it until code generation finished.
So after helpers are generated I create a factory and generate code to provide new helper instance based on MyEvent.class provided. Here's the problem: I only needed one factory in app module, but it should be able to provide helpers for events from library module - this can't be done straightforward.
What I did was:
skip generating factory for modules that my app module depends upon;
in non-app modules generate a so-called HelpersRegistrar implementation(s):
– they all share same package (you'll know why later);
– their names don't clash because of suffix (see below);
– differentiation between app module and library-module is done via javac "-Amylib.suffix=MyModuleName" param, that user MUST set - this is a limitation, but a minor one. No suffix must be specified for app module;
– HelpersRegistrar generated implementation can provide all I need for future factory code generating: event class name, helper class name, package (these two share package for package-visibility between helper and event) - all Strings, incorporated in POJO;
in app module I generate helpers - as usual, then I obtain HelperRegistrars by their package, instantiate them, run through their content to enrich my factory with code that provides helpers from other modules. All I needed for this was class names and a package.
Voilà! My factory can provide instances of helpers both from app module and from other modules.
The only uncertainty left is order of creating and running processor-class instances in app module and in other modules. I have not found any solid info on this, but running my example shows that compiler (and, therefore, code generation) first runs in module that we depend upon, and then - in app module (otherwise compilation of app module will be f..cked). This gives us reason to expect known order of code processor executions in different modules.
Another, slightly similar, approach is this: skip registrars, generate factories in all modules and write factory in app module to use other factories, that you get and name same way as registrars above.
Example can be seen here: https://github.com/techery/janet-analytics - this is a library where I applied this approach (the one without registrars since I have factories, but that can be not the case for you).
P. S.: suffix param can be switched to simpler "-Amylibraryname.library=true" and factories/registrars names can be autogenerated/incremented
Instead of using Filer to save generated file, use regular java file writing instead. You will need to serialize objects to temp files when processing because even static variables won't save in between modules. Configure gradle to delete the temp files before compilation.
I'm using Reflections to find classes that have an specific annotation. My project structure is the following
One WAR package:
WEB-INF/classes/...packages.../ClassAnnoted1.class
One JAR package that is included by the war that has a class that executes this code:
Reflections reflections= new Reflections(ClasspathHelper.forWebInfClasses(servletContext))
Set set= reflections.getTypesAnnotatedWith(CustomAnnotation.class)
CustomAnnotation is also present on the JAR package.
the set size is correct (ie if I have 3 classes with the annotation in my WAR the jar, the set size comes back as 3), but all elements inside it are null instead of Class. I need to get the class and check the annotation parameters inside the class of the JAR.
Anyone got any idea of why this is happening?
EDIT:
Reflections reflections= new Reflections("com.my.customAnnotededClasses"); //package that my annoted class is in
Set set= reflections.getTypesAnnotatedWith(CustomAnnotation.class);
Also does not work, in this case the set length is zero instead of the number of classes with the annotation.
EDIT 2:
Ok, the real problem was that I was packaging my whole application as an EAR so I had the following:
EAR
----> WAR
----> JAR
The jar was included in the EAR lib folder and not on the WAR lib folder. So the jar classes couldn't see the war classes, once i made the WAR depend on the JAR directly like this:
EAR
----> WAR
---------> JAR
It started working. But the original question still stands, there might be situations where I want the Jar classes included in the EAR instead of the WAR (if i have multiple wars that need to use my jar for instance).
I guess I can't do it using the reflections library. So I did it by hand:
public static List<Class<?>> getClassesAnnotatedWith(Class annotation, ServletContext servletContext) {
List<Class<?>> webClasses, jarClasses;
webClasses= getClassesAnnotedWithFromClassLoader(annotation, servletContext.getClassLoader());
jarClasses= getClassesAnnotedWithFromClassLoader(annotation, Thread.currentThread().getContextClassLoader());
for (Class<?> jarClass : jarClasses) {
Class<?> elementToAdd= null;
for (Class<?> webClass : webClasses) {
if ( ! jarClass.getName().equals(webClass.getName())) {
elementToAdd= jarClass;
}
}
if(elementToAdd != null) {
webClasses.add(elementToAdd);
}
}
return webClasses;
}
private static List<Class<?>> getClassesAnnotedWithFromClassLoader(Class annotation, ClassLoader classLoader) {
List<Class<?>> classes= new ArrayList<Class<?>>();
Class<?> classLoaderClass= classLoader.getClass();
while (! classLoaderClass.getName().equals("java.lang.ClassLoader")) {
classLoaderClass= classLoaderClass.getSuperclass();
}
try {
Field fldClasses= classLoaderClass.getDeclaredField("classes");
fldClasses.setAccessible(true);
Vector<Class<?>> classesVector= (Vector<Class<?>>) fldClasses.get(classLoader);
for (Class c : classesVector) {
if (c.isAnnotationPresent(annotation)) {
classes.add(c);
}
}
} catch (Exception e) { }
return classes;
}
I get the ClassLoader from my WAR package through the ServletContext object. There is also a protection in case a class is defined in both the WAR and the JAR with the annotation and same name (you should probably check if the packages are the same too though).
Note that you should probably never use this code in your own projects (maybe only for debugging). It involves reflecting the ClassLoader class to make the "classes" property public. This property might not exists in Java 9 for example, so beware. This might also have some security problems if you are interacting modules written by third parties.
i had one a similar problem. are you sure, you included the annotation-classes into your classpath? if they are not loaded, they will somehow be found but not really returned and without any exception or anything
The Reflections library gave me various problems. Now I am using the reflection part of the Guava library: until now, no unexpected behavior has occurred.
In any case, I think that it is very rare that the source of the problem is the Java classloader.
Maybe try to load the class CustomAnnotation.class before to use it in the Reflections API.
Your code should work on conventional environments.
However, in different environments, such as osgi, you get:
1) urls with different protocol (bundle/vfs/...)
2) different class loader.
In the first case, you should a) add the relevant UrlType (see the DefaultUrlTypes in Vfs for examples), or b) use different method to get the urls (see other methods in ClasspathHelper and examine the returned URL list)
In the second case, you should a) pass the customClassLoader to Reflections constructor or ConfigurationBuilder in order resolving will happen, or b) query the store directly reflections.getStore().get(TypeAnnotationsScanner.class)
see also #8339845, JbossIntegration