I recently found contradictory documentation whether service loader will locate providers added to the module path after boot.
The ServiceLoader::reload:
public void reload()
Clear this loader's provider cache so that all providers will be reloaded.
After invoking this method, subsequent invocations of the iterator or stream methods will lazily locate providers (and instantiate in the case of iterator) from scratch, just as is done by a newly-created service loader.
This method is intended for use in situations in which new service providers can be installed into a running Java virtual machine.
Which clearly indicates that the service resolution is completely dynamic.
From the other hand ModuleFinder::findAll contradicts it. "A ModuleFinder is used to find modules during resolution or service binding." — Javadoc
Set<ModuleReference> findAll()
Returns the set of all module references that this finder can locate.
A ModuleFinder provides a consistent view of the modules that it locates. If findAll is invoked several times then it will return the same (equals) result each time. For each ModuleReference element in the returned set then it is guaranteed that find will locate the ModuleReference if invoked to find that module.
According to this quote the resolution is fixed at module layer creation which is actually expected as the whole Java Platform Module System is static by design. If new providers would've require other modules it would have to modify the existing module graph.
So my question is: Is the first quote left-over docs from Java 8 Javadoc, or there might really be cases where I can add new providers dynamically?
Here I'm going to prove that the module finder docs is correct:
Class com.service.Service:
package com.service;
import java.util.ServiceLoader;
import java.util.stream.Collectors;
public interface Service {
public static void main(String[] args) throws InterruptedException {
ServiceLoader<Service> loader = ServiceLoader.load(Service.class);
for (int i = 0; i < 5; i++) {
System.out.print("Attempt " + (i + 1) + ": ");
System.out.println(loader
.stream()
.map(ServiceLoader.Provider::type)
.map(Object::toString)
.collect(Collectors.joining(", ")));
Thread.sleep(5000);
loader.reload();
}
}
}
module-info:
module service {
exports com.service;
uses com.service.Service;
}
In a different module, class com.provider.Provider:
package com.provider;
import com.service.Service;
public class Provider implements Service {
}
module-info:
module provider {
exports com.provider;
requires service;
provides com.service.Service with com.provider.Provider;
}
Here's a live GIF what happens when I first run it without the provider in the modulepath. On the second run the provider is already there, I'll try to remove it while runnning.
The service provider API works on top of Java class loaders, which are not dynamic by default. The class path is determined on JVM startup and then will not be updated: the JAR you are trying to delete, is opened by JVM, and will not be released until the shutdown. If you need some different behavior, you'll need to use a custom class loader, like the ones used by JEE application servers for the deployment of webapps or the class loaders in OSGi implementations.
Related
suppose you have moduleA and moduleB. ModuleA defines an interface (for instance for a Service) and ModuleB has a concrete class that implements the interface (provides the service).
Now if the interface has a default method and you invoke it on the class in moduleB (from another module) is this invocation supposed to be performed inside moduleA or moduleB?
Apparently it is from moduleA ... what's the rationale?
Example: suppose you have a code that does this:
InputStream is = this.getClass().getResourceAsStream(fullPath);
if this code lies in the implementation of the service in moduleB the stream will be opened. But if the code lies in the default method in moduleA then when the service is invoked on moduleB you will need to have an "open" resource in moduleB (so it seems that the invocation thinks it is from "outside" moduleB).
would like to read about the reason for that.
thanks
editing my question with an example.
Suppose you have in moduleA this code:
public interface PropertiesProvider {
public default Properties get(String domain) {
Class clazz =this.getClass();
System.out.println(" CLASS " +clazz);
InputStream is = clazz.getResourceAsStream(domain) ;
if (is != null) {
Properties props = new Properties();
try {
props.load(is);
return props;
} catch (IOException e) {
//log
}
}
return null;
}
}
and in moduleB
public class PropertiesProviderImpl implements PropertiesProvider {}
if you invoke the service from ModuleA the call is traced to come from class PropertiesProviderImpl finds the resource but does not load it if it is not "opened"
if you copy the code into PropertiesProviderImpl the calls is traced to that same class finds the ressource and loads it even when it is not "opened"
So my question is: why the difference since the call comes from the same class?
(the difference being that in one case the method is kind-of inherited from the default method in the interface)
Look at the documentation of the getResourceAsStream If this class is in a named Module then this method will attempt to find the resource in the module.
In the first case your code (in moduleA) sees the Type but cannot see the class which implements your Type, because it's in the moduleB. In the second case your code can see the class which "implements" the Type.
Look at the reference bellow, the most important sentences are:
In a modular setting the invocation of Class::forName will continue to work so long as the package containing the provider class is known to the context class loader. The invocation of the provider class’s constructor via the reflective newInstance method, however, will not work: The provider might be loaded from the class path, in which case it will be in the unnamed module, or it might be in some named module, but in either case the framework itself is in the java.xml module. That module only depends upon, and therefore reads, the base module, and so a provider class in any other module will be not be accessible to the framework.
[...]
instead, revise the reflection API simply to assume that any code that reflects upon some type is in a module that can read the module that defines that type.
[Long answer]: reflective-readability
A framework is a facility that uses reflection to load, inspect, and instantiate other classes at run time [...]
Given a class discovered at run time, a framework must be able to access one of its constructors in order to instantiate it. As things stand, however, that will usually not be the case.
The platform’s streaming XML parser, e.g., loads and instantiates the implementation of the XMLInputFactory service named by the system property javax.xml.stream.XMLInputFactory, if defined, in preference to any provider discoverable via the ServiceLoader class. Ignoring exception handling and security checks the code reads, roughly:
String providerName
= System.getProperty("javax.xml.stream.XMLInputFactory");
if (providerName != null) {
Class providerClass = Class.forName(providerName, false,
Thread.getContextClassLoader());
Object ob = providerClass.newInstance();
return (XMLInputFactory)ob;
}
// Otherwise use ServiceLoader
...
In a modular setting the invocation of Class::forName will continue to work so long as the package containing the provider class is known to the context class loader. The invocation of the provider class’s constructor via the reflective newInstance method, however, will not work: The provider might be loaded from the class path, in which case it will be in the unnamed module, or it might be in some named module, but in either case the framework itself is in the java.xml module. That module only depends upon, and therefore reads, the base module, and so a provider class in any other module will be not be accessible to the framework.
To make the provider class accessible to the framework we need to make the provider’s module readable by the framework’s module. We could mandate that every framework explicitly add the necessary readability edge to the module graph at run time, as in an earlier version of this document, but experience showed that approach to be cumbersome and a barrier to migration.
We therefore, instead, revise the reflection API simply to assume that any code that reflects upon some type is in a module that can read the module that defines that type. This enables the above example, and other code like it, to work without change. This approach does not weaken strong encapsulation: A public type must still be in an exported package in order to be accessed from outside its defining module, whether from compiled code or via reflection.
since we didn't understand precisely the previous response we carried some additional tests
in each test the resource file is not "opened"
1)
the code invoking clazz.getResouceAsStream is in default method of interface defining the service. The class implementing the interface does not defines any method.
-> this.getClass() yields the implementing class , tests fails to find resource
2)
we added this code in the default method
Object obj = clazz.getConstructor().newInstance();
and yes it fails
3) we changed the code so PropertiesProvider is abstract class and PropertiesProviderImpl inherits from it
same behaviour.
So yes it means that the same code will behave differently if you inherit from it or just invoke it directly.
This is worrying: it means the inner logic of the language is going to lead to convoluted byzantine behaviours (the reason why we dumped C++).
I'm currently writing an application that requires to operate on different type of devices. My approach would be to make a "modular" application that can dynamically load different classes according to the device they need to operate on.
To make the application easily extensible, my goal is to assign a specific path to the additional modules (either .jar or .class files) leaving the core program as it is. This would be crucial when having different customers requiring different modules (without having to compile a different application for each of them).
These modules would implement a common interface, while the "core" application can use these methods defined on the interface and let the single implementations do the work. What's the best way to load them on demand? I was considering the use of URLClassLoader but i don't know if this approach is up-to-date according to new patterns and Java trends, as I would like to avoid a poorly designed application and deprecated techniques. What's an alternative best approach to make a modular and easily extensible application with JDK 9 (that can be extended just by adding module files to a folder) ?
Additionnaly to the ServicerLoader usage given by #SeverityOne, you can use the module-info.java to declare the different instanciation of the interface, using "uses"/"provides" keywords.
Then you use a module path instead of a classpath, it loads all the directory containing your modules, don't need to create a specific classLoader
The serviceLoader usage:
public static void main(String[] args) {
ServiceLoader<IGreeting> sl = ServiceLoader.load(IGreeting.class);
IGreeting greeting = sl.findFirst().orElseThrow(NullPointerException::new);
System.out.println( greeting.regular("world"));
}
In the users project:
module pl.tfij.java9modules.app {
exports pl.tfij.java9modules.app;
uses pl.tfij.java9modules.app.IGreeting;
}
In the provider project:
module pl.tfij.java9modules.greetings {
requires pl.tfij.java9modules.app;
provides pl.tfij.java9modules.app.IGreeting
with pl.tfij.java9modules.greetings.Greeting;
}
And finally the CLI usage
java --module-path mods --module pl.tfij.java9modules.app
Here is an example; Github example (Thanks for "tfij/" repository initial exemple)
Edit, I realized the repository already provides decoupling examples:
https://github.com/tfij/Java-9-modules---reducing-coupling-of-modules
It sounds like you might want to use the ServicerLoader interface, which has been available since Java 6. However, bear in mind that, if you want to use Spring dependency injection, this is probably not what you want.
There are two scenarios.
Implementation jar's are on classpath
In this scenario you can simply use ServiceLoader API (refer to #pdem answer)
Implementation jar's not on classpath
Lets Assume BankController is your interface and CoreController is your implementation.
If you want to load its implementation dynamically from dynamic path,c create a new module layer and load class.
Refer to the following piece of code:
private final BankController loadController(final BankConfig config) {
System.out.println("Loading bank with config : " + JSON.toJson(config));
try {
//Curent ModuleLayer is usually boot layer. but it can be different if you are using multiple layers
ModuleLayer currentModuleLayer = this.getClass().getModule().getLayer(); //ModuleLayer.boot();
final Set<Path> modulePathSet = Set.of(new File("path of implementation").toPath());
//ModuleFinder to find modules
final ModuleFinder moduleFinder = ModuleFinder.of(modulePathSet.toArray(new Path[0]));
//I really dont know why does it requires empty finder.
final ModuleFinder emptyFinder = ModuleFinder.of(new Path[0]);
//ModuleNames to be loaded
final Set<String> moduleNames = moduleFinder.findAll().stream().map(moduleRef -> moduleRef.descriptor().name()).collect(Collectors.toSet());
// Unless you want to use URLClassloader for tomcat like situation, use Current Class Loader
final ClassLoader loader = this.getClass().getClassLoader();
//Derive new configuration from current module layer configuration
final Configuration configuration = currentModuleLayer.configuration().resolveAndBind(moduleFinder, emptyFinder, moduleNames);
//New Module layer derived from current modulee layer
final ModuleLayer moduleLayer = currentModuleLayer.defineModulesWithOneLoader(configuration, loader);
//find module and load class Load class
final Class<?> controllerClass = moduleLayer.findModule("org.util.npci.coreconnect").get().getClassLoader().loadClass("org.util.npci.coreconnect.CoreController");
//create new instance of Implementation, in this case org.util.npci.coreconnect.CoreController implements org.util.npci.api.BankController
final BankController bankController = (BankController) controllerClass.getConstructors()[0].newInstance(config);
return bankController;
} catch (Exception e) {BootLogger.info(e);}
return null;
}
Reference : https://docs.oracle.com/javase/9/docs/api/java/lang/module/Configuration.html
General idea: I'm writing on a loader for java that allows dynamically reloading classes to allow for changing the implementation, without restarting the entire program to keep the main application running and minimize downtimes. Every external piece of code is grouped by "modules", each module has a main class with a "onEnable, postEnable, onDisable" entry/exit point and can consist of any amount of classes. To load a module, the class containing the entry point is specified, then loaded. I'll reference them as "modules" and "additional classes" in the following, "module" being the class containing the above mentioned functions by implementing the "public interface Module", "additional classes" refer to everything the module would use on runtime but isn't a Module by itself (e.g. we have a Module called "Car implements Module", and that module requires a class "Engine" to function -> "Car" is the module, "Engine" is an additional class")
Code of what I'm doing to load a module initially (name is a String containing the full classname including path, example given later):
Class<?> clazz = mainLoader.loadClass(name);
Module module = (Module) clazz.newInstance();
addLoadedModule(module);
enableLoadedModule(module);
And here's how I reload the module when it's already existing, so that I can override the implementation. "m" is an instance of the current implementation of the Module that is supposed to be reloaded.
boolean differs = false;
Class<?> newClass = null;
try (URLClassLoader cl = new URLClassLoader(urls, mainLoader.getParent()))
{
// Try to load the class and check if it differs from the already known one
newClass = cl.loadClass(m.getClass().getName());
differs = m.getClass() != newClass;
}
catch (IOException | ClassNotFoundException e)
{
// Class couldn't be found, abort.
e.printStackTrace();
return;
}
if (!differs)
{
// New class == old class -> no need to reload it
return;
}
Module module = null;
try
{
// Try to instantiate the class
module = (Module) newClass.newInstance();
}
catch (InstantiationException | IllegalAccessException e)
{
// Can't instantiate, abort
e.printStackTrace();
return;
}
// Check versions, only reload if the new implementation's version differs from the current one. Version is a custom annotation, don't worry about that; the version check works fine
Version oldVersion = m.getClass().getAnnotation(Version.class);
Version newVersion = module.getClass().getAnnotation(Version.class);
if (oldVersion.equals(newVersion))
{
return;
}
// And if everything went well, disable and remove the old module from the list, then add and enable the new module.
disableModule(m);
modules.remove(m);
modules.put(module, false);
enableLoadedModule(module);
This is the mainLoader, urls is an URL[] pointing to the location containing the external classes to load:
mainLoader = new URLClassLoader(urls, this.getClass().getClassLoader());
The problem arises when I try to RE-load an implementation, that requires multiple classes:
Module of class A requires class B to function. This is what happens when I try to dynamically load, then reload class A:
load A -> "Sure, but I'll need B with it." -> automatically loads B -> "Here ya go, A works fine now."
reload A -> "Sure, but I'll need B with it." -> crashes because B couldn't be found
Both classes are located in the exact same folder, structure like this:
Class A implements Module: com/foo/bar/A.class
Class B: com/foo/bar/B.class
urls: ["com/foo/bar/"]
I call the function with load("com.foo.bar.A"), which works when attempting to load it the first time, but fails when trying to reload it as described above.
It works fine when trying to load a "single class module", the problem arises when the module relies on an additional external class. I tried using different classloaders to use as the parent for the URLClassLoader in the reloading process, those being the sysloader, Module.class.getClassLoader(), mainLoader (using that one, it won't ever find the new class definition because it already knows about it and therefor won't even attempt to load it from the drive again) and the mainLoader.getParent(), the classloader of the old module, and the parent of the modules classloader.
I'm probably just overseeing something obvious, but I can't figure out why it would manage to load the "extra" classes the first time, but fail when I reload the base class...
If you need any debug outputs or exact errors let me know, I replaced the debug outputs with comments explaining what does what so I got a fairly detailed log of what's happening when, but I didn't seem it to be necessary as it goes through the entire "check and then load" process just fine, it crashes when trying to enable the module. The "onEnable" method of the module requires the additional class B, that's where it fails. As I said, if you need the implementation of the classes A and B, Module, any other code or the debug outputs let me know and I'll add them in as requested.
There's a few things you can try:
Create an extension of UrlClassLoader so that you can track when it loads a class and what class loader is used to load the class.
Your other issue is make sure none of these classes are available on the "default" class path as that will cause that version to use. You are not overriding the default class loading behaviour which is to check the parent for the class first.
The other issue you're probably facing relates to the way the VM caches classes - I'm not entirely sure how this works - but from what I've experienced it seems that once a class is loaded it puts it in a shared storage space so that it does not load the class again. This shared space class will not be unloaded until the class loader that loaded it goes unreachable.
The solution lies in the classloader being closed and deleted as soon as the loading of the initial class is done, due to the class loader being only existant in the try/catch clause. I solved the issue by storing the classloader in a map until a new implementation of the module is loaded, then I can discard the old loader and store the new one instead.
I have a project with multiple modules in Android Studio. A module may have a dependency on another module, for example:
Module PhoneApp -> Module FeatureOne -> Module Services
I've included my annotation processing in the root module but the android-apt annotation processing occurs only at the top most level (PhoneApp) so that it should theoretically have access to all the modules at compile time. However, what I'm seeing in the generated java file is only the classes annotated in PhoneApp and none from the other modules.
PhoneApp/build/generated/source/apt/debug/.../GeneratedClass.java
In the other modules, I am finding a generated file in the intermediates directory that contains only the annotated files from that module.
FeatureOne/build/intermediates/classes/debug/.../GeneratedClass.class
FeatureOne/build/intermediates/classes/debug/.../GeneratedClass.java
My goal is to have a single generated file in PhoneApp that allows me to access the annotated files from all modules. Not entirely sure why the code generation process is running for each and failing to aggregate all annotations at PhoneApp. Any help appreciated.
Code is fairly simple and straight forward so far, checkIsValid() omitted as it works correctly:
Annotation Processor:
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
try {
for (Element annotatedElement : roundEnv.getElementsAnnotatedWith(GuiceModule.class)) {
if (checkIsValid(annotatedElement)) {
AnnotatedClass annotatedClass = new AnnotatedClass((TypeElement) annotatedElement);
if (!annotatedClasses.containsKey(annotatedClass.getSimpleTypeName())) {
annotatedClasses.put(annotatedClass.getSimpleTypeName(), annotatedClass);
}
}
}
if (roundEnv.processingOver()) {
generateCode();
}
} catch (ProcessingException e) {
error(e.getElement(), e.getMessage());
} catch (IOException e) {
error(null, e.getMessage());
}
return true;
}
private void generateCode() throws IOException {
PackageElement packageElement = elementUtils.getPackageElement(getClass().getPackage().getName());
String packageName = packageElement.isUnnamed() ? null : packageElement.getQualifiedName().toString();
ClassName moduleClass = ClassName.get("com.google.inject", "Module");
ClassName contextClass = ClassName.get("android.content", "Context");
TypeName arrayOfModules = ArrayTypeName.of(moduleClass);
MethodSpec.Builder methodBuilder = MethodSpec.methodBuilder("juice")
.addParameter(contextClass, "context")
.addModifiers(Modifier.PUBLIC, Modifier.STATIC)
.returns(arrayOfModules);
methodBuilder.addStatement("$T<$T> collection = new $T<>()", List.class, moduleClass, ArrayList.class);
for (String key : annotatedClasses.keySet()) {
AnnotatedClass annotatedClass = annotatedClasses.get(key);
ClassName className = ClassName.get(annotatedClass.getElement().getEnclosingElement().toString(),
annotatedClass.getElement().getSimpleName().toString());
if (annotatedClass.isContextRequired()) {
methodBuilder.addStatement("collection.add(new $T(context))", className);
} else {
methodBuilder.addStatement("collection.add(new $T())", className);
}
}
methodBuilder.addStatement("return collection.toArray(new $T[collection.size()])", moduleClass);
TypeSpec classTypeSpec = TypeSpec.classBuilder("FreshlySqueezed")
.addModifiers(Modifier.PUBLIC, Modifier.FINAL)
.addMethod(methodBuilder.build())
.build();
JavaFile.builder(packageName, classTypeSpec)
.build()
.writeTo(filer);
}
This is just for a demo of annotation processing that works with Guice, if anyone is curious.
So how can I get all the annotated classes to be included in the generated PhoneApp .java file from all modules?
It's never too late to answer a question on SO, so...
I have faced a very similar complication during one of tasks at work.
And I was able to resolve it.
Short version
All you need to know about generated classes from moduleB in moduleA is package and class name. That can be stored in some kind of MyClassesRegistrar generated class placed in known package. Use suffixes to avoid names clashing, get registrars by package. Instantiate them and use data from them.
Lond version
First of all - you will NOT be able to include your compile-time-only dependency ONLY at topmost module (lets call it "app" module as your typical android project structure does). Annotation processing just does not work that way and, as far as I could find out - nothing can be done about this.
Now to the details. My task was this:
I have human-written annotated classes. I'll name them "events". At compile time I need to generate helper-classes for those events to incorporate their structure and content (both statically-available (annotation values, consts, etc) and runtime available (I am passing event objects to those helpers when using latter). Helper class name depends on event class name with a suffix so I don't know it until code generation finished.
So after helpers are generated I create a factory and generate code to provide new helper instance based on MyEvent.class provided. Here's the problem: I only needed one factory in app module, but it should be able to provide helpers for events from library module - this can't be done straightforward.
What I did was:
skip generating factory for modules that my app module depends upon;
in non-app modules generate a so-called HelpersRegistrar implementation(s):
– they all share same package (you'll know why later);
– their names don't clash because of suffix (see below);
– differentiation between app module and library-module is done via javac "-Amylib.suffix=MyModuleName" param, that user MUST set - this is a limitation, but a minor one. No suffix must be specified for app module;
– HelpersRegistrar generated implementation can provide all I need for future factory code generating: event class name, helper class name, package (these two share package for package-visibility between helper and event) - all Strings, incorporated in POJO;
in app module I generate helpers - as usual, then I obtain HelperRegistrars by their package, instantiate them, run through their content to enrich my factory with code that provides helpers from other modules. All I needed for this was class names and a package.
Voilà! My factory can provide instances of helpers both from app module and from other modules.
The only uncertainty left is order of creating and running processor-class instances in app module and in other modules. I have not found any solid info on this, but running my example shows that compiler (and, therefore, code generation) first runs in module that we depend upon, and then - in app module (otherwise compilation of app module will be f..cked). This gives us reason to expect known order of code processor executions in different modules.
Another, slightly similar, approach is this: skip registrars, generate factories in all modules and write factory in app module to use other factories, that you get and name same way as registrars above.
Example can be seen here: https://github.com/techery/janet-analytics - this is a library where I applied this approach (the one without registrars since I have factories, but that can be not the case for you).
P. S.: suffix param can be switched to simpler "-Amylibraryname.library=true" and factories/registrars names can be autogenerated/incremented
Instead of using Filer to save generated file, use regular java file writing instead. You will need to serialize objects to temp files when processing because even static variables won't save in between modules. Configure gradle to delete the temp files before compilation.
I have written this project and already use it in other libraries of mine.
However, I find something amiss. Namely, in each user of this library, I create a utility class whose only role is to provide one or more MessageBundles. And this sucks.
I'd like to have, built into the library, a mechanism in order to have library users be able to register/recall bundles.
My first idea would be to have a singleton factory with a .register() and .get() method (with appropriate checks for duplicate keys etc) and call these from within static initialization blocks...
... But there is a problem: there is no guarantee as to which static initialization block will be called first.
Knowing that I'd like to keep the dependencies of this library "intact" (which is to mean, no external dependency at all), what solution would you recommend?
(note: this is Java 6+)
You could use the standard support for service providers: ServiceLoader. You would simply require each user of your library to provide an implementation of some interface, for example
public interface MessageBundleProvider {
List<MessageBundle> getBundles();
}
The name of the class implementing this interface would have to be specified in a file of the jar file of the user library named META-INF/services/com.example.MessageBundleProvider.
At runtime, your library would automatically discover all the message bundle providers using the following code:
private static final ServiceLoader<MessageBundleProvider> LOADER
= ServiceLoader.load(MessageBundleProvider.class);
private static final List<MessageBundle> BUNDLES;
static {
BUNDLES = new ArrayList<MessageBundle>();
for (MessageBundleProvider provider : loader) {
for (MessageBundle bundle : provider.getBundles()) {
BUNDLES.add(bundle);
}
}
}
Disclaimer: I know that ServiceLoader exists, but I've never used it before. It's how all the standard Java service providers are discovered, though (like JDBC drivers, charset providers, etc.).