I'm trying to determine the best way to create a new instance of a class based on which classes are available on the classpath at runtime.
For example, I have a library that requires a JSON response to be parsed in multiple classes. The library has the following interface:
JsonParser.java:
public interface JsonParser {
<T> T fromJson(String json, Class<T> type);
<T> String toJson(T object);
}
This class has multiple implementations, i.e. GsonJsonParser, JacksonJsonParser, Jackson2JsonParser, and currently, the user of the library is required to "pick" their implementation to be used based on which library they've included in their project. For example:
JsonParser parser = new GsonJsonParser();
SomeService service = new SomeService(parser);
What I'd like to do, is dynamically pick up which library is on the classpath, and create the proper instance, so that the user of the library doesn't have to think about it (or even have to know the internal implementation of another class parses JSON).
I'm considering something similar to the following:
try {
Class.forName("com.google.gson.Gson");
return new GsonJsonParser();
} catch (ClassNotFoundException e) {
// Gson isn't on classpath, try next implementation
}
try {
Class.forName("com.fasterxml.jackson.databind.ObjectMapper");
return new Jackson2JsonParser();
} catch (ClassNotFoundException e) {
// Jackson 2 was not found, try next implementation
}
// repeated for all implementations
throw new IllegalStateException("You must include either Gson or Jackson on your classpath to utilize this library");
Would this be an appropriate solution? It seems kind of like a hack, as well as uses exceptions to control the flow.
Is there a better way to do this?
Essentially you want to create your own JsonParserFactory. We can see how it's implemented in the Spring Boot framework:
public static JsonParser getJsonParser() {
if (ClassUtils.isPresent("com.fasterxml.jackson.databind.ObjectMapper", null)) {
return new JacksonJsonParser();
}
if (ClassUtils.isPresent("com.google.gson.Gson", null)) {
return new GsonJsonParser();
}
if (ClassUtils.isPresent("org.yaml.snakeyaml.Yaml", null)) {
return new YamlJsonParser();
}
return new BasicJsonParser();
}
So your approach is nearly the same as this, except for the use of the ClassUtils.isPresent method.
This sounds like a perfect case for the Service Provider Interface (SPI) pattern. Check out the java.util.ServiceLoader documentation for an example of how to implement it.
If only one of the implementations (GsonJsonParser, JacksonJsonParser, Jackson2JsonParser) would be present at runtime and there is no other option, then you'd have to use Class.forName().
Although you can handle it a smarter.
For example, you can put all the classes into a Set<String> and then loop over them. If any one of them throws exception, you can just continue, and the one which does not, you can do your operations.
Yes, it is a hack, and your code would become library dependent. If there could be any chance that you can include all three implementations of your JsonParsers in your classpath and use a logic to define which implementation you have to use; that would be a much better approach.
If this is not possible, you can continue with above.
Also, instead of using plain Class.forName(String name), you can use a better option Class.forName(String name, boolean initialize, ClassLoader loader) which will NOT run any static initializers (if present in your class).
Where initialize = false and loader = [class].getClass().getClassLoader()
The simple approach is the one SLF4J uses: create a separate wrapper library per underlying JSON library (GSON, Jackson, etc.) with a com.mypackage.JsonParserImpl class that delegates to the underlying library. Put the appropriate wrapper in the classpath alongside the underlying library. Then you can get the current implementation like:
public JsonParser getJsonParser() {
// needs try block
// also, you probably want to cache
return Class.forName("com.mypackage.JsonParserImpl").newInstance()
}
This approach uses the class loader to locate the JSON parser. It is the simplest and requires no 3rd party dependencies or frameworks. I see no drawbacks to it relative to Spring, Service Provider, or any other method of locating resources.
Alternately use the Service Provider API, as Daniel Pryden suggests. To do this, you still create a separate wrapper library per underlying JSON library. Each library includes a text file at location "META-INF/services/com.mypackage.JsonParser" whose contents is the fully qualified name of the implementation of JsonParser in that library. Then your getJsonParser method would look like:
public JsonParser getJsonParser() {
return ServiceLoader.load(JsonParser.class).iterator().next();
}
IMO this approach is unnecessarily more complex than the first.
Related
Can I do it with reflection or something like that?
I have been searching for a while and there seems to be different approaches, here is a summary:
reflections library is pretty popular if u don't mind adding the dependency. It would look like this:
Reflections reflections = new Reflections("firstdeveloper.examples.reflections");
Set<Class<? extends Pet>> classes = reflections.getSubTypesOf(Pet.class);
ServiceLoader (as per erickson answer) and it would look like this:
ServiceLoader<Pet> loader = ServiceLoader.load(Pet.class);
for (Pet implClass : loader) {
System.out.println(implClass.getClass().getSimpleName()); // prints Dog, Cat
}
Note that for this to work you need to define Petas a ServiceProviderInterface (SPI) and declare its implementations. you do that by creating a file in resources/META-INF/services with the name examples.reflections.Pet and declare all implementations of Pet in it
examples.reflections.Dog
examples.reflections.Cat
package-level annotation. here is an example:
Package[] packages = Package.getPackages();
for (Package p : packages) {
MyPackageAnnotation annotation = p.getAnnotation(MyPackageAnnotation.class);
if (annotation != null) {
Class<?>[] implementations = annotation.implementationsOfPet();
for (Class<?> impl : implementations) {
System.out.println(impl.getSimpleName());
}
}
}
and the annotation definition:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.PACKAGE)
public #interface MyPackageAnnotation {
Class<?>[] implementationsOfPet() default {};
}
and you must declare the package-level annotation in a file named package-info.java inside that package. here are sample contents:
#MyPackageAnnotation(implementationsOfPet = {Dog.class, Cat.class})
package examples.reflections;
Note that only packages that are known to the ClassLoader at that time will be loaded by a call to Package.getPackages().
In addition, there are other approaches based on URLClassLoader that will always be limited to classes that have been already loaded, Unless you do a directory-based search.
What erickson said, but if you still want to do it then take a look at Reflections. From their page:
Using Reflections you can query your metadata for:
get all subtypes of some type
get all types annotated with some annotation
get all types annotated with some annotation, including annotation parameters matching
get all methods annotated with some
In general, it's expensive to do this. To use reflection, the class has to be loaded. If you want to load every class available on the classpath, that will take time and memory, and isn't recommended.
If you want to avoid this, you'd need to implement your own class file parser that operated more efficiently, instead of reflection. A byte code engineering library may help with this approach.
The Service Provider mechanism is the conventional means to enumerate implementations of a pluggable service, and has become more established with the introduction of Project Jigsaw (modules) in Java 9. Use the ServiceLoader in Java 6, or implement your own in earlier versions. I provided an example in another answer.
Spring has a pretty simple way to acheive this:
public interface ITask {
void doStuff();
}
#Component
public class MyTask implements ITask {
public void doStuff(){}
}
Then you can autowire a list of type ITask and Spring will populate it with all implementations:
#Service
public class TaskService {
#Autowired
private List<ITask> tasks;
}
The most robust mechanism for listing all classes that implement a given interface is currently ClassGraph, because it handles the widest possible array of classpath specification mechanisms, including the new JPMS module system. (I am the author.)
try (ScanResult scanResult = new ClassGraph().whitelistPackages("x.y.z")
.enableClassInfo().scan()) {
for (ClassInfo ci : scanResult.getClassesImplementing("x.y.z.SomeInterface")) {
foundImplementingClass(ci); // Do something with the ClassInfo object
}
}
With ClassGraph it's pretty simple:
Groovy code to find implementations of my.package.MyInterface:
#Grab('io.github.classgraph:classgraph:4.6.18')
import io.github.classgraph.*
new ClassGraph().enableClassInfo().scan().withCloseable { scanResult ->
scanResult.getClassesImplementing('my.package.MyInterface').findAll{!it.abstract}*.name
}
What erikson said is best. Here's a related question and answer thread - http://www.velocityreviews.com/forums/t137693-find-all-implementing-classes-in-classpath.html
The Apache BCEL library allows you to read classes without loading them. I believe it will be faster because you should be able to skip the verification step. The other problem with loading all classes using the classloader is that you will suffer a huge memory impact as well as inadvertently run any static code blocks which you probably do not want to do.
The Apache BCEL library link - http://jakarta.apache.org/bcel/
Yes, the first step is to identify "all" the classes that you cared about. If you already have this information, you can enumerate through each of them and use instanceof to validate the relationship. A related article is here: https://web.archive.org/web/20100226233915/www.javaworld.com/javaworld/javatips/jw-javatip113.html
Also, if you are writing an IDE plugin (where what you are trying to do is relatively common), then the IDE typically offers you more efficient ways to access the class hierarchy of the current state of the user code.
I ran into the same issue. My solution was to use reflection to examine all of the methods in an ObjectFactory class, eliminating those that were not createXXX() methods returning an instance of one of my bound POJOs. Each class so discovered is added to a Class[] array, which was then passed to the JAXBContext instantiation call. This performs well, needing only to load the ObjectFactory class, which was about to be needed anyway. I only need to maintain the ObjectFactory class, a task either performed by hand (in my case, because I started with POJOs and used schemagen), or can be generated as needed by xjc. Either way, it is performant, simple, and effective.
A new version of #kaybee99's answer, but now returning what the user asks: the implementations...
Spring has a pretty simple way to acheive this:
public interface ITask {
void doStuff();
default ITask getImplementation() {
return this;
}
}
#Component
public class MyTask implements ITask {
public void doStuff(){}
}
Then you can autowire a list of type ITask and Spring will populate it with all implementations:
#Service
public class TaskService {
#Autowired(required = false)
private List<ITask> tasks;
if ( tasks != null)
for (ITask<?> taskImpl: tasks) {
taskImpl.doStuff();
}
}
We are building a product that needs to run on production environments. We need to modify some of the functionality of a existing library. The existing library has class's and methods, we need to override 1 or more methods so that the caller uses our overriden methods instead of the original library.
OriginalLibrary
package com.original.library ;
public class OriginalLibrary {
public int getValue() {
return 1 ;
}
public int getAnotherValue() {
return 1 ;
}
}
Original Client
public class MyClient {
private OriginalLibraryClass originalLibraryObject ;
public MyClient () {
originalLibraryObject = new OriginalLibraryClass() ;
System.out.println(originalLibraryObject.getValue()) ;
System.out.println(originalLibraryObject.getAnotherValue()) ;
}
}
Output
1
2
Now, I need to change getValue() to return 3, instead of 1
Needed Output
3
2
package com.original.library.improved ;
public class OriginalLibrary extends com.original.library.OriginalLibrary {
public int getValue() {
return 3 ;
}
public int getAnotherValue() {
return super.getAnotherValue() ;
}
}
If I do the above, I need to tell my Original Client to reorder and use my new com.original.library.improved jar file before com.original.library.
I am almost convinced that this is the most non intrusive way to launch my improved services over and above the OriginalLibrary. I would have preferred a solution where I need to tell the customer to just add my jar file, no need to recompile, relink your client code.
Similar (not same) questions on a google search
here
here
java assist is excellent library for bytecode manipulation. I have modified code below as per your sample code given, You have to explore javaassist more for your actual requirenment
CtClass etype = ClassPool.getDefault().get("com.original.library.OriginalLibrary");
// get method from class
CtMethod cm = etype.getDeclaredMethod("getValue");
// change the method bosy
cm.setBody("return 3;");
etype.rebuildClassFile();
// give the path where classes is placed, In my eclipse it is bin
etype.writeFile("bin");
OriginalLibrary originalLibraryObject;
originalLibraryObject = new OriginalLibrary();
System.out.println(originalLibraryObject.getValue());
System.out.println(originalLibraryObject.getAnotherValue());
Now output of getValue is 3 because I changed body of that method.
A couple of questions -
How is the client getting an instance of your library's class?
If they are using new OriginalLibrary(), then you're pretty much stuck with creating a new subclass of OriginalLibrary and then asking your client to use your new OriginalLibraryImproved class. This is a common problem encountered in projects and is one reason why a library should not allow its clients to instantiate its classes directly using the new operator.
If instead, your client is instantiating OriginalLibrary using a factory method provided by the library (say, OriginalLibrary.getInstance()), you may want to check if there are any hooks into the factory that allow you to change the object being returned.
Do you have full control of the source code of the original library?
If yes, then you definitely should (and I cannot emphasize this strongly enough) provide factory methods for any class in the library that is instantiable. Doing this allows you to change the actual object being returned without modifying the client (as long as the returned object's class is a subclass of the return value from the factory method).
If not, then I suggest you do the following.
Create a subclass of OriginalLibrary (say, OriginalLibraryImproved).
Create a Factory class named OriginalLibraryFactory that has a static method named getInstance(). Write code to return an instance of OriginalLibraryImproved from this method.
Ask your client to replace all occurrences of new OriginalLibrary() with OriginalLibraryFactory.getInstance(). Note that this approach will only involve adding an extra import for the factory class. The client will still refer to the returned instance using the same OriginalLibrary reference as before.
The advantage of this approach is that it gives you complete flexibility to change the implementation details of OriginalLibraryImproved without affecting the client in anyway. You could also swap OriginalLibararyImproved with a newer version like OriginalLibraryImprovedVer2 and the client will be oblivious to the fact that it is using a new class. You'll just have to make sure that OriginalLibraryImprovedVer2 subclasses OriginalLibrary.
An even more flexible approach is to use the Wrapper or Decorator pattern to avoid the pitfalls of inheritance. You can understand more about the Decorator pattern here.
In a nutshell, try to avoid forcing your clients to use new and try to avoid inheritance unless you have very compelling reasons.
It sounds a stupid question. However, in the API that I am using, version 1.7.2 has got the method Bukkit.getServer().getOnlinePlayers() returning a Player[], and version 1.7.10 has got Bukkit.getServer().getOnlinePlayers() returning a Collection<Player>. I need to make my plugin compatible with both.
I have got the API's for both, but aside from creating separate plugins, I currently have got no idea how to do this.
I currently just convert the collection into an array anyway. So, is there any way to (i can already get the version) if the version is less than 1.7.9, not use the .toArray() but since it already returns an array?
You could do it with reflection. Something like this:
List<Player> getPlayers() {
try {
Method method = getMethod(Server.class, "getOnlinePlayers");
Object result = method.invoke(Bukkit.getServer());
if(result instanceof Player[])
return Arrays.asList((Player[])result);
else
return (List<Player>)result;
} catch(ReflectiveOperationException e) {
// something went wrong! If you have a better way to handle problems, do that instead
throw new RuntimeException(e);
}
}
There are actually two ways to do this. #immibis's answer gives the first way. The second way involves creating a "version adapter" API with multiple plugin implementations for different versions of (in this case) Bukkit; e.g.
public interface BukkitVersionAdapter {
Collection<Player> getOnlinePlayers();
...
}
public class BukkitVersionAdapterV1dot7 implements BukkitVersionAdapter {
public Collection<Player> getOnlinePlayers() {
return Arrays.asList(Bukkit.getServer().getOnlinePlayers());
}
...
}
public class BukkitVersionAdapterV1dot8 implements BukkitVersionAdapter {
public Collection<Player> getOnlinePlayers() {
return Bukkit.getServer().getOnlinePlayers();
}
...
}
The version-specific adapter classes then need to be compiled against the Bukkit API jars for the respective Bukkit versions.
Then when you start your main application (or Bukkit plugin I guess), you do something like this:
String version = // get the Bukkit version
String className = // map the version to a class name; e.g.
// "pkg.BukkitVersionAdapterV1dot7" or
// "pkg.BukkitVersionAdapterV1dot8"
Class clazz = Class.forName(className);
BukkitVersionAdapter adapter =
(BukkitVersionAdapter) clazz.newInstance();
// Then ...
Collection<Player> players = adapter.getOnlinePlayers();
This is all rather cumbersome, but it has two benefits compared with the approach of using reflection to make the calls:
The logic that is specific to different Bukkit versions is now isolated to one part of the codebase ... rather than being scattered (potentially) all over the place.
The overheads of reflection are only incurred at startup. After that, all of the method calls to Bukkit made via the adapter API are regular Java method calls.
Depending on the context, these may make the adapter approach more appropriate.
If the class org.bukkit.Server just changed the return type of the method getOnlinePlayers() rather than making it deprecated and introducing a new method (yack!!!!! who dares to change the already published interface in such a way?), you have no decent way how to call the method in one single code. You simply cannot write something like
Server server = Bukkit.getServer();
// One of the branches will not be compilable
if (version <= xxxx) {
Player[] players = server.getOnlinePlayers();
}
else {
Collection<Player> players = server.getOnlinePlayers();
}
I'm afraid the only way is to use reflection like in the meantime others recommended.
You are doing well. Unfortunately there is no way to create strongly typed method that can deal with both collection and array.
(You can create method that accepts Object and then examine it using instanceof, cast and deal with both collection and arrays but this solution is ugly.)
Suppose I have a pojo:
import org.codehaus.jackson.map.*;
public class MyPojo {
int id;
public int getId()
{ return this.id; }
public void setId(int id)
{ this.id = id; }
public static void main(String[] args) throws Exception {
MyPojo mp = new MyPojo();
mp.setId(4);
ObjectMapper mapper = new ObjectMapper();
mapper.configure(SerializationConfig.Feature.WRAP_ROOT_VALUE, true);
System.out.println(mapper.getSerializationConfig().isEnabled(SerializationConfig.Feature.WRAP_ROOT_VALUE));
System.out.println(mapper.writeValueAsString(mp));
}
}
When I serialize using the Jackson ObjectMapper, I just get
true
{"id":4}
but I want
true
{"MyPojo":{"id":4}}
I've searched all over, Jacksons documentation is really unorganized and mostly out of date.
By adding the jackson annotation #JsonTypeInfo in class level you can have the expected output. i just added no-changes in your class.
package com.test.jackson;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.SerializationConfig;
import com.fasterxml.jackson.annotation.JsonTypeInfo;
import com.fasterxml.jackson.annotation.JsonTypeInfo.As;
import com.fasterxml.jackson.annotation.JsonTypeInfo.Id;
#JsonTypeInfo(include=As.WRAPPER_OBJECT, use=Id.NAME)
public class MyPojo {
// Remain same as you have
}
output:
{
"MyPojo": {
"id": 4
}
}
I'm not using jackson, but searching I found this configuration that seems to be what you want: WRAP_ROOT_VALUE
Feature that can be enabled to make root value (usually JSON Object but can be any type) wrapped within a single property JSON object, where key as the "root name", as determined by annotation introspector (esp. for JAXB that uses #XmlRootElement.name) or fallback (non-qualified class name). Feature is mostly intended for JAXB compatibility.
Default setting is false, meaning root
value is not wrapped.
So that you can configure mapper:
objectMapper.configure(SerializationConfig.Feature.WRAP_ROOT_VALUE, true);
I hope it helps you...
Below is a way to achieve this
Map<String, MyPojo> singletonMap = Collections.singletonMap("mypojo", mp);
System.out.println(mapper.writeValueAsString(singletonMap));
Output
{ "mypojo" : { "id" : 4}}
Here the advantage is that we can give our on name for the root key of json object. By the above code, mypojo will be the root key. This approach will be most useful when we use java script template like Mustache.js for iteration of json objects
To achieve this you need to use the JsonTypeInfo annotation on your class and in particular WRAPPER_OBJECT
#JsonTypeName("foo")
#JsonTypeInfo(include = JsonTypeInfo.As.WRAPPER_OBJECT ,use = JsonTypeInfo.Id.NAME)
public class Bar(){
)
There is also a nice annotation for this:
#JsonRootName(value = "my_pojo")
public class MyPojo{
...
}
will generate:
{
"my_pojo" : {...}
}
How about simplest possible solution; just use a wrapper class like:
class Wrapper {
public MyPojo MyPojo;
}
and wrapping/unwrapping in your code?
Beyond this, it would help to know WHY you would like additional json object entry like this? I know this is done by libs that emulate json via xml api (because of impedance between xml and json, due to conversion from xml to json), but for pure json solutions it is usually not needed.
Is it to allow you do figure out what actual type is?
If so, perhaps you could consider enabled polymorphic type information, to let Jackson handle it automatically? (see 1.5 release notes, entry for PTH, for details).
there is another way i used and that worked for me.
I am working with a third party jar, so i have no control for annotations.
So i had to write through bit of hack.
Override: org.codehaus.jackson.map.ser.BeanSerializerFactory.findBeanProperties(SerializationConfig, BasicBeanDescription)
Add your property as below
List<BeanPropertyWriter> props = super.findBeanProperties(config, beanDesc);
BeanPropertyWriter bpw = null;
try {
Class cc = beanDesc.getType().getRawClass();
Method m = cc.getMethod("getClass", null);
bpw = new BeanPropertyWriter("$className", null, null, m, null,true, null);
} catch (SecurityException e) {
// TODO
} catch (NoSuchMethodException e) {
// TODO
}
props.add(bpw);
return props;
This way i get more control and can do other kind of filters too.
#JsonTypeInfo(include=As.WRAPPER_OBJECT, use=Id.NAME)
This annotation works perfectly, as suggested by Arun Prakash. I was trying to get json in this form:
{"Rowset":{"ROW":{"receiptno":"881604199388936","status":"SUCCESS"}}}
but getting like this:
{"ROW":{"receiptno":"881604199388936","status":"SUCCESS"}}
Now that annotation resolved my problem.
I would be interested in hearing the OP's solution for this. I'm having similar issues where my RESTful web service is serializing objects as either XML or JSON for clients. The Javascript clients need to know the wrapping type so that can parse it. Coupling the type to a URI pattern is not an option.
Thanks.
Edit: I noticed that Spring MappingJacksonJsonMarshaller adds the wrapping class when marshalling, so I stepped through the code in debug and noticed that Spring passes in a HashMap with a single key-value pair such that the key is the wrapping name and the value is the object. So, I extended JacksonJaxbJsonProvider, override the writeTo() method and added the following:
HashMap<String, Object> map = new HashMap<String, Object>();
map.put(value.getClass().getSimpleName(), value);
super.writeTo(map, type, genericType, annotations, mediaType, httpHeaders,entityStream);
It's a bit of a hack, but it works nicely.
use withRootName.
objectMapper.writer().withRootName(MyPojo.class.getName());
I have found through experience that it is a good idea for all JSON to include both the backend type (as a string) and the component type used to render it in the front end (if using something like angular or Vue).
The justification for doing this is so that you can process various types with a single set of code.
In vue, for example, having the name of the UI component in the data allows you, among other things, to have a screen rendering a list of children of different types using only a single tag in the parent template.
<component :is="child.componentType"/>.
For backend systems and web services - I prefer to use a single web service processor class that provides logging, auditing and exception handling for all web services by looking up the appropriate processor class based on the incoming payload. That makes the implementation of all my web services look exactly the same (about 3 lines of code), and I get detailed event logging through the lifecycle of the call without writing any per service code to do so.
Having the type wrapping the JSON makes it self documenting. If all you see are the properties, you have no idea what you are looking at until you find the corresponding end point.
If you want to write data driven software, being able to identify what you are processing is a basic requirement.
I refer to "service provider framework" as discussed in Chapter 2 of Effective Java, which seems like exactly the right way to handle a problem I am having, where I need to instantiate one of several classes at runtime, based on a String to select which service, and an Configuration object (essentially an XML snippet):
But how do I get the individual service providers (e.g. a bunch of default providers + some custom providers) to register themselves?
interface FooAlgorithm
{
/* methods particular to this class of algorithms */
}
interface FooAlgorithmProvider
{
public FooAlgorithm getAlgorithm(Configuration c);
}
class FooAlgorithmRegistry
{
private FooAlgorithmRegistry() {}
static private final Map<String, FooAlgorithmProvider> directory =
new HashMap<String, FooAlgorithmProvider>();
static public FooAlgorithmProvider getProvider(String name)
{
return directory.get(serviceName);
}
static public boolean registerProvider(String name,
FooAlgorithmProvider provider)
{
if (directory.containsKey(name))
return false;
directory.put(name, provider);
return true;
}
}
e.g. if I write custom classes MyFooAlgorithm and MyFooAlgorithmProvider to implement FooAlgorithm, and I distribute them in a jar, is there any way to get registerProvider to be called automatically, or will my client programs that use the algorithm have to explicitly call FooAlgorithmRegistry.registerProvider() for each class they want to use?
I think you need to create a META-INF/services/fully.qualified.ClassName and list things there, but I don't remember the spec (JAR File Specification or this).
The Practical API design confessions of a Java architect book chapter 8 is about SPI.
The ServiceLoader might help you to list available implementations. For example with the PersistenceProvider interface:
ServiceLoader<PersistenceProvider> loader =
ServiceLoader.load(PersistenceProvider.class);
Iterator<PersistenceProvider> implementations = loader.iterator();
while(implementations.hasNext()) {
PersistenceProvider implementation = implementations.next();
logger.info("PersistenceProvider implementation: " + implementation);
}
You could have the client JAR register the providers in a static initializer block within some class that you know will be called before FooAlgorithmRegistry.getProvider(), something like:
static {
FooAlgorithmRegistry.registerProvider("test", new MyFooAlgorithmProvider());
}
But, it might be pretty hard to find a way to guarantee that this will run (static initializers are guaranteed to be run once and only once, when the class is first loaded) before the accessor method of the factory.