How to handle version conflicts with my Java application using SPI extensions - java

I am writing a plugin API for a Java application, the idea being that eventually third parties will provide their own plugin extensions for the application and all the user needs to do is place the plugin jar into a plugins directory of the application. For the impatient, my question in short is how to handle possible version conflicts when the plugin relates to a different API than that on the system, see below for details of my situation and what I have thought about.
I have read a number of articles using service provider interfaces and have something working with that. The problem comes when considering how to deal with differing version combinations.
I am aware of the technique of when adding to an API adding extension interfaces, rather than changing the existing interface (eg. API 1.0 having MyInterface, API 1.1 adding MyInterface2 with the new methods, etc). With this technique if the user has the latest API then older plugins should work fine, but what happens if the user has an old API and newer plugins?
So as an example the user has API 1.0 only with MyInterface but installs binary plugin compiled against API 1.1 where the provider class implements MyInterface2. Whilst the application may only ever call plugins using MyInterface, what happens if the plugin internally calls MyInterface2? Will this cause an error or exception and when (IE. when the class is loaded or when the method from MyInterface2 is called). Also is this standard across JVMs or may it differ depending on the JVM used?
Finally, would it be better to use a plugin framework, would that be able to check version requirements? Searching the internet I find PF4J on github. A quick look in the source code seems to show it may support some sort of version checks.

Related

Classloader problem with JAXB internal implementation interface com.sun.xml.bind.namespacePrefixMapper

I need help for the following problem: I use Websphere Liberty 19.0.0.9 with Oracle and IBM Java 1.8 an run an older application (EAR) containing an EJB, which serializes XML with JAXB. The application needs to control XML namespace definitions and prefixes and does this by providing an implementation of com.sun.xml.bind.namespacePrefixMapper to javax.xml.bind.Marshaller.setProperty with property "com.sun.xml.bind.namespacePrefixMapper".
At runtime the error java.lang.NoClassDefFoundError: com/sun/xml/bind/marshaller/NamespacePrefixMapper occurs when loading the implementation class.
The server.xml contains feature javaee-8.0, and the liberties’ JAXB implementation wlp-19.0.0.9\lib\com.ibm.ws.jaxb.tools.2.2.10_1.0.32.jar contains the class com.sun..xml.bind.marshaller.NamespacePrefixMapper.
I tried to solve it by putting the jaxb-impl-2.2.4.jar to the EAR/lib (which is the wrong way because JAXB is provided by JEE) but then an error occurred in the com.sun.xml.bind.v2.runtime.MarshallerImpl.setProperty(MarshallerImpl.java:511) because the check if(!(value instanceof NamespacePrefixMapper)) failed, because the Classloader (AppClassLoader) of the implementation provided another class object for class NamespacePrefixMapper than the MarshallerImpls’ classloader (org.eclipse.osgi.internal.loader.EquinoxClassLoader). But this showed that the liberty can access the the NamespacePrefixMapper.
I made several attempts to use the same classloader for the implementation and the MarschallerImpl when loading them, and I tried to solve it by classloder settings in the server.xml. No success.
I know that it is not recommended to use such JAXB implementation specific classes, but the application was developed this way and cannot be changed easily.
Any help is appreciated which tells me how to convince liberty to either provide the NamespacePrefixMapper class to the application classloader, or to use the application classloaders NamespacePrefixMapper also in the MarschallerImpl.
Thank you.
//The implementation class looks for example like this:
public class MyNamespacePrefixMapperImpl extends com.sun.xml.bind.marshaller.NamespacePrefixMapper {...}
JAXBContext c = JAXBContext.newInstance(some mapped class);
Marshaller m = c.createMarshaller();
com.sun.xml.bind.marshaller.NamespacePrefixMapper mapper = new MyNamespacePrefixMapperImpl();// Here the NoClassDefFoundError occurs.
m.setProperty("com.sun.xml.bind.namespacePrefixMapper", mapper); // Here the instanceof check fails if jaxb-impl.jar is in EAR/lib.
this is a precarious situation without an easy solution. Liberty attempts to "hide" internal packages to avoid scenarios where users want a slightly different version of the implementation than what the framework provides - the most glaring example of this problem was in traditional WAS where users wanted to use a different version of Jakarta Commons Logging than what was shipped with WAS - this required users to provide their own, either in an isolated shared library, or use other parent-last classloading hacks to make it work. Liberty avoids those issues by isolating the internal implementations from the user applications.
So that works great when a user wants to use a different version of a third party library than what Liberty provides, but as you have discovered, that doesn't work so great when your legacy application depends on those hidden/isolated third party libraries.
The most ideal solution would be to refactor the application code so as to not depend on internal JAXB classes - somebody with more JAXB expertise may be able to help with this... But it sounds like that may not be feasible, so another alternative would be to create a user feature. A user feature is essentially an extension to Liberty's runtime - so it has access to packages that user applications do not. It also allows you to add packages as APIs for the user applications - so you could use a user feature to add the com.sun.xml.bind.marshaller as a public API - then your user application could extend it freely. You could also include your MyNamespacePrefixMapperImpl class in your user feature and register it there so that it would automatically apply to all applications in your server.
You can find more information on user features here:
https://www.ibm.com/support/knowledgecenter/en/SSEQTP_liberty/com.ibm.websphere.wlp.doc/ae/twlp_feat_example.html
Hope this helps, Andy

Maximum Reusability for Two Implementations with Different Dependencies

I have a task that includes migrating an API Gateway from Zuul to Spring Cloud Gateway. There are two main versions currently: 1.0.1.RELEASE and 2.0.0.RC1. The first version is very basic and I'd have to manually implement filters related to rate limiting, authentication, etc...
While the second version has all the features we need with complete YML support. We have a strict rule in the company to never use beta or RC, and we need the first version of the gateway to be in production within a couple of weeks so there is not enough time to wait for the final release of version 2.
My team-leader specifically asked me to make 2 versions of using version 1.0.1 and 2.0.0 of SCG. How do you implement the module for maximum reusability? I mean I want switching between the two versions to be as easy as possible and I want to reuse as much of the logic as I can. The first thing that came to my mind is simply to create two separate projects. What do you think?
As I understand the question, you want an easy transition from the version 1.0.1.RELEASE to 2.0.0.RC1 of some dependency.
I would approach it as follows:
Create 3 modules (or projects):
api
bindings-1
bindings-2
The api module contains the API which you'll define to access functions of the dependency.
The bindings-1 and bindings-2 both implement what's defined in api, but based on the versions 1.0.1.RELEASE and 2.0.0.RC2 accordingly.
Your code will use the dependency only and exclusively via the api. No direct access to the classes and methods provided by the dependency. I would even not include the dependency as a compile-time dependency. You'll then import bindings-1 or bindings-2 depending on which version you want to use.
Having a separate api will require certain effort. It will seem overengineered. But if you don't do this, bindings to the dependency will diffuse in your code and switching from one version to another will be much more difficult.
With a dedicated api you will be forced to crystallize everything you need from the dependency in your api - in a version-independent manner.
I would also not develop bindings-1/bindings-2 as SCM branches. It's not like you'll be merging them, so why branches?

Java package naming for versioning external APIs

Are there any conventions of how to name Java packages containing classes that consume an external, versioned API?
Let's assume we have a major-minor semantic versioning scheme of a service and we have to implement a consumer that is compatible with and bound to a specific version of that API. What are the best practices to name the packages (and classes)?
Currently, we're using the scheme of: ${service}_${M}_${N} (with M = Major version, N = minor version). For example: com.example.theService_1_0..
But sonarqube is complaining that it does not match conventions.
Of course I can just disable this rule, but I wonder if there are any best-practices?
I'm looking for a general approach not only something specific to REST, because I've encountered consumer implementations for WebService, REST and CORBA. And I'm not sure, artifact-versioning (as in maven) works well here, because it relates to the version of the implementation and not the API.
I know there are questions around api versioning, but those are about the producer, not the consumer.
Yes, Java package names have a strong and ambiguous convention for indicating dependencies’ versions: don’t.
If you change your application to use a new version of an external API, you create a new version of your application. The version number you are looking for is your application’s version number. On many Java projects, dependencies’ versions are management by a Maven configuration file.
In any case, when classes use a particular API version, the class and package names have no business exposing this information, which would violate encapsulation, apart from anything else. These names have a different purpose.
Note that this is no different when you use HTTP/REST APIs or Java APIs. After all, I don’t think you’d name your class TheServiceWithLog4J_12_1_5. I hope not, at least.
You didn’t mention whether you have a requirement to support multiple versions of this external API at the same time. Even if you did, I still wouldn’t recommend exposing the API version number in the package name. Instead, use the package name to indicate why you have two versions, and the important difference.

Java application accessing multiple versions of a library

I am writing a client (Eclipse RCP) that needs to be able to use multiple versions of a library (which encapsulates the backend interface). Each version of the library adds some new classes/methods that are used by the client. In case an older lib version is used, the client will access new classes/methods that are not present in the lib bytecode, resulting in NoClassDefFoundError.
So I am thinking of how to do this the best way. The most simple way to make it fail-safe is to wrap all calls to such code into try/catch blocks I guess. I was thinking of writing a custom annotation for marking new code in the library source, and then issue a compiler warning when such marked code is accessed from code that is not secured by try/catch (can this be done by a custom annotation? Haven't written one before). Or can someone think of a different approach that is more elegant?
You can use maven repository for providing different version of your library.
Maven central repository is available for releases:
http://maven.apache.org/guides/mini/guide-central-repository-upload.html
You can setup own instance of Artifactory: http://www.jfrog.com/confluence/display/RTF/Configuring+Maven+Deployment+to+Artifactory

Extension functions and classpaths

I am aware that this is a rather long and detailed post. I would have made it shorter and simpler if I could. I'd be grateful for any advice or ideas.
Background
I am writing XSLT to transfer data between two applications - call them Source and Target.
My XSLT is called by an integration engine supplied by the supplier of the Source System - call them Source Co.
The integration engine updates the Target by calling an adapter included in the engine that wraps an API written by the supplier of the Target system - call them Target Co. This all runs inside a J2EE server.
When the integration engine is deployed to the Java EE server it copies the JAR file implementing the Target System API so that it will be on the engine's classpath.
My situation
Source Co's Adapter that wraps Target CO's API only exposes a sub-set of the API. At times, my customers' business requirements can only be met by side-stepping the Source Co Adapter and calling the API directly from JAVA.
I have achieved this already by:
Writing Java classes that accept and return DOM documents and call the API
Deploying them as JARs into the Java EE engine so that they will be on the classpath visible to the integration engine and hence my XSLT
In my XSLT I point at the Java class by declaring a namespace and then call the appropriate public static method via that namespace
This all works well. However...
My Problem
Recent versions of Target Co's API have:
Removed several deprecated methods
Exposed access to additional business entities in the Target system
Source Co's adapter uses these removed methods and therefore it needs to have an old version of the Target Co API on its classpath.
A customer's latest business requirements can only be met by using the latest API to access these additional business entities.
My question
How can I have the latest version of the API on the classpath of my extension function without it being on the classpath for the integration engine's adapter?
I could use custom class loaders of course. I've done this to load a 3rd party database driver jar that would have also broken the integration engine if it had been on its classpath. In the database driver case, this involves just using the custom classloader once. In the current case, however, I can't see how I can avoid having many, many calls to the classloader littered all over my code and this feels very wrong.
Some technical details
Source System - SAP
Target System - Oracle's Primavera
Java EE Engine - Netweaver 7.2
XSLT Processor - Saxon

Categories