I need a way to version my java service to be versioned like REST. I want to use the Semantic Versioning and manage my java api lifecycle same as REST.
Example:-
String runService(String serviceVersion, String inputToService...)
and based on the version it will call to appropriate functionality of the service.
I am thinking of loading separate versions of jar through classloader as descrived in Java, Classpath, Classloading => Multiple Versions of the same jar/project
Is there any best possible way to do this or shall I switch to REST.
Related
I am writing a plugin API for a Java application, the idea being that eventually third parties will provide their own plugin extensions for the application and all the user needs to do is place the plugin jar into a plugins directory of the application. For the impatient, my question in short is how to handle possible version conflicts when the plugin relates to a different API than that on the system, see below for details of my situation and what I have thought about.
I have read a number of articles using service provider interfaces and have something working with that. The problem comes when considering how to deal with differing version combinations.
I am aware of the technique of when adding to an API adding extension interfaces, rather than changing the existing interface (eg. API 1.0 having MyInterface, API 1.1 adding MyInterface2 with the new methods, etc). With this technique if the user has the latest API then older plugins should work fine, but what happens if the user has an old API and newer plugins?
So as an example the user has API 1.0 only with MyInterface but installs binary plugin compiled against API 1.1 where the provider class implements MyInterface2. Whilst the application may only ever call plugins using MyInterface, what happens if the plugin internally calls MyInterface2? Will this cause an error or exception and when (IE. when the class is loaded or when the method from MyInterface2 is called). Also is this standard across JVMs or may it differ depending on the JVM used?
Finally, would it be better to use a plugin framework, would that be able to check version requirements? Searching the internet I find PF4J on github. A quick look in the source code seems to show it may support some sort of version checks.
I have a task that includes migrating an API Gateway from Zuul to Spring Cloud Gateway. There are two main versions currently: 1.0.1.RELEASE and 2.0.0.RC1. The first version is very basic and I'd have to manually implement filters related to rate limiting, authentication, etc...
While the second version has all the features we need with complete YML support. We have a strict rule in the company to never use beta or RC, and we need the first version of the gateway to be in production within a couple of weeks so there is not enough time to wait for the final release of version 2.
My team-leader specifically asked me to make 2 versions of using version 1.0.1 and 2.0.0 of SCG. How do you implement the module for maximum reusability? I mean I want switching between the two versions to be as easy as possible and I want to reuse as much of the logic as I can. The first thing that came to my mind is simply to create two separate projects. What do you think?
As I understand the question, you want an easy transition from the version 1.0.1.RELEASE to 2.0.0.RC1 of some dependency.
I would approach it as follows:
Create 3 modules (or projects):
api
bindings-1
bindings-2
The api module contains the API which you'll define to access functions of the dependency.
The bindings-1 and bindings-2 both implement what's defined in api, but based on the versions 1.0.1.RELEASE and 2.0.0.RC2 accordingly.
Your code will use the dependency only and exclusively via the api. No direct access to the classes and methods provided by the dependency. I would even not include the dependency as a compile-time dependency. You'll then import bindings-1 or bindings-2 depending on which version you want to use.
Having a separate api will require certain effort. It will seem overengineered. But if you don't do this, bindings to the dependency will diffuse in your code and switching from one version to another will be much more difficult.
With a dedicated api you will be forced to crystallize everything you need from the dependency in your api - in a version-independent manner.
I would also not develop bindings-1/bindings-2 as SCM branches. It's not like you'll be merging them, so why branches?
In a project I have different processors for each request message that each of these processors can be enabled or disabled based on the configuration. I have two options, one is to use the jar library of each of these processors in my applications and use its classes and the other option is to make each of the processors like a standalone web api that get and return json objects and in this way the communication between these processors would be based on web api instead of using jar libraries.
Which of these options do you think is better and what I need to consider when making such a decision.
Thank you
Depends on what the service is, but overall...
Using the jar will save you network hops between your client and the new service
Conversely, if the jar is taking up a lot of your resources, it might be more performant to have it as a separate service on a different machine
jar file is easily manageable as a project dependency, whereas api service will likely involve a more involved release process
if you manage the jar file, you are probably prone to tighter coupling of code since you are in control of it. having an api somewhat pushes you in direction of writing somewhat cleaner code
I think it really comes down to what your jar is doing and what makes most sense with the service you've packaged in its own jar.
I want to make a java application that supports plug ins. Now my core will use jars for certain processes. If my plug ins where to also use these jars, do the plug ins of my application need to configure their build path to include the jars they would also use or is their a way so that the jars can be imported similar to how I import packages from the main application
Guice and Spring are tools for dependency injection, which means that creating objects is easier with them because they take care of instantiating objects and placing them into other objects that depends on them.
Now, when we talk about plugins, we usually are talking too about dynamically loading new classes into a running app. Think on eclipse IDE. Its architecture was designed from the beginning to be "pluggable", like, you can download jars and eclipse will add them to the running application without the need of application restart.
In this case, if you want to build pluggable apps, in a sense of dynamic classloading, I'd recommend you not to go through this path, but to research subjects such as OSGI. One popular OSGI framework is http://felix.apache.org/
Another approach for application extension (we may call this pluggable too, somehow, I guess), depending on how your app is organized and what it does, is to develop a DSL (http://en.wikipedia.org/wiki/Domain-specific_language) for it and extend it letting people adding scripts to it. Isn't something like this when a browser let you add pieces of funcionality written in javascript? Groovy makes DSL easier in some aspects, for java programmers. (see http://docs.codehaus.org/display/GROOVY/Writing+Domain-Specific+Languages)
If you want dynamic plugable systems OSGI can give you this, but OSGI its IMMO a over-complicated technology, use only if you are really sure that needs this dynamic plug-ability.
Other option for builds extensible systems its use de ServiceProvider mechanism, this is a core java mechanism, for example its the one that JDBC implementations use, you can put a JDBC driver in your classpath and the application can find it and use it without needing that you explicitly import the driver classes in your code.
This is an example of using ServiceProvider in your owns applications: http://docs.oracle.com/javase/tutorial/ext/basics/spi.html#limitations-of-the-service-loader-api
Its of course more limited than OSGI, but its very easy to use when you get the idea, and you don't need any external library because its a java core mechanism.
EDIT: about the libraries.
In runtime: With ServiceProvicer there is no separate classloaders (you can implement off course, but by default, in OSGI its implemented this separation), in runtime if your plugin need X class and this class is in the classpath all is ok, the limitation its that the main application and all the plugins use this version of the dependency (guice 3 for example) and you cannot have one plugin using X version and other plugin using X+2 version if this version are not compatible. (this is the famous hell .jar, and one of the principal motivations behind jigsaw project for example).
In compile time, include the dependency in your pom, ant build file, gradle build file or whatever build system your use as usual.
I am aware that this is a rather long and detailed post. I would have made it shorter and simpler if I could. I'd be grateful for any advice or ideas.
Background
I am writing XSLT to transfer data between two applications - call them Source and Target.
My XSLT is called by an integration engine supplied by the supplier of the Source System - call them Source Co.
The integration engine updates the Target by calling an adapter included in the engine that wraps an API written by the supplier of the Target system - call them Target Co. This all runs inside a J2EE server.
When the integration engine is deployed to the Java EE server it copies the JAR file implementing the Target System API so that it will be on the engine's classpath.
My situation
Source Co's Adapter that wraps Target CO's API only exposes a sub-set of the API. At times, my customers' business requirements can only be met by side-stepping the Source Co Adapter and calling the API directly from JAVA.
I have achieved this already by:
Writing Java classes that accept and return DOM documents and call the API
Deploying them as JARs into the Java EE engine so that they will be on the classpath visible to the integration engine and hence my XSLT
In my XSLT I point at the Java class by declaring a namespace and then call the appropriate public static method via that namespace
This all works well. However...
My Problem
Recent versions of Target Co's API have:
Removed several deprecated methods
Exposed access to additional business entities in the Target system
Source Co's adapter uses these removed methods and therefore it needs to have an old version of the Target Co API on its classpath.
A customer's latest business requirements can only be met by using the latest API to access these additional business entities.
My question
How can I have the latest version of the API on the classpath of my extension function without it being on the classpath for the integration engine's adapter?
I could use custom class loaders of course. I've done this to load a 3rd party database driver jar that would have also broken the integration engine if it had been on its classpath. In the database driver case, this involves just using the custom classloader once. In the current case, however, I can't see how I can avoid having many, many calls to the classloader littered all over my code and this feels very wrong.
Some technical details
Source System - SAP
Target System - Oracle's Primavera
Java EE Engine - Netweaver 7.2
XSLT Processor - Saxon