I am writing a client (Eclipse RCP) that needs to be able to use multiple versions of a library (which encapsulates the backend interface). Each version of the library adds some new classes/methods that are used by the client. In case an older lib version is used, the client will access new classes/methods that are not present in the lib bytecode, resulting in NoClassDefFoundError.
So I am thinking of how to do this the best way. The most simple way to make it fail-safe is to wrap all calls to such code into try/catch blocks I guess. I was thinking of writing a custom annotation for marking new code in the library source, and then issue a compiler warning when such marked code is accessed from code that is not secured by try/catch (can this be done by a custom annotation? Haven't written one before). Or can someone think of a different approach that is more elegant?
You can use maven repository for providing different version of your library.
Maven central repository is available for releases:
http://maven.apache.org/guides/mini/guide-central-repository-upload.html
You can setup own instance of Artifactory: http://www.jfrog.com/confluence/display/RTF/Configuring+Maven+Deployment+to+Artifactory
Related
I am writing a plugin API for a Java application, the idea being that eventually third parties will provide their own plugin extensions for the application and all the user needs to do is place the plugin jar into a plugins directory of the application. For the impatient, my question in short is how to handle possible version conflicts when the plugin relates to a different API than that on the system, see below for details of my situation and what I have thought about.
I have read a number of articles using service provider interfaces and have something working with that. The problem comes when considering how to deal with differing version combinations.
I am aware of the technique of when adding to an API adding extension interfaces, rather than changing the existing interface (eg. API 1.0 having MyInterface, API 1.1 adding MyInterface2 with the new methods, etc). With this technique if the user has the latest API then older plugins should work fine, but what happens if the user has an old API and newer plugins?
So as an example the user has API 1.0 only with MyInterface but installs binary plugin compiled against API 1.1 where the provider class implements MyInterface2. Whilst the application may only ever call plugins using MyInterface, what happens if the plugin internally calls MyInterface2? Will this cause an error or exception and when (IE. when the class is loaded or when the method from MyInterface2 is called). Also is this standard across JVMs or may it differ depending on the JVM used?
Finally, would it be better to use a plugin framework, would that be able to check version requirements? Searching the internet I find PF4J on github. A quick look in the source code seems to show it may support some sort of version checks.
I want to build a REST API using RestEasy. The generated file should be deployed in a WildFly application server.
I face the issue described in the following SO-question:
AsynchronousDispatcher error
The marked solution tells me, to set the dependency to "provided". Which as far as I understand means, that the library is not included in my war file but taken directly from the app-server...
Isn't that just wrong?
My idea would be to build a self-containing war file which contains all the needed libraries in the version I need.
When provided from the app-server I do get the currently available version from there. I have not really a clue about the version... when someone has the idea to update the RestEasy library on the server, it might break my app.
I'm not sure whether I missed something or did something completely wrong?
One of the big advantages to Java EE is developing towards the API and not having to worry about the implementation. Java EE containers provide the API's and implementations for the API's. If you include implementation dependencies one of two things is likely to happen.
You're dependencies will be ignored making it pointless to include them in your deployment.
You'll get conflicts between the dependencies you included vs what the server is expecting. This could be things like:
ClassCastException because it's finding two of the same class on the class path.
MethodNotFoundException because there is a version mismatch
Various other issues with conflcts
Developing towards the API instead of the implementation also allows you to easily switch between Java EE compliant containers with no to minimal changes to your deployment. The API's are generally backwards compatible as well making version upgrades not as big of an issue.
If you want to use a fat WAR (including implementations) instead of a skinny WAR (not including the implementations) then a servlet container is probably a better solution. WildFly does have a servlet only download. I'd encourage you though to trust container to do the right thing with the implementation dependencies :). Usually the only time there is an issue with upgrading is if you're upgrading Java EE versions. Even then it's usually pretty safe.
I am developing a Spring (Java framework for server-side web-development)web application, which will respond to another client-side Java application(which uses socket communication) by a JSON object. At the same time, I'm working on both server-side and client-side Java applications.
The problem is that I have a bunch of files(say, a Json variable interfaces) that are being used at both projects. For now, I have duplicate copies of that interface, in different packages in the two projects. But this causes inconsistency, because I have to update the both files whenever I need to make a change in the interface.
Does anyone have a neat solution for this?
Thanks
You should treat your shared code at the package level and not the file level.
You should create a package of interface definitions that are used by both the client and server side of your architecture and whenever that package changes, both sides will have to change accordingly.
EDIT:
I wasn't explicit about it but zellus' suggestion about importing the common code as a jar is a good one.
You might create a separate project for your common JSON code. Using subversion, svn:externals allow a neat integration on the source level. Importing the common code as jar file is another approach.
If you're using maven, you could create a local maven project containing all the classes you might need in different projects and add this dependency to your pom.xml which requires these classes.
I have a scenario where I have code written against version 1 of a library but I want to ship version 2 of the library instead. The code has shipped and is therefore not changeable. I'm concerned that it might try to access classes or members of the library that existed in v1 but have been removed in v2.
I figured it would be possible to write a tool to do a simple check to see if the code will link against the newer version of the library. I appreciate that the code may still be very broken even if the code links. I am thinking about this from the other side - if the code won't link then I can be sure there is a problem.
As far as I can see, I need to run through the bytecode checking for references, method calls and field accesses to library classes then use reflection to check whether the class/member exists.
I have three-fold question:
(1) Does such a tool exist already?
(2) I have a niggling feeling it is much more complicated that I imagine and that I have missed something major - is that the case?
(3) Do you know of a handy library that would allow me to inspect the bytecode such that I can find the method calls, references etc.?
Thanks!
I think that Clirr - a binary compatibility checker - can help here:
Clirr is a tool that checks Java libraries for binary and source compatibility with older releases. Basically you give it two sets of jar files and Clirr dumps out a list of changes in the public api. The Clirr Ant task can be configured to break the build if it detects incompatible api changes. In a continuous integration process Clirr can automatically prevent accidental introduction of binary or source compatibility problems.
Changing the library in your IDE will result in all possible compile-time errors.
You don't need anything else, unless your code uses another library, which in turn uses the updated library.
Be especially wary of Spring configuration files. Class names are configured as text and don't show up as missing until runtime.
If you have access to the source code, you could just compile source against the new library. If it doesn't compile, you have definitely a problem. If it compiles you may still have a problem if the program uses reflection, some kind of IoC stuff like Spring etc.
If you have unit tests, then you may have a better change catch any linking errors.
If you have only have a .class file of the program, then I don't know any tools that would help besides decomplining class file to source and compiling source again against the new library, but that doesn't sound too healthy.
The checks you mentioned are done by the JVM/Java class loader, see e.g. Linking of Classes and Interfaces.
So "attempting to link" can be simply achieved by trying to run the application. Of course you could hoist the checks to run them yourself on your collection of .class/.jar files. I guess a bunch of 3rd party byte code manipulators like BCEL will also do similar checks for you.
I notice that you mention reflection in the tags. If you load classes/invoke methods through reflection, there's no way to analyse this in general.
Good luck!
I'm developing a Java plugin for an existing Java program. The existing program uses a specific version of eclipse.uml2.* and my plugin does too. Unfortunately I need a newer version for my plugin.
In order to run the plugin, I need to export it into a Jar file (with all jars packed). Then the program executes it. But somehow the new eclipse.uml2.* seem to interfere with the program -> it crashes.
Is there a way to "separate" both versions of the jar files?
An approach will be to use a custom class loader in your application. This can very easily introduce bugs that are difficult to trace, so take care.
http://www.devx.com/Java/Article/31614/1954
This is the exact problem OSGi tries to solve. Would it be feasible to rework the Java app to another plugin platform?
This will be difficult. You conceivably try to use class loader tricks to allow both versions of the eclipse.uml.* classes to be loaded in the same JVM. But as far as the JVM would be concerned they would be different sets of classes, and your plugin and the base java app wouldn't be able to exchange instances.
It is probably simpler (and less risky ... in terms of likelihood of success) to rebuild (and if necessary modify) either the base program or your plugin so that they both work with the same version of the eclipse.uml2.* classes.