I'm trying to evaluate whether it's appropriate for our shop to use the NetBeans Lookup API without the whole NetBeans Platform.
So far, I managed to create a project with this code :
for (SomeInterface si : Lookup.getDefault().lookupAll(SomeInterface.class)) {
si.doSomething();
}
I also created a couple of other projects, each with an AnImplementation class implementing SomeInterface, and the accompanying file META-INF/services/path.to.SomeInterface containing a line referencing the class (eg. "other.path.to.AnImplementation").
When I add these implementing projects to the libraries (dependencies) of the main project in the NetBeans IDE, it works fine and I can see the successive results of doSomething() from both implementations.
My question is how to make that work without referencing the sub-projects in the main project ; the jars of the sub-projects wouldn't be included in the generated jar of the main project when building, and one would be able to add or remove them at will, altering the result of the above code.
If I'm not mistaken, this is the behavior advertised in the Lookup API documentation.
Thanks in advance.
Edit: For now, my conclusion is that without the NetBeans Platform (or OSGi ?) it's not possible to detect which service providers are present at startup time. You need to reference their jars in your classpath, and thus to identify them before startup. Feel free to prove me wrong.
You have to reference the sub-project in your calling application, as this puts it on the classpath - If the jar/library is not on the classpath then APIs like the Lookup and ServiceLoader wont be able to find it.
If you use OSGI or the NetBeans platform these systems allow you to change the classpath at runtime.
Geertjans blog has an entry about exactly this(using the Lookup API outside of the NetBeans platform), in his blog he also references John O'Connors blog which contrasts the ServiceLoader and Lookup APIs
EDIT
I've just seen Jon Skeets' answer to a similar question.
You can use the -Djava.ext.dirs=lib property to set a folder (In this case 'libs') as the place where it must lookup jars for your classpath.
In my understanding you don't have to bundle all the modules together with the main project for this to work. All you need is to make sure that your modules are on the classpath when starting the application, because the global Lookup uses the ServiceLoader mechanism under the hood. Based on your question I recommend considering if
using the ServiceLoader directly is a better match for you problem or
some DI framework like Guice is worth a try or
if OSGI offers something useful for you as well and use that.
Don't get me wrong, I absolutely love NetBeans and the NetBeans platform, but in my opinion using the Lookup alone is of limited use because of the possibilities listed above.
Related
Just as shown in the picture, one app (Java) referenced two third-party package jars (packageA and packageB), and they referenced packageC-0.1 and packageC-0.2 respectively. It would work well if packageC-0.2 was compatible with packageC-0.1. However sometimes packageA used something that could not be supported in packageC-0.2 and Maven can only use the latest version of a jar. This issue is also known as "Jar Hell".
It would be difficult in practice to rewrite package A or force its developers to update packageC to 0.2.
How do you tackle with these problems? This often happens in large-scale companies.
I have to declare that this problem is mostly occurred in BIG companies due to the fact that big company has a lot of departments and it would be very expensive to let the whole company update one dependency each time certain developers use new features of new version of some dependency jars. And this is not big deal in small companies.
Any response will be highly appreciated.
Let me throw away a brick in order to get a gem first.
Alibaba is one of the largest E-Commerces in the world. And we tackle with these problems by creating an isolation container named Pandora. Its principle is simple: packaging those middle-wares together and load them with different ClassLoaders so that they can work well together even they referenced same packages with different versions. But this need a runtime environment provided by Pandora which is running as a tomcat process. I have to admit that this is a heavy plan. Pandora is developed based on a fact that JVM identifies one class by class-loader plus classname.
If you know someone maybe know the answers, share the link with him/her.
We are a large company and we have this problem a lot. We have large dependency trees that over several developer groups. What we do:
We manage versions by BOMs (lists of Maven dependencyManagement) of "recommended versions" that are published by the maintainers of the jars. This way, we make sure that recent versions of the artifacts are used.
We try to reduce the large dependency trees by separating the functionality that is used inside a developer group from the one that they offer to other groups.
But I admit that we are still trying to find better strategies. Let me also mention that using "microservices" is a strategy against this problem, but in many cases it is not a valid strategy for us (mainly because we could not have global transactions on databases any more).
This is a common problem in the java world.
Your best options are to regularly maintain and update dependencies of both packageA and packageB.
If you have control over those applications - make time to do it. If you don't have control, demand that the vendor or author make regular updates.
If both packageA and packageB are used internally, you can use the following practise: have all internal projects in your company refer to a parent in the maven pom.xml that defines "up to date" versions of commonly used third party libraries.
For example:
<framework.jersey>2.27</framework.jersey>
<framework.spring>4.3.18.RELEASE</framework.spring>
<framework.spring.security>4.2.7.RELEASE</framework.spring.security>
Therefore, if your project "A" uses spring, if they use the latest version of your company's "parent" pom, they should both use 4.3.18.RELEASE.
When a new version of spring is released and desirable, you update your company's parent pom, and force all other projects to use that latest version.
This will solve many of these dependency mismatch issues.
Don't worry, it's common in the java world, you're not alone. Just google "jar hell" and you can understand the issue in the broader context.
By the way mvn dependency:tree is your friend for isolating these dependency problems.
I agree with the answer of #JF Meier ,In Maven multi-module project, the dependency management node is usually defined in the parent POM file when doing unified version management. The content of dependencies node declared by the node class is about the resource version of unified definition. The resources in the directly defined dependencies node need not be introduced into the version phase. The contents of the customs are as follows:
in the parent pom
<dependencyManagement>
<dependencies >
<dependency >
<groupId>com.devzuz.mvnbook.proficio</groupId>
<artifactId>proficio-model</artifactId>
<version>${project.version}</version>
</dependency >
</dependencies >
</dependencyManagement>
in your module ,you do not need to set the version
<dependencies >
<dependency >
<groupId>com.devzuz.mvnbook.proficio</groupId>
<artifactId>proficio-model</artifactId>
</dependency >
</dependencies >
This will avoid the problem of inconsistency .
This question can't be answered in general.
In the past we usually just didn't use dependencies of different versions. If the version was changed, team-/company-wide refactoring was necessary. I doubt it is possible with most build tools.
But to answer your question..
Simple answer: Don't use two versions of one dependency within one compilation unit (usually a module)
But if you really have to do this, you could write a wrapper module that references to the legacy version of the library.
But my personal opinion is that within one module there should not be the need for these constructs because "one module" should be relatively small to be manageable. Otherwise it might be a strong indicator that the project could use some modularization refactoring. However, I know very well that some projects of "large-scale companies" can be a huge mess where no 'good' option is available. I guess you are talking about a situation where packageA is owned by a different team than packageB... and this is generally a very bad design decision due to the lack of separation and inherent dependency problems.
First of all, try to avoid the problem. As mentioned in #Henry's comment, don't use 3rd party libraries for trivial tasks.
However, we all use libraries. And sometimes we end up with the problem you describe, where we need two different versions of the same library. If library 'C' has removed and added some APIs between the two versions, and the removed APIs are needed by 'A', while 'B' needs the new ones, you have an issue.
In my company, we run our Java code inside an OSGi container. Using OSGi, you can modularize your code in "bundles", which are jar files with some special directives in their manifest file. Each bundle jar has its own classloader, so two bundles can use different versions of the same library. In your example, you could split your application code that uses 'packageA' into one bundle, and the code that uses 'packageB' in another. The two bundles can call each others APIs, and it will all work fine as long as your bundles do not use 'packageC' classes in the signature of the methods used by the other bundle (known as API leakage).
To get started with OSGi, you can e.g. take a look at OSGi enRoute.
Let me throw away a brick in order to get a gem first.
Alibaba is one of the largest E-Commerces in the world. And we tackle with these problems by creating an isolation container named Pandora. Its principle is simple: packaging those middle-wares together and load them with different ClassLoaders so that they can work well together even they referenced same packages with different versions. But this need a runtime environment provided by Pandora which is running as a tomcat process. I have to admit that this is a heavy plan.
Pandora is developed based on a fact that JVM identifies one class by class-loader plus classname.
I want to make a java application that supports plug ins. Now my core will use jars for certain processes. If my plug ins where to also use these jars, do the plug ins of my application need to configure their build path to include the jars they would also use or is their a way so that the jars can be imported similar to how I import packages from the main application
Guice and Spring are tools for dependency injection, which means that creating objects is easier with them because they take care of instantiating objects and placing them into other objects that depends on them.
Now, when we talk about plugins, we usually are talking too about dynamically loading new classes into a running app. Think on eclipse IDE. Its architecture was designed from the beginning to be "pluggable", like, you can download jars and eclipse will add them to the running application without the need of application restart.
In this case, if you want to build pluggable apps, in a sense of dynamic classloading, I'd recommend you not to go through this path, but to research subjects such as OSGI. One popular OSGI framework is http://felix.apache.org/
Another approach for application extension (we may call this pluggable too, somehow, I guess), depending on how your app is organized and what it does, is to develop a DSL (http://en.wikipedia.org/wiki/Domain-specific_language) for it and extend it letting people adding scripts to it. Isn't something like this when a browser let you add pieces of funcionality written in javascript? Groovy makes DSL easier in some aspects, for java programmers. (see http://docs.codehaus.org/display/GROOVY/Writing+Domain-Specific+Languages)
If you want dynamic plugable systems OSGI can give you this, but OSGI its IMMO a over-complicated technology, use only if you are really sure that needs this dynamic plug-ability.
Other option for builds extensible systems its use de ServiceProvider mechanism, this is a core java mechanism, for example its the one that JDBC implementations use, you can put a JDBC driver in your classpath and the application can find it and use it without needing that you explicitly import the driver classes in your code.
This is an example of using ServiceProvider in your owns applications: http://docs.oracle.com/javase/tutorial/ext/basics/spi.html#limitations-of-the-service-loader-api
Its of course more limited than OSGI, but its very easy to use when you get the idea, and you don't need any external library because its a java core mechanism.
EDIT: about the libraries.
In runtime: With ServiceProvicer there is no separate classloaders (you can implement off course, but by default, in OSGI its implemented this separation), in runtime if your plugin need X class and this class is in the classpath all is ok, the limitation its that the main application and all the plugins use this version of the dependency (guice 3 for example) and you cannot have one plugin using X version and other plugin using X+2 version if this version are not compatible. (this is the famous hell .jar, and one of the principal motivations behind jigsaw project for example).
In compile time, include the dependency in your pom, ant build file, gradle build file or whatever build system your use as usual.
I guess this is kind of a follow-on to question 1522329.
That question talked about getting a list of all classes used at runtime via the java -verbose:class option.
What I'm interested in is automating the build of a JAR file which contains my class(es), and all other classes they rely on. Typically, this would be where I am using code from some third party open source product's "client logic" but they haven't provided a clean set of client API objects. Their complete set of code goes server-side, but I only need the necessary client bits.
This would seem a common issue but I haven't seen anything (e.g. in Eclipse) which helps with this. Am I missing something?
Of course I can still do it manually by: biting the bullet and including all the third-party code in a massive JAR (offending my purist sensibilities) / source walkthrough / trial and error / -verbose:class type stuff (but the latter wouldn't work where, say, my code runs as part of a J2EE servlet, and thus I only want to see this for a given Tomcat webapp and, ideally, only for classes related to my classes therein).
I would recommend using a build system such as Ant or Maven. Maven is designed with Java in mind, and is what I use pretty much exclusively. You can even have Maven assemble (using the assembly plugin) all of the dependent classes into one large jar file, so you don't have to worry about dependencies.
http://maven.apache.org/
Edit:
Regarding the servlet, you can also define which dependencies you want packaged up with your jar, and if you are making a stand alone application you can have the jar tool make an executable jar.
note: yes, I am a bit of a Maven advocate, as it has made the project I work on much easier. No I do not work on the project personally. :)
Take a look at ProGuard.
ProGuard is a free Java class file shrinker, optimizer, obfuscator, and preverifier. It detects and removes unused classes, fields, methods, and attributes. It optimizes bytecode and removes unused instructions. It renames the remaining classes, fields, and methods using short meaningless names. Finally, it preverifies the processed code for Java 6 or for Java Micro Edition.
What you want is not only to include the classes you rely on but also the classes, the classes you rely on, rely on. And so on, and so forth.
So that's not really a build problem, but more a dependency one. To answer your question, you can either solve this with Maven (apparently) or Ant + Ivy.
I work with Ivy and I sometimes build "ueber-jar" using the zipgroupfileset functionality of the Ant Jar task. Not very elegant would say some, but it's done in 10 seconds :-)
I often read about dependency injection and I did research on google and I understand in theory what it can do and how it works, but I'd like to see an actual code base using it (Java/guice would be preferred).
Can anyone point me to an open source project, where I can see, how it's really used? I think browsing the code and seeing the whole setup shows me more than the ususal snippets in the introduction articles you find around the web. Thanks in advance!
The Wave Protocol Server is my favourite example app.
I struggled a bit with this exact issue. It's so abstract and simple I was always worried I was "doing it wrong".
I've had been using it in the main project which has dependencies on other projects because the Guice module which sets the bindings was part of the main project.
I finally realized the libraries should be supplying the Modules themselves. At that point you can depend only on an instance of a Module (not a specific one), and the interfaces that are bound by it.
Taking it one step better, you can use the new ServiceLoader mechanism in Java 6 to automatically locate and install all Guice modules available on the classpath. Then you can swap in dependencies just by changing class path (db-real.jar vs. db-mock.jar).
I understand you're in Java-land, but in the .NET space the are several open-source apps written using an inversion of control container. Check out CodeCampServer, in which the UI module doesn't have a reference to the dependency resolution module. There is an HttpModule that does the work. (an HttpModule is just a external library you can plug in that handles events in ASP.NET, in CodeCampServer the UI project loads this DependencyRegistrarModule at run time, without any compile time reference to it.)
I think dependency injection has a way of disappearing from view if used properly, it will be just a way of initializing/wiring your application -- if it looks more fancy than that you are probably looking at extra features of the framework at hand, and not at the bare-bones dependency injection.
Edit: I'd recommend actually starting to use it instead of trying to find examples, and then come back and post questions here if you can't get stuff to work like you'd think it should :-)
I want to create a Java program that can be extended with plugins. How can I do that and where should I look for?
I have a set of interfaces that the plugin must implement, and it should be in a jar. The program should watch for new jars in a relative (to the program) folder and registered them somehow.
Although I do like Eclipse RCP, I think it's too much for my simple needs.
Same thing goes for Spring, but since I was going to look at it anyway, I might as well try it.
But still, I'd prefer to find a way to create my own plugin "framework" as simple as possible.
I've done this for software I've written in the past, it's very handy. I did it by first creating an Interface that all my 'plugin' classes needed to implement. I then used the Java ClassLoader to load those classes and create instances of them.
One way you can go about it is this:
File dir = new File("put path to classes you want to load here");
URL loadPath = dir.toURI().toURL();
URL[] classUrl = new URL[]{loadPath};
ClassLoader cl = new URLClassLoader(classUrl);
Class loadedClass = cl.loadClass("classname"); // must be in package.class name format
That has loaded the class, now you need to create an instance of it, assuming the interface name is MyModule:
MyModule modInstance = (MyModule)loadedClass.newInstance();
Look into OSGi.
On one hand, OSGi provides all sorts of infrastructure for managing, starting, and doing lots of other things with modular software components. On the other hand, it could be too heavy-weight for your needs.
Incidentally, Eclipse uses OSGi to manage its plugins.
I recommend that you take a close look at the Java Service Provider (SPI) API. It provides a simple system for finding all of the classes in all Jars on the classpath that expose themselves as implementing a particular service. I've used it in the past with plugin systems with great success.
Although I'll second the accepted solution, if a basic plugin support is needed (which is the case most of the time), there is also the Java Plugin Framework (JPF) which, though lacking proper documentation, is a very neat plugin framework implementation.
It's easily deployable and - when you get through the classloading idiosynchrasies - very easy to develop with. A comment to the above is to be aware that plugin loadpaths below the plugin directory must be named after the full classpath in addition to having its class files deployed in a normal package path named path. E.g.
plugins
`-com.my.package.plugins
`-com
`-my
`-package
`-plugins
|- Class1.class
`- Class2.class
At the home-grown classloader approach:
While its definitely a good way to learn about classloaders there is something called "classloader hell", mostly known by people who wrestled with it when it comes to use in bigger projects. Conflicting classes are easy to introduce and hard to solve.
And there is a good reason why eclipse made the move to OSGi years ago.
So, if its more then a pet project, take a serious look into OSGi. Its worth looking at.
You'll learn about classloaders PLUS an emerging technolgy standard.
Have you considered building on top of Eclipse's Rich Client Platform, and then exposing the Eclipse extension framework?
Also, depending on your needs, the Spring Framework might help with that and other things you might want to do: http://www.springframework.org/