I am using Apache Felix to create an embedded OSGi host application. I am using the following code to expose the packages I want to expose:
List<String> extra = new ArrayList<>();
extra.add("some.example.packag.to.expose.1");
extra.add("some.example.packag.to.expose.2");
extra.add("some.example.packag.to.expose.3");
config.put(Constants.FRAMEWORK_SYSTEMPACKAGES_EXTRA, extra.toString().replace("[","").replace("]", ""));
Everything works great and these packages are exposed. However, I need the bundles to have access to ALL the host project declared dependencies. So for example the parent application has Jackson, Apache (various), etc. declared and I need the bundles to have access to these.
I tried adding the packages explicitly but that does not seem to do the trick when they are dependencies. So for example in the bundle I want to use Jacksons com.fasterxml.jackson.core.type.TypeReference; so I added com.fasterxml.jackson.core.type to the above EXTRA list but it does not appear to solve the problem, the package still doesn't get exposed.
In a perfect work I just want to make ALL the host dependencies available without having to explicitly state each one.
You will have to configure each package. In OSGi you would normally install the dependencies as bundles. So the settings do not suppot to mass export system packages.
Related
I have a Spring Boot application that works as expected when ran with embedded tomcat, but I noticed that if I try to run it from an existing tomcat instance that I'm using with a previous project then it fails with a NoClassDefFoundError for a class that I don't use anywhere in my application.
I noticed in the /lib directory I had a single jar that contained a few Spring annotated classes, so as a test I cleaned out the /lib directory which resolved the issue. My assumption is that Spring is seeing some of the configurations/beans/imports on the classpath due to them existing in the /lib directory and either trying to autoconfigure something on its own, or is actually trying to instantiate some of these classes.
So then my question is - assuming I can't always fully control the contents of everything on the classpath, how can I prevent errors like this from occurring?
EDIT
For a little more detail - the class not being found is DefaultCookieSerializer which is part of the spring-session-implementation dependency. It is pulled into one of the classes in the jar located in /lib, but it is not any part of my application.
Check for features provided by #EnableAutoConfiguration. You can explicitly configure set of auto-configuration classes for your application. This tutorial can be a good starting point.
You can remove the #SpringBootApplication annotation from the main class and replace it with an #ComponentScan annotation and an #Import annotation that explicitly lists only the configuration classes you want to load. For example, in a Spring boot MVC app that uses metrics, web client, rest template, Jackson, etc, I was able to replace the #SpringBootApplication annotation with below code and get it working exactly as it was before, with all functional tests passing:
#Import({ MetricsAutoConfiguration.class,
InfluxMetricsExportAutoConfiguration.class,
ServletWebServerFactoryAutoConfiguration.class,
DispatcherServletAutoConfiguration.class,
WebMvcAutoConfiguration.class,
JacksonAutoConfiguration.class,
WebClientAutoConfiguration.class,
RestTemplateAutoConfiguration.class,
RefreshAutoConfiguration.class,
ValidationAutoConfiguration.class
})
#ComponentScan
The likely culprit of mentioned exception are incompatible jars on the classpath.
As we don't know with what library you have the issue we cant tell you the exact reason, but the situation looks like that:
One of Spring-Boot autoconfiguration classes is being triggered by the presence of class on the classpath
Trigerred configuration tries to create some bean of class that is not present in the jar you have (but it is in the specific version mentioned in the Spring BOM)
Version incompatibilities may also cause MethodNotFound exceptions.
That's one of the reasons why it is good practice not to run Spring Boot applications inside the container (make jar not war), but as a runnable jar with an embedded container.
Even before Spring Boot it was preferred to take account of libraries being present on runtime classpath and mark them as provided inside your project. Having different versions of the library on a classpath may cause weird ClassCastExceptions where on both ends names match, but the rest doesn't.
You could resolve specific cases by disabling autoconfiguration that causes your issue. You can do that either by adding exclude to your #SpringBootApplication or using a property file.
Edit:
If you don't use very broad package scan (or use package name from outside of your project in package scan) in your Spring Boot application it is unlikely that Spring Boot simply imports configuration from the classpath.
As I have mentioned before it is rather some autoconfiguration that is being triggered by existence of a class in the classpath.
Theoretical solution:
You could use maven shade plugin to relocate all packages into your own package space: see docs.
The problems is you'd have face:
Defining very broad relocation pattern that would exclude JEE classes that need to be used so that container would know how to run your application.
Relocation most likely won't affect package names used as strings in the Spring Boot annotations (like annotations #PackageScan or #ConditionalOnClass). As far as I know it is not implemented yet. You'd have to implement that by yourself - maybe as some kind of shade plugin resource processor.
When relocating classes you'd have to replace package names in all relevant configuration located in the jars. Possibly also merge some of those.
You'd also have to take into account how libraries that you use, or spring uses use package names or files.
This is definitely not a trivial tasks with many traps ahead. But if done right, then it would possibly allow you to disregard what is on the containers classpath. Spring Boot would also look for classes in relocated packages, and you wouldn't have those in ordinary jars.
I have Java web application running on a web application server single node setup, in which I am using a liberary the I included in my Web-Inf and using in my code.
The issue is that I have another application that added its liberaries to the WebSphere parent lib folder, one of which are the same liberary I am using but with an older version, creating conflict and jamming my code.
The server class loader is configured Parent first unfourtunatly and I cannot change that fact. My question is, how can I make my app use my liberary, ignoring the one used by the class loader?
The solution is to move the conflicting package to a shared library, configure the library to use an isolated class loader, and associate that library with your application or module. The "isolated class loader" setting creates a separate parent-last class loader for the shared library, so you get that behavior targeted to only the artifacts that need it rather than having to apply it to the entire application or module.
https://www.ibm.com/support/knowledgecenter/en/SSAW57_8.5.5/com.ibm.websphere.nd.multiplatform.doc/ae/tcws_sharedlib.html
I'm specifically referencing the "Use an isolated class loader for this shared library" setting.
If you can't change your application server setup there are basically three things you can do:
Downgrade your application dependency to lower version used by WebSphere server and keep it in sync. This is preferable as it's least hassle.
Shade the dependency during build to your own package to prevent package clash. This can be done with Maven Shade Plugin, see Relocating Classes usage example.
Write a new custom classloader to work around the problem.
I'd try them in 1 -> 2 -> 3 order. Option 3 is possible but is an error-prone nightmare. I'd rather deploy to another server than do it.
My target platform is a WebLogic 12c application server.
I have an ear-project, which on startup requires e.g. org.apache.commons.logging.LogFactory.
I know that this class - an related classes - can be found in <WL_HOME>/modules/com.bea.core.apache.commons.logging.api_1.1.1.jar, but it is not by default available on the classspath.
In such cases - am I supposed to somehow make the jar file in <WL_HOME>/modulesavailable on the classpath - or should I provide whatever jar file I find suitable - either bundled in the application, or placed in <WL_HOME>/user_projects/domains/<mydomain>/lib?
If I am to use the one in the <WL_HOME>/modules folder - how do I configure my domain to make it available?
To me it seems reasonable that the jar files in the modules folder should be considered provided dependencies, but so far I have been unable to find the right way to enable them as such - I have been browsing for an answer for hours:-)
UPDATE:
I know I can simply add them to the CLASSPATH variable in the server startup script - my question is more like - should I? Is there a better way - or should I completely forget about <WL_HOME>/modules?
That's a short-sighted approach.
you need to reboot the server to upgrade libraries
every app on the server must be okay with those libraries in their claspath
Weblogic has the concept of Shared JEE Libraries (example). In short, you add extra lines to MANIFEST.MF and configure the jar differently, then you can reference it in other apps using weblogic-application.xml or whatever.
The point is that you can upgrade the library without restarting the server (provided you gave it a version like 1.1 (there were bugs last time I named it 1.1.1 - it needed to be able to cast it to a floating-point number to seamlessly upgrade)).
If you just want to include some libraries but not share them outside the app, then just specify the correct <prefer-application-packages> or < prefer-web-inf-classes> element, depending on whether you have an EAR or a WAR.
if i deploy a war-file into a gemini container (e.g. virgo has one) it will be transformed on-the-fly into an osgi bundle by adding some package imports (besides other things).
Is it possible to somehow extend these default package imports using for example a bundle-listener or something like this?
regards
I would strongly recommend that you do the transformation yourself before deploying into the Gemini container, rather than forcing Gemini to do the transformation on-the-fly. First it is very easy to do; second it will be much faster to deploy; third you will be able to add the specific imports that you want.
In order to turn a standard WAR file into a WAB (Web Application Bundle) that remains compatible with traditional WAR deployment, you just need to add the following headers to the MANIFEST.MF of the WAR:
Web-ContextPath to define the context path under which the web application will be served
Set Bundle-ClassPath to WEB-INF/classes plus any JARs under WEB-INF/lib. You will have to name these explicitly e.g.: Bundle-ClassPath: WEB-INF/classes,WEB-INF/lib/a.jar,WEB-INF/lib/b.jar...
Import-Package: javax.servlet,javax.servlet.http plus anything else you want to import.
We have the following scenario with our project:
A core web application packaged as a
war file (call it Core project).
The need to "customize" or "extend" the core app
per customer (call it Customer project). This mostly includes
new bean definitions (we're using
Spring), ie. replacing service
implementations in the core.war with
customer-specific implementations.
We want to develop the Core and Customer projects independently
When the Customer project is developed, we need to be able to run/debug it in Eclipse (on Tomcat) with the Core project as a dependency
When the Customer project is built, the resulting war file "includes" the core and customer projects. So this .war is the customer-specific version of the application
I'm looking for suggestions as to the best way to do this in terms of tooling and project configuration.
We're using Ant currently, but would like to avoid getting buried in more ant. Has anyone done this with Maven?
I've seen a lot of posts on how to build a web application that depends on a java application, but nothing on a web application depending on another web app.
Thanks!
Sounds like Maven WAR overlay does what you want.
In Eclipse there is a "native" WTP way to do this. It mainly using linked folders and a little hack in .settings/org.eclipse.wst.common.component file. You can read the article about it at http://www.informit.com/articles/article.aspx?p=759232&seqNum=3 the chapter called "Dividing a Web Module into Multiple Projects". The problem with this is that the linked folder must be relative to some path variable can be defined in Window/Preferences/General/Workspace/Linked Resources tab. Otherwise the linked folder definition (can be found in .project file in project root) will contain workstation specific path. The path variable practicly should be the workspace root. This solution works great with WTP, deploy and everything else works like it should.
The second solution is to use ant for it. Forget it. You will deeply regret it.
The third solution is to use maven for it. You can forget the comfort of WTP publishing if you dont do some tricks. Use war overlays like others suggested. Be sure to install both m2eclipse, m2eclipse extras. There is an extension plugin released recently, that can help you. Described at this blog. I did not try it, but looks ok. Anyway Maven have nothing to do with linked folders, so I think even the first solution and this maven overlay can live together if necessary.
As for headless builds you can use HeadlessEclipse for the first solution. It is dead (by me) now, but still works :). If you use the maven overlay + eclipse stuff, headless builds are covered by maven.
This is little bit more involved but at a high-level we do it as below. We have the core platform ui divided to multiple war modules based on the features (login-ui,catalog-mgmt-ui etc). Each of these core modules are customizable by the customer facing team.
We merge all of these modules during build time into 1 single war module. The merge rules are based on maven's assembly plugin.
You usually start from the Java source code. WARs don't include the Java source code, just the compiled classes under WEB-INF/classes or JARs under WEB-INF/libs.
What I would do is use Maven and start a brand new empty webapp project with it: http://maven.apache.org/guides/mini/guide-webapp.html
After you have the new empty project structure, copy the Java source code to it (src/main/java) and fill out the dependencies list in pom.xml.
Once you've done all this you can use mvn clean package to create a standard WAR file that you can deploy to Tomcat.
You might want to look into designing your core app with pluggable features based on interfaces.
For example say your core app has some concept of a User object and needs to provide support for common user based tasks. Create a UserStore interface;
public interface UserStore
{
public User validateUser(String username, String password) throws InvalidUserException;
public User getUser(String username);
public void addUser(User user);
public void deleteUser(User user);
public void updateUser(User user);
public List<User> listUsers();
}
You can then code your core app (logon logic, registration logic etc) against this interface. You might want to provide a default implementation of this interface in your core app, such as a DatabaseUserStore which would effectively be a DAO.
You then define the UserStore as a Spring bean and inject it where needed;
<bean id="userStore" class="com.mycorp.auth.DatabaseUserStore">
<constructor-arg ref="usersDataSource"/>
</bean>
This allows you to customise or extend the core app depending on specific customer's needs. If a customer wants to integrate the core app with their Active Directory server you write a LDAPUserStore class that implements your UserStore interface using LDAP. Configure it as a Spring bean and package the custom class as a dependant jar.
What you are left with is a core app which everyone uses, and a set of customer specific extensions that you can provide and sell seperately; heck, you can even have the customer write their own extensions.