My web.xml is different in devel and production environments. For example in development environment there is no need in security constraints.
Typically I deploy new application version as follows:
Export Eclipse project to WAR.
Upload WAR to the server.
Redeploy.
The problem is that I have to manually uncomment security constraints in web.xml before exporting.
How do you solve this problem?
I also met an opinion in some articles that "web.xml is rarely changed". But how can web.xml not change if it is exported to WAR on every update?
Thanks in advance!
If you can't use the same web.xml during development, I would automate the build process, use two web.xml and bundle the "right" one at build time depending on the targeted environment as Brian suggested. But, instead of Ant, I'd choose Maven because it will require less work IMHO and it has a built-in feature called profiles that is perfect to manage environment specific stuff like here.
In other words, I'd put the build under Maven 2 and use a production profile containing a specific maven-war-plugin configuration to build a WAR containing a web.xml having the required security constraints. Another option would be to merge the development web.xml (cargo can do that) to add the security-constraints but this is already a bit more "advanced" solution (a bit more complex to put in place).
I would create a development and production deployment with different web.xml configs. Automate the building/maintenance of these via your build (Ant/Maven etc.) to keep control of the common elements required.
I had to solve this problem many times in the past, and ended up writing XMLTask - an Ant plugin which allows the modification of XML files without using normal text replacement (it's a lot clever than that) and without having to mess with XSLTs (it's a lot simpler than that). If you follow the above approach you may want to check this out. Here's an article I wrote about it.
Assuming that you're stuck with the idea of web.xml changing before deployment to production, then my likely approach would be to run the development web.xml through a simple XSL transform which "decorated" the web.xml with your production-only elements, such as security constraints. Assuming that you can hook this step into your build process, then the production-ready web.xml should appear during your export process.
However, it is generally a good idea not to have different web.xml across environments, it devalues your testing. Having the same value in all environments will reduce the risk of bugs appearing only in your production environment.
I converted my project to be built using ant. The starting point was just this build.xml http://tomcat.apache.org/tomcat-6.0-doc/appdev/build.xml.txt
The above build doesn't have the feature of copying in a different web.xml(based on e.g. a property set when building), but you'll learn how to do that when get a bit into ant, should be pretty easy.
As a nice side effect, deploying to a remote tomcat is now just a couple of clicks away from within Eclipse instead of Export->war and manually copying it to the server.
I would add the necessary infrastructure to allow a mechanical build, with ant or maven.
When THAT is done you can have your mechanical build create two targets, one for test and one for production.
You should however, strongly consider testing the same code as you have in production. You will be bitten otherwise.
I believe having a single war that works in multiple environments is a superior solution than baking a new one with a profile option per dev, qual, and prod. It is super annoying there is not a better mechanism to get an environment variables directly in web.xml without using a library like spring.
One solution to web.xml environment configuration given that your environment customization is related to filter init params, such as:
<filter>
<filter-name>CAS Filter</filter-name>
<filter-class>edu.yale.its.tp.cas.client.filter.CASFilter</filter-class>
<init-param>
<param-name>edu.yale.its.tp.cas.client.filter.loginUrl</param-name>
<param-value>https://<foo>:8443/login</param-value>
...
The particular filter class referenced above (CASFilter) is public. This means you can extend it will a custom adapter that adds in your environment configuration. This allows you to stay out of that nasty web.xml file.
Related
I just started to develop a java web application based on the ninjaframework. Everything works great, but: With all the ninja-dependencies, the deploy-war has around 25MB. I really hope, I won't have to upload a 25MB java archive all the time - especially due to the fact, that the dependencies won't barely change as often as e.g. a stylesheet of my app.
Is there a practical solution to move the ninjaframework-dependencies to a separated jar? I am working with eclipse, therefore a solution that integrates in the IDE would be great.
So far, I have had a look into the maven dependency-scoping and have (unsuccessfully) tried to move the dependencies into a separated project and refer to the project with a system-scoped dependency (which I would in my understanding be able to deploy as a separated jar file). I currently fail at building this dependency-jar with maven - but I also wonder, if there are better approaches.
I deploy the application on a tomcat-server in a plesk installation
Another option would be to exclude libraries that you don't use. For instance if you don't use JPA you can safely exclude it from the build via Maven's xml tag.
Background: Ninja 4 potentially bundles too many libraries by default. That's cool, because everything will work out of the box without thinking about libraries needed. The downside is that the jar/war may be too big for what you want to do. There are discussions on the way to make Ninja more modular - feel free to chime in on our mailing list :)
But as written above - you can cut Ninja's bundle down yourself using Maven's exclude.
If you have to use all the dependencies, there is no way to avoid deploying them with your application.
You don't tell if you are deploying into a container (maybe Tomcat). If you do, you can try to deploy the needed libraries into the container and set the Maven scope to provided to avoid redeploying the libraries.
Having the libraries provided by the container has benefits, but it can also be a burden. Depends strongly on your deployment and operation processes.
This may be a very rudimentary question, but please help me out if this is well-known and has been solved elsewhere.
I have a multi-war setup (all maven modules) say kilo-webapp1 and kilo-webapp2 as two WARs that I need to deploy on a Tomcat instance. These two webapps both use services from a common service jar, say kilo-common-services.jar. The kilo-common-services.jar has its own spring context that is loaded by the users of the jar viz. kilo-webapp1 and kilo-webapp2 in this case. It so happens that the initialization of the services in kilo-common-services takes a long time and hence I want it to happen only once (to ensure that the time it takes to bring up the instance is not very high) which also helps me to use it as a second level cache that it kept current in the JVM instance. To do this, we resorted to the following steps:
Modify the catalina.properties of CATALINA_BASE in tomcat to have shared.loader as ${catalina.base}/shared/lib
Copied the kilo-common-services.jar and all of its dependent jars to the CATALINA_BASE/shared/lib. [Manual step]
Copy spring related jars to the CATALINA_BASE/shared/lib location [Manual step]
Created a beanRefContext.xml file in kilo-common-services.jar. Define a new ClassPathXmlApplicationContext here, where the constructor was provided with the location to the spring context file for the common services.
Noted the dependency scope of kilo-common-services.jar and every other dependency (like Spring related jars) as provided in the kilo-webapp1 and kilo-webapp2 pom files. For Spring this is needed to ensure that the classpath scanning actions are not triggered twice. Also this causes different ClassCastExceptions (for log4j lets's say) if not excluded via the provided scope.
web.xml for kilo-webapp1 and kilo-webapp2 indicated that the parentContext for them is the servicesContext defined in kilo-common-services.jar.
I was able to verify that only one instance of the services of kilo-common-services exist, but the setup as you might have imagined is painful. If someone has best practices about such a setup in an IDE like Eclipse, would really appreciate it. My problems are as below:
#2 is becoming a challenge. I am currently running mvn dependency:copy-dependencies on kilo-common-services to copy dependent jars from target/dependency to the shared/lib which is a woefully manual step. Time and again, I forget to regenerate dependencies and have to do a redeploy again.
#3 is also not straight-forward as time and again there are newer common dependencies and we always have to remember to copy it to shared lib to avoid ClassCastExceptions
#5 is again a maintenance nightmare.
Also as time progresses, there will more such disparate common jars that need to be shared and it would involve pain for each of those jars. Feel free to critique the setup and propose a better one in its place that may be easy to use (from an IDE as well). Would be happy to provide any other details.
Thanks in advance!
The problem is that your architecture is broken (and that's why you're struggling with the solution). You have two solutions:
1) If you want to share a service that takes a long time (to initialise) between two war applications, make that a separate service completely and access it via rest or any kind of remoting.
2) Merge both webapps into one.
Having the common library is the shared lib folder is going to bring you lots of headaches, and you'll end up rolling it back.
My (personal) approach would be to merge both applications, but keep the packages separate enough and have separate spring configurations. In this way, at least you still keep the logic separation of both webapps.
Also since both run on the same container, there's little gain from having 2 separate wars (unless you're planning to move them to different containers very soon).
About the IDE, you can use the maven-cargo-plugin to start up a tomcat with several web applications with (almost) any configuration you want.
We are developing restful soa, with spring and tomcat and utilizing Domain Driven Design (well thats the plan anyway). There is migrationProject and a initial basic search service. Two separate WAR files, with two separate POMs. Both utilize the same Domain objects.
So I will have separate project that will be just the DomainObjects I will wrap them up into a jar, and then using maven and/or jenkins it will deploy automatically (whenever I configure (for example when pushed to a specific repository).
Having two copies of the same jar, sounds like a much worse idea to me. Its not your architecture that is broken, its your deployment and development process thats needs improvement, imho.
(my kind of related question).
Our long term plan is to have one project as the restful interface, with multiple Controllers that have service classes and repositories injected into them from their dependencies.
I have an ant script that I use to build my J2EE application and create jar files. The problem is the following: Two jar files are necessary for the application to run.
commons-math-2.0.jar
commons-math-1.0.jar
However, I want to only use the 2.0 for a particular package inside the application with the rest of the application using 1.0. How can I build the application to only use the 2.0 version for example with a package name such as com.naurus.eventhandler.risk? Again, I'm using an Ant script, but if there's an easier way to do this sort of thing I'm willing to experiment. Thanks!
If the two jars contain different classes/packages there should be no problem to have all of them in the application classpath. It is then a matter of discipline not to use the classes from the one jar in the other package.
However I guess these two jars contain mostly the same classes/methods? There are many ways of using different versions of the same classes:
Using different ClassLoader instances. I would not qualify it as "easy", far from it means opening the door to a bunch of nasty bugs. (can be helped using a tool like OSGi)
Splitting the application in two processes, these process being launched in the same Ant target and using any mean (CORBA, RMI, REST, etc.) to communicate.
I would not advise using any of these methods though. It would probably be simpler to make all your packages use the same version. Is there any specific difficulty in doing so?
That will be problematic since both JAR files will end up in the same classpath when you deploy your J2EE application. You could achieve what you are trying to attempt with OSGI bundles, which allow each package to have separate dependencies. However, that is a relatively large refactoring of your application.
IMO, it would be best to either:
a) Duplicate the features you need from 2.0 (if the number is small and the license allows it, e.g., package individual classes).
or
b) Spend the time to upgrade the entire application to 2.0
You could use the manisfest in your jar to define the classpath.
http://docs.oracle.com/javase/tutorial/deployment/jar/manifestindex.html
Although honestly it seems a bit convoluted, but it is your requirement.
I would like to develop a web application in Java/Spring/Hibernate serving as a business platform that I could connect plugins to (i.e. CRM plugin, ware plugin, sales plugin). Those plugins could be dependent on other plugins in a tree manner. The parent project would be packaged as war having all the basic configuration and looks (Spring configs, CSS, scripts), ready-to-go user and group management, security settings, etc.
Altogether, I would like it to behave and look a bit like Joomla, but be built using different tools for different purposes. And I have a few questions concerning that project:
Do you know of any open source projects offering such a platform ready to go?
If not, is Maven applicable for managing those plugins?
What is the best way to package and deploy those plugins?
And last but not least, is this the right way to go, or is it a dead end? Would it be better to create a separate web app for those business needs?
There are lots of ways to build plugin modules.
Some Ideas:
You could package every plugin module as a jar and in the classpath root of this jar, put a spring config file with the beans configuration, so if when you are using a specific plugin. You can "turn on" the beans of this package on a web application by simply adding this file to the contextConfigLocation parameter in your web.xml:
<listener>
<listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>
classpath:module1.xml
classpath:module2.xml
classpath:module3.xml
classpath:module4.xml
</param-value>
</context-param>
So you are able to use those beans in your web application. Another way of doing this, would be to use a more annotations driven approach. Or you can mix the methods.
Some time ago, I structured a way to automatically hot-detect (in execution time, without having to restart the application) plugins in a desktop application by detecting all implementations of a specific abstract class (a contract) in the classpath. So, all that I had to do to build a new plugin, was to implement this "contract". I've used some "classloader" goodies to do this.
When talking about "modules" maybe you would like to see something about OSGi
Well... those are some ideas. I hope it helps in any way. ;-)
I think this is a fine way to design a web application, depending on the requirements. I use plugins because I had several clients using the same codebase with different requirements. If you are developing for one installation, I would say don't waste your time.
Now for the how-to. "Plugins" are a very vague concept. I've used plugins
to intercept method calls
to run background processes
to add additional views in my web application
The question is now, how does this work. The method interceptor works using a org.aopalliance.intercept.MethodInterceptor. The background processors use a TimerTask. The additional views in the web application use Spring MVC routing.
My plugins are packaged as JARs and discovered at application startup time as Spring application contexts.
This is all very general, but might give you some ideas to go off of.
Do you know of any open source
projects offering such a platform
ready to go?
Have a look at Spring Roo
If not is maven applicable for
managing those plugins?
Yes, it is. Check out how AppFuse uses it.
What is the best way to package and
deploy those plugins?
Again, check how Spring ROO or AppFuse does it.
Hope that helps.
*
And last but not least, is this the right way to go, or is it a dead
end? Would it be better to create a separate web app for those
business needs?
*
I have negative experiences in area modularisation with JPA. For example #Entity Customer is included in CRM module, but is intensively used from other. First natural idea one module = own persistence unit is very hard to realise, JPA should be across modules, and modularisation idea is gone to dead end, modules are not separated.
I use kind of modularisation "in process & in JAR", kind of structures is build, some menus / entities etc belong to "modules" in lighter sense
When I'm deploying a WAR (or EAR) to an application server I have to be sure that the environment (everything around the AS) is ready for my application. Is it possible to instruct AS to execute certain Java classes right after deployment, and report a deployment problem if one of them reports a failure?
Implement ServletContextListener and register it with <listener-class> inside your web.xml
What if it's EAR without web.xml at all? I mean, is there any more generic approach?
I'll put my answer back then :) To my knowledge, there is nothing standardized in Java EE for that so the answer is "it depends on what your application server has to offer". For example, with WebLogic you can create ApplicationLifecycleListener classes.
Depending on the complexity of the checks you want to perform, it might be simpler to create some kind of status page deployed as part of the application and check it after deployment (that you could poll later regularly to check the health of your app).
For complex needs, using a real monitoring solution might be a better choice.