Incremental deployment of java web applications - java

We have following problem. Developers frequently need to make small changes to our web applications. When I say small, I mean things like correcting the spelling on a web page or similar. Generating and redeploying war archives can be slow and costly in such scenarios.
How could we automate and install changes incrementally? For example, generate new exploded war, compare files with exploded war in production and then replace in production only the files affected by change: .jsp .html .class etc.
This need not be hot deployment, it’s ok to restart the server. What I wish to avoid is having to copy and deploy wars that can be 80Mb in size. Sometimes connections are slow and making such minuscule change to web application as simple spelling correction can take hours.
We use Maven to automate our build process. The key issue is to automate the whole process, so that I can be sure that app v2.2.3 in my Subversion is exactly what I have in production after incremental deployment.

We used to do this sort of thing all of the time. We worked in a bank, and there were sometimes changes to legal phrases or terms and conditions that needed to be changed today (or more usually yesterday).
We did two things to help us deploy quickly. We had a good change control and build process. We could change and deploy any version we liked. We also had a good test suite, with which we could test changes easily.
The second was more controversial. All of our html was deployed as separate files on the server. There was no WAR. Therefore, when the circumstances came up that we needed to change something textual quickly, we could do it. If java needed changing, we always did a FULL build and deploy.
This is not something I'd recommend, but it was good for our situation.
The point of a WAR is so that everything gets deployed at the same time. If you're using a WAR, that means you want it to be deployed all at once.
One suggestion is not to do such corrections so often (once a week?). Then you don't have so much pain.

Hard to say. You can ofcourse replace single class files in an exploded webapp, but this is generally a bad idea and you don't see many people doing this.
The reason is that when you make small changes it becomes harder and harder to detect differences between production and development. The chances of you sending a wrong classfile and breaking the production server increases over time.
When you say text changes, isn't it an idea to keep the text resources seperate from the war file? That way, not only developers but maybe even the customer can easily add/change translations.
To the customer it's important, but technically it's silly to do a 80MB deploy over a slow line to fix a small typo.
You can also try to look at your build/delivery cycle and increase testing efforts to prevent these small changes.
Hope this helps.

You can have the master war deployed somewhere the running servers can access it, and instead of deploying war files to the individual servers you can use rsync and perl to determine if there are changes to any files in the master war, distribute them to the servers and execute restarts.

diff and patch:
http://stephenjungels.com/jungels.net/articles/diff-patch-ten-minutes.html

At the moment I installed SVN on the remote server so in case of a simple udate you can just update single file. Transfering the big WAR file would be quite impractical.
You can automate to a single click deployment using putty / plink [if you are using windows] by creating a simple script on the local machine an another one in the remote machine.
At the moment I have a DEVELOPMENT SVN and a LIVE SVN. The ANT build is merging the DEV to LIVE and the commit again back to the LIVE repository. At that stage the remote server can do a SVN UP and you will get automatically the file requested.
You can furter improve the update script to restart the server in case some classes are changed and do not restart in case of updating scripts/JSP.
In this way you will have also the option to rollback to a previous version to be sure that you have a working web app all the times.
To improve the process of merging SVN this tool is quite useful. : http://www.orcaware.com/svn/wiki/Svnmerge.py

The usual answer is to use a Continuous Integration sstem which watches your subversion and build the artifacts and deploy them - you just want your web application to be abel to work even after being redeployed. Question is if that is fast enough for you?

I don't think there's a straightforward answer to this one. T
The key here is modularisation - a problem which I don't think is solved very well with Java applications at present. You may want to look at OSGi or dynamic modules lathough I'm not sure how effective they are in terms of this problem.
I've seen solutions where people drop classes into application server/servlet container, I don't agree with it, but it does appear to work... I'm sure there are horror stories though!
Maven certainly makes things easier by splitting applications into modules, but if you do this and deploy modules independently you need to make sure that the various versions play nice together in a test environment to begin with...
An alternative is to partition your application in terms of functionality and host separate functions on various servers, e.g:
Customer Accounts - Server A
Search - Server B
Online Booking - Server C
Payment Services - Server D
The partitioning makes it easier to deploy applications, but again you have to make sure that your modules play nicely together first. Hope that helps.

I have had a similar situation before. It really is a separation of concerns issue, and it's not too straight forward. What you need to do is separate the text from the template/HTML page.
We solved this by placing our text in a database table, and using the table as a message resource - the same way people use myMessages.properties for internationalization (i8n). This gives you two advantages, you can i8n the text, and make changes in prod instantly and easily without a code deployment. We also cached the table to ensure performance didn't suffer much at all.
Not a solution for all, but it did work really well for us.

Related

Multiple versions in a web application: duplication or messy code?

I was used to manage versions with a tag in Git. But that was a long time ago, for stand-alone applications. Now the problem is that I have a web application, and at the same application might connect clients that expect to communicate to different versions of the application.
So, I added to the input a path variable for the version in that way :
#PathParam("version") String version
And the client can specify the version in the URL:
https://whatever.com/v.2/show
Then across the code I added conditions like this:
if(version.equals("v.2") {
// Do something
}
else if(version.equals("v.3") {
// Do something else
}
else {
// Or something different
}
The problem is that my code is becoming very messy. So I decided to do in a different way. I added this condition only in one point of the code, and from there I call different classes according to the version:
MyClassVersion2.java
MyClassVersion3.java
MyClassVersion4.java
The problem now is that I have a lot of duplication.
And I want to solve this problem as well. How can I do now to have a web application that:
1) Deal with multiple versions
2) It is not messy (with a lot of conditions)
3) Doesn't have much duplication
Normally, when we speak of an old version of an application, we mean that the behavior and appearance of that version is cast in stone and does not change. If you make even the slightest modification to the source files of that application, then its behavior and/or appearance may change, (and according to Murphy's law it will change,) which is unacceptable.
So, if I were you, I would lock all the source files of the old version in the source code repository, so that nobody can commit to them, ever. This approach solves the problem and dictates how you have to go about everything else: Every version would have to have its own set of source files which would be completely unrelated to the source files of all other versions.
Now, if the old versions of the application must have something in common with the newest version, and this thing changes, (say, the database,) then we are not exactly talking about different versions of the application, we have something more akin to different skins: The core of the application evolves, but users who picked a skin some time ago are allowed to stick with that skin. In this case, the polymorphism solution which has already been suggested by others might be a better approach.
your version number is in a place in the URL named the 'Context Root'.
You could release multiple different WAR files each of which is configured to respond on different Context Roots.
So one war for version 1, one war for version 2 etc.
This leaves you with code duplication.
So what you are really asking is, "how do I efficiently modularise Java web applications?".
This is a big question, and leads you into "Enterprise Java".
Essentially you need to solve it by abstracting your common code to a different application. Usually this is called 'n-tier' design.
So you'd create an 'integration tier' application which your 'presentation' layer war files speaks to.
The Integration tier contains all the common code so that it isn't repeated.
Your integration tier could be EJB or webservices etc.
Or you could investigate using OSGi.

Can you remove sakai core tools you don't want/need?

Something I've been wondering recently, is it possible to actually "remove" core tools from the sakai vanilla build without a huge effort (editing loads of config files)?
I know about stealthing tools (https://confluence.sakaiproject.org/display/DOC/Provisional+Tools) and I "think" there's some way to "disable" tools (or is that just stealthing?), but simply to remove the possibility of potential problems and lower the service memory footprint + startup time it would be nice if there was a supported means to "not have X Y or Z tool in the service at all".
I've never tried just removing jars to see what happens, but I suspect that mightn't be a good idea and probably needs to be compiled up with tools deployed to the webapp directory, which I would think means changing a whole load of maven files to do a "mvn clean install sakai:deploy" that would be lighter.
The Sakai architecture is actually more akin to a lot of loosely (or tightly in some cases) coupled tools than a unified system. This is an advantage from the perspective that you can do exactly the thing you want to do here. It is a disadvantage from a unified user experience perspective (though that is not an architectural limitation but rather a side effect of how the tool teams were run early on in the project).
If you want to remove a tool (like Samigo for this example) then you can simply delete the war file (and directory) related to it from your TOMCAT_HOME/webapps directory. Run this from your tomcat home directory:
rm -rf webapps/samigo-app*
When you startup tomcat, the tool will not be loaded and things will work fine (assuming there is not another tool or part of Sakai that expects that one to be there). Some tools like resources (sakai-content-tool) should not be removed for that reason (though stealthing them would be fine).
Please note that only removing the tool will not save you as much as you might hope since there is also a service related to most tools which lines in TOMCAT_HOME/components. The service component is actually an exploded war file (basically the same as the tool webapp) but it has not interface and has to follow some Sakai conventions in order to load properly. In the case of Samigo again, you could remove it like so (running from your tomcat home):
rm -rf components/samigo-pack
You should NOT do this while the system is running. You should also NOT remove the API jars from shared.
When you restart Sakai after removing the component you will see a more significant drop in the resources since the tool service is no longer loaded in memory and initialized. I saw about 5 second reduction in startup time (90s to 85s) and about a 25MB reduction in JVM memory used (from 795 to 770) by removing Samigo and it's service.
Your best bet would be to "trial and error" out the optimal solution for your situation and try removing a tool and it's service (if it has one) and seeing if things startup without errors and if the tools you do use work as expected.
Also, please note that removing a tool will NOT remove the tool pages in existing courses. You will end up with a page which simply displays nothing (because Sakai sees it as an empty page in the course now). If you add the tool back into the system then it will appear on the page again.
UPDATE: If you want to remove the blank tool page there is one easy option. The easy option is to just go into the site and remove the page the tool is on. This can be done from the Sites admin tool.
Alternatively, you could go into the database and remove all the pages which contain the specific tool id. This is pretty risky though so I don't recommend it.
Generally, the removal of a tool like this would happen before the tool is used in production so hopefully this is a rare case.
After fairly extensive testing, this is what I've found you can remove related to VLE functionality, this probably won't apply to that many but it's useful if you purely want collaborative tools (for running a research VRE, or just a slimline tool provider):
Under tomcat webapps...
samigo (also make sure you remove the samigo folder under < tomcat root >/sakai/samigo)
presence (make sure you turn off presence in sakai.properties too though!)
sakai-podcasts
podcasts
lessonbuilder
osp (all of it i.e delete all wars referencing osp-*)
sakai-signup-tool (we don't have a need for this, but you might)
citations
polls-tool
sakai-gradebook-tool (DO NOT remove sakai-gradebook-testservice!)
grades-rest
dav (assuming you don't use webdav, make sure to turn off webdav in sakai.properties, we use shibboleth for SSO, so we can't currently use webdav... also the advent of the multi drag n drop + zip archive files/folders in resources has made webdav doubly needless)
sakai-syllabus-tool
sakai-reset-pass (Again, we use shibboleth SSO, so don't need password reset functionality)
DO NOT remove sakai-assignment-tool
sakai-postem
Under tomcat components...
samigo
presence
sakai-podcasts
lessonbuilder-components
osp (all of it)
sakai-signup
citations
polls-tool
(I haven't fully tested this, but it seems wise to NOT remove grade related directories)
sakai-syllabus-tool
DO NOT remove sakai-assignment-tool
Removing this stuff reduced my startup time by a couple of minutes and reduced my memory footprint on the server too (don't have exact figures for that)

Best place for Jars

A quick question for best practice please if anyone can help.
I am about the implement my project onto a webserver running tomcat. This will host quite a few domains which are mainly static with just HTML code. Mine however includes a database connector and also some JAX jars.
My question is
For best practice is it better to put the .JAR files into $TOMCAT_HOME/lib so that they are avaialable to all webapps, (maybe used by others in the future), or should I keep them in the WEB-INF/lib folder which is webapp specific. If I then build another webapp that uses these JARs I would then have to duplicate them in the WEB-INF/lib folder for that new webapp.
I know it would work either way but what is best practice please.
I would tend to keep them per-webapp. That gives you the opportunity to upgrade one webapp without having to touch the others. So you can roll out fixes etc. to one app without having to rebuild/retest the others.
An application server like Tomcat is designed to be able to isolate webapps from one another, enabling them to change independently, use different versions of the same libraries etc. Unless you are absolutely sure that your apps will always need the same version, and will always be ready to upgrade simultaneously (or never) to newer versions, then keeping them per-webapp makes more sense.

How to organize staging deployment of Java web app?

We have a Java web app, and a number of developers working with it. Every developer is working with his/her own feature, in its own branch. When the feature is ready - we want to review it and visually test (after all unit and integration tests are passed, of course). We want to automate this process of deployment. Ideally, we would like to let our developers click just one button somewhere to make the application deployed to http://example.com/staging/branches/foo (where branches/foo is developer's path in SVN repository).
Then, the deployment is reviewed (by project sponsors mostly), merged into /trunk, and removed from the staging server.
I think that I'm not the first one who needs to implement such a scenario. What are the tools and technologies that may help me?
Typically, I would use a stage environment to test the "trunk" (ie all the individual branches for a release merged together). Several reasons for this:
Stakeholders and sponsors usually don't have time to test individual branches. They want to test the entire release. It also tend to get very confusing for people not inside the immediate team to keep track of different, changing URLs and understanding why feature X works on one URL and not the other. Always keep it simple for your sponsors.
It tends to become very messy and costly to maintain more than one instance of third-party dependencies (databases, service providers etc) for proper stage testing. Bear in mind that you want to maintain realistic test-data at all times.
Until you merge all individual branches together for a release, there will be collisions and integration bugs that will be missed. Assume that automated integration tests won't be perfect.
All that being said, there are lots of good tools for automatic build/deploy out there. Not knowing anything about your build setup and deployment environment, a standard setup could consist of a build-server, maven and tomcat. The build-server would execute the build and deploy the resulting appplication to the test-server. If you are using maven and tomcat, there is a plugin available for this task (http://mojo.codehaus.org/tomcat-maven-plugin/introduction.html). There are a number of good build-servers out there as well with good support for maven. Teamcity is popular, as is Hudson CI.
Basically you can use Hudson/Jenkins.
There are ways to manage have multiple deployments on one machine with some plugins, as stated on the following post on Jenkins Users, you'll just have to manage those multiple deployments to be the branches the developers are working on.
As #pap said, Hudson and other CI software build, test (if you have any tests in it) and deploy webapps, you'll just have to configure this procedure. Hope the link is helpful.

Java EE Jar file sharing

At our shop, we are maintaining roughly 20 Java EE web applications. Most of these applications are fairly CRUD-like in their architecture, with a few of them being pretty processor intensive calculation applications.
For the deployment of these applications we have been using Hudson set up to monitor our CVS repository. When we have a check-in, the projects are set to be compiled and deployed to our Tomcat 6.0 server (Solaris 10, sparc Dual-core 1.6 GHz processor, 2 GB RAM...not the beefiest machine by any stretch of the imagination...) and, if any unit-tests exist for the project, those are executed and the project is only deployed if the unit-tests pass. This works great.
Now, over time, I've noticed myself that a lot of the projects I create utilize the same .jar files over and over again (Hibernate, POI (Excel output), SQL Server JDBC driver, JSF, ICEFaces, business logic .jar files, etc.). Our practice has been to just keep a folder on our network drive stocked with all the default .jar files we have been using, and when a new project is started we copy this set of .jar files into the new project and go from there...and I feel so dirty every time this happens it has started to keep me up at night. I have been told by my co-workers that it is "extremely difficult" to set up a .jar repository on the tomcat server, which I don't buy for a second...I attribute it to pure laziness and, probably, no desire to learn the best practice. I could be wrong, however, I am just stating my feelings on the matter. This seems to bloat the size of our .war files that get deployed to the server as well.
From my understanding, Tomcat itself has a set of .jar files that are accessible to all applications deployed to it, so I would think we would be able to consolidate all of these duplicate .jar files in all our projects and move them onto the tomcat server. This would involve only updating one .jar file on the server if, for example, we need to update the ICEFaces .jar files to a new version.
Another part of me says that by including only one copy of the .jar files on the server, I might need to keep a copy of the server's lib directory in my development environment as well (i.e. include those .jar files in eclipse dependency).
My gut instinct tells me that I want to move those duplicated .jar files onto the server...will this work?
I think Maven and Ivy were born to help manage JAR dependencies. Maybe you'll find that those are helpful.
As far as the debate about duplicating the JARs in every project versus putting them in the server/lib, I think it hinges on one point: How likely is it that you'll want to upgrade every single application deployed on Tomcat at the same time? Can you ever envision a time where you might have N apps running on that server, and the (N+1)th app could want or require a newer version of a particular JAR?
If you don't mind keeping all the apps in synch, by all means have them use a common library base.
Personally, I think that disk space is cheap. My preference is to duplicate JARs for each app and put them in the WAR file. I like the partitioning. I'd like to see more of it when OSGi becomes more mainstream.
It works most of the time, but you can get into annoying situations where the jar that you have moved into tomcat is trying to make an instance of a class in one of your web application jars, leading to ClassNotFoundException s being thrown. I used to do this, but stopped because of these problems.
I really don't think putting libraries in common/lib is a good idea. The idea behind the use of war files as applications into a servlet container, is to have a real idea of isolation between your webapps. You could face errors like deploy some third party WAR (with it own libraries inside WEB-INF/lib) and it behave unexpectedly because it loaded other version of one of it libraries from the common one (remember that the regular behavior for load classes is first look at the common classloader and if you don't find the class look into the one for your webapp). Don't even mention how painful could be to move some application to other servlet container or an Application Server.
As mentioned before, you could use maven to deal with jar dependencies, and if you like the homogeneous use of libraries, define a POM parent (maven jargon) across all your applications.
In my experience you should be very careful with sharing libraries between web applications by moving them into the web container itself.
Let them live in WEB-INF/lib so your wars are self contained (you WILL be glad you did one day).
What you might consider is employing maven or Ant Ivy to pull in library jars from a common repository instead. This is very useful and should not be a problem in your scenario.
Edit: A notable exception is the Metro library - web service layer from Glassfish - which needs to be in the web container and not in the web application.

Categories