I know there is a lot of information here... but there may be other people with problems like this, and I think it would be a great help to discuss this, or at least get some decent input/suggestions brewing.
Alright, let me start out by giving an overview of our environment.
We have a multi-module maven project with about 11 JARs. Dependent on those internal JARs are 9 WAR files, of which 8 are placed in an EAR file. The remaining WAR file is deployed on its own as a separate application. When the 8 WAR files are built (that reside in the EAR), they are built as skinny WAR files, so the resultant EAR file is at a minimal size with all dependencies in the APP-INF/lib section. All of this works with no issues. We currently deploy to a remote WebLogic 10.3 server that has a lot of memory and CPU, so the load is not on our individual machines. We're also publishing nightly snapshots using a continuous integration build server.
Artifacts that we're deploying:
EAR file containing 8 WAR files, 11 internal JARs, and third party libs: ~70MB
Other WAR file: ~110MB
Some of our software engineers would like to work from home, over a VPN connection, and have incremental/hot deployment options. Otherwise, because of how we deploy with web logic/maven, they are forced to build an entire EAR file or the 110MB WAR file and upload them over VPN. This is not fun, and it's not fast. I have been reading on JRebel, and was wondering if anyone else uses JRebel with a multimodule maven project doing remote deployments, and how to do it efficiently.
From some of my reading, it is recommended to 'upload the changes' to the server and have the rebel.xml configurations reading those directories for that particular deployment which... well, brings us to the issue at hand. How do I tell Maven to dump changed resources/newly compiled class files to some other directory so that I can upload them to the server and to the appropriate folders (our server hosts something like 10+ WebLogic instances running on various ports, one instance per developer). Or just have the developers share their workspace folder with the network, and configure rebel.xml files (in a JAR, for example) point to the appropriate //COMPUTERNAME/workspace/jarProjectName/target/classes folder. The problem I foresee with that is, every time they start WebLogic, it's going to fish all the .class and configuration files and JSP files across the network, because the rebel.xml file wins first, and that will be terrible over VPN. AFTER the deploy is up though, then hot deployment should work as usual. I just don't want the unnecessary overhead of transferring all the classes over the network for the first boot, and not only that, sometimes developers are at the office, turn off their computer, and then go home. What happens to JRebel/WebLogic then?
It seems like a much better idea to only see what files have changed in the various maven projects, and FTP them to the proper location on the server so that JRebel can do it's thing, completely server side. Does anyone have a good way to do this? Or maybe someone has a solution that does not involve JRebel at all. Let's talk.
It is now possible to use JRebel for remote deployment also. Really easy to setup, no need for special networking configurations, opening ports on the remote machine, etc.
http://zeroturnaround.com/jrebel/remoting
It relies on IDE plugin heavily but the experience is then as if you were developing on a local machine
What you should do is have each developer run their own local instances of WebLogic.
There is a fair bit of memory usage with WebLogic, but having to do a deployment over a VPN is going to be a losing proposition. The only way this might work is using LiveRebel. But again, you will still pay a heavy penalty for network transmission, especially over a slow connection.
You are mostly likely better off running your app in the JDeveloper WLS and dropping the huge shared instance of WebLogic.
Why not to use samba (http://en.wikipedia.org/wiki/Samba_(software)) protocol for this? You will just need a network drive to be used as a shared location. Developer could set the compiler path to point to that location and in the deployed app, rebel.xml paths should point to the same directories. That would do the trick.
Even if the developer switches off his machine, Weblogic will keep running.
Related
I have a Java Spring Boot Application, and I build it with Maven. With
spring-boot-maven-plugin,
I can create fat, executable jar file.
Then I copy it to the remote server and run. But sometimes,
I change only one line or event one word in my code and I had to do whole build/copy step again. I'm sure that I'm doing it wrong, but I couldn't find another way that more efficient (Like capistrano in Rails).
At this point, I'm planning to clone source code to server, push from local, pull from remote, build and run approach. What is the correct (or elegant) way of doing this deployment?
For automatic build and deployment process (continuous integration), you can use Jenkins. Refer this documentation for more details: https://jenkins.io/doc/
I would say it depends where are you trying to do it.
The best and the most agile way to do it for a controlled environment is surely a CI-CD (Continuous Integration and Continuous Deployment) pipelines, which complies-builds-tests-deploys your code against every commit made to the source code BUT it may be too slow to use CI-CD for a development environment where you had like to have a shorter feedback cycle and faster feedback to see how the code is progressing.
However, if you are talking about development environment, I will hit another chord and ask you why to deploy to the external server AT ALL while developing. When you use Spring Boot, which helps you develop a self-contained application, you get the Tomcat Server embedded with it for free. That gives you the choice to run the code anywhere you develop and test to move forward.
A simple maven goal - mvn spring-boot:run can make the code run anywhere you had like.
There is another magical library available in Spring-Boot, known as Devtools, which is meant to support agile developers. The library once in the app classpath, performs hot-swapping of byte-code to auto reload of code into the running application (running locally with embedded Tomcat) as soon there is a saved change. This is one of the coolest gadget that a developer can have.
Use of Spring-Loaded (or JRebel for non spring-boot apps) libraries can also help a developer do hot-swapping of byte code to load changes in running application as soon saved.
I hope it helps.
I've been reading about some of the (relatively) new application frameworks for Java such as Akka, Play and Vertx. I can't find a clear answer however on whether or not applications created with these frameworks are deployed like traditional EE applications? That is, are they packaged as WAR/EAR files and deployed to an application server like WebSphere? I my mind, a lot of the WAR/EAR infrastructure was built with traditional EE apps in mind.
In there default they are not deployed like normal EE Applications. These Frameworks try to simplify things and make writing code faster and easier and so they most of the time have there own deployment mode and bring there own web server. Also they follow more the Docker approach of having fat jars and be able to be used as micro service.
So from my point of view it looks like this (could be wrong I did not use them):
Akka its possible to add to an WEB-INF/lib in an war file
Play native installer is recommended. They dropped the war possibility but there seems to be an github plugin
vert.x seems no support for ear or war files
I'm trying to build a Dynamic Web Application in Eclipse and I'm having trouble when I test my changes.
Sometimes when I make changes, I relaunch the Tomcat server and behold, my project looks exactly the same as last time. I then clean and rebuild the project a few times as well as clean the server, and finally my changes are visible.
This is time consuming and doesn't feel correct.
Is there a proper, quicker, more efficient way to launch a clean copy of your project on the local server?
I have used the Eclipse Tomcat plugin for years with no issues and more recently the Spring Tools Suite with plugin for their customized Tomcat. As you say having your project Build Automatically and having the Server configuration, via the plugin, set to Publish Changes Automatically goes some way towards alleviating the pain of Java web development.
For non Java files (JSP, CSS, JSF, JS etc.) you should see the changes instantly. For Java files, the plugin will reload the application to reflect the changed classes. While this works okay for small applications, for large applications the reload time can become significant. There can also be issues with permgen after multiple reloads and you can lose your session and be booted back to some previous screen and have to step back through the various screens to see your changes.
The ultimate solution then is to use a tool called JRebel which will swap you updated classes into the running VM without the requirement to reload the whole application as well as publishing non java files, reloading application config files (log4J, Spring) managing JPA entities and a lot more.
http://zeroturnaround.com/software/jrebel/
It's a commercial product and while not exactly cheap, in my experience worth every penny. I've been using it for years and couldn't live without it. Your productivity will increase x-fold especially when working on larger applications. No connection, just a satisfied customer.
For a free alternative see:
http://www.hotswapagent.org
In my experience, the tomcat plugin in eclipse has been flaky like this from the beginning. I suggest that you not use it - its just not worth the trouble.
There are at least two alternatives.
First, you could just install tomcat locally, build your war file, and deploy it manually.
A better alternative IMHO is to set up a maven build for your project, and then install the maven tomcat plugin, which just works. I've never seen it get flaky.
Something similar is probably available for gradle, if you want to use that instead of maven.
I've discovered that selecting Build Automatically, under the Projects menu item, changes in the code are automatically recompiled and added to the currently running Tomcat server instance.
Now it's going great! Much faster!
I'm noticing a lot of projects (DropWizard, Grails, etc.) starting to embrace the notion of a "fat" JAR (using an embedded web server like Jetty or Tomcat) vs. the traditional WAR deploy. Both methods involve a single JVM process (i.e. no matter how many WARs are deployed to Tomcat, it's all the same JVM process).
Under what circumstances is either deployment method preferable over the other?
Here are some reasons:
In favor of JAR:
Simple to build and deploy.
Embedded servers like Jetty are easy to operate.
Applications are easy for users to start and they can run on their personal computers too, because they are lightweight.
Starting and stopping applications will require less knowledge than managing web servers.
In favor of WAR or EAR:
The server would provide features like deployment, restart, security and so on for multiple web applications simultaneously.
Perhaps a separate deployment team can handle the starting and stopping of apps.
If your supervisors like to follow rules, they will be happy to find that you are not breaking them.
Having said this, you can always provide 2 or 3 types of executables to cater to all needs. Any build tool makes this easy.
Distributing an application with an embedded webserver allows for standalone setup and running it by just calling java -jar application.jar.
However, there may be users who want to be in control of which web server is used or who want to deploy multiple applications into a single webserver (e.g. in order to prevent port clashes especially with ports 80 and 8080). In that case a "fat" jar might cause problems or at least some unneeded code and thus a larger memory footprint.
IMHO the best approach for those two cases would be to provide two artifacts: a "fat" jar for (easier) standalone setup and an application-only war/ear for those who want to deploy the application in their own container.
I am thinking about user perspective. You could wrap this one-self containing jar within a .exe or .dmg and just install it without the need to have additional instructions on how to deploy. Also, since you are doing the deploy for a particular server only, you could take advantage of that particular server
I work in a team of Java developers. We write the code in Eclipse, and then we use maven to build the war. Afterwards we deploy the war in Tomcat.
Is there a free way to autodeploy files on save ?
Thanks.
JRebel gives you exactly that: auto-deploy files on save, using Eclipse AND Tomcat, but you do have to pay for it.
I recommend JRebel, but a quick glance around for free alternatives brings up this SO question, where someone suggested the Dynamic Code Evolution VM as a similar product.
There is a good article on the different ways to hot-deploy Java web apps here, which also details some of the other approaches already mentioned.
If you want to deploy for testing purpose on your developer machine, you should use the Tomcat Maven Plugin or, better the Maven Jetty Plugin (it's better because it's lighter and faster).
If you want to deploy it on a remote server, say at every commit on your SCM you must use Contiuous Integration tools like, for example, Jenkins or Apache Continuum.
If you have your class files, you could put them in WEB-INF/classes. Is that what you were looking for?
You can use the maven jetty plugin
Don't build with maven, use exploded deployment, so that static UI files and JSPs could be picked up automatically by the container. For reloading changes to classfiles, you can run the application in debug session and use hotswap (which allows you only the changes to method bodies) or overcome your demand for free software and buy yourself a JRebel license, which can be used for free on non-commercial projects (http://social.jrebel.com)