Auto deployment java (daemon or web) application tool - java

I want a tool that can do automatically deployment from local computer to remote Ubuntu server.
My projects are in java. They can be webapp or daemon app, or anything (but now only java); I have spring, hibernate, maven build in my projects.
Is there a tool that can help doing SSH login, run sql script, copy files, editing configuration (in a few .conf files) (mysql username, password, some url address,...), run the newly installed service, do version control, ...
Since deploment/patching packages manually is tedious and time consuming.
Or I have to write my own tool?
edited: We don't want (too many) developers to know server configuration information (since it's unsafe), deployment is done by only one (or a few) sysad.
I thought about Puppet and Chef. Do you think these two systems can help my situation.
All suggestions are welcome.
Thanks in advance.

I suggesst to use Maven for that. You can specify a plugin for each of the tasks and bind the plugins to a certaains build phase.
For SSH upload use maven-deploy-plugin
To run the script use sql-maven-plugin
Copy files may be done with resource or assembly plugin. That depends on your needs
The easisest way to edit conf file is to replace it. But if it's not an option, then use maven-replacer-plugin to replace entries in a config file by regexp

Check out kwatee (I am the creator). It's easy to configure via a web administration interface but deployment operations can then be fully automated using CLI tools, or an ant or maven plugin.

Related

Tomcat. dynamic extend heap memory

I have a task to automaticly extend heap memory for our tomcat application. I know it isn't possible by JDK features, and I'm looking for any dirty(or not) hack to make that kinda feature. The main reason of that demand is minimal configuration of tomcat before start demo version of our application. In other word no user of our application will have any access to configuration of JVM/tomcat.
Application required ~1024M of heap memory, default value for tomcat8 is 256M, which isn't appropriate for our goals.
At this moment I have two possible solutions:
An a .sh/.bat script which will configure tomcat. pros - it will do the work, cons - it's another point of configuration demo stand(copying of a script)
An wrapper for our application, which goes in the same war file and configure tomcat if required. pros - it will do the work, and no new configuration point(Just install tomcat and copy the war file),
Is there another way to do that in more... common and simple way?
EDIT The main goal is to make installation of our application in following steps:
Install tomcat
Copy war file
Start tomcat
no any additional configuration, just copy war and start tomcat
That is commonly solved by wrapping the installation and configuration of tomcat in an installation script. Pros: the end user just download the installer and runs it; cons: the installation script must be tailored for the final environment (one for Windows, one for Linux) and can be hard to write. A sometimes simpler way is to provide a zip containing a readme.txt file and a install.bat (or install.sh)
An alternate way if the configuration is really complex is to directly configure a VM (VMDK is a de facto standard), and let the users install the virtualizer they prefere.

SpringSource integration with eclipse

I am attempting to integrate the sonarqube plug-in with eclipse. However, I want to do this on another machine that does not have network access. I have copied the .jar files into the Spring plugins folder. This does not properly install it.
I was wondering how I can go about this installation process?
Thanks
Doing a copy of plugins JAR is not a good idea. You will miss some items (especially features, config, ...).
You could download the update site content from http://downloads.sonarsource.com/eclipse/eclipse/ and then copy it to your offline computer and configure a local update site (file://xxxx).

Eclipse: Is it possible to publish Javascript edits to an external Tomcat instance

I am converting an application from Flex to Javascript. My workflow within Eclipse for Flex was to use Maven to start my Java web app in Tomcat and then have Eclipse configured to compile edited Actionscript files to a SWF and save it to my exploded WAR directory (that Maven/Cargo uses).
It worked very well for a long time allowing me to edit actionscript source code, flip over the browser, refresh the screen and see the changes.
I am new to Javascript however, and am struggling to get the same workflow up and running. The part I don't understand is how to tell Eclipse that I would like my edited Javascript files to be written out to a particular directory (that contains the exploded WAR). In my WAR project (a WTP dynamic web project) there is something that looks like a Javascript build path called "Javascript resources", but there is no output directory.
I would really like to continue to run Tomcat and Jetty via Maven if at all possible. I realize I can do what I want via WTP (M2E-WTP), but would prefer to use Maven/Cargo.
Denis's suggestion to create custom builder is probably best solution if you want to continue using pure Maven/Cargo approach with Eclipse.
If you are deploying to an exploded war directory, then another similar idea would be to use a File Synchronization plugin. These will automatically copy modified files to configured folders. See:
http://andrei.gmxhome.de/filesync/
https://wiki.onehippo.com/display/CMS7/Use+Filesync+Eclipse+plugin+for+faster+turn+around
-------------
FWIW, I don't think Maven:Tomcat/Cargo plugins are ideal for real-time web development, especially on the frontend side of things. They are useful mainly for controlled deployments or bootstrapping a server without initial setup. My thoughts:
Eclipse WTP used to be great for real-time web development, but I stopped using it a few years back as it just got way too hard to make it work correctly in a Maven environment. Fwiw, my preferred approach these days looks like this:
Do not install or use Eclipse WTP.
Use m2eclipse to integrate Maven with Eclipse.
Use Maven to do clean builds and generate exploded WAR directory in target folder.
Setup independent Tomcat server to load webapp from the exploded target folder.
I suspect the tomcat setup/startup could be integrated into Maven. It's not worth the extra complexity to me though.
Then, I configure JRebel (automatically via Maven) to handle java and web resource file changes. With this setup, I almost never have to redeploy or restart Tomcat. All changes (java, html, js, etc.) are seen immediately.
I think the same setup could be used without JRebel (for non-java files only) by configuring the web source folders as source folders in Eclipse with custom target output path being the corresponding directory in the exploded war directories. If that didn't work, then it would definitely work by using the custom builder or file synchronization solutions mentioned above.
Eclipse introduces the concept of "builder" to build a project. It comes with hardcoded builders such as the java compiler or the war builder of WTP.
But eclipse also enable to setup your own Builder using ant files : on your project, right click the project properties, go to section Builders, click on the new button.
You can use arguments to your ant file and use variables defined in by eclipse to build them
Do not forget to fill the refresh tab if you want eclipse to by notified of the produced files.
Do not forget the fill the Build options tab, section "Specify working set of relevant resources" in order to have your builder called each time a source used by the build file is changed inside eclipse.
Also go to the "targets" tab to specify during which type of build phase your ant file is called and which target is called.
I knwo this solution may not be the best for you since your build process will be described more than once but it may help you achieve your goal.

Custom installer package for linux/solaris

I am in the unfortunate situation where I need to deploy and upgrade packages and config files on machines with no root access and no ability to use or install a package manager. Are there any neat solutions that allow creation of custom install packages?
I am open to custom compiles of some software in a custom location on the servers if it helps the situation.
Im almost at the point where I might end up having to write my own java package management system :(
In case its relevant some further information. The installer needs to install and configure the following:
Apache Tomcat
WAR files into Apache Tomcat
ActiveMQ
Some JAR files with some corresponding Cron entries
This sounds a bit perverse. Why do you need to "deploy" Tomcat / ActiveMQ to (lots of) machines that you don't have root or sudo access to?
Anyway, I don't see the need for a custom installer to do this (* see note below).
The yum --installroot /home/whatever <package> should install <package> in a non-standard location. If you cannot use yum or whatever, you should be able to download a binary ZIP or TAR file and unpack it. And once you have installed / unpacked whatever, you can leap in and edit the configuration files using the relevant app tools ... or a text editor. Tomcat can be installed in any directory you feel like, and run using your own login account if you need. I imaging ActiveMQ is the same.
Deployment of a WAR file is simply a matter of copying it to Tomcat's webapp directory.
Creation of a cron entry is simply a matter of running the crontab(1) command.
And if you have to go through this process lots of times, you could write some shell scripts to do the repetitive work for you.
(* Note - there are a couple of possible roadblocks.
You will need root/sudo access deploy a startup file for Tomcat, etc to "/etc/init.d" to get it to start automatically when the system boots. There is no easy way around this. The "/etc/init.d" directory is only writeable by root.
If you want manually launch Tomcat to run on ports 80 / 443, you will need root/sudo access to launch it. Again, there is no easy way around this. Only a "root" process can listen on port numbers less that 1024.)
Take a look at InstallJammer. You can develop graphical or console-based installers for both platforms from a single project. They won't require root unless you need them to.
InstallBuilder is the tool we use to package Bitnami stacks including the Java ones like Alfresco which include JRE, Tomcat, etc. and do not require admin privileges

Deploy java (command line) app using Netbeans / ant

I've finally managed to create a Netbeans project out of an old standalone (not Web-) Java application which consisted only out of single .java sources. Now I have basically two questions regarding Netbeans Subversion interaction and application deployment:
Do you check in all the Netbeans project files into the repository, normally?
If I build the project using Netbeans (or ant) I get a .jar file and some additional jar libraries. In order for the app to run properly on the server, some additional config files and directories (log/ for example) are needed. The application itself is a J2SE application (no frameworks) which runs from the command line on a Linux platform. How would you deploy and install such an application? It would also be nice if I could see what version of app is currently installed (maybe by appending the version number to the installed app path).
Thanks for any tips.
No, not usually. Anything specific to NetBeans (or Eclipse, IntteliJ, etc), I don't check in; try to make it build from the command line with your ant script and produce exactly what you want. The build.xml is something that can be used for other IDEs, or in use with Anthill or CruiseControl for automated builds/continuous integration, so that should be checked in. Check in what is needed to produce/create your artifacts.
You don't specify what type of server, or what exact type of application. Some apps are deployed via JNLP/WebStart to be downloaded by multiple users, and have different rules than something deployed standalone for one user on a server to run with no GUI as a monitoring application. I cannot help you more with that unless you can give some more details about your application, the server environment, etc.
Regarding the config files, how do you access those? Are they static and never going to change (something you can load using a ResourceBundle)? ? You can add them to the jar file to look them up in the ResourceBundle, but it all depends on what you are doing there. If they have to be outside the jar file for modification without recompiling, have them copied with an installer script.
As for directories, must they already exist? Or does the application check for their existence, and create them if necessary? If the app can create them if absent, you have no need to create them. If they need to be there, you could make it part of the install script to create those folders before the jar files are installed.
Version number could be as simple as adding an about box somewhere in the app, and looking up the version string in a config/properties file. It has to be maintained, but at least you would be able to access something that would let you know you have deployed build 9876.5.4.321 (or whatever version numbering scheme you use).
Ideally, you should not tie down your application sources and config to a particular IDE.
Questionwise,
I suggest you do not. Keep you repository structure independent of the IDE
You might have to change your application so that it's structure is very generic and can be edited in any IDE.
Is this a web app? A standalone Java app? If you clarify these, it would be easier to answer your query.
We don't check in the /build or the /dist directories.
We tend to use this structure for our Netbeans projects in SVN:
/project1/
/trunk
/tags/
/1.0
/1.1
/binaries/
/1.0
/1.1
When a change is need we check out the netbeans project from trunk/ and make changes to it and check it back in. Once a release of the project is needed we do an SVN copy of the netbeans project files to the next tag version. We also take a copy of the deployable (JAR or WAR) and place it in the version directory under binaries along with any dependencies and config files.
By doing this we have a clean, versioned deployable that is separate from the source. Are deployables are version in the name - project1-1.0.jar, project1-1.1jar and so on.
I disagree with talonx about keeping your source non-IDE specific - by not storing IDE files in SVN along with you source you are adding extra complication to the checkout, change, checkin, deploy cycle. If you store the IDE project files in SVN you can simply check out the project, fire up the IDE and hit build. You don't have to go through the steps of setting up a new project in the IDE, including the files you SVNed, setting up dependencies etc. It saves time and means all developers are working with the same setup, which reduces errors and discrepancies. The last thing you want is for a developer to check out a project to make a small bug fix and have to spend time having to find dependencies and set stuff up.
To answer question #2 -- who's your consumer for this app?
If it's an internal app and only you (or other developers) are going to be deploying it, then what you have is perfectly all right. Throw in a README file explaining the required directories.
If you're sending it out to a client to install, that's a different question, and you should use an installer. There are a few installers out there that wrap an ant script and your resources, which is a nice approach particularly if you don't need the GUI... just write a simple ant script to put everything in the right place.
Version number is up to you -- naming the JARs isn't a bad idea. I also have a habit of printing out the version number on startup, which can come in handy.

Categories