I'm currently migrating our build process to Maven from Ant. Our application is deployed to many different customers, each with a unique set of dependencies and and configuration. I can implement different profiles to model these and build the required wars from them. However this is a process that happens at compile time.
Each release is tagged under SVN as well as uploaded to our internal Nexus repository. I want to be able to take a defined release and reconstruct it based a profile. Is there a way to do something like this? Is there something other than profiles I should be using?
"declare several execution for the war plugin to produce several artifacts (and install/deploy them)" This sounds like this might be the way forward. How would I go about doing this?
This goes a bit against a Maven golden rule (the one main artifact per module rule) but can be done. The One artifact with multiple configurations in Maven blog post describes one way to implement this approach:
I decided to put all the environment
specific configuration in a special
source tree, with the following
structure:
+-src/
+-env/
+-dev/
+-test/
+-prod/
Then I configured the maven-war-plugin
to have three different executions
(the default plus two extra), one for
each environment, producing three
different war files: beer-1.0-dev.war,
beer-1.0-test.war and
beer-1.0-prod.war. Each of these
configurations used the standard
output files from the project and then
copied the content from the
corresponding src/env/ directory on
to the output files, enabling an
override file to be placed in the
corresponding src/env/ directory. It
also supported copying a full tree
structure into the output directory.
Thus if you for instance wanted to
replace the web.xml in test you
simply created the following
directory:
src/env/test/WEB-INF/
and placed your test specific
web.xml in this directory and if you
wanted to override a db.property
file placed in the classpath root
directory for the test environment you
created the following directory:
src/env/test/WEB-INF/classes
and placed your test specific
db.property file in this directory.
I kept the src/main directory
configured for development
environment. The reason for this was
to be able to use the
maven-jetty-plugin without any extra
configuration. Configuration
Below you find the maven-war-plugin
configuration that I used to
accomplish this:
<plugin>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<classifier>prod</classifier>
<webappDirectory>${project.build.directory}/${project.build.finalName}-prod</webappDirectory>
<webResources>
<resource>
<directory>src/env/prod</directory>
</resource>
</webResources>
</configuration>
<executions>
<execution>
<id>package-test</id>
<phase>package</phase>
<configuration>
<classifier>test</classifier>
<webappDirectory>${project.build.directory}/${project.build.finalName}-test</webappDirectory>
<webResources>
<resource>
<directory>src/env/test</directory>
</resource>
</webResources>
</configuration>
<goals>
<goal>war</goal>
</goals>
</execution>
<execution>
<id>package-dev</id>
<phase>package</phase>
<configuration>
<classifier>dev</classifier>
<webappDirectory>${project.build.directory}/${project.build.finalName}-dev</webappDirectory>
<webResources>
<resource>
<directory>src/env/dev</directory>
</resource>
</webResources>
</configuration>
<goals>
<goal>war</goal>
</goals>
</execution>
</executions>
</plugin>
(...) I can define each customer project with profiles but I don't know if there's a way to release them to a repository.
You have several options:
use profiles and run the build several times (create artifacts with a classifier and install/deploy them)
declare several execution for the war plugin to produce several artifacts (and install/deploy them)
use different modules (and maybe war overlays to merge a common part with a specific one)
Or at least a way in Maven to automatically build an artifact with a specified profile from say an SVN tag.
Well, this is doable. But without more details about a particular problem, it's hard to be more precise.
I would take a look at your architecture and see if there is a way to split up your project into multiple projects. One would be the main code base. The other projects would depend on the JAR file produced by the main project and add in their own configuration, dependencies, etc to produce your final artifact.
This would let you version customer specific code independently of each other as well as keeping common code in one place and separate from customer specific stuff.
Have you taken a look at the Maven Assembly plugin?
This plugin allows you to customize how your distribution is assembled - i.e. what format (.tar.gz, .zip, etc), directory structure, etc. I think you should be able to bind several instances of the plugin to the package phase to assemble multiple variations of your output (i.e. the packaging for customer 1, customer2, etc, separately).
The deploy plugin should then automatically handle deploying each of your assembled packages in the target directory to the repository.
I ended up doing something slightly different. We're not storing the releases in our internal repository. Instead we're building using Hudson and a multi-configuration project (one configuration/profile for each customer). This way when a release is made the Hudson job is run to build different wars for all customers. They are then stored on the Hudson server instead of Nexus. Builds for specific versions and customers can also be built at any time from the releases in Nexus. – samblake Mar 16 '11 at 12:32
Related
I am using Maven to build my project. It was working fine up until I put in a parent pom. Now, the project still builds but the output is 2 jar files instead of one. One of them ends in a -boot.jar, and seems to be the correct jar file, because it has all the dependencies baked in and is over 60mb. The other file has the correct name (projectId-version.jar) but is less than 1mb, and this is the one that gets picked up by the pipeline process, and it fails deployment.
I need mvn to build just one jar with all the dependencies baked in.
I am using Spring boot 1.5.19 (the parent pom has this dependency). Any ideas?
Please see the documentation of the spring boot maven plugin usage and docuemtation.
https://docs.spring.io/spring-boot/docs/2.1.4.RELEASE/maven-plugin/repackage-mojo.html
https://docs.spring.io/spring-boot/docs/current/reference/html/build-tool-plugins-maven-plugin.html
Specifically the repackage goal. The parent POM you are using must be configured to set the new artifact to add the boot.jar via the classifier configuration.
Classifier to add to the repackaged archive. If not given, the main artifact will be replaced by the repackaged archive. If given, the classifier will also be used to determine the source archive to repackage: if an artifact with that classifier already exists, it will be used as source and replaced. If no such artifact exists, the main artifact will be used as source and the repackaged archive will be attached as a supplemental artifact with that classifier. Attaching the artifact allows to deploy it alongside to the original one,
You could define the configuration settings for the maven plugin and override whatever is defined in the parent.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
<configuration>
<classifier></classifier>
</configuration>
</plugin>
I already found an answer here on Stack Overflow how to include a 3rd party JAR in a project without installing it to a "local repository":
Can I add jars to maven 2 build classpath without installing them?
But, when I use the Maven Shade Plugin to create a JAR that includes all the dependencies of the project as well, the 3rd party JAR is not included automatically.
How can I make the Maven Shade Plugin add such a 3rd party JAR in to the shaded JAR?
As per the answer gotten, I made it work. What I did was, added this snippet to the beginning of my pom.xml:
<repositories>
<repository>
<id>repo</id>
<url>file://${basedir}/repo</url>
</repository>
</repositories>
Then added a dependency for my project, also to pom.xml:
<dependencies>
<dependency>
<groupId>dummy</groupId>
<artifactId>dummy</artifactId>
<version>0.0.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
And then ran a command line to add a package to 'repo':
mvn org.apache.maven.plugins:maven-install-plugin:2.3.1:install-file
-Dfile=<my-jar>.jar -DgroupId=dummy -DartifactId=dummy
-Dversion=0.0.0 -Dpackaging=jar -DlocalRepositoryPath=`pwd`/repo/
(Not sure if the repo path needs to be a full path, but didn't want to take chances.)
The contents of the repo subdirectory is now:
repo/dummy/dummy/0.0.0/dummy-0.0.0.jar
repo/dummy/dummy/0.0.0/dummy-0.0.0.pom
repo/dummy/dummy/maven-metadata-local.xml
Now I can check this in to version control, and have no local or remote dependencies.
But, when I use the Maven Shade Plugin to create a JAR that includes all the dependencies of the project as well, the 3rd party JAR is not included automatically.
Yes, because the system scoped dependencies are assumed to be always present (this is exactly what the system scope is about) so they won't be included. People actually don't understand what system scope dependencies are, they just keep abusing them (yes, this is abuse), and then get side effects and wonder why (as Brian pointed out in his answer).
I already wrote many, many, really many times about this here on SO and in 99% of the cases, system scoped dependencies should be avoided. And I'll repeat what the Dependency Scopes mini guide says one more time:
system: This dependency is required in some phase of your project's lifecycle, but is system-specific. Use of this scope is discouraged: This is considered an "advanced" kind of feature and should only be used when you truly understand all the ramifications of its use, which can be extremely hard if not actually impossible to quantify. This scope by definition renders your build non-portable. It may be necessary in certain edge cases. The system scope includes the <systemPath> element which points to the physical location of this dependency on the local machine. It is thus used to refer to some artifact expected to be present on the given local machine an not in a repository; and whose path may vary machine-to-machine. The systemPath element can refer to environment variables in its path: ${JAVA_HOME} for instance.
So, instead of using the system scope, either:
Add your libraries to your local repository via install:install-file. This is a quick and dirty way to get things working, it might be an option if you're alone but it makes your build non portable.
Install and run an "enterprise repository" like Nexus, Archiva, or Artifactory and add your libraries via deploy:deploy-file. This is the ideal scenario.
Setup a file based repository as described in this previous answer and put your libraries in there. This is the best compromise if you don't have a corporate repository but need to work as a team and don't want to sacrifice portability.
Please, stop using the system scope.
The Maven addjars plugin solves this problem - see
http://code.google.com/p/addjars-maven-plugin/wiki/UsagePage
Used <resources> to include my lib with all jars. i.e:
<build>
<resources>
<resource>
<directory>${project.basedir}</directory>
<includes>
<include>lib/*.jar</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
If you only need a quick and dirty solution, you can add the content of the extracted jar file to your src/main/resource directory.
This is probably a really fundamental question, but I'm afraid I don't know much about Java and I couldn't find the answer anywhere.
I'm attempting to build an Ant library which depends on the TFS SDK. I followed the guide to setting up a project, but when I export it as a JAR and try to run a task using ANT I get the following error:
java.lang.NoClassDefFoundError: /com/microsoft/tfs/core/util/TFSUser
I realise I could put the TFS SDK JAR in my ANT lib folder, but if possible I'd like my JAR to include it and the library just work without having to do so.
This answer seems to say it's possible to include all the resources needed to run using Eclipse (I'm using 3.7.2) but it doesn't detail how to actually do it. What is the option in Eclipse to do so?
Select "Extract required libraries into generated JAR" as you do the export.
Select "Extract required libraries into generated JAR" as you do the export.
Use File -> Export -> Java -> Runnable JAR file instead from Eclipse.
The "Extract required libraries into generated JAR" should be what you need.
When you build a jar you get a JAR containing just your code, and not any dependencies your Jar requires. You could use something like jarjar to combine all the dependencies into one easy to manage Jar file or copy all the depend JARs into a folder for ease of use. It looks like Eclipse has options to also do this kind of thing (see posts above).
The other option would be to use a dependency management system such as Maven or Ivy. This has a higher learning curve, but for a library it is worthwhile as it will allow users of your library to easy grab all the dependencies. For an end user application then a single distributable is likely a better option (for which you could use Maven or Ivy to internally manage the dependencies and then something like jarjar or Java Web Start to distribute to your end users).
Just in case if you're doing with maven. You need to include following plugin.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<properties>
<property>
<name>listener</name>
<value>com.example.TestProgressListener</value>
</property>
</properties>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
Read more # how do I build JAR file with dependencies?
You would need to depend on classpath attribute in manifest file. This explained well at How to package libraries into my jar using Ant
I have a POM-based Java project. It contains a number of servlets for deployment in a WAR. However, in addition to this, I also have classes that launch the application as a standalone using embedded servlet and database environments (for a turnkey development environment). Additionally, there is also a command-line client for the application.
I would like to have the ability to build the project into both the WAR and two separate executable JARS (one server, one client). I'm not concerned about the JARs/WAR containing some unnecessary code or deps- I just want all 3 to work.
What's the "correct" way to do this with Maven?
Multiple projects is the way to do this. Put the common code in the first project along with the standalone support. Then make a second with war packaging that depends on the first.
You could use assembly plugin to do this. Assembly plugin can package zip or tar.gz archive for you. It's a perfect distribution format for standalone applications. When you configure assembly plugin you could link it to package phase, so application will be packaged in two formats: war and zip.
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2.1</version>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>assembly</goal>
</goals>
</execution>
</executions>
</plugin>
Up until now we used Ant in my company. Whenever we wanted to send the application to the client we run a special Ant script that packaged all our source code with all jar libraries and Ant itself along with a simple batch file.
Then the client could put the files on a computer with no network access at all (and not even Ant) and run the batch file. As long as the computer had a valid JDK the batch script would compile all the code using the jars and create a WAR/EAR that would finally be deployed by the client on the application server.
Lately we migrated to Maven 2. But I haven't found a way to do the same thing. I have seen the Maven assembly plugin but this just creates source distributions or binary ones. Our scenario is actually a mix since it contains our source code but binary jars of the libraries we use (e.g. Spring, Hibernate)
So is it possible to create with Maven a self-contained assembly/release/package that one can run in a computer with no network access at all??? That means that all libraries should be contained inside.
Extra bonus if Maven itself is contained inside as well, but this is not a strict requirement. The final package should be easily compiled by just one command (easy for a system administrator to perform).
I was thinking of writing my own Maven plugin for this but I suspect that somebody has already encountered this.
From your dev environment, if you include the following under build plugins
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
and invoke mvn assembly:assembly, you would get yourApp-version-with-dependencies.jar in the target folder. This is a self-sufficient jar, and with a Main-class MANIFEST.MF entry, anybody can double click and run the application.
You might try this approach:
Use mvn ant:ant to create ant build
scripts from a maven project
Make sure ant is a project dependency
Use the assembly to build an ant
system
or plan b:
Use mvn ant:ant to create ant build
scripts from a maven project
Make sure ant is a project dependency
Write a "bootstrap class" to call Ant and run the build
Use appassembler to build a
scripted build and install environment
In plan b, you'd write scripts to set up a source tree someplace from the packaged source jars, and then use the appassembler build bat or sh scripts to call the bootstrap and build via ant. Your bootstrap can do anything you need to do before or after the build.
Hope this helps.
Perhaps an answer that I submitted for a similar question could be of some assistance. See Can maven collect all the dependant jars for a project to help with application deployment? The one piece missing is how to include the source code in the assembly. I have to imagine that there is some way to manage that with the assembly plugin. This also doesn't address the inclusion of Maven in the distribution.
What was the reason for moving from Ant to Maven? It sounds like you had everything worked out well with the Ant solution, so what is Maven buying you here?
If it is just dependency management, there are techniques for leveraging Maven from Ant that give you the best of both worlds.
the source plugin will give you a jar containing the source of a probject "source:jar". you could then use the assembly plugin to combine the source jars from your internal projects (using the sources to reference these source jars) and the binary jars from the external projects into one distribution.
however, as for turning this into a compilable unit, i have no suggestions. you could certainly bundle maven, but you'd need to create a bundle containing all the plugins you need to build your project! i don't know of any existing tool to do that.
This is how I do it... on the build part of the pom add in this:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<id>attach-sources</id>
<phase>verify</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
And then on the profiles section add this bit in:
<profiles>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
And when I do a maven install it builds the jar and also checks in a jar of the source.