When doing a mvn install I want to end up with 2 WAR files in my target directory. One will contain the production web.xml and the other will contain the test/uat web.xml.
I've tried this:
<build>
<finalName>cas-server</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1-beta-1</version>
<configuration>
<webXml>src/main/config/prod/web.xml</webXml>
<warName>cas-prod</warName>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1-beta-1</version>
<configuration>
<webXml>src/main/config/test/web.xml</webXml>
<warName>cas-test</warName>
</configuration>
</plugin>
</plugins>
</build>
But I only end up with the test WAR.
I don't think you can do this in one step (actually, I'm surprised that Maven doesn't complain about your setup and wonder which one is applied) and I'd suggest to use profiles and maybe filtering to manage this use case.
If your web.xml are really different, you could just put your maven-war-plugin configuration in two profiles. Or, better, you could merge them into something like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1-beta-1</version>
<configuration>
<webXml>src/main/config/${env}/web.xml</webXml>
<warName>cas-test</warName>
</configuration>
</plugin>
And set the env property in two profiles to pick up the right web.xml at build time.
<profiles>
<profile>
<id>uat</id>
<properties>
<env>test</env>
</properties>
</profile>
<profile>
<id>prod</id>
<properties>
<env>prod</env>
</properties>
</profile>
</profiles>
If your web.xml are similar (i.e. if only values differ in them), you could define properties and their values in two profiles and use filtering to apply them. Something like this:
<profiles>
<profile>
<id>env-uat</id>
<activation>
<property>
<name>env</name>
<value>uat</value>
</property>
</activation>
<properties>
<key1>uat_value_key_1</key1>
<keyN>uat_value_key_n</keyN>
</properties>
</profile>
<profile>
<id>env-prod</id>
<activation>
<property>
<name>env</name>
<value>prod</value>
</property>
</activation>
<properties>
<key1>prod_value_key_1</key1>
<keyN>prod_value_key_n</keyN>
</properties>
</profile>
</profiles>
Then activate one profile or the other by passing the env property on the command line, e.g.:
mvn -Denv=uat package
Another option would be to put the values into specific filters and pick up the right one at build time (like in this post).
There are really many options but as I said, I don't think you can do this without runngin the build twice.
More resources on profiles/filtering:
Maven Book: Chapter 11. Build Profiles
Maven Book: Chapter 15.3. Resource Filtering
Introduction to Build Profiles
Use an alternative Maven Profile during test phase
maven profile filtering search on Google
You can tell the Maven Assembly plugin to simply generate two assemblies. You just write an assembly descriptor file for each output you wish to create and list them in the plugin config.
For example I'm using it to generate a WAR file and a TGZ file, but there's no reason you can't do two WARs in the same way. mvn package will then generate both files.
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.3</version>
<configuration>
<descriptors>
<descriptor>src/main/assembly/assembly-war.xml</descriptor>
<descriptor>src/main/assembly/assembly-dist.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>dist-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
I'd generally suggest to use profiles and run two dedicated builds. However, it should be possible to create any number of artifacts using the maven-assembly-plugin.
Old Question, but I want to answer for completeness.
You can do this in one build step very simply with the war plugin with two executions. See the sample code below:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<executions>
<execution>
<id>build-context-one</id>
<phase>install</phase>
<goals>
<goal>war</goal>
</goals>
<configuration>
<classifier>context-one</classifier>
<webResources>
<resource>
<filtering>true</filtering>
<directory>src/main/webapp</directory>
<includes>
<include>**</include>
</includes>
</resource>
<resource>
<directory>your-context-one-directory</directory>
</resource>
</webResources>
</configuration>
</execution>
<execution>
<id>build-context-two</id>
<phase>install</phase>
<goals>
<goal>war</goal>
</goals>
<configuration>
<classifier>classifier-two</classifier>
<webResources>
<resource>
<filtering>true</filtering>
<directory>src/main/webapp</directory>
<includes>
<include>**</include>
</includes>
</resource>
<resource>
<directory>your-context-two-directory</directory>
</resource>
</webResources>
</configuration>
</execution>
</executions>
I think this can only be achieved by writing a custom Maven plugin or interfering with the build lifecycle and running the war assembling process twice (this is just a faint idea).
Maybe you could create two profiles and run the goal twice with different profiles (mvn -P prof1 package, mvn -P prof2 package), but be careful with the generated artifact names, they shouldn't be overwritten. Or you might be able to create a custom plugin that uses other plugins and assembles the two war files.
While I don't use Maven but Ivy instead, here's how you should generally do this:
Have your own application published in to a private repository/similar as JAR with its dependencies/other static stuff and then individual project settings for building the application's deployment specific WARs with context specific configurations. Now by building any of the individual deployment projects you get the latest version of your actual application with its build specific configurations.
I would assume that since this is trivial in Ivy, Maven should be able to do it just as easily.
Sort-of-ugly hack (it breaks Maven's idea of declaring intentions instead of actions), but worked for me: I had to generate two wars which shared the same back-end code base, but varied on the MVC controller packages.
After banging my head with several plugins, I thought "hey, I'd do it easily in ant", which lead me to use <maven-antrun-plugin> to generate the wars during the "package" phase (on which we arlready have all the files). Sort of like this:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<delete file="target/war-1.war" />
<delete file="target/war-2.war" />
<war destfile="target/war-1.war">
<fileset dir="target/original">
<exclude name="**/WEB-INF/classes/package_of_war2/**" />
</fileset>
</war>
<war destfile="target/war-2.war">
<fileset dir="target/original">
<exclude name="**/WEB-INF/classes/package_of_war1/**" />
</fileset>
</war>
<delete file="target/original.war" /
</tasks>
</configuration>
</execution>
</executions>
</plugin>
(to be fair, I did not delete the original file, but you should be able to do so.)
In your case, you could package one war without the alternate web.xml, rename/move it over the original web.xml, then package the second war.
More simple:
Just create a multi modules project.
Each module would have the WAR packaging :)
Build from parent pom and voila !
Related
I am trying to build and test a JavaFX application on a headless build server. Locally I am using TestFX and Monocle https://github.com/TestFX/Monocle and its working fine. However, I had to manually install Monocle into the java Extensions folder as per this question: JavaFX + maven + TestFX + monocle don't work together
Now I need to use a headless build server to automate our deployment. I can't figure out how to get this Java extension installed correctly with Maven, without doing it manually. This seemed to be the right feature: https://maven.apache.org/pom.html#Extensions,
<extensions>
<extension>
<groupId>org.testfx</groupId>
<artifactId>openjfx-monocle</artifactId>
<version>8u76-b04</version>
</extension>
</extensions>
but the tests fail with a NoClassDefFoundException (which doesn't happen if I manually build the jar into the Extensions). I don't know how to debug this, or if I'm even using the right feature. Any suggestions?
I had a similar headache some time ago. I solved it by copying both openjfx-monocle and all extensions from the extensions folder in a folder under /target and then set the extensions system property to that path. This way I could avoid the NoClassDefFoundException and also successfully run all test on Jenkins. Here is the profile part:
<!--
This profile is used to make headless tests work with the Monocle Platform.
It first copies the extensions from the JDK to the target/java-extensions folder.
Then copies the openjfx-monocle implementation to the same folder.
Afterwards it sets the extensions path to the folder with the copied extensions and the monocle platform.
-->
<profile>
<id>headless-tests</id>
<activation>
<property>
<name>headless.tests</name>
<value>true</value>
</property>
</activation>
<build>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>2.7</version>
<executions>
<execution>
<id>copy-external-jars</id>
<phase>generate-sources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>target/java-extensions</outputDirectory>
<resources>
<resource>
<directory>${java.home}/lib/ext/</directory>
</resource>
</resources>
</configuration>
</execution>
<execution>
<id>copy-monocle-to-extensions</id>
<phase>generate-sources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>target/java-extensions</outputDirectory>
<resources>
<resource>
<directory>src/test/resources/test-lib</directory>
<includes>
<include>openjfx-monocle-8u76-b04.jar</include>
</includes>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<argLine>-Djava.ext.dirs=${project.basedir}/target/java-extensions</argLine>
</configuration>
</plugin>
</plugins>
</build>
</profile>
In my case I copied the monocle jar from maven in the src/test/resources folder. This can further be improved by using Maven Dependency Plugin to copy the monocle jar directly with maven instead having it in src/test/resources.
I have two Git repositories. One repositary is java services (maven web project) and another repository consists of UI {HTML, JS, CSS}(non maven), At the time of java services repository build I want to include the latest UI (master) into the war file. I tried with maven-resources-plugin
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.7</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>2.7</version>
<executions>
<execution>
<id>copy-resources</id>
<!-- here the phase you need -->
<phase>install</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/</outputDirectory>
<resources>
<resource>
<directory>/home/srinivas/AAA/bb/</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
mvn install
It copies the resources to target folder but it is not placing them in the war file
You are using the wrong phase in your execution as the package phase is actually the phase where your war is created, so you need to execute at an earlier phase e.g. prepare-package.
You should definitely read Introduction to the Build Lifecycle for clarification.
In addition you should not become accustomed to pulling in resources via maven-resources-plugin from the file system. This is generally frowned upon as bad practice since other developers will not be able to reproduce your build.
Using a repository manager to store your dependencies is the way to go here. Read Why do I need a Repository Manager? to get started.
at first glance, change :
<phase>install</phase>
to
<phase>prepare-package</phase>
I have a some perl file in my src/main/java/com/pac/report.pl which I want to package as part of my classes in the jar file.
Using maven maven-jar-plugin include directives, I have tried below and various other suggestions I pulled off the web, but doesn't copy the perl file as part of my classes in the jar file. Does anyone know what I am doing wrong.
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<includes>
<include>**/*</include>
</includes>
</configuration>
</plugin>
EDIT
Also let me point out that I don't want to place the file in the resource directory due to legacy call and dependent reasons.
That is because the classes packaged into your jar aren't taken from src, but rather from target (specifically /target/classes), and the compiler completely ignores your non-java file.
Try placing your file in src/main/resources/com/pac/report.pl and it should be packaged into the jar (with the relative path of /com/pac/report.pl) since thats the default location where the resources plugin looks for additional files to add to /target before the jar plugin runs.
EDIT - or, if you dont want to / cant do this the way maven expects, you could manually bind an execution of the resources plugin to the lifecycle to pick up your file and copy it over to target. something like this:
<build>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>copy-resources</id>
<phase>compile</phase> <!-- or later -->
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/classes</outputDirectory>
<resources>
<resource>
<!-- path to your *.pl file here -->
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
...
</build>
What the easiest/right way to conditionally exclude a java file from compilation in a maven project?
I would like to be able to set a 'boolean' properties in the pom.xml:
<properties>
<IncludeMayBe>true</IncludeMayBe>
</properties>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<excludes>
????
</excludes>
</configuration>
</plugin>
Is there a way to fiddle something with the compiler plugin? Or should I go for profiles? I feel like creating a profile is overkill, but may be this is the only solution...
EDIT:
We have established that profiles are the solution. For conditional activation from within the pom.xml, one can use the following:
<profiles>
<profile>
<activation>
<property>
<IncludeMayBe>true</IncludeMayBe>
</property>
</activation>
...
</profile>
</profiles>
I suggest you use the Build helper maven plugin.
Using this, you can have several source directories.
Then you can control what source directories are included using profiles.
Assuming you have your monitoring classes under src/monitoring/java you could add the following to the element in your pom.xml
<profiles>
<profile>
<id>monitoring</id>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>add-sources</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${basedir}/src/monitoring/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
You cannot do this using the compiler plugin (see below).
But even if you could, it doesn't feel right. It would mean that the files in a artifact JAR file would depend on command line switches and/or environmental settings, and that makes it harder for other folks to reproduce builds.
My gut feeling is that you'd be better of modularizing your maven project and using profiles to determine what modules get built.
I had a look at the source code of the compiler plugin mojo, and it looks like there's no way to configure source include / exclude filters. At some point, someone has implemented filters, but the relevant Map objects are private and there is no way to populate them, and hence no way to use this functionality.
The code is here: http://svn.apache.org/viewvc/maven/plugins/tags/maven-compiler-plugin-2.3.2/src/main/java/org/apache/maven/plugin/
I guess you could hack your own version of the plugin ... but it seems like a bad idea.
I want to add a jar file through the systemPath from the local file-system relative to my project directory structure, not on a remote repository. I added the dependency declaration but maven doesn't do anything else with it.
In the declaration below, I want the jar file copied to my target web-inf/lib directory and also jarred as part of the war file. At present, that doesn't happen. How would I get the jar file copied to my war file?
This is the output from debug maven mode:
DEBUG] cglib:cglib-nodep:jar:2.2:test (setting scope to: compile)^M
DEBUG] Retrieving parent-POM: org.objenesis:objenesis-parent:pom:1.2 for project: null:objenesis:ja
DEBUG] org.objenesis:objenesis:jar:1.2:test (selected for test)^M
DEBUG] org.javap.web:testRunWrapper:jar:1.0.0:system (selected for system)^M
DEBUG] Plugin dependencies for:
...
<dependency>
<groupId>org.javap.web</groupId>
<artifactId>testRunWrapper</artifactId>
<version>1.0</version>
<scope>system</scope>
<systemPath>${basedir}/lib/testRunWrapper.jar</systemPath>
</dependency>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<webResources>
<resource>
<directory>WebContent</directory>
</resource>
</webResources>
</configuration>
</plugin>
OK, I did this: Note the directory structure at the bottom.
With the approach below, the jar file from the relative project path is treated as a first class citizen like the other jars. The listing below corrects my original problem. With the pom.xml listing below, the jar file is copied to my target directory.
<repositories>
<repository>
<id>JBoss</id>
<name>JBoss Repository</name>
<layout>default</layout>
<url>http://repository.jboss.org/maven2</url>
</repository>
<repository>
<id>my-local-repo</id>
<url>file://${basedir}/lib/repo</url>
</repository>
</repositories>
<dependency>
<groupId>testRunWrapper</groupId>
<artifactId>testRunWrapper</artifactId>
<version>1.0.0</version>
</dependency>
$ find repo
repo
repo/testRunWrapper
repo/testRunWrapper/testRunWrapper
repo/testRunWrapper/testRunWrapper/1.0.0
repo/testRunWrapper/testRunWrapper/1.0.0/testRunWrapper-1.0.0.jar
Using the maven dependency plugin does the job:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.8</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/${project.build.finalName}/WEB-INF/lib</outputDirectory>
<includeScope>system</includeScope>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Don't use system. To do what you want, just declare as a regular (compile) dependency and use mvn install:install-file into your local repository. Everything else will work as you want (lib will be copied, etc.) That will mean that the build will only work on your machine, however.
To properly fix this for your (internal) team, you will want to set up a repository (e.g. Artifactory, Nexus, or Archiva). This is almost a must for team use of Maven.
If this is for public (e.g. open source) use you can either mimic a repository via an http server or put up a real repository.
try something like this (using Ant plugin to manually put the jar to output directory):
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<copy file="${project.basedir}/pathToJAR.jar"
todir="${project.build.directory}/outputFileName/WEB-INF/lib"/>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
AFAIK, system scoped dependencies are somewhat like those with provided scope and thus are not included in the target artifact. Why don't you install the dependency into your local repository instead?
From the doc:
system
This scope is similar to provided except that you have to provide the JAR which contains it explicitly. The artifact is always available and is not looked up in a repository.
In case this answer didn't work for you as it didn't for me and you know that system is a bad scope, you can try this solution where you are Installing the jar by using install-plugin (scroll down a bit), which installs the JAR into your actual local Maven-repository. Basically you only need to add this plugin to your pom.xml:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<groupId>myGroupId</groupId>
<artifactId>myArtifactId</artifactId>
<version>myVersion</version>
<packaging>jar</packaging>
<file>${basedir}/lib/xxx.jar</file>
</configuration>
</execution>
</executions>
</plugin>
Fill in the appropriate values for groupId, artifactId and version and put your original jar file into the <project-home>/lib-directory and fix file above. You can add more execution-sections, but then don't forget to add ids there, like:
<execution>
<id>common-lib</id>
Everybody who updates from the code-repo needs to call mvn initialize once.
And all Eclipse-enthusiasts may add this to pom.xml, too, to get rid of errors in Eclipse:
<pluginManagement>
<plugins>
<!-- This plugin's configuration is used to store Eclipse m2e settings
only. It has no influence on the Maven build itself. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<versionRange>[2.4,)</versionRange>
<goals>
<goal>install-file</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute></execute>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
The problem with using a reference to the file system is that dependent projects will not be able to globally access this jar file. i.e. the dependent project's ${basedir} is different and thus the .jar file won't be found.
Global repositories on the other hand are universally accessible.