Using Maven Profiles with a WAR-Packaged Product - java

We have a Maven build (version 2.2.1) that currently produces a WAR file. Our output directory is target/, so we end up with a build artifact target/MyWar.WAR.
I'm adding two profiles to our POM.xml file to facilitate specific build "flavors" that each require a specific version of an A.xml file. In the intermediate build directory target/MyWar/ there are 3 files:
A.xml
A_1.xml
A_2.xml
Building in Maven without a specified profile should use A.xml, and it does currently. I want to use maven-antrun-plugin to (for Profile 1) replace A.xml with A_1.xml, and for Profile 2 replace A.xml with A_2.xml. (Removing the _1 and _2 suffixes.)
This is an example of Profile 1's execution:
<profile>
<id>Profile1</id>
<build>
...
<execution>
<id>Profile1-Replace</id>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<delete
file="${project.build.outputDirectory}/../MyWar/A.xml" />
<copy
file="${project.build.outputDirectory}/../MyWar/A_1.xml"
tofile="${project.build.outputDirectory}/../MyWar/A.xml" />
</tasks>
</configuration>
</execution>
...
</build>
</profile>
This correctly replaces the files in the intermediate target/MyWar/ directory, but for whatever reason the final WAR that's being produced in target does not reflect these changes.
It's as if running in the 'package' phase is too late, and the WAR has already been built. Changing the phase to 'compile' or 'test', the immediately-previous phases, complain because the A.xml file (and the intermediate build directory) have not even been created yet.

I would suggest to use the process-resources phase instead, or even generate-resources, if you feel that is a better fit. As a last resort, use prepare-package. But the package phase is the wrong place to do this sort of thing. All such modifications might typically occur directly in the source tree.
However, if you do the file manipulation in a separate directory, then you can add it during the package phase using the maven-war-plugin as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<webResources>
<resource>
<directory>A_variant</directory>
</resource>
</webResources>
</configuration>
</plugin>
Of course if you need to go this route, it would be simpler to keep three directories and choose the appropriate one in your profile.

Related

Spread config files to child poms (maven)

I have a parent pom which is inherited by multiple other poms
superpom
|--pokemon
|--|--app
|--|--infrastructure
|--yu-gi-oh
|--|--app
|--|--infrastructure
I have multiple config files like:
a cve-suppress.xml file for the good old dependency plugin (can be directly on pom level)
logback.xml (must be in test/resources)
...
Of course, I could have these files in every project in every module, but changes would lead to changes everywhere and therefore consume much time.
How can I effectively move/copy the files on runtime to the child poms?
Does files are mostly used for testing in gitlab pipelines
Possible ideas
1. Resource Plugin in superpom
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.0.2</version>
<executions>
<execution>
<id>copy-resource-one</id>
<phase>generate-sources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/destination-folder</outputDirectory>
<resources>
<resource>
<directory>source-files</directory>
<includes>
<include>foo.txt</include>
</includes>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
superpom specifies path to resource
not working, because when e.g. pokemon executes plugin, in this scope, the file does not exist
2. Use Gitlab variables/files
copy files in gitlab pipelines to desired place
problem1 -> copy file to every test/resource folder in every module of project is hard and changes to path may lead to errors
problem2 -> file content is in gitlab therefore separeted from parent pom, overview might get blurry
3. Use mojo exec plugin
trigger script directly creating file
I am not sure how to do this exactly, I cannot find good examples so far :/
Do you have any other ideas? Is there a way I am missing or a dedicated plugon for this?

Unzip and re zip a file using Maven?

Question: is there any way in Maven (without resorting to an ant plugin) to unzip a file, cd into the directory, remove the file, and the rezip it, all as part of the build?
This is necessary as it is a complex build and also do not want to have to use gradle to accomplish this task.
The requirement of unzipping, removing file and zipping again can also be met in one single step by the truezip-maven-plugin and its remove goal which:
Remove a set of files from an existing archive.
The official examples also cover this scenario.
Given the following snippet:
<properties>
<archive>${project.basedir}/sample.zip</archive>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>truezip-maven-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>remove-a-file</id>
<goals>
<goal>remove</goal>
</goals>
<phase>package</phase>
<configuration>
<fileset>
<!-- note how the archive is treated as a normal file directory -->
<directory>${archive}</directory>
<includes>
<include>hello.txt</include>
</includes>
</fileset>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
And executing:
mvn clean package
The build will process the ${archive} file (in this case a sample.zip at the same level of the pom.xml file, that is, in the project.basedir directory) and remove from it the hello.txt file. Then rezip everything.
I just tested it successfully, you can even skip the properties section if not required. However, you should also carefully know that:
The zip file should not be under version control, otherwise it would create conflicts at each build
The behavior most probably should not be part of the default Maven build, hence good candidate for a Maven profile
the plugin replaces the original file, so if that was an issue you could firstly copy it to another location and then process it as above. To copy it, you could use the maven-resources-plugin and its copy-resources goal.

Does maven have an ability to pack single *.dll to jar without any sources?

I'd like to add *.dlls as third party libs to my repository and during packaging process just pack them to *.jar, sign them and copy to some specific folder.
Signing and coping are well done and work correctly (as expected by using maven-dependency-plugin and maven-jarsigner-plugin). But I didn't find any method to automatically pack single dll to jar (without any sources like maven-assembly-plugin does).
Solution that I see by the time: add to my repository not a "pure" dll, but already packed to jar lib (packed by myself)... but it's not a good idea, I guess)
It sounds like you've successfully retrieved your .dll (with dependency plugin) and signed it (jarsigner plugin), and it's somewhere in your ${project.build.directory} (which defaults to target).
If that's correct, give this a try:
Define the packaging of your project as jar
Retrieve dlls
Make sure the jarsigner:sign goal is bound to the prepare-package phase. It binds to package by default and we need to ensure jarsigner:sign runs before jar:jar.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jarsigner-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>sign</id>
<phase>prepare-package</phase> <!-- important -->
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
</plugin>
Configure the jar plugin to include the signed dll(s)
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<!-- using this ID merges this config with default -->
<!-- So it should not be necessary to specify phase or goals -->
<!-- Change classes directory because it will look in target/classes
by default and that probably isn't where your dlls are. If
the dlls are in target then directoryContainingSignedDlls is
simply ${project.build.directory}. -->
<id>default-jar</id>
<configuration>
<classesDirectory>directoryContainingSignedDlls</classesDirectory>
<includes>
<include>**/*.dll</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
Now, running mvn clean package should give you a jar containing your signed dlls.
If JACOB requires manifest config there are docs explaining how to do this.
Good luck!
I would recommend to pack your dll's as a zip archive via maven-assembly-plugin and let that module deploy the zip archive as attached to your usual pom. The packaging of that project should be pom instead of default.
I would be a little bit confused if i download a jar and find dll's inside it,
but if you prefer you could create jar via the maven-assembly-plugin or use the maven-jar-plugin.

In Maven2, what's the simplest way to build a WAR and the EAR to contain that WAR in a single POM?

Situation is pretty straightforward. I have a Java webapp that I'm converting to be built with Maven. At present, the app is built with Ant into a single WAR file, which is then bundled into an EAR with a very simple application.xml.
maven-war-plugin and maven-ear-plugin both look pretty straightforward to me, and it appears they're setting me up to be forced to consider the above as two distinct projects, with the WAR project as a dependency of the EAR project. This seems a tad inconvenient, especially because a profile setting of the WAR project will change for each environment, which seems like it would force me to duplicate that build tweaking each time I attempted to build the EAR as well.
All of that to say: is there a straightforward way to build the WAR and package that into this trivially-simple EAR? I'd like to avoid maintaining these as two separate projects, but would similarly prefer not to resort to an overly messy hack using assemblies to accomplish this.
All of that to say: is there a straightforward way to build the WAR and package that into this trivially-simple EAR? I'd like to avoid maintaining these as two separate projects, but would similarly prefer not to resort to an overly messy hack using assemblies to accomplish this.
Short answer: no, there is no simple maven-way to do that as this would go against a Maven rule which is "one artifact per project" (understand one output per project which is true in 99% of the cases).
And actually, I would strongly recommend to not go the hacky way and forget using assemblies to create an EAR. Instead, create two modules, one with a packaging of type war, the other with a packaging of type ear depending on the war artifact and declare them as modules of a parent pom.xml. Like this:
my-project
|-- pom.xml // packaging of type pom and my-war and my-ear as modules
|-- my-war
| `-- pom.xml // packaging of type war
`-- my-ear
`-- pom.xml // packaging of type ear
If you go for Maven, adopt Maven philosophy, don't fight against it, it will save you lot of pain. Seriously, hacking assemblies to do what the maven-ear-plugin is already doing is just anti DRY. You'd better stick to Ant in that case.
I know this is 5 years old now, but it was still the first answer that came up when I searched. Also, whilst "that's not the maven way" is a perfectly reasonable answer for some people, others may still prefer to use a single pom as the OP asked, and it is really not that complicated.
First, create a standard war pom.xml to generate the war file you want to include in the ear. Leave the packaging as war.
Then write your own application.xml (in src/main/application or wherever) using a placeholder for the war file name:
<application xmlns="http://java.sun.com/xml/ns/javaee" ... >
<module>
<web>
<web-uri>${project.build.finalName}.war</web-uri>
<context-root>myapp</context-root>
</web>
</module>
</application>
And include any other server-specific xml files (weblogic-application.xml etc.) in the same location.
Next, add a resources section to replace the placeholder with the war file name:
<resources>
<resource>
<directory>src/main/application</directory>
<filtering>true</filtering>
<includes>
<include>META-INF/*.xml</include>
</includes>
</resource>
</resources>
Finally, add an ant ear task to built the ear:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<ear destfile="${project.build.directory}/${project.build.finalName}.ear"
appxml="${project.build.outputDirectory}/META-INF/application.xml">
<fileset dir="${project.build.outputDirectory}" includes="META-INF/*.xml"/>
<fileset dir="${project.build.directory}" includes="${project.build.finalName}.war"/>
</ear>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
And that's it.
In Maven every single project produce an aritifact. In your situation I suggest create two project one for war and one for ear. If you need mutltiple versions of your projects you can achive that using classifiers and profiles.
This is excerpt of richfaces examples pom.
<plugin>
<artifactId>maven-war-plugin</artifactId>
<executions>
<execution>
<id>jee5</id>
<phase>package</phase>
<goals>
<goal>war</goal>
</goals>
<configuration>
<webappDirectory>${project.build.directory}/${project.build.finalName}-jee5</webappDirectory>
<classifier>jee5</classifier>
<packagingExcludes>WEB-INF/lib/jsf-api*,WEB-INF/lib/jsf-impl*,WEB-INF/lib/el-*</packagingExcludes>
<warSourceExcludes>WEB-INF/lib/jsf-api*,WEB-INF/lib/jsf-impl*,WEB-INF/lib/el-*</warSourceExcludes>
</configuration>
</execution>
<execution>
<id>tomcat6</id>
<phase>package</phase>
<goals>
<goal>war</goal>
</goals>
<configuration>
<webappDirectory>${project.build.directory}/${project.build.finalName}-tomcat6</webappDirectory>
<classifier>tomcat6</classifier>
<packagingExcludes>WEB-INF/lib/el-*</packagingExcludes>
<warSourceExcludes>WEB-INF/lib/el-*</warSourceExcludes>
</configuration>
</execution>
</executions>
<configuration>
<webResources>
<resource>
<directory>${basedir}/src/main/java</directory>
<targetPath>/WEB-INF/src</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
In your ear pom use profiles to import required dependency with appropriate classifier.
<profile>
<id>jee5</id>
<dependencies>
<dependency>
<groupId>org.richfaces.samples</groupId>
<artifactId>richfaces-demo</artifactId>
<version>${richfaces-version}</version>
<classifier>jee5</classifier>
<type>war</type>
<scope>runtime</scope>
</dependency>
</dependencies>
</profile>

Using Maven to run a WAR dynamically in Tomcat, how does one add classpath entries so only Tomcat sees them?

Scenario is such: I have a webapp that I'd like to run dynamically with the tomcat-maven-plugin's tomcat:run goal. The wrinkle is that I have numerous classpath resources that need to differ between the packaged artifact and the one run off a local workstation.
Failed Attempts:
1.) My first attempt was to use the builder-helper-maven-plugin, but it won't work because the target configuration files will (inconsistently!) work their way into the packaged WAR archive.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<id>add-resource</id>
<phase>generate-resources</phase>
<goals>
<goal>add-resource</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${testEnv}</directory>
<targetPath>${basedir}/target/classes</targetPath>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
2.) My second attempt was to add the folder (since the files-to-be-deployed aren't present in Tomcat's classpath yet either) to -Djava.ext.dirs, but it has no effect (I actually suspect that this systemProperties element is misconfigured or otherwise not working at all). See:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>tomcat-maven-plugin</artifactId>
<version>1.0-beta-1</version>
<configuration>
<tomcatWebXml>${basedir}/src/main/mock/web.xml</tomcatWebXml>
<systemProperties>
<property>
<name>java.ext.dirs</name>
<value>${basedir}/src/main/resources-env/${testEnv}</value>
</property>
</systemProperties>
<path>/licensing</path>
</configuration>
</plugin>
I'm not sure what to attempt next. The heart of the problem seems to be that missing in this plugin is something like Surefire's <additionalClasspathElement> element.
Would the next step be to create a custom catalina.properties and add it to a <configurationDir>? If so, what would catalina.properties need to look like?
Edit: More thorough explanation follows
I understand this question reads somewhat vaguely, so I'll try to elaborate a bit.
My POM uses the webResources functionality of the WAR plugin to copy some environment-specific config files and without using a profile to do it, by copying in a resource named /src/main/resources-env/${env} like so:
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
...
<configuration>
...
<webResources>
<!-- Copy Environment-specific resources to classes -->
<resource>
<directory>${basedir}/src/main/resources-env/${env}</directory>
<targetPath>WEB-INF/classes</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
This will copy the (default, DEV) environment resources into the package and currently works fine. Note also that b/c these occur as part of packaging, the tomcat:run goal is never privy to them (which is desired, as the environments differ).
So the problem is this: when the dynamic tomcat:run is executed, the application will fail because its classpath (it looks at target/classes) will lack the needed local workstation environmental config files. All I need to do is get those on the path for tomcat, but would like to do so without adding anything to the command line and definitely without breaking the build's integrity if someone follows up with a mvn package and doesn't clean first.
I hope this is more clear.
I may be missing something but why don't you declare the required dependencies in a profile and use this profile when running Tomcat? I don't get why you would need to put these resources at Tomcat's classpath level.
UPDATE: I'm editing my answer to cover the comment from the OP itself answering my question above.
You're correct, the files really need to be in the webapp classpath, not tomcat's. So how could I make a profile that activate automatically for tomcat:run but without additional cmd line args?
I don't know how to do this without declaring the profile as <activeByDefault> or listing it under the <activeProfiles> (but this is not what I had in mind, I would rather use property activation and call something like mvn tomcat:run -Denv=test, not sure to understand why this is a problem).
And how should I "declare the dependencies" in the profile while ensuring that subsequent invocations never let them into the package WAR via a vanilla mvn package
If the previously mentioned profile is active by default, then you'll need to exclude it if you don't want it, by calling something like mvn package -P !profile-1. A profile can't be magically deactivated for one particular goal (at least, not to my knowledge).
Actually, my understanding is that you really have two different context here: the "testing" context (where you want to include more things in the WAR) and the "normal" context (where you don't want these things to be included). To be honest, I don't know how you could distinguish these two situations without specifying any additional parameter (either to activate a profile or to deactivate it depending on the context). You must have valid reasons but, as I said, I don't really understand why this is a problem. So maybe profiles are not a solution for your situation. But I'd really like to understand why because this seems to be a typical use case for profiles :)
UPDATE2: Having read your comment to the other answer and your update, I realize that my initial understanding was wrong (I though you were talking about dependencies in the maven sense). But, I still think that profiles could help you, for example to customize the <resources> as in this blog post (this is just one way to do, using a property like src/main/resources/${env} in the path is another way to go). But this won't solve all your concerns (like not specifying additional command line params or automagically cleaning the target directory). I don't have any solutions for that.
Add the dependencies element directly to the plugin element.
Here is an example of doing the same with the Jetty plugin from the (still in development) Maven Handbook: http://www.sonatype.com/books/mhandbook-stage/reference/ch06s03.html
Vote for http://jira.codehaus.org/browse/MTOMCAT-77 which addresses this need.
Here's the solution I have in place at the moment.
Special thanks to Pascal's diligent conversation here, but I ultimately decided to make a change to how I was loading my environment-specific config files throughout the goals and now I believe I'm getting most of what I initially wanted.
I removed the config files from <webResources> from the WAR plugin and the test config from <testResources> and am now manually managing the resource-copying with the the maven-resources-plugin to copy them directly into target/classes at the goal they're needed. This way Tomcat can see the config, but the tests aren't broken by having duplicate or differing config files on the path.
It's definitely a mess, but it works. Listing:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<id>copy-env-resources</id>
<phase>process-resources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${env}</directory>
<filtering>true</filtering>
</resource>
</resources>
<outputDirectory>${basedir}/target/classes</outputDirectory>
</configuration>
</execution>
<execution>
<id>copy-testEnv-resources</id>
<phase>process-test-resources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${testEnv}</directory>
<filtering>true</filtering>
</resource>
</resources>
<outputDirectory>${basedir}/target/classes</outputDirectory>
</configuration>
</execution>
<execution>
<id>copy-env-resources-again</id>
<phase>prepare-package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${env}</directory>
<filtering>true</filtering>
</resource>
</resources>
<outputDirectory>${basedir}/target/classes</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
So a mvn clean install will build and test with ${env} and ${testEnv} appropriately. A mvn -Denv=someLocalConfig tomcat:run (which in my case is identical to my default ${testEnv} will make sure the src/main/resources-env/someLocalConfig gets loaded for Tomcat's dynamic execution, but without requiring that I do a clean before successfully rebuilding.
Like I said, messy that I'm rewriting the same cluster of files to the same target location at each phase, but it accomplishes what I'd meant to.

Categories