I have a POM file which includes a property (under properties section) which has an IP value that we use when pushing it to git.
<device.ip>1.2.3.4</device.ip>
But for my builds, I need to use a another IP value, so I should change it to my required IP when I start to work on a new branch.
I'd like to be able to check the variable value at build startup and abort it given the variable value is different from what I need.
Any other solutions also welcomed.
(I hope my question would not be downgraded because of lack of code - there is no really code to write here. The scenario is quite self explanatory)
Than you for your advice.
You could use the maven-enforcer-plugin which support such checks.
The usage looks like this for the requirePropery rule.
<project>
[...]
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>3.0.0-M2</version>
<executions>
<execution>
<id>enforce-property</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<requireProperty>
<property>device.ip</property>
<message>You must set a device.ip property!</message>
<regex>.*\d.*</regex> <!-- Express the value you need. -->
<regexMessage>The device.ip property contain...</regexMessage>
</requireProperty>
</rules>
<fail>true</fail>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
[...]
</project>
I would suggest splitting your project in modules.
Module 1 contains your code without any configuration.
Module 2 refers to module 1 and additionally contains what goes into production. For multiple deployments create an additional module for each. This is where your production property goes.
Module 3 refers to module 1 (but not 2) and contains whatever you need for development (configuration like this property and helper classes). For complex scenarios make an additional module for each.
This has worked well for me.
Related
I have a Java (11.0.7) Maven (3.0.6) multi-module project that contains the following module declarations:
<modules>
<module>jdrum-commons</module>
<module>jdrum-datastore-base</module>
<module>jdrum-datastore-simple</module>
<module>jdrum</module>
</modules>
Each of these Maven modules contains a module-info that defines the necessary requirements and exports to restrict access and visibility.
As such, jdrum-datastore-simple has some test utility classes that I reuse in jdrum's tests. By configuring the surefire plugin in jdrum's config via the code snippet below I am able to package the whole project without any issues.
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>
<!-- Allow the unnamed module access to the tests at test-time -->
--add-opens jdrum/at.rovo.drum.impl=ALL-UNNAMED
--illegal-access=deny
</argLine>
</configuration>
</plugin>
</plugins>
</build>
Within the parents POM I've also configured the generation of a report via the site argument, which also generates the Javadoc of the respective projects. The configuration for the JAR containing the javadoc as well as the configuration for the Javadoc generation as part of the report are both the same and look like this:
<!-- Generate Javadoc while reporting -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.2.0</version>
<inherited>true</inherited>
<configuration>
<verbose>true</verbose>
<source>${maven.compiler.source}</source>
<show>protected</show>
<failOnWarnings>false</failOnWarnings>
<release>${maven.compiler.release}</release>
<stylesheet>java</stylesheet>
</configuration>
<reportSets>
<reportSet>
<id>html</id>
<reports>
<report>javadoc</report>
</reports>
</reportSet>
</reportSets>
</plugin>
The Javadoc generation as part of the package step, which generates the project-version-javadoc.jar as output, succeeds as both, the jdrum-datastore-simple dependencies as well as its tests, are only included at test time:
<!-- Test data store to use for testing -->
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
If I'd change the scope from test to compile or provided the Javadoc generation would also fail with an error such as
Exit code: 1 - javadoc: error - The code being documented uses packages in the unnamed module, but the packages defined in https://github.com/RovoMe/JDrum/jdrum-datastore-simple/apidocs/ are in named modules.
The issue here, as far as I understood the problem, is, that the jdrum-datastore-simple module is not added to the module path of Javadoc. The next logical step was therefore to add that module to the configuration as such:
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<additionalOptions>
<option>--add-modules</option>
<option>jdrum.datastore.simple</option>
</additionalOptions>
</configuration>
</plugin>
</plugins>
</reporting>
This adds the jdrum-datastore-simple module to the Javadoc configuration string, which can be seen in the jdrum/target/site/apidocs/options file that now contains an
...
--add-modules
jdrum.datastore.simple
...
entry. On further analyzing the generated options file it is apparent that the module path is missing out a reference to the actual JAR file and hence the Javadoc generation and thus the Maven process fails due to Javadoc not being able to locate the defined module. If I update that options file and add the path to the missing JAR file and then only perform a mvn package site the whole process succeeds and all is fine (as the pure invocation of the javadoc.bat located in the target/site/apidocs folder would as well).
Now, in order to make the whole process more dynamic I wanted to add or update the module path. However, the maven-javadoc-plugin does not directly allow this. Therefore I came up with adding a further maven-javadoc-plugin option of --module-path and a further option entry that contains the whole path. By the whole path I mean the path to every single dependency, so not only the path to jdrum-datastore-simple. This also works but due to hardcoding the path to the respective JAR files, the project is now not usable by other users unless they have the same system and path structure I used. To fix this I quickly replaced the respective path structure with ${settings.localRepository} and ${project.parent.basedir} properties on the respective modules in the module path. Unfortunately Javadoc is rather nitpicking on the path structure it accepts and it turns out that on my Windows machine Maven does return a path structure starting with C:\Users\... which Javadoc can't handle. If the path structure looks like C:/Users/... however Javadoc is fine with the values.
On further research I stumbled upon this thread which suggests to use Maven's build-helper-maven-plugin to define new properties for i.e. the M2 repository and use the built-in reg-ex capability to replace \ characters with /. However, adding a configuration such as
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>replace-local-repo-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.m2repo</name>
<value>${settings.localRepository}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
<execution>
<id>replace-local-path-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.basedir</name>
<value>${project.parent.basedir}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
</executions>
</plugin>
and using the introduced tags instead does not work at all as Maven is complaining about an invalid value provided. If I use $\{settings.localRepository} Maven is fine about the provided value, however in the final options file not the value of the actual settings.localRepository is updated but the provided string itself and I end up with something like $/{settings.localRepository}/org/slf4j/... which Javadoc can't resolve and therefore still misses out on the correct location to the jdrum-datastore-simple dependency.
So, how can I add the path to the missing dependency to maven-javadoc-plugin's module path defined in the generated options file so that the Maven is actually able to generate the whole report?
It seems that with java11 Update 9 (maybe also with update 8; not tested) maven-javadoc-plugin is able to correctly generate the Javadoc for multi-module projects without the need to alter the module-path.
For those interested how the actual Maven POM looks like:
Parent POM
POM for a shared module
POM for a sharing and consuming module
POM for the consuming module
If I have two dependencies which are the same in the same pom, I want the build to fail. Currently I can detect it happening with the Maven Dependency Plugin's "analyze-duplicate". However, there's no option to failOnWarning like some of the others (plus, it prints at Info level, not Warning). Is there an alternative to extending this?
Generally, when you want the build to fail for some reason, the good plugin to look into the Maven Enforcer Plugin. This plugin can be configured with a set of rules that, when verified, will fail the build.
In this case, it would need to be a rule that checks for duplicate dependencies, and there is a built-in rule just for that: <banDuplicatePomDependencyVersions>. As such, you could have
<plugin>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<executions>
<execution>
<id>enforce-no-duplicate-dependencies</id>
<goals>
<goal>enforce</goal>
</goals>
<configuration>
<rules>
<banDuplicatePomDependencyVersions/>
</rules>
</configuration>
</execution>
</executions>
</plugin>
This rule is unfortunately not documented (yet, it will be in the next version, see MENFORCER-259), but it exists since version 1.3 of the plugin (MENFORCER-152).
What this rule does is checking that there are no 2 duplicate declaration with the same 'dependencies.dependency.(groupId:artifactId:type:classifier)'; which is to say that two declared dependencies with the same group id and artifact id declared in the POM will have to have a different type and/or classifier.
I want to deploy two jar artifacts with different classifiers, but at the moment that fails because both supply their own version of pom.xml. How can I fix that, so that both pom.xmls can be uploaded along with their artifacts?
Example - I have com.test.company.somelib-1.0.0-cmp1.jar and com.test.company.somelib-1.0.0-cmp2.jar, where cmpX is a classifier. Both packages contain (logically) the same code and classes (of the same version), they only differ slightly in the way they were preprocessed. The classifier annotation is there due to backwards compatibility we need to maintain.
Long story short, first artifact uploads fine, second one fails with Forbidden, because our repository does not allow overwriting artifacts (and I want to keep it that way).
There is a slightly different pipeline that creates both the packages, so it is easier to have their builds separate. I just want to deploy them as two packages of the same name and different classifier.
Thanks for help
Edit: it has been suggested to use Maven profiles. I can see that they would work, but they would not be ideal.
Consider the setup I have depicted on the picture below - there is a CI server (TeamCity).
There is a "starter" build (Sources). This build checkouts all required source files.
From this starter build several other builds are triggered (processing using x.x.x/compile). Each of those builds adjusts a template-pom.xml (fills in particular classifier and other info), and then builds and deploys its artifact to our Artifactory.
With the setup I want to achieve if I decide to add another processing-build, all I need to do is add another "branch". If I was using profiles, I would need to also add a new profile to the pom.xml file.
Correct me if I am wrong please. Profiles seem to be able to achieve the goal, but not ideally, at least in my case.
I strongly discourage having 2 (or more) different pom files with the same GAV.
But I understand your need is raised by legacy reasons.
I have not tried this myself but it could be working:
Leave one build (= maven project) as you have it now. On the other build skip the normal deployment and manually invoke the deploy-file goal of the deploy plugin like so:
<build>
<plugins>
<!-- skip normal execution of deploy plugin -->
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<executions>
<execution>
<id>default-deploy</id>
<configuration>
<skip>true</skip>
</configuration>
</execution>
</executions>
</plugin>
<!-- invoke with goal: deploy-file -->
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<executions>
<execution>
<id>someId</id>
<phase>deploy</phase>
<goals>
<goal>deploy-file</goal>
</goals>
<inherited>false</inherited>
<configuration>
<file>path-to-your-artifact-jar</file>
<generatePom>false</generatePom>
<artifactId>xxx</artifactId>
<groupId>xxx</groupId>
<version>xxx</version>
<classifier>xxx</classifier>
<packaging>xxx</packaging>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
I have a Maven project which includes a Maven plugin (the Liquibase Maven plugin) which exposes different goals.
Two of these goals (update and diff) need different parameters which are in conflict between them (because the semantics of the two is different), so I need to give Maven different properties in the two goal executions.
That's what I've done
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.4.1</version>
<!-- This configuration is used for every goal except "diff" -->
<configuration>
<propertyFile>src/main/resources/liquibase.properties</propertyFile>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
</configuration>
<executions>
<execution>
<id>default-cli</id>
<goals>
<goal>diff</goal>
</goals>
<!-- This configuration is used for the "diff" goal -->
<configuration>
<propertyFile>src/main/resources/liquibaseDiff.properties</propertyFile>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
</configuration>
</execution>
</executions>
</plugin>
However, this configuration is wrong because for each goal (diff, update of the others) only the liquibaseDiff.properties file is used.
Is there any way to pass different configurations for different goals in Maven?
Configuration of plugins can be done in two different locations:
Globally for all executions. The global configuration is done with the <configuration> element under <plugin>. This configuration in inherited by all executions.
Per execution. This is done using the <configuration> element under <execution>.
In your example, consider this POM:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.4.1</version>
<configuration>
<!-- put the configuration here that is common to all executions -->
</configuration>
<executions>
<execution>
<id>diff</id>
<goals>
<goal>diff</goal>
</goals>
<configuration>
<!-- put the specific configuration of the diff goal here, this will inherit from the global configuration -->
</configuration>
</execution>
<execution>
<id>update</id>
<goals>
<goal>update</goal>
</goals>
<configuration>
<!-- put the specific configuration of the update goal here, this will inherit from the global configuration -->
</configuration>
</execution>
</executions>
</plugin>
The default inheritance behavior is to merge the content of the configuration element according to element name. If the child POM has a particular element, that value becomes the effective value. If the child POM does not have an element, but the parent does, the parent value becomes the effective value.
In case of conflicts, you can control the default inheritance performed by Maven using combine.children and combine.self. Quoting the Maven docs:
combine.children="append" results in the concatenation of parent and child elements, in that order. combine.self="override", on the other hand, completely suppresses parent configuration.
In addition to this, you need to be aware that when executing a Maven command, on the command line, that directly invokes a goal, such as mvn liquibase:diff, it creates a new execution with an id that is default-cli. As such, since the specific configuration above of the goal diff is done in an execution with id diff, it will not be used. This is actually normal, since the same goal of the same plugin could be present in multiple execution blocks with different configuration: which one should be used if it is executed on the command line, without additional information?
Typically, this situation is solved in 2 manners:
Execute on the command line a specific execution, i.e. the one you configured. This is possible since Maven 3.3.1 and you would execute
mvn liquibase:diff#diff
The #diff in the command above refers to the unique <id> of the execution that is configured in the POM.
Bind your execution to a specific phase of the Maven lifecycle, and let it be executed with the normal flow of the lifecycle. This is generally the prefered solution. In the example above, we could, for example, add a <phase>test</phase> in the execution block of the diff execution; and then Maven will execute it when the test phase is ran during the course of the build.
Scenario is such: I have a webapp that I'd like to run dynamically with the tomcat-maven-plugin's tomcat:run goal. The wrinkle is that I have numerous classpath resources that need to differ between the packaged artifact and the one run off a local workstation.
Failed Attempts:
1.) My first attempt was to use the builder-helper-maven-plugin, but it won't work because the target configuration files will (inconsistently!) work their way into the packaged WAR archive.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.3</version>
<executions>
<execution>
<id>add-resource</id>
<phase>generate-resources</phase>
<goals>
<goal>add-resource</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${testEnv}</directory>
<targetPath>${basedir}/target/classes</targetPath>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
2.) My second attempt was to add the folder (since the files-to-be-deployed aren't present in Tomcat's classpath yet either) to -Djava.ext.dirs, but it has no effect (I actually suspect that this systemProperties element is misconfigured or otherwise not working at all). See:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>tomcat-maven-plugin</artifactId>
<version>1.0-beta-1</version>
<configuration>
<tomcatWebXml>${basedir}/src/main/mock/web.xml</tomcatWebXml>
<systemProperties>
<property>
<name>java.ext.dirs</name>
<value>${basedir}/src/main/resources-env/${testEnv}</value>
</property>
</systemProperties>
<path>/licensing</path>
</configuration>
</plugin>
I'm not sure what to attempt next. The heart of the problem seems to be that missing in this plugin is something like Surefire's <additionalClasspathElement> element.
Would the next step be to create a custom catalina.properties and add it to a <configurationDir>? If so, what would catalina.properties need to look like?
Edit: More thorough explanation follows
I understand this question reads somewhat vaguely, so I'll try to elaborate a bit.
My POM uses the webResources functionality of the WAR plugin to copy some environment-specific config files and without using a profile to do it, by copying in a resource named /src/main/resources-env/${env} like so:
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
...
<configuration>
...
<webResources>
<!-- Copy Environment-specific resources to classes -->
<resource>
<directory>${basedir}/src/main/resources-env/${env}</directory>
<targetPath>WEB-INF/classes</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
This will copy the (default, DEV) environment resources into the package and currently works fine. Note also that b/c these occur as part of packaging, the tomcat:run goal is never privy to them (which is desired, as the environments differ).
So the problem is this: when the dynamic tomcat:run is executed, the application will fail because its classpath (it looks at target/classes) will lack the needed local workstation environmental config files. All I need to do is get those on the path for tomcat, but would like to do so without adding anything to the command line and definitely without breaking the build's integrity if someone follows up with a mvn package and doesn't clean first.
I hope this is more clear.
I may be missing something but why don't you declare the required dependencies in a profile and use this profile when running Tomcat? I don't get why you would need to put these resources at Tomcat's classpath level.
UPDATE: I'm editing my answer to cover the comment from the OP itself answering my question above.
You're correct, the files really need to be in the webapp classpath, not tomcat's. So how could I make a profile that activate automatically for tomcat:run but without additional cmd line args?
I don't know how to do this without declaring the profile as <activeByDefault> or listing it under the <activeProfiles> (but this is not what I had in mind, I would rather use property activation and call something like mvn tomcat:run -Denv=test, not sure to understand why this is a problem).
And how should I "declare the dependencies" in the profile while ensuring that subsequent invocations never let them into the package WAR via a vanilla mvn package
If the previously mentioned profile is active by default, then you'll need to exclude it if you don't want it, by calling something like mvn package -P !profile-1. A profile can't be magically deactivated for one particular goal (at least, not to my knowledge).
Actually, my understanding is that you really have two different context here: the "testing" context (where you want to include more things in the WAR) and the "normal" context (where you don't want these things to be included). To be honest, I don't know how you could distinguish these two situations without specifying any additional parameter (either to activate a profile or to deactivate it depending on the context). You must have valid reasons but, as I said, I don't really understand why this is a problem. So maybe profiles are not a solution for your situation. But I'd really like to understand why because this seems to be a typical use case for profiles :)
UPDATE2: Having read your comment to the other answer and your update, I realize that my initial understanding was wrong (I though you were talking about dependencies in the maven sense). But, I still think that profiles could help you, for example to customize the <resources> as in this blog post (this is just one way to do, using a property like src/main/resources/${env} in the path is another way to go). But this won't solve all your concerns (like not specifying additional command line params or automagically cleaning the target directory). I don't have any solutions for that.
Add the dependencies element directly to the plugin element.
Here is an example of doing the same with the Jetty plugin from the (still in development) Maven Handbook: http://www.sonatype.com/books/mhandbook-stage/reference/ch06s03.html
Vote for http://jira.codehaus.org/browse/MTOMCAT-77 which addresses this need.
Here's the solution I have in place at the moment.
Special thanks to Pascal's diligent conversation here, but I ultimately decided to make a change to how I was loading my environment-specific config files throughout the goals and now I believe I'm getting most of what I initially wanted.
I removed the config files from <webResources> from the WAR plugin and the test config from <testResources> and am now manually managing the resource-copying with the the maven-resources-plugin to copy them directly into target/classes at the goal they're needed. This way Tomcat can see the config, but the tests aren't broken by having duplicate or differing config files on the path.
It's definitely a mess, but it works. Listing:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<id>copy-env-resources</id>
<phase>process-resources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${env}</directory>
<filtering>true</filtering>
</resource>
</resources>
<outputDirectory>${basedir}/target/classes</outputDirectory>
</configuration>
</execution>
<execution>
<id>copy-testEnv-resources</id>
<phase>process-test-resources</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${testEnv}</directory>
<filtering>true</filtering>
</resource>
</resources>
<outputDirectory>${basedir}/target/classes</outputDirectory>
</configuration>
</execution>
<execution>
<id>copy-env-resources-again</id>
<phase>prepare-package</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/src/main/resources-env/${env}</directory>
<filtering>true</filtering>
</resource>
</resources>
<outputDirectory>${basedir}/target/classes</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
So a mvn clean install will build and test with ${env} and ${testEnv} appropriately. A mvn -Denv=someLocalConfig tomcat:run (which in my case is identical to my default ${testEnv} will make sure the src/main/resources-env/someLocalConfig gets loaded for Tomcat's dynamic execution, but without requiring that I do a clean before successfully rebuilding.
Like I said, messy that I'm rewriting the same cluster of files to the same target location at each phase, but it accomplishes what I'd meant to.