in my maven project, I've got a xml file in resources. Depending on some input parameter I want the file to be adapted before packaged into a jar or war. Of course, the original file shall not be touched.
It is not an option to create multiple xml-files and select a suitable one, for example, with spring profiles as there can be numerous combinations of contents in the xml file.
So, I thought of creating a maven plugin, that manipulates the file before packaging. Probably, I need to manipulate the file, when maven has copied the file to the target folder but before maven packages the file into the jar/war.
#Mojo(name = "manipulate-xml", defaultPhase = LifecyclePhase.PREPARE_PACKAGE)
public class MyMojo extends AbstractMojo {
#Parameter(defaultValue = "${project}", required = true, readonly = true)
MavenProject project;
#Parameter(property = "option")
String option;
public void execute() throws MojoExecutionException {
if (option.equals("optionA")) {
// get file from target and manipulate
} else if (option.equals("optionB")) {
// get file from target and manipulate
}
}
}
Then, I could embedded the maven plugin into my project and build the project with
mvn clean package -Doption=optionA
However, now I am stuck. I do not know, how to get the file from target and even if this is the right approach.
Besides, is it possible during the packaging to prevent some dependencies from being packaged into the jar/war?
I appreciate any help.
Depending on what manipulating means, you can use the possibilities of the maven resources plugin (https://maven.apache.org/plugins/maven-resources-plugin/index.html).
If you need to modify some simple values inside the xml, use properties in the xml and let the resources plugin replace them during build. The values for the build can be either in the pom.xml or given to maven via -Dproperty=value.
If you want to select a different files, define multiple maven profiles, in each you can configure the resources plugin to copy only the wanted files and then select the correct profile in the build.
If the built-in possibilities are not enough, you might even program your own filter for the resources plugin, that might be easier than writing a custom full fledged maven plugin.
Related
I'm currently writing a custom maven plugin for generating a XML file in a multi-module maven project.
My maven structure is pretty standard: one parent project and a module by project components in the parent project folder:
-- Parent
-- module A
-- module B
-- module C
I need to list, by module, a set of classes flagged by a custom annotation.
I already wrote a set of custom annotations and an annocation processor to create a XML file at compile time in the corresponding module output directory (${project.build.outputDirectory}) .
Now i need to merge each module XML into one file, but i don't know how to access each modules from within my maven plugin except having each path set as parameters (i don't like this method).
Any idea on how to do this ?
Does maven plugins can traverse project modules ?
Thank you in advance.
To get the list list of all projects you can use:
List<MavenProject> projectList = MavenSession.getProjectDependencyGraph().getSortedProjects()
If one of your goals is correctly executed you will get everything you need. Every MavenProject contains a getBaseDir() etc.
After some researches, it seems that MavenProject.getCollectedProjects() will return the list of projects beeing manipulated by a goal execution in a multi-module project.
I am migrating an EAR application from Log4J 1.2.17 to Log4J2 2.4. Please find below the EAR structure.
EAR
-- APPLICATION JAR 1 (contains custom plugin)
-- APPLICATION JAR 2
-- APPLICATION JAR 3 (contains custom plugin)
-- APPLICATION JAR 4
-- APPLICATION WAR 1
-- APPLICATION WAR 2
-- APPLICATION WAR 3
-- OTHER THIRD PARTY APIs
-- lib/log4j-api-2.4.jar
-- lib/log4j-core-2.4.jar
-- lib/log4j-jcl-2.4.jar
-- lib/log4j-web-2.4.1.jar
-- META-INF/log4j2.xml
-- META-INF/MANIFEST.MF (contains all jars in class-path entry)
Custom plugin classes in all the jars are in the same package - com.test.it.logging.
PFB the initialization code.
Adding the custom plugins package.
PluginManager.addPackage("com.test,it.logging");
Initializing the logging configuration using log4j2.xml.
String path = "path/log4j2.xml";
System.setProperty("log4j.configurationFile", path);
None of the defined custom plugins are getting detected and I tried all the combinations available to initialize log4j2.xml and plugins initialization but nothing worked.
It gives me a feel that custom plugins is not at all working in EAR as I tried all the permutations and combinations. is this a BUG in log4j2 (version: 2.4) ? If no, then please guide me about how to define logging configuration containing custom plugins in an EAR containing custom plugins that are scattered across many jars within an EAR ?
Can anyone please let me know about how to configure
Also, PFB my question posted in stackoverflow on the same.
Custom plugin not getting detected in EAR with log4j2 API
I am using Wildfly 8.2.0-Final AS and maven for building EAR.
Just adding a note that I am always finding Log4JPlugins.dat file inside Jars containing custom plugins irrespective of the options I try regarding detecting plugins.
Your response is highly important to me and thanks.
I don't believe the log4j classes have visibility into the classloaers for the war and application jars.
When compiling a custom Plugin, the Log4J pom.xml defines a plugin that automatically generates cache data in the file META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
You can see this under your target/classes in a Maven project.
The log4j-core-2.x.x.jar also contains a Log4j2Plugins.dat defining its cache data.
The problem is a single JAR is created when testing an EAR using ShrinkWrap and normally the log4j-core-2.x.x.jar Log4j2Plugins.dat is added to the test JAR as it would most likely be first in the class path.
This means your custom plugin cache is missing.
The solution using ShrinkWrap is to create a new Log4j2Plugins.dat merging any required custom plugin cache files with the cores and then adding that to the JAR.
The following function achieves that...
private static void mergeLog4J2Log4j2PluginsFile(JavaArchive ja, Class... uniqueJARClasses) {
// #Author: Johnathan Ingram <jingram#rogueware.org>
// Log4J2 uses /META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat within a JAR to define custom plugins
// This is automatically generated by the plugin defined in the log4j-core-2.x.x pom.xml when compiling your custom plugin
// The problem with shrinkwrap is that the JAR is not preserved and only a single
// /META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
// file can exist as JAR files cannot be added to a JAR file as a library.
// This is normally the default contained in log4j-core-2.x.x.jar which does not expose any custom plugins
// To rectify, both the core and the custom plugin JAR file Log4j2Plugins.dat need to be merged into a single Log4j2Plugins.dat
try {
// List of a unique class in each JAR containing a Log4j2Plugins.dat requiring merging
Vector<URL> datUrls = new Vector<URL>();
for (Class klass : uniqueJARClasses) {
// Find the JAR the class belongs to
URL classLoc = klass.getProtectionDomain().getCodeSource().getLocation();
URL resourceURL = classLoc.toString().endsWith(".jar")
? new URL("jar:" + URLDecoder.decode(classLoc.toString(), "UTF-8") + "!/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat")
: new URL(URLDecoder.decode(classLoc.toString(), "UTF-8") + "/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
datUrls.add(resourceURL);
}
// Use the Log4J2 PluginCache to build a merged Log4j2Plugins.dat
File mergedDatFile = new File("target/Log4j2Plugins.dat");
try (FileOutputStream fo = new FileOutputStream(mergedDatFile)) {
org.apache.logging.log4j.core.config.plugins.processor.PluginCache pc = new org.apache.logging.log4j.core.config.plugins.processor.PluginCache();
pc.loadCacheFiles(datUrls.elements());
pc.writeCache(fo);
}
// Replace the default Log4j2Plugins.dat if present
ja.delete("/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
ja.addAsManifestResource(mergedDatFile, "org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
} catch (Exception ex) {
ex.printStackTrace(System.err);
}
}
To run:
JavaArchive ja = ShrinkWrap.create(JavaArchive.class, "my-test.jar");
...
mergeLog4J2Log4j2PluginsFile(ja, org.apache.logging.log4j.core.config.plugins.processor.PluginCache.class, MyCustomPlugin.class);
I've been tasked with write some base classes for automated integration tests for an existing project, and I've run into a snag with project dependencies.
The (abstracted) project layout is this:
project /
MainProject
Plugins /
...
ConfigurationProject
IntegrationTestProject
IntegrationTestProject has a dependency upon ConfigurationProject, the latter having the following layout:
ConfigurationProject/
PluginConfigs/
<plugin configuration files>
<main configuration files>
Notably, the main configuration files reside in the root of the project. In an attempt to add them to my classpath, my primary build.gradle has the following:
project('ConfigurationProject') {
description = 'Configuration'
sourceSets.main.resources.srcDir projectDir
}
This seems to look okay, as eclipse shows all files in root as being part of the project resources, and assembling packages everything up as expected.
However, when I actually go to run the integration tests, the ConfigurationProject resources do not seem to be in the classpath, as it fails to pull config information, further confirmed by the lack of ConfigurationProject present in outputs from this snippet:
public void classpathScanner() {
ClassLoader c=getClass().getClassLoader();
System.out.println("c="+c);
URLClassLoader u=(URLClassLoader)c;
URL[] urls=u.getURLs();
for (URL i : urls) {
System.out.println("url: "+i);
}
}
The ConfigurationProject is included in IntegrationTestProject using the IntegrationTestProjects gradle.build
dependencies {
compile project(':ConfigurationProject')
}
I have only observed this problem when adding the project root as a resource, addng subfolders of a project to sourceSets seems fine (and is used elsewhere in this project). Moving the configuration files to a subfolder is an option I have considered, and will enact if I find no other solution, but I wanted to see if there were options that did not involve this course of action.
The answer to this is essentially "don't make project root a resource folder, Gradle hates that, and even if it didn't, it's bad form anyway" I fixed this by placing the configuration files to a more sensible resource path.
How can you get all the dependencies of a MavenProject (including transitive ones) using Aether?
I have seen numerous examples where you specify the gav and it resolves the artifact and all it's dependencies. This is all fine. However, if your plugin is supposed to be invoked from the same project whose dependencies you're trying to resolve, this does not seem to work (or perhaps I am doing it wrong). Could somebody please give me a working example of how to do it?
I have tried the example with jcabi-aether shown in this SO post.
Try to use an utility class Classpath from jcabi-aether:
Collection<File> jars = new Classpath(
this.getProject(),
new File(this.session.getLocalRepository().getBasedir()),
"test" // the scope you're interested in
);
You will get a list of JARs and directories which are in "test" scope in the current Maven project your plugin is in.
If you're interested to get a list of Artifacts instead of Files, use Aether class directly:
Aether aether = new Aether(this.getProject(), repo);
Set<Artifact> artifacts = new HashSet<Artifact>();
for (Artifact dep : this.getProject().getDependencyArtifacts()) {
artifacts.addAll(aether.resolve(dep, JavaScopes.COMPILE));
}
I have a multi-module maven project, including a seperate assembly-project. As i develop and run my application from eclipse (during development), i have specific configuration-files (e.g. log4j or other property-files) in my main-module (which contains the main-class). These files contain development-time-specific information. The assembly-project contains each of the config-files for production. The assembled product then should use these configs instead. This is my current setup:
MainModule/src/main/resources
+configA.properties
+log4j.properties
Module1/src/main/resources
+configB.properties
AssemblyProj/src/main/resources
+configA.properties
+configB.properties
+log4j.properties
And the generated project has this structure:
libs/
+MainModule.jar
+Module1.jar
configs/
+configA.properties
+configB.properties
+log4j.properties
the config-directory overlays the config-files in each *.jar because of the classpath, i.e.
java -cp configs/;libs/* My.Main.Class
Now the problem that i have, is that there are still all dev-configs included in each jar. Also i have kind of a bad feeling about using that overlay-classpath-method. Is there any practice on how to do this in a better manner?
Extract these resources into classifier-based dependencies for each of the mentioned modules. Then define <profiles/> that trigger their usage. In your assembly use the classifiers as necessary.