In my build.gradle file I need to add the line:
shadowJar {
mergeServiceFiles()
}
Otherwise the jar does not run properly. I wonder what this line does exactly?
I use the Gradle plugin in Eclipse Luna. I create the jar on one Java project which depends on another one.
mergeServiceFiles is declared exactly here and its implementation is as follows:
/**
* Syntactic sugar for merging service files in JARs
* #return
*/
public ShadowJar mergeServiceFiles() {
try {
transform(ServiceFileTransformer.class);
} catch (IllegalAccessException e) {
} catch (InstantiationException e) {
}
return this;
}
As you can see it uses ServiceFileTransfomer which is defined here. From its docs:
Modified from org.apache.maven.plugins.shade.resource.ServiceResourceTransformer.java
Resources transformer that appends entries in META-INF/services resources into a single resource. For example, if there are several META-INF/services/org.apache.maven.project.ProjectBuilder resources
spread across many JARs the individual entries will all be
concatenated into a single
META-INF/services/org.apache.maven.project.ProjectBuilder resource
packaged into the resultant JAR produced by the shading process.
TL;DR - It merges the service files in the META-INF/services folder.
Long Answer
Some libraries (e.g. Micronaut) create a few service files in the META-INF/services folder. These files can contain any information useful at runtime. In case of Micronaut framework, it creates a file that lists the Bean References (or beans that are instantiated) in a file called io.micronaut.inject.BeanDefinitionReference under META-INF/services.
Usually, if you just have one application, it works fine even without mergeServiceFiles(). But if there are two micronaut projects that you are bundling into a single jar (e.g. a micronaut lib project with some utils and a micronaut app project with core business logic), you will have two io.micronaut.inject.BeanDefinitionReference. Each one will contain the beans of it's own project.
If you don't use mergeServiceFiles(), one of the BeanDefinitionReference files will be overwritten by the other. In that case, you will get a runtime exception saying BeanNotInstantiated or something of that sort.
Using mergeServiceFiles() merges (or concatenates in this case) the BeanDefinitionReference files of both the projects so that at runtime, you get all the beans defined.
More details can be found in the gradle forum topic here.
Related
in my maven project, I've got a xml file in resources. Depending on some input parameter I want the file to be adapted before packaged into a jar or war. Of course, the original file shall not be touched.
It is not an option to create multiple xml-files and select a suitable one, for example, with spring profiles as there can be numerous combinations of contents in the xml file.
So, I thought of creating a maven plugin, that manipulates the file before packaging. Probably, I need to manipulate the file, when maven has copied the file to the target folder but before maven packages the file into the jar/war.
#Mojo(name = "manipulate-xml", defaultPhase = LifecyclePhase.PREPARE_PACKAGE)
public class MyMojo extends AbstractMojo {
#Parameter(defaultValue = "${project}", required = true, readonly = true)
MavenProject project;
#Parameter(property = "option")
String option;
public void execute() throws MojoExecutionException {
if (option.equals("optionA")) {
// get file from target and manipulate
} else if (option.equals("optionB")) {
// get file from target and manipulate
}
}
}
Then, I could embedded the maven plugin into my project and build the project with
mvn clean package -Doption=optionA
However, now I am stuck. I do not know, how to get the file from target and even if this is the right approach.
Besides, is it possible during the packaging to prevent some dependencies from being packaged into the jar/war?
I appreciate any help.
Depending on what manipulating means, you can use the possibilities of the maven resources plugin (https://maven.apache.org/plugins/maven-resources-plugin/index.html).
If you need to modify some simple values inside the xml, use properties in the xml and let the resources plugin replace them during build. The values for the build can be either in the pom.xml or given to maven via -Dproperty=value.
If you want to select a different files, define multiple maven profiles, in each you can configure the resources plugin to copy only the wanted files and then select the correct profile in the build.
If the built-in possibilities are not enough, you might even program your own filter for the resources plugin, that might be easier than writing a custom full fledged maven plugin.
I use package-info.java to specify #XmlAccessorType(XmlAccessType.NONE) and some xml java adapters using #XmlJavaTypeAdapters. Model objects (with JAXB annotations) are placed in separate maven module shared by other modules. The configuration in package-info.java is not discovered if model objects are in separate maven module. If I move for testing purposes model objects to same maven module everything is OK. I think separate maven module can be considered equivalent to 3rd party lib from JAXBContext point of view. I use JDK1.7 JAXB reference implementation. Any ideas how configuration may differ?
I also encounter this problem, in my case qualified/unqualified property from package-info.java was ignored. I managed to find two way to workaround this:
like Pavla wrote, copy all JAXB classes with package-info.java locally
include module as a dependency with compile scope (which gives similar result that classes are in module. In my case I created separate jar lib with JAXB classes)
I also spotted that it do not work only in case of creating WebServices (creating object and sending to WS works fine in different modules).
I am using Jbossas7.1.1 and cxf 2.4.6. In the time of registering service Jboss created wsdl from JAXB (in my case path /opt/jboss/jboss-as-7.1.1.Final/standalone/data/wsdl/module.war/SubmitMessage.wsdl). In local setting file is generated properly.
Any ideas why creating WS behaves like this?
I hit this issue recently and the actual problem (with Java 8, i.e. no Java modules involved) was that I had on the classpath two *.jar files which both contained the same package - in one JAR, there was package-info.class with JAXB annotations and in the other one, there wasn't.
In that case, I guess that if package-info.class file is discovered depends on the classpath ordering (which is very brittle and only semi-deterministic).
I have a WAR application that includes a JAR library. The JAR library contains the Batch Job and the Batch Artifacts (META-INF/batch-jobs/...). The WAR app includes this jar as a library and defines a JAX-RS Service that allows to clients to invoke the batch job calling the JobOperator Interface...
When i run this deployment, the JSR 352 implementation (JBeret) keeps complaining that the Job cannot be found anyware when the JobOperator Interface is called... However, if the Batch Job and the Batch Artifacts are included as classes of the WAR deployment, everything runs smoothly...
So, what is the problem?
After a "little" research, i found the answer (dispersed) in the following links:
Wildfly Issues
Mailing list
Briefly, In order to put this kind of deployment to work, you have to modify the deployment that calls the Job Operator interface to invoke the requested Job (in my case, it was the WAR File)... These are the modifications:
Include an "empty" batch-jobs folder under the META-INF folder. (I guess the empty is optional, because i have to put a README file under that folder to prevent GIT from removing such folder)
Define a ServiceLoader (file) under META-INF/services folder. This ServiceLoader (file) must be called: org.jberet.spi.JobXmlResolver and should contain the following implementation as content: org.jberet.tools.MetaInfBatchJobsJobXmlResolver
That's all.
The WildFly issue (https://issues.jboss.org/browse/WFLY-7000, similar to the one mentioned above, but is a different one) has been fixed, and should address your point 1 (having to use empty batch-jobs/ directory).
I am migrating an EAR application from Log4J 1.2.17 to Log4J2 2.4. Please find below the EAR structure.
EAR
-- APPLICATION JAR 1 (contains custom plugin)
-- APPLICATION JAR 2
-- APPLICATION JAR 3 (contains custom plugin)
-- APPLICATION JAR 4
-- APPLICATION WAR 1
-- APPLICATION WAR 2
-- APPLICATION WAR 3
-- OTHER THIRD PARTY APIs
-- lib/log4j-api-2.4.jar
-- lib/log4j-core-2.4.jar
-- lib/log4j-jcl-2.4.jar
-- lib/log4j-web-2.4.1.jar
-- META-INF/log4j2.xml
-- META-INF/MANIFEST.MF (contains all jars in class-path entry)
Custom plugin classes in all the jars are in the same package - com.test.it.logging.
PFB the initialization code.
Adding the custom plugins package.
PluginManager.addPackage("com.test,it.logging");
Initializing the logging configuration using log4j2.xml.
String path = "path/log4j2.xml";
System.setProperty("log4j.configurationFile", path);
None of the defined custom plugins are getting detected and I tried all the combinations available to initialize log4j2.xml and plugins initialization but nothing worked.
It gives me a feel that custom plugins is not at all working in EAR as I tried all the permutations and combinations. is this a BUG in log4j2 (version: 2.4) ? If no, then please guide me about how to define logging configuration containing custom plugins in an EAR containing custom plugins that are scattered across many jars within an EAR ?
Can anyone please let me know about how to configure
Also, PFB my question posted in stackoverflow on the same.
Custom plugin not getting detected in EAR with log4j2 API
I am using Wildfly 8.2.0-Final AS and maven for building EAR.
Just adding a note that I am always finding Log4JPlugins.dat file inside Jars containing custom plugins irrespective of the options I try regarding detecting plugins.
Your response is highly important to me and thanks.
I don't believe the log4j classes have visibility into the classloaers for the war and application jars.
When compiling a custom Plugin, the Log4J pom.xml defines a plugin that automatically generates cache data in the file META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
You can see this under your target/classes in a Maven project.
The log4j-core-2.x.x.jar also contains a Log4j2Plugins.dat defining its cache data.
The problem is a single JAR is created when testing an EAR using ShrinkWrap and normally the log4j-core-2.x.x.jar Log4j2Plugins.dat is added to the test JAR as it would most likely be first in the class path.
This means your custom plugin cache is missing.
The solution using ShrinkWrap is to create a new Log4j2Plugins.dat merging any required custom plugin cache files with the cores and then adding that to the JAR.
The following function achieves that...
private static void mergeLog4J2Log4j2PluginsFile(JavaArchive ja, Class... uniqueJARClasses) {
// #Author: Johnathan Ingram <jingram#rogueware.org>
// Log4J2 uses /META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat within a JAR to define custom plugins
// This is automatically generated by the plugin defined in the log4j-core-2.x.x pom.xml when compiling your custom plugin
// The problem with shrinkwrap is that the JAR is not preserved and only a single
// /META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
// file can exist as JAR files cannot be added to a JAR file as a library.
// This is normally the default contained in log4j-core-2.x.x.jar which does not expose any custom plugins
// To rectify, both the core and the custom plugin JAR file Log4j2Plugins.dat need to be merged into a single Log4j2Plugins.dat
try {
// List of a unique class in each JAR containing a Log4j2Plugins.dat requiring merging
Vector<URL> datUrls = new Vector<URL>();
for (Class klass : uniqueJARClasses) {
// Find the JAR the class belongs to
URL classLoc = klass.getProtectionDomain().getCodeSource().getLocation();
URL resourceURL = classLoc.toString().endsWith(".jar")
? new URL("jar:" + URLDecoder.decode(classLoc.toString(), "UTF-8") + "!/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat")
: new URL(URLDecoder.decode(classLoc.toString(), "UTF-8") + "/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
datUrls.add(resourceURL);
}
// Use the Log4J2 PluginCache to build a merged Log4j2Plugins.dat
File mergedDatFile = new File("target/Log4j2Plugins.dat");
try (FileOutputStream fo = new FileOutputStream(mergedDatFile)) {
org.apache.logging.log4j.core.config.plugins.processor.PluginCache pc = new org.apache.logging.log4j.core.config.plugins.processor.PluginCache();
pc.loadCacheFiles(datUrls.elements());
pc.writeCache(fo);
}
// Replace the default Log4j2Plugins.dat if present
ja.delete("/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
ja.addAsManifestResource(mergedDatFile, "org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
} catch (Exception ex) {
ex.printStackTrace(System.err);
}
}
To run:
JavaArchive ja = ShrinkWrap.create(JavaArchive.class, "my-test.jar");
...
mergeLog4J2Log4j2PluginsFile(ja, org.apache.logging.log4j.core.config.plugins.processor.PluginCache.class, MyCustomPlugin.class);
I have an ear file that will have root deployments (deploy configuration), lib deployments (earlibs configuration) and an additional custom utilJars configuration that I want to place in a utilJars folder at the root of the ear. I am aware that the first two configurations are automatically handled by the Ear task.
How can I add an additional CopySpec to the Ear task (or any AbstractCopyTask for that matter) to handle the third configuration?
My understanding of copy specs was erroneous. Instead of copyspecs existing side by side, they exist in a hierachy, as explained in section 16.6.3 of the gradle user guide.
thus additional copy specs may be 'nested' within the root specification of the task. These nested specs inherit the parent specs unless otherwise specified:
e.g. in the following spec, the root spec contains an into, exclude and from spec. in the from spec, there is a nested include spec. This spec does not overwrite anything from the root spec, and the root spec does not see it. The into spec does however overwrite the from copyspec and thus will copy everything from the configuration 'runtime' into the folder libs but not anything from the src/dist folder.
task nestedSpecs(type: Copy) {
into 'build/explodedWar'
exclude '**/*staging*'
from('src/dist') {
include '**/*.html'
}
into('libs') {
from configurations.runtime
}
}
I hope this helps someone else :)