I am migrating an EAR application from Log4J 1.2.17 to Log4J2 2.4. Please find below the EAR structure.
EAR
-- APPLICATION JAR 1 (contains custom plugin)
-- APPLICATION JAR 2
-- APPLICATION JAR 3 (contains custom plugin)
-- APPLICATION JAR 4
-- APPLICATION WAR 1
-- APPLICATION WAR 2
-- APPLICATION WAR 3
-- OTHER THIRD PARTY APIs
-- lib/log4j-api-2.4.jar
-- lib/log4j-core-2.4.jar
-- lib/log4j-jcl-2.4.jar
-- lib/log4j-web-2.4.1.jar
-- META-INF/log4j2.xml
-- META-INF/MANIFEST.MF (contains all jars in class-path entry)
Custom plugin classes in all the jars are in the same package - com.test.it.logging.
PFB the initialization code.
Adding the custom plugins package.
PluginManager.addPackage("com.test,it.logging");
Initializing the logging configuration using log4j2.xml.
String path = "path/log4j2.xml";
System.setProperty("log4j.configurationFile", path);
None of the defined custom plugins are getting detected and I tried all the combinations available to initialize log4j2.xml and plugins initialization but nothing worked.
It gives me a feel that custom plugins is not at all working in EAR as I tried all the permutations and combinations. is this a BUG in log4j2 (version: 2.4) ? If no, then please guide me about how to define logging configuration containing custom plugins in an EAR containing custom plugins that are scattered across many jars within an EAR ?
Can anyone please let me know about how to configure
Also, PFB my question posted in stackoverflow on the same.
Custom plugin not getting detected in EAR with log4j2 API
I am using Wildfly 8.2.0-Final AS and maven for building EAR.
Just adding a note that I am always finding Log4JPlugins.dat file inside Jars containing custom plugins irrespective of the options I try regarding detecting plugins.
Your response is highly important to me and thanks.
I don't believe the log4j classes have visibility into the classloaers for the war and application jars.
When compiling a custom Plugin, the Log4J pom.xml defines a plugin that automatically generates cache data in the file META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
You can see this under your target/classes in a Maven project.
The log4j-core-2.x.x.jar also contains a Log4j2Plugins.dat defining its cache data.
The problem is a single JAR is created when testing an EAR using ShrinkWrap and normally the log4j-core-2.x.x.jar Log4j2Plugins.dat is added to the test JAR as it would most likely be first in the class path.
This means your custom plugin cache is missing.
The solution using ShrinkWrap is to create a new Log4j2Plugins.dat merging any required custom plugin cache files with the cores and then adding that to the JAR.
The following function achieves that...
private static void mergeLog4J2Log4j2PluginsFile(JavaArchive ja, Class... uniqueJARClasses) {
// #Author: Johnathan Ingram <jingram#rogueware.org>
// Log4J2 uses /META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat within a JAR to define custom plugins
// This is automatically generated by the plugin defined in the log4j-core-2.x.x pom.xml when compiling your custom plugin
// The problem with shrinkwrap is that the JAR is not preserved and only a single
// /META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat
// file can exist as JAR files cannot be added to a JAR file as a library.
// This is normally the default contained in log4j-core-2.x.x.jar which does not expose any custom plugins
// To rectify, both the core and the custom plugin JAR file Log4j2Plugins.dat need to be merged into a single Log4j2Plugins.dat
try {
// List of a unique class in each JAR containing a Log4j2Plugins.dat requiring merging
Vector<URL> datUrls = new Vector<URL>();
for (Class klass : uniqueJARClasses) {
// Find the JAR the class belongs to
URL classLoc = klass.getProtectionDomain().getCodeSource().getLocation();
URL resourceURL = classLoc.toString().endsWith(".jar")
? new URL("jar:" + URLDecoder.decode(classLoc.toString(), "UTF-8") + "!/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat")
: new URL(URLDecoder.decode(classLoc.toString(), "UTF-8") + "/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
datUrls.add(resourceURL);
}
// Use the Log4J2 PluginCache to build a merged Log4j2Plugins.dat
File mergedDatFile = new File("target/Log4j2Plugins.dat");
try (FileOutputStream fo = new FileOutputStream(mergedDatFile)) {
org.apache.logging.log4j.core.config.plugins.processor.PluginCache pc = new org.apache.logging.log4j.core.config.plugins.processor.PluginCache();
pc.loadCacheFiles(datUrls.elements());
pc.writeCache(fo);
}
// Replace the default Log4j2Plugins.dat if present
ja.delete("/META-INF/org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
ja.addAsManifestResource(mergedDatFile, "org/apache/logging/log4j/core/config/plugins/Log4j2Plugins.dat");
} catch (Exception ex) {
ex.printStackTrace(System.err);
}
}
To run:
JavaArchive ja = ShrinkWrap.create(JavaArchive.class, "my-test.jar");
...
mergeLog4J2Log4j2PluginsFile(ja, org.apache.logging.log4j.core.config.plugins.processor.PluginCache.class, MyCustomPlugin.class);
Related
I've got a spring boot application build as multi-modular gradle project (old-style, not fancy jigsaw)
What I want to achieve - is replace java platform loggers (e.g., SSlLogger/ System.getLogger) with sl4j & logback that are used in my app and are managed by spring-boot-admin server at runtime. I need all my loggers to write to file instead of console - or else I won't see logs in Logz.io.
I do not control how my app is deployed, but I control the way fat jar is built (so, manuals with terminal commands 'java - ...' are not very helpful :( )
I started to follow https://www.baeldung.com/java-9-logging-api guide, but got stuck.
for simplicity, my structure is ->
build.gradle
/application-module
build.gradle (combines 3 other modules)
/src /...
/rest-module
build.gradle
/src /...
/service-module
build.gradle
/src /...
/persistency-module
build.gradle
/src /...
So, I want to add one more module
/log-module
/src -> with actual classes
module-info.java
Slf4jLogger implements System.Logger
Slf4jLoggerFinder extends System.LoggerFinder
and include it into my application-module
but when trying to build it all, I get 'error: module not found: org.slf4j', and app is not build.
So, what am I doing wrong? What additional plugins/config do I need? And will it even allow me to achieve my goal?
Okay, I managed to find the solution. It's a combination of
https://www.baeldung.com/java-9-logging-api
https://www.baeldung.com/java-spi
So, I don't even needed jigsaw modules - only the JDK's service provider mechanism
So, in fact you need 3 files [these are id's of pastebin's samples; but they are almost the same as in the java-9-logging-api article]
AkXY3zgu -> adapter class
YFUkZwat -> logger provider
CD6NNibj -> meta-inf file - and that's the trickiest part (with file name :) )
file name -> META_INF/services/java.lang.System$LoggerFinder
com.my-projects.commons.logs.Slf4jLoggerFinder
An now on regular app startup system logger will be replaced with slf4j-adapter.
But still, check how system logger is created -> for example, I mostly need SSLLogger, and there is some system-prop-based logic there...
I was recently asked to troubleshoot an issue with a Spring Boot program that gets executed by Oozie. Unfortunately, I don't have access to the Spring Boot application or the logs. :) I do have the output from mvn dependency:tree -Ddetail=true
I'm told that the Spring Boot application runs fine on its own but won't run when executed as an Oozie Java action. We suspect that some of the dependencies that are added to the classpath by Oozie conflict with dependencies from Spring Boot.
This is somewhat speculative, but I'd like to run a simple Oozie Java action that captures the group, artifact, and version for all the dependencies that are added to the classpath and compare that to the dependency tree from the Spring Boot application. I'm thinking that, if there are version conflicts, it might be possible to exclude/resolve them in the pom.xml.
I wrote a class that writes the names of the jars in the classpath to a text file:
void captureClasspath(){
PrintWriter out = null;
try {
ClassLoader cl = ClassLoader.getSystemClassLoader();
URL[] urls = ((URLClassLoader)cl).getURLs();
out = new PrintWriter(new OutputStreamWriter(
new BufferedOutputStream(new FileOutputStream("/tmp/classpath_capture.txt")), "UTF-8"));
for (URL url : urls){
out.println(url.getFile());
}
} catch (UnsupportedEncodingException | FileNotFoundException e) {
e.printStackTrace();
} finally {
if(out != null) {
out.flush();
out.close();
}
}
}
The output looks like this:
/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.1.2.4.3.0-227-tests.jar
/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.2.4.3.0-227.jar
/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.1.2.4.3.0-227.jar
/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.1.2.4.3.0-227.jar
/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.1.2.4.3.0-227.jar
... etc ... (more than 300 lines)
Instead of the filename, I'd like to extract the group, artifact and version from these jars. Is that possible? Or is there a better strategy to troubleshoot/resolve this issue given the limited input (no application logs, code, etc...)?
Instead of the filename, I'd like to extract the group, artifact and
version from these jars. Is that possible?
This would require reading the contents of each jar file and pulling the group, artifact and version from the relevant entry within the jar file. Some of the relevant methods for implementing this are JarFile#entries(), JarFile#getEntry(String) and JarFile#getInputStream(ZipEntry).
Maven builds will store an entry in the jar at META-INF/maven/<group>/<artifact>/pom.properties. For example, running jar xf hadoop-common.jar extracts META-INF/maven/org.apache.hadoop/hadoop-common/pom.properties, which contains the following data:
#Generated by Maven
#Thu Aug 18 01:41:25 UTC 2016
version=2.7.3
groupId=org.apache.hadoop
artifactId=hadoop-common
Several common sources of classpath version conflicts for Hadoop applications are Guava, Jackson and Protobuf.
In my build.gradle file I need to add the line:
shadowJar {
mergeServiceFiles()
}
Otherwise the jar does not run properly. I wonder what this line does exactly?
I use the Gradle plugin in Eclipse Luna. I create the jar on one Java project which depends on another one.
mergeServiceFiles is declared exactly here and its implementation is as follows:
/**
* Syntactic sugar for merging service files in JARs
* #return
*/
public ShadowJar mergeServiceFiles() {
try {
transform(ServiceFileTransformer.class);
} catch (IllegalAccessException e) {
} catch (InstantiationException e) {
}
return this;
}
As you can see it uses ServiceFileTransfomer which is defined here. From its docs:
Modified from org.apache.maven.plugins.shade.resource.ServiceResourceTransformer.java
Resources transformer that appends entries in META-INF/services resources into a single resource. For example, if there are several META-INF/services/org.apache.maven.project.ProjectBuilder resources
spread across many JARs the individual entries will all be
concatenated into a single
META-INF/services/org.apache.maven.project.ProjectBuilder resource
packaged into the resultant JAR produced by the shading process.
TL;DR - It merges the service files in the META-INF/services folder.
Long Answer
Some libraries (e.g. Micronaut) create a few service files in the META-INF/services folder. These files can contain any information useful at runtime. In case of Micronaut framework, it creates a file that lists the Bean References (or beans that are instantiated) in a file called io.micronaut.inject.BeanDefinitionReference under META-INF/services.
Usually, if you just have one application, it works fine even without mergeServiceFiles(). But if there are two micronaut projects that you are bundling into a single jar (e.g. a micronaut lib project with some utils and a micronaut app project with core business logic), you will have two io.micronaut.inject.BeanDefinitionReference. Each one will contain the beans of it's own project.
If you don't use mergeServiceFiles(), one of the BeanDefinitionReference files will be overwritten by the other. In that case, you will get a runtime exception saying BeanNotInstantiated or something of that sort.
Using mergeServiceFiles() merges (or concatenates in this case) the BeanDefinitionReference files of both the projects so that at runtime, you get all the beans defined.
More details can be found in the gradle forum topic here.
I'm trying to find out if we can load a oracle commerce component from file system. Generally we assemble all the code into an ear file and deploy it, however, I got a requirement where in I have to store some components in file system rather than packaging them along with ear file.
I know that we can use URLClassloader to load a class as shown below,
File classDir = new File("A:\\LodeeModule\\classes");
URL[] url = { classDir.toURI().toURL() };
ClassLoader loader = new URLClassLoader(url);
for (File file : classDir.listFiles()) {
String filename = file.getName().replace(".class", "");
loader.loadClass("com.buddha.testers." + filename).getConstructor().newInstance();
}
but how can we use the same for an component which has to be resolved by Nucleus at later point of time? Is there any way to instruct Nucleus to resolve component from file system?
You should just be able to add the JAR that contains the components classes to the CLASSPATH system variable used by the application server instance.
Then in the component configuration just define the implementing class as you normally would
$class=some.class.path.class
If you are using Jboss EAP 6+ on a newer version of ATG (11.0+) you might have some more trouble, you have to jump through some more hoops due to its classloader
https://docs.jboss.org/author/display/AS7/Class+Loading+in+AS7
Essentially you would need to define a jboss module containing your jar files, and define a dependency between the ear's "module" and the module containing your classes.
Alternatively you can define a ClassLoaderService that will manage the classes for your JARs
To do this, you need to define a new ClassLoaderService, so create a new properties file as you would with any other component.
/my/custom/ClassLoaderService.properties
$class=atg.nucleus.ServicesManifestClassLoaderService
$description=Custom Class Loader Service.
# The files to go into the classpath of the classloader
classpathFiles=\
/path/to/my/jars/lib/someClasses.jar,\
/path/to/my/jars/lib/someOtherClasses.jar
loggingDebug=false
Then in the actual component that you need these classes for add this line;
$classloader=/my/custom/ClassLoaderService
I think you're looking for the atg.dynamo.data-dir property. If you specify that property dynamo will look at that location for the "server configs" or properties files. This allows you to separate the configs from the ear file.
Note: You can still include configs in the ear, I believe they will still have first precedence
It's usually specified when you start the server, something like:
run.sh -c <your server> -Datg.dynamo.data-dir=/data/something/serverconfigs
This feature is largely undocumented, but many people know about it.
See http://docs.oracle.com/cd/E24152_01/Platform.10-1/ATGPlatformProgGuide/html/s0302developmentmodeandstandalonemode01.html
EDIT:
I mistook what you were originally asking. You might want to take a look at the disposable class loader that ATG provides, but keep in mind this is only intended for development purposes.
Project setup:
Logging-1.0.jar
contains a Logger.class which uses slf4j/log4j
depends on slf4j-api.jar, slf4j-log4j.jar, log4j.jar
LoggingOSGI-1.0.jar
wraps the logging project
contains an Activator and MANIFEST.MF
lib/ contains logging-1.0.jar, slf4j-api.jar, slf4j-log4j.jar, log4j.jar
jars from lib/ are added to classpath and packages from logging-1.0.jar are exported
SomeBundle-1.2.jar
contains an Activator and MANIFEST.MF
has a dependency on LoggingOSGI-1.0.jar
Accessing the Logger class from SomeBundle works, but the logging project can't find the log4j.properties (log4j:WARN No appenders could be found for logger).
Questions:
Where do i have to place the log4j.properties?
Any ideas what i could try? (already tried: different directories, Eclipse-Buddies, -Dlog4j.configuration as VM argument)
Would be an extension point, which tells the logging project the location of the log4j.properties, a good solution?
When I last tried this around six years ago, the solution turned to be to create a fragment bundle with the log4j.properties file, and then to attach that fragment (via the Fragment-Host manifest header) to the bundle that loads the logging library ("Logging-1.0.jar," in your case). It felt like a lot of project structure, build time, and deployment overhead for what seems like such a simple goal.
See section 3.14 of the OSGi Service Platform Core Specification for more detail on fragment bundles.
An alternate idea is to consider using the Configuration Admin Service to designate the path to a logging configuration file on disk, outside of your bundles. That would require augmenting your logging library to look up a configuration (or, better, listen for one) and then pass that configuration through to the logging implementation.
I would also be remiss to not point out the OSGi Log Service, specified in section 101 of the OSGi Service Platform Service Compendium.
To solve my problem i added this code to the Activator of the LoggingOSGI-1.0 which configures log4j. The file path is taken from a System property: -Dlog4j.configuration=path/to/log4j.properties.
Still interested in other approaches or opinions to this solution.
private static final String LOG4J_CONFIG_KEY = "log4j.configuration";
public void start(BundleContext bundleContext) throws Exception {
Activator.context = bundleContext;
if (System.getProperties().containsKey(LOG4J_CONFIG_KEY)) {
String file = System.getProperties().getProperty(LOG4J_CONFIG_KEY);
PropertyConfigurator.configure(file);
}
}