Logback multiple configuration error on multiproject Junit4/Eclipse/Gradle - java

I've got a Java Eclipse/Gradle multiproject build build that uses SLF4J/Logback in the client, server, and a couple of libraries. I cannot figure out how to manage my Logback configurations in a way that permits sane Junit4 testing without generating the error
Resource [logback-test.xml] occurs multiple times on the classpath.
My configuration files look like this:
project
server
config/logback-dev.xml
src/main/resources/logback.xml
src/test/resources/logback-test.xml
client
config/logback-dev.xml
src/main/resources/logback.xml
src/test/resources/logback-test.xml
library
src/test/resources/logback-test.xml
Production config logback.xml for each distribution module goes in src/main/resources. It gets added to the build, and ends up in the jar. This seems necessary, because Logback's database appender is part of the application, and I need a default log configuration that includes it.
Development config logback-dev.xml lives in module/config. I specify its location as a VM argument in eclipse: -Dlogback.configurationFile=config/logback-dev.xml
In theory, my unit tests should all use logback-test.xml, which should live in module/src/test/resources for each module.
It's that last point that's the problem. Running a Junit4 test on a module with dependencies will pick up it's src/test/resources/logback-test.xml... but will also pick up the logback-test.xml that eclipse has placed in the module/bin of each dependency. Logback gets loaded before JUnit runs any user code, so programmatic solutions won't work.
I have dozens of Junit4 classes that I want to run in Eclipse. I'd rather not get stuck manually specifying VM args for each of several dozen Junit run configurations in Eclipse.
How can I avoid the multple logback.xmls error for my unit tests without manually configuring every JUnit class?

Related

What is the relationship between Spring Boot and the Maven pom.xml file?

I use maven to build spring boot applications. I know that the maven spring-boot dependencies have annotations (such as #AutoConfiguration or #Condition*). At what point in the SpringApplication.run method in the main class does it actually read from the pom.xml file? I'm stepping through the code line by line and I can't figure out at what point does Spring Boot interact with the pom.xml file.
The #EnableAutoConfiguration enables auto-cofiguration of the Application Context based on jar files in the classpath and user-defined beans, so presumably the pom.xml eventually gets added as a source and the annotations in the dependencies are read when the auto-configuration state is running, but I can't figure out where or when it does so.
SpringApplication.run doesn't read the pom.xml file, it works at a higher level of abstraction. The pom.xml file is used by your IDE or by the mvn command line application to download dependencies and configure your applications classpath.
By the time that SpringApplication.run is called, the classpath is fully configured but SpringApplication itself isn't aware about how this was done.
Auto-configuration occurs by searching the all jars on the classpath looking for a files named META-INF/spring/org.springframework.boot.autoconfigure.AutoConfiguration.imports (if you're using a recent version). Once these files have been found, the classes listed within them are loaded and used for configuration.
If you want to step though the code, you can set a breakpoint at AutoConfigurationImportSelector.getCandidateConfigurations(...) and see the auto-configuration candidate classes get found. The exact auto-configurations that get applied will depend on the #Condition... annotations on them.

log4j2 logging for a Java SDK

Let's say we build a Java SDK and different projects can consume by adding it as a jar in the classpath or adding it as a dependency in the maven pom.xml or gradle file. Logs from the SDK are not visible at run-time when the other projects are consuming this library. I tried with SL4J, and none of the logs are visible at the run time when it is used by the other projects. Should I go with log4j2? If yes, should I provide with a log4j configuration /properties file in my SDK? Will the properties/configuration be picked-up at run time from the consumer libraries? What are the best practices for this? Can you please advise?
Best practice #1: never include the logging configuration file in your jar.
Applications using your library will likely want to provide their own logging configuration. If one of the jars contains a different configuration it becomes a “fun” guessing game for these applications to figure out why their configuration isn’t working. (And depending on the classpath order, their configuration may sometimes take precedence over the configuration in the library, so sometimes it will work, just to make it extra challenging...)

Managing dependencies in a JAR - Test Automation

I'm not sure if this is the best place to post such a question, but here it is. I'm a test automation engineer that works primarily with backend, spring boot command line apps. My tests, at a high level, are designed to ensure that any type of data that is thrown at the app will be handled correctly. We are a Java shop.
As with any "good" testing practice, I am treating the app like a blackbox, in that I do not pull in the model objects to run my tests. I simply supply the app with data, execute a command line type script (run.sh) that takes my data and processes it. My tests are comprised mostly of JDBC (to interact with the database) and a slew of ArrayList utilities that I have put together to sort out result sets and get specific db information.
Thus far, I have been deploying my tests as a JAR. I bundle everything up and deploy it to the environment with a script that will execute the tests. The tests do not run when the app is run. Though they do live inside of the project, they are a separate entity with separate launcher classes. However, I am finding that managing dependencies in a JAR is a real headache. Is there a better way to deploy automation / integration tests for command line apps?
I'm pulling in maven shade plugin to bundle all of my dependencies into a "God JAR", but that isn't helping me to resolve the issues that occur when I attempt to execute the JAR. I get multiple bean instantiation errors, relating to the app itself, and not my tests. For this reason, I pull in the app model, and the app itself as dependencies. When I ran the tests in my initial testing, they worked just fine. Deployed to environment and they continued to work correctly. Fast forward a couple of months, a few changes made to the app, and now it's a dependency nightmare when I build the new JAR.
TLDR: I'm having trouble managing dependencies in a maven project, integration tests JAR. Is there a better way to deploy automation / integration tests for command line apps where dependency management is easier?
(Note: I'm relatively new to this world, so pardon me if the question seems a bit vague).
I think the error happens when you use the shade plugin to re-package the spring boot jar. The way spring boot works is to add dependencies into the jar as jars itself and configure its own class loader (in the meta config) that is capable of reading classes from jar files inside the jar file. The standard java class loader does not do this - thats probably why the shade plugin misses out some jars (probably the ones embedded in the spring boot uber jar).
what I would try is to create a test-version of the spring boot app that contains the test-classes in the compile scope and a dependency to the original spring boot jar (you don't need the uber jar - therefore you may have to add a classifier to the original (app) spring boot plugin config to have that jar still available as it is replaced by default) and use the spring boot plugin to package the test version of it (using the dependency and its classifier above you used for the original app).

Multiple different classpaths during maven test execution

For a java project I want to spin up a server application during integration test (maven-failsafe-plugin, can be switched).
Problem
The server application should be fetched via maven
My project and the server depend on a shared library
The version of my copy of the shared lib may be different then the one of the server application
(During test there is even a third application involved, but the same requirements apply)
Current solution
Create a classloader manually, built classpath manually, start server application in custom classpath
manual dependency resolution sucks. Has to be redone on dependency changes.
Put everything onto the classpath, remove stuff that breaks.
Also manual...
Wish
Specify a "dependency profile" in pom.xml for each component
During test call something like: Maven.getClassLoaderForProfile("server"), receiving a classloader with all dependencies (including transitive ones)
Load application in this classloader

Avoid getting slf4j from maven

Maven uses slf4j, so as soon as it is launched, it initializes it with its default implementation contained into apache-maven-3.3.9\lib\slf4j-simple-1.7.5.jar and with configuration file defined in apache-maven-3.3.9\conf\logging\simplelogger.properties.
After that it loads the pom file and found my jetty-maven-plugin which launch a webapp. But in this webapp I want to use a different implementation for slf4j, but I can't because slf4j is already initialized.
I understand that maven is mainly a tool for build and not to launch apps, but I can't modify log configuration of apache-maven for each project to get pretty logs for each of them.
Is someone already face this issue and find a way to avoid that?
Note:
run-forked instead run works but in this case I can't no more debug from eclipse so I prefer an another solution.
older version of maven works as 3.0.3 because it didn't used slf4j

Categories