I'm working on a multi module maven based project in which one of the modules contains a few annotation processors for the custom annotations used by other modules. When I add a dependency of annotation processor module to any other module, the annotations of that modules are processed by those annotation processors.
But recently I integrated Checker Framework (for type annotations) and then all the custom annotation processors (I mentioned above) stopped working. Any idea on how to get them to work even with Checker Framework is greatly appreciated?
To clear the scenario,
Let's say I have a maven module named module_A. In this module I have a annotation (class level) called "#FoodItem". I need to enforce a rule that any class annotated with "#FoodItem" annotation should implement the interface "Food". So I wrote an annotation processor "FoodItemAnnotationProcessor" in the same module (module_A) which processes such classes and check for the compliance with that rule.
Then let's say I have another module named module_B which has a maven dependency to the module_A. In this module I have a class called "Pizza" which is annotated with "#FoodItem" annotation.
If a build the project (which has module_A and module_B) with the above configuration, the "FoodItemAnnotationProcessor" is executed at compile stage and validates the class "Pizza" for the rule mentioned above.
After that I integrated Checker framework to module_B (as mentioned here). Then checker framework related validations are executed at compile time as expected, but the "FoodItemAnnotationProcessor" ceased to work.
To understand the problem you must know how javac finds your annotation processors.
When you don't supply the --processor argument for javac (see doc-javac-options), then the annotation-processor auto-discovery feature (see javac-doc: Annotation processing) is activated. This means, that javac will search for all available annotation-processors in your classpath (or processorpath, if you have specified it).
Jars, which include a META-INF/services/javax.annotation.processing.Processor file, can specify their annotation processor classes and javac will automatically use them.
The "problem" is that the checker-framework has multiple multiple annotation processors for the checks, but you may only want to use some of those: thus the annotation-discovery process cannot be used and you must manually specify all annotation processors to run in your build file.
For a Maven build you can do it like this:checker-framework doc for Maven
<annotationProcessors>
<!-- Add all the checkers you want to enable here -->
<annotationProcessor>org.checkerframework.checker.nullness.NullnessChecker</annotationProcessor>
</annotationProcessors>
This will explicitly set the --processor argument for javac (see doc-javac-options), which disables the default annotation-discovery process.
So the solution is to manually add all annotation processors that you want to run (in addition to the checker-framework checkers).
E.g. when you want to run the NullnessChecker and Dagger, you must specify both:
<annotationProcessors>
<!-- Add all the checkers you want to enable here -->
<annotationProcessor>org.checkerframework.checker.nullness.NullnessChecker</annotationProcessor>
<!-- Add all your other annotation processors here -->
<annotationProcessor>dagger.internal.codegen.ComponentProcessor</annotationProcessor>
</annotationProcessors>
Hint:
to find out which annotation processors you are currently using, run your build and pass the Non-Standard Option -XprintProcessorInfo to javac.
UPDATE:
The checkers also support some sort of auto-discovery (doc-ref) - Note: I have not used this yet.
2.2.3 Checker auto-discovery
“Auto-discovery” makes the javac compiler always run a checker plugin,
even if you do not explicitly pass the -processor command-line option.
This can make your command line shorter, and ensures that your code is
checked even if you forget the command-line option.
To enable auto-discovery, place a configuration file named
META-INF/services/javax.annotation.processing.Processor in your
classpath. The file contains the names of the checker plugins to be
used, listed one per line. For instance, to run the Nullness Checker
and the Interning Checker automatically, the configuration file should
contain:
org.checkerframework.checker.nullness.NullnessChecker
org.checkerframework.checker.interning.InterningChecker
Related
Is there a way to overwrite a configuration in a Quarkus extension with a hard-coded value?
What I'm trying to do: I am creating a custom Quarkus extension for JSON logging, based on quarkus-logging-json but with additional (non static) fields. I reuse some classes from the extension's runtime library, so it is a Maven dependency of the runtime module of the extension (and the deployment also needs to be declared as a dependency to my deployment module, because the quarkus extension plugin checks this).
It seems to work fine, except that I now have 2 formatters, and the following line is logged:
LogManager error of type GENERIC_FAILURE: Multiple console formatters were activated
I would like to disable the quarkus-logging-json extension completely by hard-coding these values:
quarkus.console.json.enable=false
quarkus.file.json.enable=false.
Is there a way to do this?
Thank you.
An extension cannot override runtime configuration values, it can however set a default value using io.quarkus.deployment.builditem.RunTimeConfigurationDefaultBuildItem
My gradle project contains 3 sub-projects with one source file each:
root-project\
sub-project-abstract\
...AbstractFoo.java
sub-project-commons\
...ConcreteFoo.java (extends AbstractFoo)
sub-project-main\
...Main.java (instantiates ConcreteFoo)
build.gradle of sub-project-commons:
dependencies {
implementation(project(:sub-project-abstract))
}
build.gradle of sub-project-main:
dependencies {
implementation(project(:sub-project-commons))
}
The Main-class in sub-project-main is aware of ConcreteFoo, however, compilation fails with cannot access AbstractFoo.
For some reason, I expected sub-project-commons to "export" ConcreteFoo and AbstractFoo, since it's a implementation-dependency. Or in other words, form the perspective of sub-project-main, AbstractFoo is a transitive dependency.
However, this doesn't seem to be the case.
I know that I could probably make it work by explicitly adding sub-project-abstract as a direct dependency to sub-project-main. However, that's something I want to avoid due to the nature of the commons project (my actual project contains up to 10 subprojects, and it should be possible to reuse the commons-project without declaring a dependency to sub-project-abstract every single time the commons-project is referenced.
Is there a way to make the Main-class aware of AbstractFoo without directly declaring sub-project-abstract as a dependency (but indirectly via sub-project-commons)?
This is expected behavior for the implementation configuration. You should apply the Java Library Plugin and use the api configuration.
The key difference between the standard Java plugin and the Java Library plugin is that the latter introduces the concept of an API exposed to consumers. A library is a Java component meant to be consumed by other components. It’s a very common use case in multi-project builds [emphasis added], but also as soon as you have external dependencies.
The plugin exposes two configurations that can be used to declare dependencies: api and implementation. The api configuration should be used to declare dependencies which are exported by the library API, whereas the implementation configuration should be used to declare dependencies which are internal to the component.
[...]
Dependencies appearing in the api configurations will be transitively exposed to consumers of the library, and as such will appear on the compile classpath of consumers. Dependencies found in the implementation configuration will, on the other hand, not be exposed to consumers, and therefore not leak into the consumers' compile classpath. [...]
In sub-project-commons (Kotlin DSL):
plugins {
...
`java-library`
}
...
dependencies {
api(project(":sub-project-abstract"))
}
...
I have a Spring Boot application that works as expected when ran with embedded tomcat, but I noticed that if I try to run it from an existing tomcat instance that I'm using with a previous project then it fails with a NoClassDefFoundError for a class that I don't use anywhere in my application.
I noticed in the /lib directory I had a single jar that contained a few Spring annotated classes, so as a test I cleaned out the /lib directory which resolved the issue. My assumption is that Spring is seeing some of the configurations/beans/imports on the classpath due to them existing in the /lib directory and either trying to autoconfigure something on its own, or is actually trying to instantiate some of these classes.
So then my question is - assuming I can't always fully control the contents of everything on the classpath, how can I prevent errors like this from occurring?
EDIT
For a little more detail - the class not being found is DefaultCookieSerializer which is part of the spring-session-implementation dependency. It is pulled into one of the classes in the jar located in /lib, but it is not any part of my application.
Check for features provided by #EnableAutoConfiguration. You can explicitly configure set of auto-configuration classes for your application. This tutorial can be a good starting point.
You can remove the #SpringBootApplication annotation from the main class and replace it with an #ComponentScan annotation and an #Import annotation that explicitly lists only the configuration classes you want to load. For example, in a Spring boot MVC app that uses metrics, web client, rest template, Jackson, etc, I was able to replace the #SpringBootApplication annotation with below code and get it working exactly as it was before, with all functional tests passing:
#Import({ MetricsAutoConfiguration.class,
InfluxMetricsExportAutoConfiguration.class,
ServletWebServerFactoryAutoConfiguration.class,
DispatcherServletAutoConfiguration.class,
WebMvcAutoConfiguration.class,
JacksonAutoConfiguration.class,
WebClientAutoConfiguration.class,
RestTemplateAutoConfiguration.class,
RefreshAutoConfiguration.class,
ValidationAutoConfiguration.class
})
#ComponentScan
The likely culprit of mentioned exception are incompatible jars on the classpath.
As we don't know with what library you have the issue we cant tell you the exact reason, but the situation looks like that:
One of Spring-Boot autoconfiguration classes is being triggered by the presence of class on the classpath
Trigerred configuration tries to create some bean of class that is not present in the jar you have (but it is in the specific version mentioned in the Spring BOM)
Version incompatibilities may also cause MethodNotFound exceptions.
That's one of the reasons why it is good practice not to run Spring Boot applications inside the container (make jar not war), but as a runnable jar with an embedded container.
Even before Spring Boot it was preferred to take account of libraries being present on runtime classpath and mark them as provided inside your project. Having different versions of the library on a classpath may cause weird ClassCastExceptions where on both ends names match, but the rest doesn't.
You could resolve specific cases by disabling autoconfiguration that causes your issue. You can do that either by adding exclude to your #SpringBootApplication or using a property file.
Edit:
If you don't use very broad package scan (or use package name from outside of your project in package scan) in your Spring Boot application it is unlikely that Spring Boot simply imports configuration from the classpath.
As I have mentioned before it is rather some autoconfiguration that is being triggered by existence of a class in the classpath.
Theoretical solution:
You could use maven shade plugin to relocate all packages into your own package space: see docs.
The problems is you'd have face:
Defining very broad relocation pattern that would exclude JEE classes that need to be used so that container would know how to run your application.
Relocation most likely won't affect package names used as strings in the Spring Boot annotations (like annotations #PackageScan or #ConditionalOnClass). As far as I know it is not implemented yet. You'd have to implement that by yourself - maybe as some kind of shade plugin resource processor.
When relocating classes you'd have to replace package names in all relevant configuration located in the jars. Possibly also merge some of those.
You'd also have to take into account how libraries that you use, or spring uses use package names or files.
This is definitely not a trivial tasks with many traps ahead. But if done right, then it would possibly allow you to disregard what is on the containers classpath. Spring Boot would also look for classes in relocated packages, and you wouldn't have those in ordinary jars.
I got a Java project that I'm migrating from Java 8 to Java 13. This project uses ResourceBundles to enable language localisation.
In Java 8, I provided a custom ResourceBundle.Control to ResourceBundle.getBundle(baseName, control) but this doesn't work anymore in Java 9+. As I understand it, I must instead provide a custom ResourceBundleProvider interface, which I called UiProvider, and an implementation of this interface, UiProviderImpl, which must be used as a service.
To generate module descriptors, I'm using the moditect maven plugin. But it doesn't look like I can add a provides directive anywhere, only exports, opens and uses directives. Or am I missing anything? Here's an excerpt of my pom.xml with what I tried to configure. Can this be fixed?
<module>
<moduleInfo>
<name>net.babelsoft.negatron</name>
<opens>net.babelsoft.negatron;</opens>
<uses>theme.language.spi.UiProvider</uses>
<provides>theme.language.spi.UiProvider with theme.language.spi.UiProviderImpl</provides>
</moduleInfo>
</module>
At the time I wrote my question, Moditect didn't support the provides directive within the moduleInfo tag.
The only way was to use a moduleInfoSource tag, which forces the developper to directly write the actual content of module-info.java, not very satisfactory.
After discussing with the author of Moditect, I submitted a pull request to add the support of the provides directive within the moduleInfo tag. It hasn't been merged to Moditect source code yet...
The documentation of Maven Compiler plugin mentions the following:
annotationProcessors:
Names of annotation processors to run. Only applies to JDK 1.6+ If not
set, the default annotation processors discovery process applies.
What is the default annotation processors discovery process here? Is there any other way to set up annotation processors than this configuration tag?
I've found that the Getting Started with the Annotation Processing Tool (apt) documentation mentions a default discovery procedure, but it works with factory classes not processors and unfortunately it uses the tools.jar and com.sun packages from the JDK. Is this the default annotation processors discovery process?
The default way to make an annotation processor available to the compiler is to register it in a file in META-INF/services/javax.annotation.processing.Processor. The file can contain a number of processors: each the fully-qualified class name on its own line, with a newline at the end. The compiler will default to using processors found in this way if none are specified.