I have an Android library module where a public class uses some annotations from an external library. The annotations are purely used internally, so I don't want to expose it as an API Gradle dependency, leaking that dependency to clients, but keep it as an implementation one.
However, the presence of the annotation causes warnings for applications using my library, since the annotation class can not be resolved. This is actually quite understandable - a public class contains a symbol which is not on the compile path of the client - but is there a way to have annotations kept internal when packaging a library - or somehow ignored by the calling application? The annotations are RUNTIME retention ones, so they cannot be completely stripped out in a build step or similar.
Setup for illustrative purposes:
my-library:
build.gradle has implementation com.example.foobar which contains #Example annotation
Foo.java is a public class annotated with #Example:
#Example(foo = "bar")
public class Foo {
...
}
Other classes in my-library itself require the annotation to be present in runtime
some-client:
build.gradle has implementation com.example.mylibrary which contains the class Foo
This application uses Foo objects, but doesn't need to know about the annotation, however when building it will get:
classes.jar(com/example/Foo.class): warning: Cannot find annotation method 'foo()' in type 'Example': class file for com.example.Example not found
If I change implementation to API for com.example.foobar, the warning is eliminated, but the application will now get the #Example annotation on its build path, which is an internal implementation detail. Is there another way?
Related
My gradle project contains 3 sub-projects with one source file each:
root-project\
sub-project-abstract\
...AbstractFoo.java
sub-project-commons\
...ConcreteFoo.java (extends AbstractFoo)
sub-project-main\
...Main.java (instantiates ConcreteFoo)
build.gradle of sub-project-commons:
dependencies {
implementation(project(:sub-project-abstract))
}
build.gradle of sub-project-main:
dependencies {
implementation(project(:sub-project-commons))
}
The Main-class in sub-project-main is aware of ConcreteFoo, however, compilation fails with cannot access AbstractFoo.
For some reason, I expected sub-project-commons to "export" ConcreteFoo and AbstractFoo, since it's a implementation-dependency. Or in other words, form the perspective of sub-project-main, AbstractFoo is a transitive dependency.
However, this doesn't seem to be the case.
I know that I could probably make it work by explicitly adding sub-project-abstract as a direct dependency to sub-project-main. However, that's something I want to avoid due to the nature of the commons project (my actual project contains up to 10 subprojects, and it should be possible to reuse the commons-project without declaring a dependency to sub-project-abstract every single time the commons-project is referenced.
Is there a way to make the Main-class aware of AbstractFoo without directly declaring sub-project-abstract as a dependency (but indirectly via sub-project-commons)?
This is expected behavior for the implementation configuration. You should apply the Java Library Plugin and use the api configuration.
The key difference between the standard Java plugin and the Java Library plugin is that the latter introduces the concept of an API exposed to consumers. A library is a Java component meant to be consumed by other components. It’s a very common use case in multi-project builds [emphasis added], but also as soon as you have external dependencies.
The plugin exposes two configurations that can be used to declare dependencies: api and implementation. The api configuration should be used to declare dependencies which are exported by the library API, whereas the implementation configuration should be used to declare dependencies which are internal to the component.
[...]
Dependencies appearing in the api configurations will be transitively exposed to consumers of the library, and as such will appear on the compile classpath of consumers. Dependencies found in the implementation configuration will, on the other hand, not be exposed to consumers, and therefore not leak into the consumers' compile classpath. [...]
In sub-project-commons (Kotlin DSL):
plugins {
...
`java-library`
}
...
dependencies {
api(project(":sub-project-abstract"))
}
...
I have a Spring Boot application that works as expected when ran with embedded tomcat, but I noticed that if I try to run it from an existing tomcat instance that I'm using with a previous project then it fails with a NoClassDefFoundError for a class that I don't use anywhere in my application.
I noticed in the /lib directory I had a single jar that contained a few Spring annotated classes, so as a test I cleaned out the /lib directory which resolved the issue. My assumption is that Spring is seeing some of the configurations/beans/imports on the classpath due to them existing in the /lib directory and either trying to autoconfigure something on its own, or is actually trying to instantiate some of these classes.
So then my question is - assuming I can't always fully control the contents of everything on the classpath, how can I prevent errors like this from occurring?
EDIT
For a little more detail - the class not being found is DefaultCookieSerializer which is part of the spring-session-implementation dependency. It is pulled into one of the classes in the jar located in /lib, but it is not any part of my application.
Check for features provided by #EnableAutoConfiguration. You can explicitly configure set of auto-configuration classes for your application. This tutorial can be a good starting point.
You can remove the #SpringBootApplication annotation from the main class and replace it with an #ComponentScan annotation and an #Import annotation that explicitly lists only the configuration classes you want to load. For example, in a Spring boot MVC app that uses metrics, web client, rest template, Jackson, etc, I was able to replace the #SpringBootApplication annotation with below code and get it working exactly as it was before, with all functional tests passing:
#Import({ MetricsAutoConfiguration.class,
InfluxMetricsExportAutoConfiguration.class,
ServletWebServerFactoryAutoConfiguration.class,
DispatcherServletAutoConfiguration.class,
WebMvcAutoConfiguration.class,
JacksonAutoConfiguration.class,
WebClientAutoConfiguration.class,
RestTemplateAutoConfiguration.class,
RefreshAutoConfiguration.class,
ValidationAutoConfiguration.class
})
#ComponentScan
The likely culprit of mentioned exception are incompatible jars on the classpath.
As we don't know with what library you have the issue we cant tell you the exact reason, but the situation looks like that:
One of Spring-Boot autoconfiguration classes is being triggered by the presence of class on the classpath
Trigerred configuration tries to create some bean of class that is not present in the jar you have (but it is in the specific version mentioned in the Spring BOM)
Version incompatibilities may also cause MethodNotFound exceptions.
That's one of the reasons why it is good practice not to run Spring Boot applications inside the container (make jar not war), but as a runnable jar with an embedded container.
Even before Spring Boot it was preferred to take account of libraries being present on runtime classpath and mark them as provided inside your project. Having different versions of the library on a classpath may cause weird ClassCastExceptions where on both ends names match, but the rest doesn't.
You could resolve specific cases by disabling autoconfiguration that causes your issue. You can do that either by adding exclude to your #SpringBootApplication or using a property file.
Edit:
If you don't use very broad package scan (or use package name from outside of your project in package scan) in your Spring Boot application it is unlikely that Spring Boot simply imports configuration from the classpath.
As I have mentioned before it is rather some autoconfiguration that is being triggered by existence of a class in the classpath.
Theoretical solution:
You could use maven shade plugin to relocate all packages into your own package space: see docs.
The problems is you'd have face:
Defining very broad relocation pattern that would exclude JEE classes that need to be used so that container would know how to run your application.
Relocation most likely won't affect package names used as strings in the Spring Boot annotations (like annotations #PackageScan or #ConditionalOnClass). As far as I know it is not implemented yet. You'd have to implement that by yourself - maybe as some kind of shade plugin resource processor.
When relocating classes you'd have to replace package names in all relevant configuration located in the jars. Possibly also merge some of those.
You'd also have to take into account how libraries that you use, or spring uses use package names or files.
This is definitely not a trivial tasks with many traps ahead. But if done right, then it would possibly allow you to disregard what is on the containers classpath. Spring Boot would also look for classes in relocated packages, and you wouldn't have those in ordinary jars.
I'm working on a multi module maven based project in which one of the modules contains a few annotation processors for the custom annotations used by other modules. When I add a dependency of annotation processor module to any other module, the annotations of that modules are processed by those annotation processors.
But recently I integrated Checker Framework (for type annotations) and then all the custom annotation processors (I mentioned above) stopped working. Any idea on how to get them to work even with Checker Framework is greatly appreciated?
To clear the scenario,
Let's say I have a maven module named module_A. In this module I have a annotation (class level) called "#FoodItem". I need to enforce a rule that any class annotated with "#FoodItem" annotation should implement the interface "Food". So I wrote an annotation processor "FoodItemAnnotationProcessor" in the same module (module_A) which processes such classes and check for the compliance with that rule.
Then let's say I have another module named module_B which has a maven dependency to the module_A. In this module I have a class called "Pizza" which is annotated with "#FoodItem" annotation.
If a build the project (which has module_A and module_B) with the above configuration, the "FoodItemAnnotationProcessor" is executed at compile stage and validates the class "Pizza" for the rule mentioned above.
After that I integrated Checker framework to module_B (as mentioned here). Then checker framework related validations are executed at compile time as expected, but the "FoodItemAnnotationProcessor" ceased to work.
To understand the problem you must know how javac finds your annotation processors.
When you don't supply the --processor argument for javac (see doc-javac-options), then the annotation-processor auto-discovery feature (see javac-doc: Annotation processing) is activated. This means, that javac will search for all available annotation-processors in your classpath (or processorpath, if you have specified it).
Jars, which include a META-INF/services/javax.annotation.processing.Processor file, can specify their annotation processor classes and javac will automatically use them.
The "problem" is that the checker-framework has multiple multiple annotation processors for the checks, but you may only want to use some of those: thus the annotation-discovery process cannot be used and you must manually specify all annotation processors to run in your build file.
For a Maven build you can do it like this:checker-framework doc for Maven
<annotationProcessors>
<!-- Add all the checkers you want to enable here -->
<annotationProcessor>org.checkerframework.checker.nullness.NullnessChecker</annotationProcessor>
</annotationProcessors>
This will explicitly set the --processor argument for javac (see doc-javac-options), which disables the default annotation-discovery process.
So the solution is to manually add all annotation processors that you want to run (in addition to the checker-framework checkers).
E.g. when you want to run the NullnessChecker and Dagger, you must specify both:
<annotationProcessors>
<!-- Add all the checkers you want to enable here -->
<annotationProcessor>org.checkerframework.checker.nullness.NullnessChecker</annotationProcessor>
<!-- Add all your other annotation processors here -->
<annotationProcessor>dagger.internal.codegen.ComponentProcessor</annotationProcessor>
</annotationProcessors>
Hint:
to find out which annotation processors you are currently using, run your build and pass the Non-Standard Option -XprintProcessorInfo to javac.
UPDATE:
The checkers also support some sort of auto-discovery (doc-ref) - Note: I have not used this yet.
2.2.3 Checker auto-discovery
“Auto-discovery” makes the javac compiler always run a checker plugin,
even if you do not explicitly pass the -processor command-line option.
This can make your command line shorter, and ensures that your code is
checked even if you forget the command-line option.
To enable auto-discovery, place a configuration file named
META-INF/services/javax.annotation.processing.Processor in your
classpath. The file contains the names of the checker plugins to be
used, listed one per line. For instance, to run the Nullness Checker
and the Interning Checker automatically, the configuration file should
contain:
org.checkerframework.checker.nullness.NullnessChecker
org.checkerframework.checker.interning.InterningChecker
I use quartz in my grails project.
I now wanted to include the weceem plugin (that is a "lightweight" CMS)
It turns out that the plugin itself uses quatz as well.
Now I have a compile error saying:
Invalid duplicate class definition of class QuartzConfig :
The sources xxx\target\work\plugins\weceem-1.4\grails-app\conf\QuartzConfig.groovy and xxx\grails-app\conf\QuartzConfig.groovy each contain a class with the name QuartzConfig.
QuartzConfig.groovy /xxx/.link_to_grails_plugins/weceem-1.4/grails-app/conf
What can I do?
EDIT: Of cause, I want to use my QuartzConfig. It should override the plugin one
I have projectA, projectB, and projectC Eclipse Maven projects.
ProjectA contains:
IMyApi interface.
"Empty" META-INF\beans.xml file.
ProjectB contains:
IMyConfig interface.
MyConfigJndi implementation of IMyConfig.
MyApiImpl implementation of IMyApi, with a property #Inject private IMyConfig config;.
"Empty" META-INF\beans.xml file.
ProjectC contains:
a MyConfigAlter implementation of IMyConfig, marked as #Alternative.
a Main class (and method) that initializes Weld SE and retrieves a IMyApi bean.
a META-INF\beans.xml where MyConfigAlter is listed in the alternatives section.
Now, I run the Main class, and the IMyApi bean is successfully retrieved (as a MyApiImpl instance). But such an instance has been, in its config property, injected with a MyConfigJndi instance, instead of the alternative version (MyConfigAlter)
I am using Eclipse Luna + M2Eclipse.
What am I doing wrong?
UPDATE: I found out that using #Specializes instead of #Alternative solves the issue, but I still think it is not the proper solution (in some situation I may not have access to the "default" implementation).
UPDATE 2:
I am using Weld-se, 2.2.10.Final:
<dependency>
<groupId>org.jboss.weld.se</groupId>
<artifactId>weld-se</artifactId>
<version>2.2.10.Final</version>
<scope>runtime</scope>
</dependency>
And the initialization is simply
WeldContainer weld =
new Weld().
initialize();
IMyApi myApi =
weld.
instance().
select(
IMyApi.
class).
get();
Selecting an alternative using the alternatives element in the beans.xml descriptor only affects the corresponding bean archive, i.e. ProjectC in your case, as documented in Declaring selected alternatives for a bean archive. Based on that, this is logical that the ProjectB bean archive gets the MyConfigJndi implementation injected.
Since CDI 1.2, it is possible to select an alternative globally for the application using the #Priority annotation as documented in Declaring selected alternatives for an application.
So in your case, you could write:
#Priority(Interceptor.Priority.Application)
#Alternative
class MyConfigAlter {
}
Another way to solve this is to use -Dorg.jboss.weld.se.archive.isolation=false - from http://docs.jboss.org/weld/reference/2.2.11.Final/en-US/html/environments.html#_bean_archive_isolation_2
The reason that this happens is that each JAR on the classpath becomes its own bean archive. Since the CDI spec as of 1.2 does not include an SE specification there is no definition of how the classpath operates in this mode. This isn't necessarily how an SE app would be designed since you don't have unique classloaders for each JAR.