Is there a way to get dagger to spit out why it didn't generate a particular component?
I tried refactoring some of our modules and ended up breaking something, but I have literally no idea what I broke! All I see is that all my DaggerFoo components are missing, because dagger is apparently silently failing.
I've tried compiling with verbosity & higher max errors, but I still see absolutely nothing from Dagger itself saying what went wrong.
-Xdiags:verbose
-Xmaxerrs=1000
I have no relevant errors to share, because none are printed!
How the heck do you debug Dagger2?
Dagger runs as an annotation processor, so its error messages will manifest as compiler errors. These will often look like this message ("X cannot be provided...").
error: some.injected.ClassName cannot be provided without an #Inject constructor
or from an #Provides- or #Produces-annotated method.
some.injected.ClassName is injected at
some.class.that.InjectsIt
some.class.that.InjectsThatAbove
some.class.that.FurtherInjectsThat
If you're not sure where to look for compiler errors, you can see some other answers here:
Android Studio: Where is the Compiler Error Output Window?
How to view the list of compile errors in IntelliJ?
eclipse annotation processor not working. Where are errors shown?
If you've edited your project configuration or Gradle definition, it is also possible that Dagger is no longer running at all, or that it hasn't run for a while and has been working only based on its previous output. If so, check your Gradle file or Eclipse project definition to ensure that you are including Dagger as an annotationProcessor, and that you have at least one #Component file for that annotation processor to find.
Related
I am working on a rather large project consisting of Spring and JavaEE-components. Right now, I am updating from javax to jakartaEE and also from Java 11 to 17. The build-tool is gradle with several modules, which depend on one another. I have already changed things to make the first modules compile and am now at the module containing the JPA entities.
Suddenly I find myself in a situation, where the metamodel is no longer created.
My basic assumption here is, that gradle itself is not participating in the generation of the Metamodel. Instead, the APT generators are handed over to javac which then recognizes them and executes them prior to compilation.
I have debugged the gradle execution and the processorpath option of javac contains the jpa-metamodel.jar as expected.
Therefore, I assume, I am rather looking at a Java than a Gradle problem.
I did a very simple POC project with the same Java/Hibernate versions and there it worked.
With this project, I was able to verify, that the metamodel-generator issues a log statement. This statement is also issued, if the source code does not compile. So, it seems to be an indicator whether the modelgen is working at all.
This log message is not written in my main project.
Also, next to the jpa-metamodel there are two other apt generators at work. Both show similar behavior, i.e. printing out a log line when working but not in my updated project.
This brings me to the assumption, that it is no problem with the jpa-metamodel-gen but rather with APT itself.
Where should I be looking next?
EDIT
I have done some more researching. Via gradle --debug I was able to extract the actual javac call. Executing that from console shows the same error.
I reduced the number of elements to be compiled. It seems that putting an actual entity on the list of things to be compiled the APT generator goes dormant. Putting only a MappedEntity supertype on the list will still leave the APT active. WTF?
EDIT2
I am currently on my way to a reproducable exmaple. I've cut through a lot of business-related libs and right now, what I have is a havac call, where -processorpath only mentions the jpamodelgen and -classpath only mentions hibernate core (both in Version 6.1.6).
In this constellation the modelgen does not work. As soon as I eliminate the hibernate-core, it works.
Next step: I'll strip down the entities.
All right, that was unexpected.
I condensed it down to a minimal example. Required for this issue to hit is a call similar to this one:
javac \
-processorpath hibernate-jpamodelgen-6.1.6.Final.jar\
-classpath hibernate-core-6.1.6.Final.jar\
BaseEntity.java
The additional condition, that triggers this issue is, that BaseEntity has to have a dysfunctional #Type annotation. #Type has changed its signature from Hibernate 5 to Hibernate 6 and my entities were not yet up to date.
Interestingly enough, neither a generell compile error nor a specific one with other annotations (I tried a broken spring-Repository one) will stop the modelgen from working (as won't the fact, that the processorpath is incomplete in the example).
I will report this as an issue to the hibernate project, as I believe, the modelgen should react more graceful in such a situation.
Reported as bug here:
https://hibernate.atlassian.net/browse/HHH-15946
What are the possible causes for ABstractMethodError?
Exception in thread "pool-1-thread-1" java.lang.AbstractMethodError:
org.apache.thrift.ProcessFunction.isOneway()Z
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:51)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at com.gemfire.gemstone.thrift.hbase.ThreadPoolServer$ClientConnnection.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
It usually means that you are using an old version of an interface implementation which is missing a new interface method. For example java.sql.Connection interface got a new getSchema method in 1.7. If you have 1.6 JDBC driver and call Connection.getSchema you will get AbstractMethodError.
The simple answer is this: some code is trying to call a method which is declared abstract. Abstract methods have no body and cannot be executed. Since you have provided so little information I can't really elaborate more on how this can happen since the compiler usually catches this problem - as described here, this means the class must have changed at runtime.
From documnentation of AbstractMethodError
Thrown when an application tries to call an abstract method. Normally,
this error is caught by the compiler; this error can only occur at run
time if the definition of some class has incompatibly changed since
the currently executing method was last compiled.
A kind of special case of the above answer.
I had this error, because I was using a spring-boot-starter-parent (e.g. 2.1.0.RELEASE uses spring version: 5.1.2.RELEASE) but I also included a BOM, that also defined some spring dependencies, but in an older version (e.g. 5.0.9.RELEASE).
So one thing to do, is check your dependency tree (in Eclipse e.g. you can use the Dependency Hierarchy) if you are using the same versions.
So one solution could be that you upgrade the spring dependencies in your BOM, another one could be that you exclude them (but depending on the amount, this could be ugly).
If you download any project zip file, after unzipping them and importing into Android Studio, you are unable to run the project because this error happened.
I got out of the problem by deleting my android studio, then download and install the new version.
I truly hope it help.
If you you are getting this error on the implemented methods, make sure you have added your dependencies correctly as mentioned in this thread.
As Damian quoted :
Normally, this error is caught by the compiler; this error can only
occur at run time if [...]
I had the same error that was not caught by the compiler but at runtime. To solve it I only compiled again without giving the code any modification.
if you are getting this error on a minified build using Proguard then check if the class is a POJO class and if so then exclude it from the Proguard using the below rule:
-keep class your.application.package.pojo.** {*;}
I had the same error when I imported an eclipse project into intellij ide.. I tried to import it without .iml file then my problem was solved
I get this problem when I update my kotlin plugin to a new version .... the problem is that my pom file is using the older kotlin version .. I guess it might help someone if he is doing this mistake
I am getting various of these and others infrequently on android.. I have to clean everything change som configuration rebuild change configuration again to normal somehow just the build tools don't rebuild everything they should for whatever reason (Android gradle bug obviously).
Looking at trying to use the GWT-Jackson-Apt library for doing certain RPC, but when looking at examples and trying to run some demos there are always interfaces with a bizarre undefined constructor call.
#JSONMapper
public interface SampleMapper extends ObjectMapper<SimpleBean> {
SampleMapper INSTANCE = new App_SampleMapperImpl();
}
source: https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/basic/basic-client/src/main/java/org/dominokit/jacksonapt/samples/basic/App.java
I've been digging around, but there is no definition of App_SampleMapperImpl() anywhere in the source code. And it doesn't compile, saying that there is an undefined symbol
The exact same thing is done in the readme file's exmaples which can be found on this page: https://github.com/DominoKit/gwt-jackson-apt/tree/f60d0358b90bcbf78d066796f680aeae1d7156bb
can anyone explain what is going on here? How is this constructor being defined, or implied? And what do I need to do to make the example compile?
Assuming you are making a Maven project, the important thing is to include the annotation processor which generates the mappers. Then, once the project knows how to generate them, you'll be able to use them in your code.
Annotation Processors run while the compiler is running, which means you technically get to write code which doesn't appear it will compile. Then, as the compiler is running, it asks all registered annotation processors to please generate code based on the annotations and existing types (not the missing references like App_Sample_MapperImpl as you might think). The processor then runs, generates the missing class, and then the compile continues.
Usually what happens is that you build while writing code (eclipse, for example, does this every time a file is saved, intellij does it when you ask for a build, etc), and then the class exists and can be referenced going forward. Even when the project is cleaned and rebuilt, while the reference seems like it should not work, it will work as soon as the compiler runs.
In this case, we'll need to follow the example to make sure the processor is present. in https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/shared-mappers/shared-mappers-shared/pom.xml, we see this in the dependencies:
<dependency>
<groupId>org.dominokit.jackson</groupId>
<artifactId>jackson-apt-processor</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
This is marked scope=provided since it is only needed to compile, then shouldn't be included in later dependency graphs. For each specific IDE, you may need to specify additional options to get it to re-run automatically (a checkbox in Eclipse, nothing in IntelliJ I believe, and I haven't used other IDEs in too long to say).
One final note for maven: you must use a relatively recent maven-compiler-plugin so that generated code is handled correctly: latest is 3.8.0, published July 2018, but I think anything after 3.5.1 will be sufficient if you must use an older one.
Just follow the example on the main page of the project: https://github.com/DominoKit/gwt-jackson-apt/
Does that work ?
I'm trying to understand more about how GWT compilation works.
More specifically, I want to know how does GWT decide that a particular error is fatal, and the app compilation should fail because of it, and how does it decide that compilation is successful, even though there are compilation errors.
The reason I'm asking is that it's very difficult to distinguish legitimate errors in my log when doing a search, from ones that don't seem to cause any problem.
I'm talking about GWT 2.7 and GWT 2.8 (which I've seen they exhibit the same behavior).
Also, I'm using GWTP 1.5.3, if this is relevant somewhat.
A concrete example: I have this error in my logs:
Tracing compile failure path for type 'myApp.ClientModule'
Errors in 'file:/E:/data/.../myApp/ClientModule.java'
Line 24: No source code is available for type myApp.client.ServicesProvidersModuleGen; did you forget to inherit a required module?
Checked 1 dependencies for errors.
The error above does not make my app to fail compilation, and myApp works just fine (the class is something that registers some GIN bindings, which also work).
Why didn't GWT failed my compilation when it encountered that error?
Additionally, I also have other errors such as:
Errors in 'com/google/gwt/validation/client/impl/AbstractGwtSpecificValidator.java'
Line 102: No source code is available for type javax.validation.ValidationException; did you forget to inherit a required module?
Line 177: No source code is available for type javax.validation.ConstraintValidator<A,T>; did you forget to inherit a required module?
Line 153: No source code is available for type javax.validation.groups.Default; did you forget to inherit a required module?
Line 302: No source code is available for type javax.validation.ConstraintViolation<T>; did you forget to inherit a required module?
These errors also don't fail my compilation. Why?
Edit1: forgot to add.
I'm tempted to guess that compilation fails when the error is in something directly reachable from an entry point, and that compilation is OK when that code is not reachable.
However, I have the counter-example of code with annotations.
I have code that IS reachable from the entry point, and has annotations whose source code is not available, and yet the compilation succeeds (although this is the only exception that I could find so far).
Your analysis is good.
GWT will scan the entire classpath, ignoring everything not in the source path and "rebasing" super-sources. During that scan, it emits the kind of error you saw, but only when code will reach the missing sources (from entry points) the error will become fatal. Annotations are no exception, but code will never actually reach them as their just metadata (unless you implement an #interface, which Java allows). Annotations can be used by generators though, in which case they can fail the build.
Note that if you use -failOnError (or -strict, which is an alias), then all errors are fatal. You should aim for turning this on IMO.
I have an annotation processor for an annotation of retention policy=SOURCE.
I have not idea how to step-debug it.
I have issued print statements, logger info when I run mvn install, compile or package or ant javac, and I see their sysouts in the compile log.
However, I have no idea how to step-debug the processor in Eclipse. I mean, how do you step-debug compile-time?
An option in recent times is to use something like http://github.com/google/compile-testing which lets you invoke the compilation job against arbitrary annotation processors, which you can set break points, step through, etc.
#Test public void testStuff() {
// Create a source file to process, or load one from disk.
JavaFileObject file = JavaFileObjects.fromSourceLines("test.Foo",
"package test;",
"",
"import bar.*;",
"",
"#MyAnnotation(blah=false)",
"interface TestInterface {",
" Bar someBar();",
"}",
// assert conditions following a compilation in the context of MyProcessor.
assert_().about(javaSource()).that(file)
.processedWith(new MyProcessor())
.failsToCompile()
.withErrorContaining("some error message").in(file).onLine(5);
}
This test expects you will get some error message because #MyAnnotation is incorrectly declared in the test data source. If this assertion fails, you can run it in debug mode in your IDE, set breakpoints in MyProcessor, and step through with a full compiler environment active during debugging.
For unit testing specific methods within your processor, you can also use the #Rule called CompilationRule from which you can obtain Elements and Types utility classes in order to test specific logic in your compiler in a more isolated way.
You have to invoke the Java compiler from Eclipse, using a debug configuration (you'll need to create the configuration manually, from the "Debug Configurations..." menu choice.
The "correct" way to invoke the Java compiler under JDK 1.6 or above is to use the JavaCompiler interface in javax.tools, which you get from the ToolProvider (I include all the links because there's a decent amount of class/package documentation that you should read).
The "quick-and-dirty" way (that should work, but I make no guarantees) is to invoke com.sun.tools.javac.Main.main(), passing it your normal command-line arguments. To do this, you'll need tools.jar on your classpath (it's found in $JAVA_HOME/lib).
Annotation processing occurs during compilation, so normal debugging won't work. If you want to debug it in the context of you project, you can use Eclipse remote debugging, while having Gradle or Maven in debug mode. Then you can put breakpoints in the Annotation Processor's files.
See Debugging an Annotation Processor in any project.
Disclaimer: I wrote the post.