APT-generators stop working after update to Jakarta/Java17 - java

I am working on a rather large project consisting of Spring and JavaEE-components. Right now, I am updating from javax to jakartaEE and also from Java 11 to 17. The build-tool is gradle with several modules, which depend on one another. I have already changed things to make the first modules compile and am now at the module containing the JPA entities.
Suddenly I find myself in a situation, where the metamodel is no longer created.
My basic assumption here is, that gradle itself is not participating in the generation of the Metamodel. Instead, the APT generators are handed over to javac which then recognizes them and executes them prior to compilation.
I have debugged the gradle execution and the processorpath option of javac contains the jpa-metamodel.jar as expected.
Therefore, I assume, I am rather looking at a Java than a Gradle problem.
I did a very simple POC project with the same Java/Hibernate versions and there it worked.
With this project, I was able to verify, that the metamodel-generator issues a log statement. This statement is also issued, if the source code does not compile. So, it seems to be an indicator whether the modelgen is working at all.
This log message is not written in my main project.
Also, next to the jpa-metamodel there are two other apt generators at work. Both show similar behavior, i.e. printing out a log line when working but not in my updated project.
This brings me to the assumption, that it is no problem with the jpa-metamodel-gen but rather with APT itself.
Where should I be looking next?
EDIT
I have done some more researching. Via gradle --debug I was able to extract the actual javac call. Executing that from console shows the same error.
I reduced the number of elements to be compiled. It seems that putting an actual entity on the list of things to be compiled the APT generator goes dormant. Putting only a MappedEntity supertype on the list will still leave the APT active. WTF?
EDIT2
I am currently on my way to a reproducable exmaple. I've cut through a lot of business-related libs and right now, what I have is a havac call, where -processorpath only mentions the jpamodelgen and -classpath only mentions hibernate core (both in Version 6.1.6).
In this constellation the modelgen does not work. As soon as I eliminate the hibernate-core, it works.
Next step: I'll strip down the entities.

All right, that was unexpected.
I condensed it down to a minimal example. Required for this issue to hit is a call similar to this one:
javac \
-processorpath hibernate-jpamodelgen-6.1.6.Final.jar\
-classpath hibernate-core-6.1.6.Final.jar\
BaseEntity.java
The additional condition, that triggers this issue is, that BaseEntity has to have a dysfunctional #Type annotation. #Type has changed its signature from Hibernate 5 to Hibernate 6 and my entities were not yet up to date.
Interestingly enough, neither a generell compile error nor a specific one with other annotations (I tried a broken spring-Repository one) will stop the modelgen from working (as won't the fact, that the processorpath is incomplete in the example).
I will report this as an issue to the hibernate project, as I believe, the modelgen should react more graceful in such a situation.
Reported as bug here:
https://hibernate.atlassian.net/browse/HHH-15946

Related

JasperReport cannot find symbol JREvaluator in WildFly, works without server

Recently I've been working on report generation with Jasper. I have created a simple program to test it and when running it via IDE it did work fine.
Then I moved the (very short) class to WildFly sever application and despite having the exact same code and library generation fails with cannot find symbol. Those symbols it cannot find are JREvaluator, JRFillVariable as well as packages such as net.sf.jasperreports.engine
In so far I have confirmed that:
Project builds (meaning those classes are visible for javac, but not jvm)
jasperreports-6.13.0.jar is added to war (it's present in /WEB-INF/lib folder alongside other libraries, like gson and hibernate
jasperreports-6.13.0.jar contains the missing classes
It looks to me like the problem doesn't lie in library not being loaded or missing classes (because in testing environment it works), but like something was preventing JBoss class loader from loading those classes
Attempted (and failed) solutions
Clean and Build
Adding -Djava.awt.headless=true to VM options - this did not changed anything
Adding -Djava.awt.headless=false to VM options - also didn't change a thing, but once caused NullPointerException inside jasperreport library. For testing program - worked in both cases
Adding commons-beanutils-1.9.4.jar, commons-digester-2.1.jar, commons-collections4-4.4.jar and commons-loggin-1.2.jar - with no changes
Adding jasper-compiler-jdt-5.5.23.jar - this caused a different error, namely NoSuchMethodError for org.eclipse.jdt.internal.compiler.ICompilerRequestor and few others. This library however should not be necessary as - from what I understand - jasperreport-6.13.0.jar already contains it's compiler and separate library for compiler is not required since a long time.
What has not been attempted:
Forcing the classes to load (http://www.java2s.com/Code/Java/Reflection/Forcethegivenclasstobeloadedfully.htm)
Dynamically loading jar during Runtime or using custom class loader
Update: after looking at this answer and applying the suggestion the missing class was different. Which suggests that the dependencies inside jasperreport.jar are not being loaded properly
I have figured it out
For some reason in server project libraries used by jasperreport.jar were not loaded, but in the testing project they were (might be due to WildFly, might be due to differences between IntelliJ and NetBeans)
Here is the list of libraries, based on pom.xml file in jasperreport.jar that I have added. Some might not be necessary and the list might not be exhaustive (I basically stopped adding libraries once report started generating) but it's good enough base if someone else runs into this problem:
commons-beanutils-1.9.4.jar
itext-2.1.7.jar
poi-ooxml-4.1.1.jar
commons-collections4-4.4.jar
jcommon-1.0.23.jar
xalan-2.7.2.jar
commons-digester-2.1.jar
jfreechart-1.0.19.jar
xmpcore-5.1.3.jar
commons-logging-1.2.jar
poi-4.1.1.jar

GWT-Jackson-Apt seemingly undefined class constructor call

Looking at trying to use the GWT-Jackson-Apt library for doing certain RPC, but when looking at examples and trying to run some demos there are always interfaces with a bizarre undefined constructor call.
#JSONMapper
public interface SampleMapper extends ObjectMapper<SimpleBean> {
SampleMapper INSTANCE = new App_SampleMapperImpl();
}
source: https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/basic/basic-client/src/main/java/org/dominokit/jacksonapt/samples/basic/App.java
I've been digging around, but there is no definition of App_SampleMapperImpl() anywhere in the source code. And it doesn't compile, saying that there is an undefined symbol
The exact same thing is done in the readme file's exmaples which can be found on this page: https://github.com/DominoKit/gwt-jackson-apt/tree/f60d0358b90bcbf78d066796f680aeae1d7156bb
can anyone explain what is going on here? How is this constructor being defined, or implied? And what do I need to do to make the example compile?
Assuming you are making a Maven project, the important thing is to include the annotation processor which generates the mappers. Then, once the project knows how to generate them, you'll be able to use them in your code.
Annotation Processors run while the compiler is running, which means you technically get to write code which doesn't appear it will compile. Then, as the compiler is running, it asks all registered annotation processors to please generate code based on the annotations and existing types (not the missing references like App_Sample_MapperImpl as you might think). The processor then runs, generates the missing class, and then the compile continues.
Usually what happens is that you build while writing code (eclipse, for example, does this every time a file is saved, intellij does it when you ask for a build, etc), and then the class exists and can be referenced going forward. Even when the project is cleaned and rebuilt, while the reference seems like it should not work, it will work as soon as the compiler runs.
In this case, we'll need to follow the example to make sure the processor is present. in https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/shared-mappers/shared-mappers-shared/pom.xml, we see this in the dependencies:
<dependency>
<groupId>org.dominokit.jackson</groupId>
<artifactId>jackson-apt-processor</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
This is marked scope=provided since it is only needed to compile, then shouldn't be included in later dependency graphs. For each specific IDE, you may need to specify additional options to get it to re-run automatically (a checkbox in Eclipse, nothing in IntelliJ I believe, and I haven't used other IDEs in too long to say).
One final note for maven: you must use a relatively recent maven-compiler-plugin so that generated code is handled correctly: latest is 3.8.0, published July 2018, but I think anything after 3.5.1 will be sufficient if you must use an older one.
Just follow the example on the main page of the project: https://github.com/DominoKit/gwt-jackson-apt/
Does that work ?

SBT: Java Annotation Processors and compileIncremental error

I’m using the immutables.org and mapstruct annotation processors in my sbt project (I've moved them to subprojects, so they don't interfere with each other).
Sometimes, compiling my project fails in compileIncremental because the annotation processor would create a new file, but the compiler already read the previously generated file or I changed my interface in src/main/java but the (previously) generated sources still "implement" the old interface (they would be overwritten, but that happens only after processing the sources in src/main/java).
My workaround was to create a task that deletes the generated sources beforehand for which "(compile in Compile)" would depend on.
Is there another way to do this? like disabling compileIncremental for one single project? or specifying the order of compilation? (like first normal sources, then unmanagedSources)
Alternatively finding out if the sourceFiles really changed and only then deleting the generated sources would also work for me, but I’m not sure how to approach that.
Any help would be greatly appreciated!
Thanks,
Dominik

RemoteActorRefProvider ClassNotFound

I'm struggling trying to get remote actors setup in Scala. I'm running Scala 2.10.2 and Akka 2.2.1.
I compile using [I've shortened the paths on the classpath arg for clarity sake]:
$ scalac -classpath "akka-2.2.1/lib:akka-2.2.1/lib/scala-library.jar:akka-2.2.1/lib/akka:akka-2.2.1/lib/akka/scala-reflect-2.10.1.jar:akka-2.2.1/lib/akka/config-1.0.2.jar:akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-kernel_2.10-2.2.1.jar:akka-2.2.1/lib/akka/akka-actor_2.10-2.2.1.jar:." [file.scala]
I've continuously added new libraries trying to debug this - I'm pretty sure all I really need to include is akka-remote, but the others shouldn't hurt.
No issues compiling.
I attempt to run like this:
$ scala -classpath "[same as above]" [application]
And I receive a NSM exception:
java.lang.NoSuchMethodException: akka.remote.RemoteActorRefProvider.<init>(java.lang.String, akka.actor.ActorSystem$Settings, akka.event.EventStream, akka.actor.Scheduler, akka.actor.DynamicAccess)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getDeclaredConstructor(Class.java:2053)
...
Looking into the source code, it appears that Akka 2.2.X's flavor of this constructor takes 4 arguments (the Scheduler is removed). But in Akka < 2.2.X, the constructor takes 5 args.
Thus, I'm thinking my classpath isn't setup quite right. At compile-time, Scala must be finding the <2.2.X flavor. I don't even know where it would be finding it, since I only have Akka 2.2.1 installed.
Any suggestions!? Thanks! (Please don't say to use SBT).
The problem here is that the Scala distribution contains akka-actor 2.1.0 and helpfully puts that in the boot class path for you. We very strongly recommend using a dependency manager like sbt or maven when building anything which goes beyond the most trivial projects.
As noted in another answer, the problem is that scala puts a different version of Akka into the bootclasspath.
To more directly answer your question (as you said you don't want to use sbt): you can execute your program with java instead of scala. You just have to put the appropriate Scala jars into the classpath.
Here is a spark-dev message about the problem. The important part is: "the workaround is to use java to launch the application instead of scala. All you need to do is to include the right Scala jars (scala-library and scala-compiler) in the classpath."

Eclipse 3.5+ - Annotation processor: Generated classes cannot be imported

I am using a 3rd party annotation processor for generating meta-data code (.java files) from the annotated classes in my project.
I have successfully configured the processor through Eclipse (Properties -> Java Compiler -> Annotation Processing) and the code generation works fine (code is automatically created and generated). Also, Eclipse successfully auto-completes the generated classes and their fields, without any errors. Let's say that I have a class "some.package.Foo" and that the generated meta-data class is "some.package.Foo_". By the help of auto-completion, I can get the following code in the Eclipse editor, without any errors:
import some.package.Foo_;
...
public class Test {
void test() {
Foo_.someField = null; // try to access a field from the generated class Foo_
}
}
However, as soon as I actually build the project (or just save the file since Build automatically is enabled), I get the error which tells that "some.package.Foo_" cannot be resolved.
It seems like Eclipse is generating and compiling the some.package.Foo_ at the same time, or more likely.
I found two temporary solutions (which are practically hindering the use of the annotation processor in the first place):
Before each build of that generated classes, right click on every generated file go to Properties and uncheck the "Derived" tick. After that, I do the cleanup of the project and the imports are fine - there are no more errors. However, if I do the cleanup one more time, the errors again show up, because the generation of the files causes the "Derived" tick to be checked again (automatically). So this is really annoying and time-consuming.
I also uncheck the "Derived" tick
from all those files, and this time
I uncheck the "Derived" tick from
the source folder and packages which
contain those files. Then I disable
the annotation processor, and then
do the cleanup. There are no more
import errors, even if I do another
cleanup, but there is no benefit of
using the annotation processor,
because if I was to change something
which would update the model, I need
to turn the annotation processor
back on, and repeat this tedious
procedure to turn it off, after it
has generated the new version of
those files.
Is this a bug in Eclipse? If yes, is there a better workaround or quick-fix than the two I have stated above? If not, what should I try to solve the problem?
I also tried rearranging the order of the libraries on the build path and it doesn't help.
I assume that you are generating sources in the last processor round. This is not recommended way and leads exactly to the problem that you had.
Explanation is here: http://code.google.com/p/acris/wiki/CodeGenerationPlatform_Pitfall_Rounds
So the my advise is to generate sources in regular processing rounds and final round should be used just for notification that processing is over or something like that.
Hopefully this helps you.
I have a similar problem, and the only thing I've found is that it's the imports specifically that don't work, but the references in the class itself do work. The workaround I've used is to use the FQCN in all cases where the generated class is needed (except when the generated class is in the same package, since then the import is obviously not needed).
So to use your example, I'd do:
public class Test {
void test() {
some.package.Foo_.someField = null; // try to access a field from the generated class Foo_
}
}
My only guess then is that the eclipse compiler is processing the imports before doing the annotation processing, which imho must be a bug in eclipse.
I know this question is over a year old, so I'd be interested to know if you've found any other way to fix it.
We were experiencing a similar problem and apparently just solved it, so thought of sharing it at SO, in case it helps someone.
We are using:
Eclipse Indigo (Build id: 20120216-1857)
m2e Connector for maven
openJPA for static metamodel class generation
Our problem:
Say, we have a package named com.abc.xyz and an entity class in there named OurEntity. When we build the projects (JPA, EJB, EAR etc. all together with an mvn clean at the beginning) the metamodel classes get generated. And also get appropriately packaged within the PU jar. But when we try to import the generated metamodel class com.abc.xyz.OurEntity_, Eclipse cannot resolve it. OP apparently got past this point:-). Maven build failed, saying it could not resolve that class. Not much help from google except for a few bug reports such as this one: https://bugs.eclipse.org/bugs/show_bug.cgi?id=350378
That bug report said importing the whole package as opposed to the single class helped. So, tried that, but with no benefit. It also said (and so did David Heitzman) that using the fully qualified class name worked for them. That did not work either.
The solution:
Added the PU jar to Eclipse build path for the project that needed to use the metamodel classes. All of a sudden all the red underlines went away (not a surprise). But the fear was there might be two PUs in the same ear. But maven automagically took care of that.
As this rather old question got some attention without pointing to the very probable eclipse bug the OP was specifically asking for, I'd like to complement the above answers with a pointer to the eclipse bug tracker:
Cannot resolve import for generated class IF processing annotations with parameters referencing constants
The workarounds include
doing a wildcard import of the package defining the generated classes (i.e. import some.package.*;)
using the fully qualified name of your generated class, i.e. referring to some.package.Foo in your code and not using an import
switch to a newer Eclipse. This specific eclipse bug is resolved with Eclipse version 4.4 (aka Luna).

Categories