I am using a 3rd party annotation processor for generating meta-data code (.java files) from the annotated classes in my project.
I have successfully configured the processor through Eclipse (Properties -> Java Compiler -> Annotation Processing) and the code generation works fine (code is automatically created and generated). Also, Eclipse successfully auto-completes the generated classes and their fields, without any errors. Let's say that I have a class "some.package.Foo" and that the generated meta-data class is "some.package.Foo_". By the help of auto-completion, I can get the following code in the Eclipse editor, without any errors:
import some.package.Foo_;
...
public class Test {
void test() {
Foo_.someField = null; // try to access a field from the generated class Foo_
}
}
However, as soon as I actually build the project (or just save the file since Build automatically is enabled), I get the error which tells that "some.package.Foo_" cannot be resolved.
It seems like Eclipse is generating and compiling the some.package.Foo_ at the same time, or more likely.
I found two temporary solutions (which are practically hindering the use of the annotation processor in the first place):
Before each build of that generated classes, right click on every generated file go to Properties and uncheck the "Derived" tick. After that, I do the cleanup of the project and the imports are fine - there are no more errors. However, if I do the cleanup one more time, the errors again show up, because the generation of the files causes the "Derived" tick to be checked again (automatically). So this is really annoying and time-consuming.
I also uncheck the "Derived" tick
from all those files, and this time
I uncheck the "Derived" tick from
the source folder and packages which
contain those files. Then I disable
the annotation processor, and then
do the cleanup. There are no more
import errors, even if I do another
cleanup, but there is no benefit of
using the annotation processor,
because if I was to change something
which would update the model, I need
to turn the annotation processor
back on, and repeat this tedious
procedure to turn it off, after it
has generated the new version of
those files.
Is this a bug in Eclipse? If yes, is there a better workaround or quick-fix than the two I have stated above? If not, what should I try to solve the problem?
I also tried rearranging the order of the libraries on the build path and it doesn't help.
I assume that you are generating sources in the last processor round. This is not recommended way and leads exactly to the problem that you had.
Explanation is here: http://code.google.com/p/acris/wiki/CodeGenerationPlatform_Pitfall_Rounds
So the my advise is to generate sources in regular processing rounds and final round should be used just for notification that processing is over or something like that.
Hopefully this helps you.
I have a similar problem, and the only thing I've found is that it's the imports specifically that don't work, but the references in the class itself do work. The workaround I've used is to use the FQCN in all cases where the generated class is needed (except when the generated class is in the same package, since then the import is obviously not needed).
So to use your example, I'd do:
public class Test {
void test() {
some.package.Foo_.someField = null; // try to access a field from the generated class Foo_
}
}
My only guess then is that the eclipse compiler is processing the imports before doing the annotation processing, which imho must be a bug in eclipse.
I know this question is over a year old, so I'd be interested to know if you've found any other way to fix it.
We were experiencing a similar problem and apparently just solved it, so thought of sharing it at SO, in case it helps someone.
We are using:
Eclipse Indigo (Build id: 20120216-1857)
m2e Connector for maven
openJPA for static metamodel class generation
Our problem:
Say, we have a package named com.abc.xyz and an entity class in there named OurEntity. When we build the projects (JPA, EJB, EAR etc. all together with an mvn clean at the beginning) the metamodel classes get generated. And also get appropriately packaged within the PU jar. But when we try to import the generated metamodel class com.abc.xyz.OurEntity_, Eclipse cannot resolve it. OP apparently got past this point:-). Maven build failed, saying it could not resolve that class. Not much help from google except for a few bug reports such as this one: https://bugs.eclipse.org/bugs/show_bug.cgi?id=350378
That bug report said importing the whole package as opposed to the single class helped. So, tried that, but with no benefit. It also said (and so did David Heitzman) that using the fully qualified class name worked for them. That did not work either.
The solution:
Added the PU jar to Eclipse build path for the project that needed to use the metamodel classes. All of a sudden all the red underlines went away (not a surprise). But the fear was there might be two PUs in the same ear. But maven automagically took care of that.
As this rather old question got some attention without pointing to the very probable eclipse bug the OP was specifically asking for, I'd like to complement the above answers with a pointer to the eclipse bug tracker:
Cannot resolve import for generated class IF processing annotations with parameters referencing constants
The workarounds include
doing a wildcard import of the package defining the generated classes (i.e. import some.package.*;)
using the fully qualified name of your generated class, i.e. referring to some.package.Foo in your code and not using an import
switch to a newer Eclipse. This specific eclipse bug is resolved with Eclipse version 4.4 (aka Luna).
Related
Looking at trying to use the GWT-Jackson-Apt library for doing certain RPC, but when looking at examples and trying to run some demos there are always interfaces with a bizarre undefined constructor call.
#JSONMapper
public interface SampleMapper extends ObjectMapper<SimpleBean> {
SampleMapper INSTANCE = new App_SampleMapperImpl();
}
source: https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/basic/basic-client/src/main/java/org/dominokit/jacksonapt/samples/basic/App.java
I've been digging around, but there is no definition of App_SampleMapperImpl() anywhere in the source code. And it doesn't compile, saying that there is an undefined symbol
The exact same thing is done in the readme file's exmaples which can be found on this page: https://github.com/DominoKit/gwt-jackson-apt/tree/f60d0358b90bcbf78d066796f680aeae1d7156bb
can anyone explain what is going on here? How is this constructor being defined, or implied? And what do I need to do to make the example compile?
Assuming you are making a Maven project, the important thing is to include the annotation processor which generates the mappers. Then, once the project knows how to generate them, you'll be able to use them in your code.
Annotation Processors run while the compiler is running, which means you technically get to write code which doesn't appear it will compile. Then, as the compiler is running, it asks all registered annotation processors to please generate code based on the annotations and existing types (not the missing references like App_Sample_MapperImpl as you might think). The processor then runs, generates the missing class, and then the compile continues.
Usually what happens is that you build while writing code (eclipse, for example, does this every time a file is saved, intellij does it when you ask for a build, etc), and then the class exists and can be referenced going forward. Even when the project is cleaned and rebuilt, while the reference seems like it should not work, it will work as soon as the compiler runs.
In this case, we'll need to follow the example to make sure the processor is present. in https://github.com/DominoKit/gwt-jackson-apt/blob/f60d0358b90bcbf78d066796f680aeae1d7156bb/samples/shared-mappers/shared-mappers-shared/pom.xml, we see this in the dependencies:
<dependency>
<groupId>org.dominokit.jackson</groupId>
<artifactId>jackson-apt-processor</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
This is marked scope=provided since it is only needed to compile, then shouldn't be included in later dependency graphs. For each specific IDE, you may need to specify additional options to get it to re-run automatically (a checkbox in Eclipse, nothing in IntelliJ I believe, and I haven't used other IDEs in too long to say).
One final note for maven: you must use a relatively recent maven-compiler-plugin so that generated code is handled correctly: latest is 3.8.0, published July 2018, but I think anything after 3.5.1 will be sufficient if you must use an older one.
Just follow the example on the main page of the project: https://github.com/DominoKit/gwt-jackson-apt/
Does that work ?
I’m using the immutables.org and mapstruct annotation processors in my sbt project (I've moved them to subprojects, so they don't interfere with each other).
Sometimes, compiling my project fails in compileIncremental because the annotation processor would create a new file, but the compiler already read the previously generated file or I changed my interface in src/main/java but the (previously) generated sources still "implement" the old interface (they would be overwritten, but that happens only after processing the sources in src/main/java).
My workaround was to create a task that deletes the generated sources beforehand for which "(compile in Compile)" would depend on.
Is there another way to do this? like disabling compileIncremental for one single project? or specifying the order of compilation? (like first normal sources, then unmanagedSources)
Alternatively finding out if the sourceFiles really changed and only then deleting the generated sources would also work for me, but I’m not sure how to approach that.
Any help would be greatly appreciated!
Thanks,
Dominik
I am working on a project whose input is XSD.From the input XSD jaxb classes will be generated in a particular package.
There is reflection class which will create class from the ObjectFactory.java (JAXB generated).
Class<?> aClass = Class.forName("pkg.ObjectFactory");
But its throwing class not found exception.
Refreshing the excipse project by right clicking on it resolving the exception.
How to solve this problem automatically?
Though wait for an eclipse-only answer too, maybe you should consider having a nice build infrastructure like maven. If you reached a size where would like to draw the build system in separate parts and pin it on the wall.
I am in favour of many independantly versioned (sub)projects. An explicit structure is best done with a maven build infrastructure, available for all IDEs. Then a project might generate the JAXB source classes, and the main project might depend on that project.
I am importing a Jar file "com.ibm.mq.jar" into my workspace(Eclipse IDE).
While importing, a screen came where I could see all the classes in the Jar file.
After I imported it into the work space, I was able to import the package and following statement didn't give any error.
import com.ibm.mq.*;
But, in code I am not able to use any of the classes which were there in the package.
Like, "MQC" is a class in the package, but in code it doesn't reflect("MQC cannot be resolved as a type" error comes if I try to use it).
This jar file actually contains Websphere MQ API classes.
Can anyone advise, what am I missing.
If you're using MQ 7, check its documentation here. There was some stuff going on about deprecation of com.ibm.mq.mqc and, depending on the version you use, that class was replaced by com.ibm.mq.constants.MQConstants. Like this one, there are other cases.
In fact com.ibm.mq only contains the exception MQException, so you won't find any classes there. I suggest you check the version you're using and dig a little deeper into the docs, as a first step.
Suppose I have have a java project myProject and am using an external library jar (someJar.jar), which has a class com.somepackage.Class1.class.
Now I find an updated version of Class1.java which fixes a bug in the original jar.
I include the new Class1.java in my source code under package com.somepackage
When I build the project (e.g., using Netbeans), there is a dist\myProject.jar which contains the classcom.somepackage.Class1.class and a dist\lib\someJar.jar which also contains a class with the same name.
When I run the file (e.g, using java -jar dist\myProject.jar), the new version of Class1.class is used (as I want).
How does Java decide which class file to run in case of such duplicates? Is there any way I can specify precedence ?
Is there any 'right' way to avoid such clashes?
In Proguard, when I try to compress my code, I get a duplicate class error. How do I eliminate this?
Java decides which one to use based on the order of the classpath. List yours first and you'll be fine.
The "right" way would be to fix the orignal source, but sometimes that's not always an option.
I haven't used ProGuard, but I have re-jarred libaries before that had duplicate classes. The solution in my case was to tell Ant to ignore duplicate classes. I would assume ProGuard would have that support too.
Can you not create an updated jar file which contains the bug fix? It's going to make things a lot simpler if you don't have two versions of the same fully-qualified class around.
1) Updated Jar is a better solution.
2) Use a different class name. Is there a reason, why you want to use the same class name and same packing? I don't think there is a reason.
3) create a wrapper/ proxy class, that encapsulate all the calls to the jar and you can decide to call this new class that fixes the bug ( provided it has a different name and packaging)