I had to discover I have Java code in my project, which compiles and runs fine in Eclipse, but throws a compilation error in javac.
A self-contained snippet:
import java.util.HashSet;
import java.util.Set;
public class Main {
public static void main(String[] args) {
Set<Integer> setOfInts = new HashSet<Integer>();
Set<Object> setOfObjects = covariantSet(setOfInts);
}
public static <S, T extends S> Set<S> covariantSet(Set<T> set) {
return new HashSet<S>(set);
}
}
Compilation in javac returns:
Main.java:10: incompatible types
found : java.util.Set<java.lang.Integer>
required: java.util.Set<java.lang.Object>
Set<Object> setOfObjects = covariantSet(setOfInts);
^
This error now prevents building the project in Maven. As the Eclipse compiler is built to be more tolerant, I now have to assume the definition and usage of snippets as above static method is no valid Java?
It seems that Sun's 1.6 JDK can't infer the correct type. The following seems to work on my machine:
Set<Object> setOfObjects = Main.<Object, Integer>covariantSet(setOfInts);
Note that you must invoke the static method prefixed with the class name
You are right. This problem indeed exists. Eclipse does not use javac. It uses its own compiler.
Actually javac is "right". Generics are erasures. Type S is not included into your byte code, so jvm does not have enough information about the return type at runtime. To solve the problem change the method prototype as following:
public static <S, T extends S> Set<S> covariantSet(Set<T> set, Class<S> returnType)
Now the return type is passed to the method at runtime and compiler should not complain.
In your Maven build skript you have set the compiler version.
In Ant it lookes like this:
<property name="source.version" value="1.5" />
search for 1.3 or 1.4, or compile to find that value in the maven skripts
With value 1.5 the compiler will accept the generics (see your error messages)
I know it's old question, but I want to mention, the function could be written as:
import java.util.HashSet;
import java.util.Set;
public class Main {
public static void main(String[] args) {
Set<Integer> setOfInts = new HashSet<Integer>();
Set<Object> setOfObjects = covariantSet(setOfInts);
}
public static <S> Set<S> covariantSet(Set<? extends S> set) {
return new HashSet<S>(set);
}
}
It's a little bit cleaner and you can use the function exactly how you intented to(with implicit generic typing).
Add the next plugin to your pom.xml:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
Related
I have a problem understanding the behaviour of Java generics in the following case.
Having some parametrised interface, IFace<T>, and a method on some class that returns a class extending this interface, <C extends IFace<?>> Class<C> getClazz() a java compilation error is produced by gradle, 1.8 Oracle JDK, OSX and Linux, but not by the Eclipse compiler within the Eclipse IDE (it also happily runs under Eclipse RCP OSGi runtime), for the following implementation:
public class Whatever {
public interface IFace<T> {}
#SuppressWarnings("unchecked")
protected <C extends IFace<?>> Class<C> getClazz() {
return (Class<C>) IFace.class;
}
}
➜ ./gradlew build
:compileJava
/Users/user/src/test/src/main/java/Whatever.java:6: error: incompatible types: Class<IFace> cannot be converted to Class<C>
return (Class<C>) IFace.class;
^
where C is a type-variable:
C extends IFace<?> declared in method <C>getClazz()
1 error
:compileJava FAILED
This implementation is not a very logical one, it is the default one that somebody thought was good, but I would like to understand why it is not compiling rather than question the logic of the code.
The easiest fix was to drop a part of the generic definition in the method signature. The following compiles without issues, but relies on a raw type:
protected Class<? extends IFace> getClazz() {
return IFace.class;
}
Why would this compile and the above not? Is there a way to avoid using the raw type?
It's not compiling because it's not type-correct.
Consider the following:
class Something implements IFace<String> {}
Class<Something> clazz = new Whatever().getClazz();
Something sth = clazz.newInstance();
This would fail with a InstantiationException, because clazz is IFace.class, and so it can't be instantiated; it's not Something.class, which could be instantied.
Ideone demo
But the non-instantiability isn't the relevant point here - it is fine for a Class to be non-instantiable - it is that this code has tried to instantiate it.
Class<T> has a method T newInstance(), which must either return a T, if it completes successfully, or throw an exception.
If the clazz.newInstance() call above did succeed (and the compiler doesn't know that it won't), the returned value would be an instance of IFace, not Something, and so the assignment would fail with a ClassCastException.
You can demonstrate this by changing IFace to be instantiable:
class IFace<T> {}
class Something extends IFace<String> {}
Class<Something> clazz = new Whatever().getClazz();
Something sth = clazz.newInstance(); // ClassCastException
Ideone demo
By raising an error like it does, the compiler is removing the potential for getting into this situation at all.
So, please don't try to fudge the compiler's errors away with raw types. It's telling you there is a problem, and you should fix it properly. Exactly what the fix looks like depends upon what you actually use the return value of Whatever.getClass() for.
It is kind of funny, that the Eclipse compiler does compile the code, but Oracle Java Compiler will not compile it. You can use the Eclipse Compiler during the gradle build to make sure, gradle is compiling the same way the IDE does. Add the following snippet to your build.gradle file
configurations {
ecj
}
dependencies {
ecj 'org.eclipse.jdt.core.compiler:ecj:4.4.2'
}
compileJava {
options.fork = true
options.forkOptions.with {
executable = 'java'
jvmArgs = ['-classpath', project.configurations.ecj.asPath, 'org.eclipse.jdt.internal.compiler.batch.Main', '-nowarn']
}
}
It fails to compile because C could possibly be anything, where the compiler can be sure that IFace.class does not fulfill that requirement:
class X implements IFace<String> {
}
Class<X> myX = myWhatever.getClass(); // would be bad because IFace.class is not a Class<X>.
Andy just demonstrated why this assignment would be bad (e.g. when trying to instantiate that class), so my answer is not very different from his, but perhaps a little easier to understand...
This is all about the nice Java compiler feature of the type parameters for methods implied by calling context. You surely know the method
Collections.emptyList();
Which is declared as
public static <T> List<T> emptyList() {
// ...
}
An implementation returning (List<T>)new ArrayList<String>(); would obviously be illegal, even with SuppressWarnings, as the T may be anything the caller assigns (or uses) the method's result to (type inference). But this is very similar to what you try when returning IFace.class where another class would be required by the caller.
Oh, and for the ones really enjoying Generics, here is the possibly worst solution to your problem:
public <C extends IFace<?>> Class<? super C> getClazz() {
return IFace.class;
}
Following will probably work:
public class Whatever {
public interface IFace<T> {}
#SuppressWarnings("unchecked")
protected <C extends IFace> Class<C> getClazz() {
return (Class<C>) IFace.class;
}
}
In your former code, problem is that C has to extend IFace<?>>, but you provided only IFace. And for type system Class<IFace> != Class<IFace<?>>, therefore Class<IFace> can not be cast to Class<C extends IFace<?>>.
Maybe some better solution exists, as I am not a generics expert.
I have a class that calls one of its own methods internally several times. All of these methods take a generic parameter (Guava's Predicate). Eclipse compiles this fine and reports no errors and has no warning indicators, with compiler settings set to Java 1.6 compatibility. Gradle (using JDK 1.6.0_37) reports that one of these times the method is called it cannot find the symbol for that method, but the other times it can. This seems to involve the use of Guava's Predicates#and() static method. But a similar call with Guava's Predicates#not() works.
I have simplified the code down to the following:
import static com.google.common.base.Predicates.and;
import static com.google.common.base.Predicates.not;
import com.google.common.base.Predicate;
import com.google.common.base.Predicates;
import com.google.common.collect.FluentIterable;
public class MyClass {
public List<String> doStuffAnd(List<String> l, Predicate<String> p1, Predicate<String> p2) {
// eclipse fine, gradle complains it can't find symbol doStuff
return doStuff(l, and(p1, p2));
}
public List<String> doStuffNot(List<String> l, Predicate<String> p) {
// both eclipse and gradle compile fine
return doStuff(l, not(p));
}
public List<String> doStuff(List<String> l, Predicate<String> p) {
return FluentIterable.from(l).filter(p).toList();
}
}
Resulting compile error is:
doStuff(java.util.List,com.google.common.base.Predicate)
in MyClass cannot be applied to
(java.util.List,com.google.common.base.Predicate)
return doStuff(l, and(p1, p2));
^
If I explicitly type the call to Predicates.and() as follows
return doStuff(l, Predicates.<String>and(p1, p2));
then it is fine. But I don't have to do that with the call to Predicates.not() It also works if I extract the #and expression as a local variable.
What is the difference between the call using #and and the call using #not?
Is there anything I can do avoid this that involves neither typing the and call nor extracting the and expression?
And why is there a difference between the gradle compiler and Eclipse compiler?
OP's solution
The difference between and and not is that and defines its generic signature for its parameters as Predicate<? super T>, whereas not uses a simple parameter signature of Predicate<T>.
So to solve this, I define doStuffAnd with parm: Predicate<? super String>.
A couple of days ago, I started refactoring some code to use the new Java 8 Streams library. Unfortunately, I ran into a compile time error when performing Stream::map with a method which is declared to throw a generic E that is further specified to be a RuntimeException.
Interesting enough, the compile time error goes away when I switch to using a method reference.
Is this a bug, or is my method reference not equivalent to my lambda expression?
(Also, I know I can replace p->p.execute(foo) with Parameter::execute. My actual code has additional parameters for the execute method).
Error message
Error:(32, 43) java: unreported exception E; must be caught or declared to be thrown
Code
import java.util.ArrayList;
import java.util.List;
public class JavaBugTest
{
interface AbleToThrowException<E extends Exception>
{
}
interface Parameter {
public <E extends Exception> Object execute(AbleToThrowException<E> algo) throws E;
}
interface ThrowsRuntimeException extends AbleToThrowException<RuntimeException>
{
}
static ThrowsRuntimeException foo;
public static Object manualLambda(Parameter p)
{
return p.execute(foo);
}
public static void main(String[] args)
{
List<Parameter> params = new ArrayList<>();
params.stream().map(p -> p.execute(foo)); // Gives a compile time error.
params.stream().map(JavaBugTest::manualLambda); // Works fine.
}
}
System setup
OS: Windows x64
Java compiler version: Oracle JDK 1.8.0_11
IDE: Intellij
A very simple solution is to explicitly provide a type argument for Parameter#execute(..).
params.stream().map(p -> p.<RuntimeException>execute(foo)); // Gives a compile time error.
Without the explicit type argument, it seems like the JDK compiler cannot infer a type argument from the invocation context, though it should. This a bug and should be reported as such. I have now reported it and will update this question with new details when I have them.
Bug Report
I defined a parameterized interface:
import com.google.common.base.Optional;
public interface AbstractResource<S extends Parent> {
Optional<S> getOption();
Optional<S> getAbsent();
Optional<S> getNull();
}
Then, I implemented it as a raw type. Observe that I'm breaking the interface by returning the Optional types Child, Object and Integer for the respective methods.
public class FooResource implements AbstractResource { // Did not add type parameter
#Override
public Optional<Child> getOption() {
Child child = new Child("John");
return Optional.of(child);
}
#Override
public Optional<Object> getAbsent() {
return Optional.absent();
}
#Override
public Optional<Integer> getNull() {
return null;
}
}
When compiling with the -Xlint:unchecked option, why doesn't the compiler show a warning that FooResource fails to add type parameters? In fact, it compiles successfully.
-Xlint:unchecked is for unchecked conversion (aka casts). It does nothing have to do with using raw types. Compile your classes with the option -Xlint. You then will get the expected output:
FooResource.java:3: warning: [rawtypes] found raw type: AbstractResource
AbstractResource
^
missing type arguments for generic class AbstractResource<S>
where S is a type-variable:
S extends Parent declared in interface AbstractResource
1 warning
I would suggest using an IDE. Eclipse - for example - shows such warnings out of the box.
I just found out, that Java 7 supports more "Xlint" options (than Java 6, e.g.). So the option "-Xlint:rawtypes" will indeed help here.
If you want to see the warning, you should be using
-Xlint:rawtypes
instead of
-Xlint:unchecked
For Maven builds, refer to this:
http://maven.apache.org/plugins/maven-compiler-plugin/examples/pass-compiler-arguments.html
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<compilerArgument>-Xlint:rawtypes</compilerArgument>
</configuration>
</plugin>
This compiles find using Eclipse:
abstract class CollectionView implements Collection<Object> {
...
public Object[] toArray(Object[] o) {
if (fast) {
return get(map).toArray(o);
} else {
synchronized (map) {
return get(map).toArray(o);
}
}
}
...
}
class KeySet extends CollectionView implements Set<Object> {
protected Collection<Object> get(Map<Object, Object> map) {
return map.keySet();
}
protected Object iteratorNext(Map.Entry entry) {
return entry.getKey();
}
}
but it fails to compile when using Ant:
error: KeySet is not abstract and does not override
abstract method toArray(T[]) in Set
I can see why the code would compile using Eclipse: KeySet already inherits the implementation of the toArray(T[]) method from CollectionView.
But why does it fail when I compile using Ant?
<javac srcdir="src" destdir="bin" debug="on">
<compilerarg value="-Xlint:unchecked"/>
<compilerarg value="-Xlint:deprecation"/>
</javac>
First we should note the exact signature of the method expected to be implemented is:
<T> T[] toArray(T[] a);
And both javac and Eclipse do warn you about this 'Type safety' issue. And if you change the signature to be the expected one, javac is happy.
If you put an #Override to the method toArray, even with the signature which use raw Object type, both Eclipse and javac correctly see it as an override of the method declared by Collection. So this issue is not there.
The inconsistency, and I think the bug of javac, is that is any subclass implementation, javac doesn't recognize the super method Object[] toArray(Object[] o) to implement <T> T[] toArray(T[] a). If it did for the abstract class, i should also do it for every subclass.
It is not the first time javac has a bug about this. See this thread for instance. I have searched the Oracle bug database, I found nothing reported about what you have found.
Then there are a work around: in the abstrcat class, use the expected signature; Or do the override 'manually` in the subclasss:
public Object[] toArray(Object[] o) {
return super.toArray(o);
}
There are several cases where eclipse compiles fine and javac doesn't. If you do not mind, there are three ways that I know to build using the eclipse compiler.
Package eclipse pre-compiled classes (hacky, NOT recommended)
Use the eclipse compiler adapter with Ant. When you specify the property build.compiler, all javac tasks henceforth will be affected on your Ant build. You can set it to "org.eclipse.jdt.core.JDTCompilerAdapter". Note that you will have to include this class (and classes it depends on) in your ant build classpath. The most straightforward way is to add the necessary jars to the lib folder of your Ant installation
When building with maven configure this
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<compilerId>eclipse</compilerId>
<compilerVersion>1.6</compilerVersion>
<source>1.6</source>
<target>1.6</target>
<optimize>true</optimize>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-compiler-eclipse</artifactId>
<version>2.2</version>
</dependency>
</dependencies>
</plugin>
in the plugins section of the build section of your pom.xml