I have a class that calls one of its own methods internally several times. All of these methods take a generic parameter (Guava's Predicate). Eclipse compiles this fine and reports no errors and has no warning indicators, with compiler settings set to Java 1.6 compatibility. Gradle (using JDK 1.6.0_37) reports that one of these times the method is called it cannot find the symbol for that method, but the other times it can. This seems to involve the use of Guava's Predicates#and() static method. But a similar call with Guava's Predicates#not() works.
I have simplified the code down to the following:
import static com.google.common.base.Predicates.and;
import static com.google.common.base.Predicates.not;
import com.google.common.base.Predicate;
import com.google.common.base.Predicates;
import com.google.common.collect.FluentIterable;
public class MyClass {
public List<String> doStuffAnd(List<String> l, Predicate<String> p1, Predicate<String> p2) {
// eclipse fine, gradle complains it can't find symbol doStuff
return doStuff(l, and(p1, p2));
}
public List<String> doStuffNot(List<String> l, Predicate<String> p) {
// both eclipse and gradle compile fine
return doStuff(l, not(p));
}
public List<String> doStuff(List<String> l, Predicate<String> p) {
return FluentIterable.from(l).filter(p).toList();
}
}
Resulting compile error is:
doStuff(java.util.List,com.google.common.base.Predicate)
in MyClass cannot be applied to
(java.util.List,com.google.common.base.Predicate)
return doStuff(l, and(p1, p2));
^
If I explicitly type the call to Predicates.and() as follows
return doStuff(l, Predicates.<String>and(p1, p2));
then it is fine. But I don't have to do that with the call to Predicates.not() It also works if I extract the #and expression as a local variable.
What is the difference between the call using #and and the call using #not?
Is there anything I can do avoid this that involves neither typing the and call nor extracting the and expression?
And why is there a difference between the gradle compiler and Eclipse compiler?
OP's solution
The difference between and and not is that and defines its generic signature for its parameters as Predicate<? super T>, whereas not uses a simple parameter signature of Predicate<T>.
So to solve this, I define doStuffAnd with parm: Predicate<? super String>.
Related
I developed some code in Eclipse, tested it successfully, pushed it to our Jenkins CI server, and got an email that Maven was choking with a Java compile error. I subsequently isolated the problem and created the following minimal example showing the issue:
import java.util.List;
import java.util.function.Function;
class MinimalTypeFailureExample {
public static void main(String[] args) {
List<String> originalList = null; // irrelevant
List<IntToByteFunction> resultList = transform(originalList,
outer -> inner -> doStuff(inner, outer));
System.out.println(resultList);
}
static <F, T> List<T> transform(List<F> originalList,
MyFunction<? super F, ? extends T> function) {
return null; // irrelevant
}
static Byte doStuff(Integer inner, String outer) {
return null; // irrelevant
}
}
#FunctionalInterface
interface MyFunction<F, T> extends Function<F, T> {
#Override
T apply(F input);
}
#FunctionalInterface
interface IntToByteFunction {
Byte applyIntToByte(Integer inner);
}
In Eclipse, this code compiles without error and appears to execute as intended. However, compiling with javac gives the following error:
MinimalTypeFailureExample.java:7: error: incompatible types: cannot infer type-variable(s) F,T
List<IntToByteFunction> resultList = transform(originalList, outer -> inner -> doStuff(inner, outer));
^
(argument mismatch; bad return type in lambda expression
T is not a functional interface)
where F,T are type-variables:
F extends Object declared in method <F,T>transform(List<F>,MyFunction<F,? extends T>)
T extends Object declared in method <F,T>transform(List<F>,MyFunction<F,? extends T>)
1 error
Changing the argument type of transform() from MyFunction to Function, or removing the wildcard ? extends in the argument type, makes the example code compile in javac.
Clearly, either Eclipse or javac is in violation of the Java Language Specification. The question is, do I file the bug report on Eclipse or javac? The type inference rules for generic lambdas are so complex that I have no idea whether this program is legal Java or not according to the JLS.
Motivation note
In the original code, transform() was Guava's com.google.common.collect.Lists.transform(). The MyFunction interface was Guava 's com.google.common.base.Function interface, which extends java.util.function.Function for historical reasons.
The purpose of this code was to create a view of a list of a first type as a list of a second type. The second type was a functional interface type and I wanted to populate the output list with functions of this type constructed based on the values in the input list—hence the curried lambda expression.
Version info for reproducibility
Eclipse versions tested:
2018-09 (4.9.0) Build id: 20180917-1800
2019-03 RC1 (4.11 RC1) Build id: 20190307-2044
javac versions tested:
1.8.0_121
JDK 10.0.1 via the JDoodle online Java compiler
It looks like you run into JDK bug JDK-8156954 which has been fixed in Java 9 but not in Java 8.
It is a bug of Java 8 javac because in your example all variable types of the transform method can be inferred without violating the Java language specification as follows:
F: String (via first parameter originalList of type List<String>)
T: IntToByteFunction (via return type List<IntToByteFunction>)
These inferred variable types are compatible with the type of the second parameter, the chained lambda expression:
outer -> inner -> doStuff(inner, outer) resolves (with doStuff(Integer, String) to
String -> Integer -> doStuff(Integer, String) resolves to
String -> Integer -> Byte is compatible with
String -> IntToByteFunction is compatible with
MyFunction<? super String, ? extends IntToByteFunction>
Your example can be minimized further:
import java.util.function.Function;
class MinimalTypeFailureExample {
void foo() {
transform((Function<Integer, String>)null, o -> i -> {return "";});
}
<T, F> void transform(F f, MyFunction<T, ? extends F> m) {}
}
#FunctionalInterface
interface MyFunction<T, R> extends Function<T, R> {
#Override
R apply(T t);
}
MyFunction overrides the same with the same (R apply(T t);). If Function instead of MyFunction is used or if MyFunction extends Function but without #Override R apply(T t); then the error disappears. Also with F instead of ? extends F the error disappears.
Even if your example differs from the example in the mentioned bug, it can be assumed that it is the same bug because it is the only "argument mismatch; bad return type in lambda expression bug that has been fixed in Java 9 but not in Java 8 and that occurs only with lambda functions in combination with Java Generics.
I tried the example code with javac 11.0.2 and received no error. That would suggest that the bug may have been in javac and is fixed in recent versions. I am slightly surprised at this because as mentioned I did try testing JDK 10 in an online interface.
I am open to other answers that provide more details on the specific problem, such as a JDK bug number for the issue.
As a workaround to make the code compile in JDK 8, an explicit cast can be added to the inner lambda expression:
List<IntToByteFunction> resultList = transform(originalList,
outer -> (IntToByteFunction) inner -> doStuff(inner, outer));
When working with some existing code, I've encountered a problem during runtime when running the code with Eclipse Neon.3. Unfortunately, I wasn't able to reproduce the exception in a minimal working example but the following output was produced by the classloader:
Exception in thread "main" java.lang.VerifyError: Bad type on operand stack
Exception Details:
Location:
...
Reason:
Type 'java/lang/Object' (current frame, stack[8]) is not assignable to 'MyMap'
Current Frame:
...
Bytecode:
...
Stackmap Table:
...
It works on the command line and in older eclipse versions, so it did not matter much at that time. Lasts week, Eclipse Oxygen.1 was released and we started using that one. Now, the same code produces a compile-time exception:
Problem detected during type inference: Unknown error at invocation of reduce(Main.MyMap<MyKey,MyValue>, BiFunction<Main.MyMap<MyKey,MyValue>,? super MyValue,Main.MyMap<MyKey,MyValue>>, BinaryOperator<Main.MyMap<MyKey,MyValue>>)
I've managed to put together a minimal working example that produces this error in Eclipse but works on the commandline:
import java.util.HashMap;
import java.util.HashSet;
import java.util.Set;
public class Main<MyKey, MyValue> {
static class MyMap<K, V> extends HashMap<K, V> {
public MyMap<K, V> putAllReturning(MyMap<K, V> c) { putAll(c); return this; }
public MyMap<K, V> putReturning(K key, V value) { put(key, value); return this; }
}
public Main() {
Set<MyValue> values = new HashSet<>(); // actually something better
final MyMap<MyKey, MyValue> myMap =
values.stream()
.reduce(
new MyMap<MyKey, MyValue>(),
(map, value) -> {
Set<MyKey> keys = new HashSet<>(); // actually something better
return keys.stream()
.reduce(
map, // this would work syntactically: new MyMap<MyKey, MyValue>(),
(map2, key) -> map2.putReturning(key, value),
MyMap::putAllReturning);
},
MyMap::putAllReturning
);
}
}
It seems that the first parameter to the inner reduce causes type-inference to break, as replacing it with another instance of the same type (but explicitly declared instead of inferred) eliminates the error.
Knowing the exact source of the error in the class allowed us to rewrite the code (extracting the lambda expression passed to the outer reduce to its own method). However, I'd still be interested in an explanation to why such a construct might break both of the newer eclipse compilers.
Well for me on Oxygen, it worked by simply declaring the parameters for the reduce:
(MyMap<MyKey, MyValue> map, MyValue value) -> ....
Knowing the exact problem and why eclipse compiler can't infer the types is too much for me; either way this should be reported (it might be already known by the eclipse team).
I can also confirm that it compiles OK with jdk-8-131 and jdk-9-175
I have a problem understanding the behaviour of Java generics in the following case.
Having some parametrised interface, IFace<T>, and a method on some class that returns a class extending this interface, <C extends IFace<?>> Class<C> getClazz() a java compilation error is produced by gradle, 1.8 Oracle JDK, OSX and Linux, but not by the Eclipse compiler within the Eclipse IDE (it also happily runs under Eclipse RCP OSGi runtime), for the following implementation:
public class Whatever {
public interface IFace<T> {}
#SuppressWarnings("unchecked")
protected <C extends IFace<?>> Class<C> getClazz() {
return (Class<C>) IFace.class;
}
}
➜ ./gradlew build
:compileJava
/Users/user/src/test/src/main/java/Whatever.java:6: error: incompatible types: Class<IFace> cannot be converted to Class<C>
return (Class<C>) IFace.class;
^
where C is a type-variable:
C extends IFace<?> declared in method <C>getClazz()
1 error
:compileJava FAILED
This implementation is not a very logical one, it is the default one that somebody thought was good, but I would like to understand why it is not compiling rather than question the logic of the code.
The easiest fix was to drop a part of the generic definition in the method signature. The following compiles without issues, but relies on a raw type:
protected Class<? extends IFace> getClazz() {
return IFace.class;
}
Why would this compile and the above not? Is there a way to avoid using the raw type?
It's not compiling because it's not type-correct.
Consider the following:
class Something implements IFace<String> {}
Class<Something> clazz = new Whatever().getClazz();
Something sth = clazz.newInstance();
This would fail with a InstantiationException, because clazz is IFace.class, and so it can't be instantiated; it's not Something.class, which could be instantied.
Ideone demo
But the non-instantiability isn't the relevant point here - it is fine for a Class to be non-instantiable - it is that this code has tried to instantiate it.
Class<T> has a method T newInstance(), which must either return a T, if it completes successfully, or throw an exception.
If the clazz.newInstance() call above did succeed (and the compiler doesn't know that it won't), the returned value would be an instance of IFace, not Something, and so the assignment would fail with a ClassCastException.
You can demonstrate this by changing IFace to be instantiable:
class IFace<T> {}
class Something extends IFace<String> {}
Class<Something> clazz = new Whatever().getClazz();
Something sth = clazz.newInstance(); // ClassCastException
Ideone demo
By raising an error like it does, the compiler is removing the potential for getting into this situation at all.
So, please don't try to fudge the compiler's errors away with raw types. It's telling you there is a problem, and you should fix it properly. Exactly what the fix looks like depends upon what you actually use the return value of Whatever.getClass() for.
It is kind of funny, that the Eclipse compiler does compile the code, but Oracle Java Compiler will not compile it. You can use the Eclipse Compiler during the gradle build to make sure, gradle is compiling the same way the IDE does. Add the following snippet to your build.gradle file
configurations {
ecj
}
dependencies {
ecj 'org.eclipse.jdt.core.compiler:ecj:4.4.2'
}
compileJava {
options.fork = true
options.forkOptions.with {
executable = 'java'
jvmArgs = ['-classpath', project.configurations.ecj.asPath, 'org.eclipse.jdt.internal.compiler.batch.Main', '-nowarn']
}
}
It fails to compile because C could possibly be anything, where the compiler can be sure that IFace.class does not fulfill that requirement:
class X implements IFace<String> {
}
Class<X> myX = myWhatever.getClass(); // would be bad because IFace.class is not a Class<X>.
Andy just demonstrated why this assignment would be bad (e.g. when trying to instantiate that class), so my answer is not very different from his, but perhaps a little easier to understand...
This is all about the nice Java compiler feature of the type parameters for methods implied by calling context. You surely know the method
Collections.emptyList();
Which is declared as
public static <T> List<T> emptyList() {
// ...
}
An implementation returning (List<T>)new ArrayList<String>(); would obviously be illegal, even with SuppressWarnings, as the T may be anything the caller assigns (or uses) the method's result to (type inference). But this is very similar to what you try when returning IFace.class where another class would be required by the caller.
Oh, and for the ones really enjoying Generics, here is the possibly worst solution to your problem:
public <C extends IFace<?>> Class<? super C> getClazz() {
return IFace.class;
}
Following will probably work:
public class Whatever {
public interface IFace<T> {}
#SuppressWarnings("unchecked")
protected <C extends IFace> Class<C> getClazz() {
return (Class<C>) IFace.class;
}
}
In your former code, problem is that C has to extend IFace<?>>, but you provided only IFace. And for type system Class<IFace> != Class<IFace<?>>, therefore Class<IFace> can not be cast to Class<C extends IFace<?>>.
Maybe some better solution exists, as I am not a generics expert.
I'm trying to compile this piece of code:
import java.util.Collection;
import java.util.function.BiConsumer;
import de.hybris.platform.servicelayer.exceptions.ModelSavingException;
import de.hybris.platform.servicelayer.model.ModelService;
public class Foo {
public static interface ModelService2 {
public abstract void saveAll(Object[] paramArrayOfObject) throws ModelSavingException;
public abstract void saveAll(Collection<? extends Object> paramCollection) throws ModelSavingException;
public abstract void saveAll() throws ModelSavingException;
}
public void bar() {
final BiConsumer<ModelService2, Collection<? extends Object>> consumer1 = ModelService2::saveAll;
final BiConsumer<ModelService, Collection<? extends Object>> consumer2 = ModelService::saveAll;
}
}
The interface ModelService is defined by the SAP hybris platform. ModelService2 just replicates the overloaded methods with name saveAll defined in the interface of the hybris platform.
I get the following compiler error when compiling the above:
1. ERROR in src\Foo.java (at line 17)
final BiConsumer<ModelService, Collection<? extends Object>> consumer2 = ModelService::saveAll;
^^^^^^^^^^^^^^^^^^^^^
Cannot make a static reference to the non-static method saveAll(Object[]) from the type ModelService
Why does the compiler do different type inference for ModelService when the only difference I'm able to spot is where each of the interfaces is located?
I'm using javac 1.8.0_77 for compilation in this case. Eclipse for example doesn't report any errors for the above code.
EDIT:
A relatively similiar error happens also for the following variable declarations:
final Consumer<ModelService2> consumer3 = ModelService2::saveAll;
final Consumer<ModelService> consumer4 = ModelService::saveAll;
The compile error in this case is:
1. ERROR in src\Foo.java (at line 19)
final Consumer<ModelService> consumer4 = ModelService::saveAll;
^^^^^^^^^^^^^^^^^^^^^
Cannot make a static reference to the non-static method saveAll(Object[]) from the type ModelService
EDIT2:
compilation arguments are:
'-noExit'
'-classpath'
'<classpath>'
'-sourcepath'
'<source path>'
'-d'
'<path>\classes'
'-encoding'
'UTF8'
EDIT 3:
These are the definitions for the 3 methods shown by the Eclipse class file viewer:
// Method descriptor #43 (Ljava/util/Collection;)V
// Signature: (Ljava/util/Collection<+Ljava/lang/Object;>;)V
public abstract void saveAll(java.util.Collection arg0) throws de.hybris.platform.servicelayer.exceptions.ModelSavingException;
// Method descriptor #45 ([Ljava/lang/Object;)V
public abstract void saveAll(java.lang.Object... arg0) throws de.hybris.platform.servicelayer.exceptions.ModelSavingException;
// Method descriptor #10 ()V
public abstract void saveAll() throws de.hybris.platform.servicelayer.exceptions.ModelSavingException;
Resolution:
The problem is caused by the eclipse compiler for java v4.4.1. It is fixed since at least v4.5.1. I failed to notic that it was the eclipse compiler that was used by the hybris platform to compile the code when building from the command line at first.
The interaction between method overloading, varargs and type inference is perhaps the most complicated and hairy part of Java type checking. It's an area where bugs turn up regularly and where there are often differences between different compilers.
My guess is the following:
ModelService has a vararg saveAll. Because of this saveAll with two object argument is a valid method call to such an object. If that method would be static it would be valid to call it with one ModelService and one Collection, so a method reference expression would be valid for a BiConsumer<ModelService2, Collection<? extends Object>> type.
Because of a compiler bug the compiler notes that, and notes that the method in not static, and thus infers that the method reference expression is not valid here. This generates the compilation error.
ModelService2.saveAll is on the other hand is not a vararg and can not be called with one ModelService and one Collection. Because of this the compiler does not get stuck in this bug when it tries that possibility.
When I tried this code with Eclipse 4.5.2 and javac 1.8.0_77 all of your examples compiled for me. I have no idea why you are getting different results.
A couple of days ago, I started refactoring some code to use the new Java 8 Streams library. Unfortunately, I ran into a compile time error when performing Stream::map with a method which is declared to throw a generic E that is further specified to be a RuntimeException.
Interesting enough, the compile time error goes away when I switch to using a method reference.
Is this a bug, or is my method reference not equivalent to my lambda expression?
(Also, I know I can replace p->p.execute(foo) with Parameter::execute. My actual code has additional parameters for the execute method).
Error message
Error:(32, 43) java: unreported exception E; must be caught or declared to be thrown
Code
import java.util.ArrayList;
import java.util.List;
public class JavaBugTest
{
interface AbleToThrowException<E extends Exception>
{
}
interface Parameter {
public <E extends Exception> Object execute(AbleToThrowException<E> algo) throws E;
}
interface ThrowsRuntimeException extends AbleToThrowException<RuntimeException>
{
}
static ThrowsRuntimeException foo;
public static Object manualLambda(Parameter p)
{
return p.execute(foo);
}
public static void main(String[] args)
{
List<Parameter> params = new ArrayList<>();
params.stream().map(p -> p.execute(foo)); // Gives a compile time error.
params.stream().map(JavaBugTest::manualLambda); // Works fine.
}
}
System setup
OS: Windows x64
Java compiler version: Oracle JDK 1.8.0_11
IDE: Intellij
A very simple solution is to explicitly provide a type argument for Parameter#execute(..).
params.stream().map(p -> p.<RuntimeException>execute(foo)); // Gives a compile time error.
Without the explicit type argument, it seems like the JDK compiler cannot infer a type argument from the invocation context, though it should. This a bug and should be reported as such. I have now reported it and will update this question with new details when I have them.
Bug Report