This code doesn't compile
List<String> pairs = new ArrayList<>();
System.out.println(pairs.stream().collect(Collectors.toMap(x -> x.split("=")[0], x -> x.split("=")[1])));
Compilation error is: The method split(String) is undefined for the type Object
error at System.out.println(pairs.stream().collect(Collectors.toMap(x -> x.split("=")[0], x -> x.split("=")[1])));
But this one compiles fine
List<String> pairs = new ArrayList<>();
Map<String,String> map = pairs.stream().collect(Collectors.toMap(x -> x.split("=")[0], x -> x.split("=")[1]));
System.out.println(map);
Can someone explain why?
MORE INFORMATION
It was intellij 12; jdk1.8.0_11; windows 64
I assume you are using an IDE (like Eclipse). Eclipse - for example - uses its own compiler and does not utilize the "javac" command (from JDK).
So, I can reproduce your problem, but only with Eclipse. Simply compiling this code on command line with "javac" works just fine.
The problem is very simple: The Eclipse compiler is not able to infer the type String for the collect method's arguments. So it simply infers Object (as this is the type, the compiler can safely assume). And an Object does not know the split method.
You can force the compiler to know about String by explicitely declaring the type inside the lambda:
List<String> pairs = new ArrayList<>();
System.out.println(pairs.stream().collect(Collectors.toMap((String x) -> x.split("=")[0], x -> x.split("=")[1])));
... or by explicitely declaring the correct types for the geneirc toMap method:
List<String> pairs = new ArrayList<>();
System.out.println(pairs.stream().collect(Collectors.<String, String, String> toMap(x -> x.split("=")[0], x -> x.split("=")[1])));
Versions of IntelliJ make different(Just red lines in source editor in IDE). The code should be compiled by JDK successfully.
IntelliJ 13 is OK for your code. IntelliJ 12 supports lambda expression poorly. I also met similar problems between two versions of IntelliJ when using lambda expression.
Related
I have some fairly complex code that uses Javaslang. If I compile it into a jar, it runs fine. However, when I try to step into it in Eclipse for debugging, Eclipse flags it as a compilation error and dies when it reaches that line. The particularly weird part is that this worked a week ago, and the code has not changed in the interim.
Things I have tried:
clean project (including unchecking 'build automatically')
delete project from Eclipse, delete .project and .settings, re-import
from scratch
delete project from Eclipse, delete .project, .classpath, .settings, do mvn eclipse:eclipse, reimport
Maven builds this without errors [both within Eclipse and from the command line]. I can run the project this depends on and have it access this code from the JAR, so I know it works. I just cannot have Eclipse access the code from the project, either in 'run' or 'debug' mode.
Seq<Tuple2<StateProbabilityVector, ScenData>> resultStateProbs =
futures.
flatMap(Future::get).
toList();
// Update the target counts.
// THIS ENTIRE STATEMENT IS THE ERROR
Seq<Tuple2<ScenState, Probability>> result =
resultStateProbs.flatMap(tuple -> tuple.apply((spv, baTargetCount) ->
{
return spv.getStateProbList().
peek(sp -> logger.debug("Checking if {} > {}: {}",
sp.getProbability(),
intermediateMinProb,
sp.getProbability().greaterThan(intermediateMinProb))).
filter(sp -> sp.getProbability().greaterThan(intermediateMinProb)).
map(sp -> updateScenarioData(sp, baTargetCount, dupStateInfo));
}));
// signature for updateScenarioData
protected abstract Tuple2<ScenState, Probability> updateScenarioData(StateProbability stateProb,
ScenData scenData,
DSI dupStateInfo);
// truncated def of StateProbabilityVector
#Getter #ToString #Builder
public class StateProbabilityVector {
#NonNull
private final Seq<StateProbability> stateProbList;
}
So the types are all correct, but Eclipse claims:
> Type mismatch: cannot convert from Object to Iterable<? extends
> Object>
> Type mismatch: cannot convert from Seq<Object> to
> Seq<Tuple2<ScenState,Probability>>
As NĂ¡ndor comments, this is probably down to a difference between the Eclipse compiler and javac, and the problem can probably be solved with a type witness in the right place. To find the right place, I would start by breaking up the functional method chain and extracting some local variables:
Seq<Tuple2<ScenState, Probability>> result =
resultStateProbs.flatMap(tuple -> {
Seq<Tuple2<ScenState, Probability>> filteredAndUpdated =
tuple.apply((spv, baTargetCount) -> {
Seq<StateProbability> stateProbList = spv.getStateProbList();
stateProbList.peek(sp -> {
logger.debug("Checking if {} > {}: {}", sp.getProbability(), intermediateMinProb, sp.getProbability().greaterThan(intermediateMinProb));
});
Seq<StateProbability> filtered = stateProbList.filter(sp ->
sp.getProbability().greaterThan(intermediateMinProb));
Seq<Tuple2<ScenState, Probability>> updated = filtered.map(sp ->
updateScenarioData(sp, baTargetCount, dupStateInfo));
return updated;
});
return filteredAndUpdated;
});
If you use Eclipse's extract variable refactoring, that by itself may tell you where it's inferring the wrong types, and explicitly declaring the correct types of the local variables might be enough to fix the problem all by itself.
If not, it should at least narrow the error down, and show you exactly where in the call chain Eclipse is having trouble. You can then probably fix it with type witnesses or, if all else fails, explicit casts, and then (with that type information added) perhaps inline the variables again, although this code is dense enough that I might leave them in.
Side notes:
peek() will only only debug the first StateProbability -- is that your intent?
consider adding a greaterThan() method to StateProbability so you don't have to repeatedly call getProbability().greaterThan(). (If the answer to #1 is "no", this method would also be a good place to put the debug statement.)
consider adding a method on SceneState that would return a prefiltered list, like Seq<StateProbability> SceneState.allGreaterThan(Probability).
I have the following method signature
JavaPairRDD<K,Object> countApproxDistinctByKey(double relativeSD)
from class
Class JavaPairRDD<K,V>
Javadoc here
In my code, I do the following
JavaPairRDD<String, String> mapToPair = ... some calculations ...
JavaPairRDD<String, Long> reachedDeviceRDD = mapToPair.countApproxDistinctByKey(0.01);
In Eclipse (Mars) this assignment throws the following error:
Type mismatch: cannot convert from JavaPairRDD<String,Object> to JavaPairRDD<String,Long>
which is correct, given the above signature!
But my problem is that I have been programming for 6+ months with IntellijIdea, and this error has never shown up since now.
It doesn't even come up with NetBeans and it doesn't show up compiling the project in Maven (i.e. javac).
It seems to me that the compiler is "casting automatically" the Object generic parameter to Long, which would be madness.
I don't know what I'm missing.
Additional info:
The classes are from the Apache Spark project, which is written in Scala.
** Edit **
You can find a code example here:
I am trying out Spark Programming examples using Java 1.8 in Eclipse Luna and have the following code -
JavaPairRDD<String, Integer> counts = ones
.reduceByKey(new Function2<Integer, Integer, Integer>() {
#Override
public Integer call(Integer i1, Integer i2) {
return i1 + i2;
}
});
List<Tuple2<String, Integer>> output = counts.collect(); //Compilation Error
I am using M2Eclipse to build and create the jar and using spark-submit to execute the jar in my local. The jar is working and printing the correct output but Eclipse always shows the above mentioned line as a compilation error - The type Tuple2 is not generic; it cannot be parameterized with arguments <String, Integer>
Even the programming examples referred in the Spark webpage uses the same notation for Tuple2. https://spark.apache.org/docs/0.9.0/java-programming-guide.html
I am not able to understand why Eclipse is showing it as a compilation error since the return type of the collect call is a List<Tuple2<String,Integer>>
Any help is greatly appreciated.
As mentioned by #Holger in the comments, 2 scala-library jars were added to the build path. Removed the earlier version and compilation errors disappeared.
It helps on my problem on IntelliJ Idea, too. I called a function in which a parameter is a generic type with Tuple2 as a parameter. It always show an error but I can pass the compilation. It confused me for several days. After removing several dependent sharded-jars(in which it contains something related scala-libiary), the error disappeared.
I have the following code:
List<Person> personList = getPersons();
List<Function<List<Person>, Stream<Person>>> streams = new ArrayList<>();
streams.add(p -> p.stream());
streams.add(p -> p.parallelStream());
Intellij Idea suggests I should replace the lambda expressions to method references.
I'd like to do so, only I'm not sure what should be the new generic type of the streams list.
I tried to evaluate the expression personList::stream but I get "No such instance field: 'stream'". If I try List::stream or ArrayList::stream (The concrete type of the person list) I get: "No such static field: 'stream'".
Is there a way to add method references to a list?
if so what should be the list's generic type?
Thanks
As assylias pointed out, IDEA was just complaining and the code ran without problem,
I still had problems with the same code in IDEA 13 since streams.add expected a function that returns Stream and List::stream returns Stream. To solve it I ended up using the following code:
List<Person> personList = getPersons();
List<Supplier<Stream<Person>>> streams = new ArrayList<>();
streams.add(personList::stream);
streams.add(personList::parallelStream);
This compiles fine (b119):
List<String> personList = Arrays.asList("a", "b");
List<Function<List<String>, Stream<String>>> streams = new ArrayList<>();
streams.add(List::stream);
streams.add(List::parallelStream);
You may be using an old build of the jdk or IntelliJ is messing with you!
personList::stream is basically the same as p -> p.stream(). Neither has a type per se. The type of the expression is the type of the context that accepts it.
I'm playing with Java 8 and e(fx)clipse and just trying things out. I'm trying to apply a map function that removes all a's from a nullable string. However, the succeeding filter functions are having compile errors because map is returning Optional<Object> instead of Optional<String>.
What am I doing wrong?
Optional.ofNullable(string)
.map( s -> s.replaceAll("a", "") )
.filter( s -> !((String) s).isEmpty() ) //notice the need for cast
.map( s -> "String: " + s )
.ifPresent(System.out::println);
Seems to be an issue with type inferencing. Optional.map takes an argument of type Function<? super T, ? extend U> and returns an Optional<U>.
Now for String, the type T is being inferred as String, and U is being inferred as Object, that is why it returns an Optional<Object>. I've not used Java 8 too much, but I guess this is what is happening.
You can get Optional<String> by giving an explicit type argument:
Optional.ofNullable(string).<String>map(s -> s.replaceAll("a", ""))
.filter(s -> s.isEmpty())
.map(s -> "String" + s)
.ifPresent(System.out::println);
<String> in between . and map(...) denotes an explicit type argument, to ensure that the type argument U is inferred as String.
Frankly I would have expected that to work without explicit type argument, because the Function that is passed to map has both the input and output type as String:
s (type is String) -> s.replaceAll(..) (this also returns String)
and since, String satisfies both the bounds - ? extends T and ? super U. Remember in generics, String super String and String extends String are true. So, it took me by surprise why it isn't working as expected. But again, for these kinds of surprises only I guess, Java has got explicit type argument.
Note: So it turned out to be an issue with eclipse. The original code works fine on command line.
Another option is to create a Function<T, R> object beforehand, and pass it:
Function<String, String> func = s -> s.replaceAll("a", "");
Optional.ofNullable(string).map(func)
.filter(s -> s.isEmpty())
.map(s -> "String" + s)
.ifPresent(System.out::println);
Reference:
What is explicit type argument inference?
It seems that your problem has already been fixed in a later Java 8 build because this compiles for me without errors:
public static void main(String[] args) {
Optional.ofNullable("x")
.map(s -> s.replaceAll("a","b"))
.filter(s -> !s.isEmpty());
}
My Java version:
$ java -version
java version "1.8.0-ea"
Java(TM) SE Runtime Environment (build 1.8.0-ea-b108)
Java HotSpot(TM) 64-Bit Server VM (build 25.0-b50, mixed mode)
Note that the Java JDK build number is irrelevant if you are seeing errors within Eclipse because Eclipse uses its own Java compiler. Try to compile your classes with javac from the command line.