I have a requirement to validate a field against some predefined values (that can grow in future). So for this I have created a Enum and defined a method that returns the stream of the allowed values.
public enum EnumDemo {
VERSION("1.0.0","2.0.3");
private List<String> ver;
EnumDemo(String... ver) {
this.ver = Arrays.asList(ver);
}
public List<String> getVer() {
return ver;
}
public static Stream<EnumDemo> stream() {
return Arrays.stream(EnumDemo.values());
}
}
Now I need to validate a field against the values defined in this Enum.
I'm using:
Optional<EnumDemo> ab = EnumDemo.stream()
.map(l -> {l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst();})
.findFirst();
System.out.println(ab.get().getVer());
But it is giving me compilation error. Any help would be appreciated.
Edit:
Compilation Error:
The method map(Function<? super EnumDemo,? extends R>) in the type Stream<EnumDemo> is not applicable for the arguments ((<no type> l) -> {})
You should write it this way:
Optional<EnumDemo> ab = EnumDemo.stream().filter(l -> l.getVer().contains("2.0.3"))
.findFirst();
By the way, it wasn't working because you used {} for the lambda expression, so it was expecting a return statement in the {}. You could either remove the {} (along with the ;) or add in the return.
Anyway the original codes looked confusing, not sure if I guessed the intention correctly, but this implementation should be clearer.
Edit
Based on your comment, this is what you need:
EnumDemo.stream().flatMap(l -> l.getVer().stream())
.filter("2.0.3"::equals)
.findAny()
.ifPresent(System.out::println);
Update
Holger commented that there is a shorter and more meaningful way, with better performance:
if(EnumDemo.stream()
.anyMatch(l -> l.getVer().contains(userString))) {
System.out.println(userString);
}
To understand it, you have to think about lambdas. Lambdas represent interfaces but are specially treated by the JVM, so not every Lambda needs a class to represent. (Stateless lambdas can be just methods).
Now when looking at the map() method in the Stream interface:
<R> Stream<R> map(Function<? super T, ? extends R> mapper);
You see that it expects an implementation of the Function interface. You now have many different ways to provide that mapper. In this example lets map from Object to String:
1. Using an inline lambda:
.map(o -> o.toString())
2. Using a multiline lambda:
.map(o -> {
return o.toString();
})
3. Using method references:
.map(Object::toString)
4. Using an anonymous class:
.map(new Function<Object, String>(){
#Override
public String apply(Object o){
return o.toString();
}
})
Your current code uses the 2. approach. But without a return statement. This is even better seen when looking at the anonymous class at 4.. It seems natural, that when not using a return statement in a method that no value is returned.
And that's why you get the compilation error.
You just have to add the return statement:
.map(l -> {
return l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst();
});
Or remove the brackets {}:
.map(l -> l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst());
Or even use the approach provided by #Jai in his answer. Which works even better, than what you currently have.
You are using lambda expression and not returning any value so it is giving compilation error. It is better to use ifPresent()
String val="2.0.3";
EnumDemo.stream()
.flatMap(l -> l.getVer().stream())
.filter(c -> c.equals(val))
.findAny()
.ifPresent(x -> System.out.println(x));
Related
I can't understand why the String::toUpperCase() expression works fine inside the Stream map pipeline. When I look at this example here:
Stream.of("test1", "test2", "test3", "test4")
.filter(s -> s.contains("r"))
.map(s -> s + "map")
.map(String::toUpperCase)
.forEach(System.out::println);
When I look to the definition of the map operator used in the example below map(Function<? super String, ? extends String> mapper) I saw a function design pattern is been used.
In this example .map(s -> s + "map") is fine, as I understand we are looking for Function more precisely the R apply(T t);, it is totally what the lambda expression said s -> s + "map" here we have a function with a parameter s and it returns s + String "map" and it conforms to this spec. T and R, they are present.
On the other side the second one map(String::toUpperCase), I can't understand why the expression toUpperCase is considered as a Function interface, I should note the core of this function is like this
public String toUpperCase() {
return toUpperCase(Locale.getDefault());
}
and we are looking for R apply(T t); there is no T parameter in this method toUpperCase? Why does this one work?
What's much easier to understand in terms of the apply method of the Function interface is the anonymous class representation of the method reference String::toUpperCase. It goes like this -
new Function<String, String>() {
#Override
public String apply(String str) { // read as given a String return a String (uppercased)
return str.toUpperCase();
}
}
The string arguments(str) provided to the above apply method are the ones from the Stream after the previous map operation.
It's called a method reference and it's syntactic sugar for a lambda expression. In other words the following:
String::toUpperCase
is equivalent to:
s -> s.toUpperCase()
It's a method that takes a String s and returns a String with all letter from s uppercase, it's a Function<String, String>.
Let's say I have two classes and two methods:
class Scratch {
private class A{}
private class B extends A{}
public Optional<A> getItems(List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
public Optional<A> getItems2(List<String> items){
return Optional.of(
items.stream()
.map(s -> new B())
.findFirst()
.get()
);
}
}
Why does getItems2 compile while getItems gives compiler error
incompatible types: java.util.Optional<Scratch.B> cannot be converted to java.util.Optional<Scratch.A>
So when I get the value of the Optional returned by findFirst and wrap it again with Optional.of the compiler recognizes the inheritance but not if I use directly the result of findFirst.
An Optional<B> is not a subtype of Optional<A>. Unlike other programming languages, Java’s generic type system does not know “read only types” or “output type parameters”, so it doesn’t understand that Optional<B> only provides an instance of B and could work at places where an Optional<A> is required.
When we write a statement like
Optional<A> o = Optional.of(new B());
Java’s type inference uses the target type to determine that we want
Optional<A> o = Optional.<A>of(new B());
which is valid as new B() can be used where an instance of A is required.
The same applies to
return Optional.of(
items.stream()
.map(s -> new B())
.findFirst()
.get()
);
where the method’s declared return type is used to infer the type arguments to the Optional.of invocation and passing the result of get(), an instance of B, where A is required, is valid.
Unfortunately, this target type inference doesn’t work through chained invocations, so for
return items.stream()
.map(s -> new B())
.findFirst();
it is not used for the map call. So for the map call, the type inference uses the type of new B() and its result type will be Stream<B>. The second problem is that findFirst() is not generic, calling it on a Stream<T> invariably produces a Optional<T> (and Java’s generics does not allow to declare a type variable like <R super T>, so it is not even possible to produce an Optional<R> with the desired type here).
→ The solution is to provide an explicit type for the map call:
public Optional<A> getItems(List<String> items){
return items.stream()
.<A>map(s -> new B())
.findFirst();
}
Just for completeness, as said, findFirst() is not generic and hence, can’t use the target type. Chaining a generic method allowing a type change would also fix the problem:
public Optional<A> getItems(List<String> items){
return items.stream()
.map(s -> new B())
.findFirst()
.map(Function.identity());
}
But I recommend using the solution of providing an explicit type for the map invocation.
The issue you have is with inheritance for generics.
Optional< B > doesn't extend Optional< A >, so it can't be returned as such.
I'd imagine that something like this:
public Optional<? extends A> getItems( List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
Or:
public Optional<?> getItems( List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
Would work fine, depending on your needs.
Edit: escaping some characters
An Optional<B> is not a sub-class of Optional<A>.
In the first case, you have a Stream<B>, so findFirst returns an Optional<B>, which cannot be converted to an Optional<A>.
In the second case, you have a stream pipeline that returns an instance of B. When you pass that instance to Optional.of(), the compiler sees that the return type of the method is Optional<A>, so Optional.of() returns an Optional<A> (since an Optional<A> can hold an instance of B as its value (since B extends A)).
Look at this similar example:
Optional<A> optA = Optional.of(new B()); //OK
Optional<B> optB = Optional.of(new B()); //OK
Optional<A> optA2 = optB; //doesn't compile
You can make the second method fail by rewriting it as:
public Optional<A> getItems2(List<String> items) {
return Optional.<B>of(items.stream().map(s -> new B()).findFirst().get());
}
This is simply because generic types are invariant.
Why the difference? See the declaration of Optional.of:
public static <T> Optional<T> of(T value) {
return new Optional<>(value);
}
The type of the optional is picked up from the target of the assignment (or return type in this case).
And Stream.findFirst():
//T comes from Stream<T>, it's not a generic method parameter
Optional<T> findFirst();
In this case, however, return items.stream().map(s -> new B()).findFirst(); doesn't type the result of .findFirst() based on the declared return type of getItems (T is strictly based on the type argument of Stream<T>)
If the class B inherits class A, that doesn't mean Optional inherits Optional. Optional is a different class.
I wish to do lazy evaluation on a list of functions I've defined as follows;
Optional<Output> output = Stream.<Function<Input, Optional<Output>>> of(
classA::eval, classB::eval, classC::eval)
.map(f -> f.apply(input))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
where as you see, each class (a, b & c) has an Optional<Output> eval(Input in) method defined. If I try to do
Stream.of(...)....
ignoring explicit type, it gives
T is not a functional interface
compilation error. Not accepting functional interface type for T generic type in .of(T... values)
Is there a snappier way of creating a stream of these functions? I hate to explicitly define of method with Function and its in-out types. Wouldn't it work in a more generic manner?
This issue stems from the topic of the following question;
Lambda Expression and generic method
You can break it into two lines:
Stream<Function<Input, Optional<Output>>> stream = Stream
.of(classA::eval, classB::eval, classC::eval);
Optional<Output> out = stream.map(f -> f.apply(input))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
or use casting:
Optional<Output> out = Stream.of(
(<Function<Input, Optional<Output>>>)classA::eval,
classB::eval,
classC::eval)
.map(f -> f.apply(input))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
but I don't think you can avoid specifying the type of the Stream element - Function<Input, Optional<Output>> - somewhere, since otherwise the compiler can't infer it from the method references.
There is a way that allows to omit the Function<Input, Optional<Output>> type, but it’s not necessarily an improvement
Optional<Output> o =
Stream.concat(Stream.of(input).map(classA::eval),
Stream.concat(Stream.of(input).map(classB::eval),
Stream.of(input).map(classC::eval)))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
and it doesn’t scale.
It seems, the best option is to wait for Java-9 where you can use
Optional<Output> o = classA.eval(input)
.or(() -> classB.eval(input))
.or(() -> classC.eval(input));
I have the following expression:
scheduleIntervalContainers.stream()
.filter(sic -> ((ScheduleIntervalContainer) sic).getStartTime() != ((ScheduleIntervalContainer)sic).getEndTime())
.collect(Collectors.toList());
...where scheduleIntervalContainers has element type ScheduleContainer:
final List<ScheduleContainer> scheduleIntervalContainers
Is it possible to check the type before the filter?
You can apply another filter in order to keep only the ScheduleIntervalContainer instances, and adding a map will save you the later casts :
scheduleIntervalContainers.stream()
.filter(sc -> sc instanceof ScheduleIntervalContainer)
.map (sc -> (ScheduleIntervalContainer) sc)
.filter(sic -> sic.getStartTime() != sic.getEndTime())
.collect(Collectors.toList());
Or, as Holger commented, you can replace the lambda expressions with method references if you prefer that style:
scheduleIntervalContainers.stream()
.filter(ScheduleIntervalContainer.class::isInstance)
.map (ScheduleIntervalContainer.class::cast)
.filter(sic -> sic.getStartTime() != sic.getEndTime())
.collect(Collectors.toList());
A pretty elegant option is to use method reference of class:
scheduleIntervalContainers
.stream()
.filter( ScheduleIntervalContainer.class::isInstance )
.map( ScheduleIntervalContainer.class::cast )
.filter( sic -> sic.getStartTime() != sic.getEndTime())
.collect(Collectors.toList() );
There is a small problem with #Eran solution - typing class name in both filter and map is error-prone - it is easy to forget to change the name of the class in both places. An improved solution would be something like this:
private static <T, R> Function<T, Stream<R>> select(Class<R> clazz) {
return e -> clazz.isInstance(e) ? Stream.of(clazz.cast(e)) : null;
}
scheduleIntervalContainers
.stream()
.flatMap(select(ScheduleIntervalContainer.class))
.filter( sic -> sic.getStartTime() != sic.getEndTime())
.collect(Collectors.toList());
However there might be a performance penalty in creating a Stream for every matching element. Be careful to use it on huge data sets. I've learned this solution from #Tagir Vailev
Instead of a filter + map like other answers suggest, I would recommend this utility method:
public static <Super, Sub extends Super> Function<Super, Stream<Sub>> filterType(Class<Sub> clz) {
return obj -> clz.isInstance(obj) ? Stream.of(clz.cast(obj)) : Stream.empty();
}
Use it as:
Stream.of(dog, cat fish)
.flatMap(filterType(Dog.class));
Compared to filter + map it has the following advantages:
If the class does not extend your class you will get a compile error
Single place, you can never forget to change a class in either filter or map
Filter by class type with StreamEx
StreamEx.of(myCollection).select(TheThing.class).toList();
I have written following code:
Set<Pair<Predicate>> interestingPairs = ...
...
interestingPairs.stream()
.filter(pair -> pair.getFirst().isNegated() == pair.getSecond().isNegated())
.flatMap( pair -> findUnification(pair)
.map(u -> {
Set<String> aVars = pair.getFirst().getAllVariables();
Set<String> bVars = pair.getSecond().getAllVariables();
if(u.sigma.isEmptyOrVarByVar(aVars) && !u.sigmaPrime.isEmptyOrVarByVar(bVars))
return Stream.of(pair.getFirst());
if (!u.sigma.isEmptyOrVarByVar(bVars) && u.sigmaPrime.isEmptyOrVarByVar(aVars))
return Stream.of(pair.getSecond());
return Stream.empty();
})
.orElse(Stream.empty()))
.forEach(toRemove -> this.predicates.remove((Predicate)toRemove));
I'm using intelliJ IDE. After flatMap operation elements of my stream have Object type instead of Predicate, so in the foreach toRemove has not desired type. When I change return Stream.empty() to return Stream.<Predicate>empty(), object's type after flatMap is Predicate. I would even understand this if not .orElse(Stream.empty()) where I don't have to add the <Predicate implicitly. What's the thing I don't get here?
I assume that your Predicate is your custom non-generic type, not the java.util.function.Predicate. Seems that the compiler fails to infer the type of the .flatMap argument. You may help it specifying it explicitly:
interestingPairs.stream()
.filter(pair -> pair.getFirst().isNegated() == pair.getSecond().isNegated())
.<Predicate>flatMap( pair -> findUnification(pair)
... and so on
To my understanding, current JLS automatic type inference rules do not cover this case, so it's not a compiler bug. You just have to specify type argument explicitly sometimes or extract some complex lambdas to intermediate variables.