Apologies that is probably the worst Title I've used but I can't quite think how to word it.
I'm calling a method table.getColData(COL_1) which returns a generic
public <T extends Object> List<T> getColData(final String col)
I am calling it twice and getting two lists of strings. I want to concatenate these and have ended up with this -
List<String> a = table.getColData(col1);
List<String> b = table.getColData(col2);
List <String> c = Stream.concat(a.stream(), b.stream()).collect(Collectors.toList());
which works nicely I think. I can't find a way to avoid the 2 declarations though without getting an error as the concat thinks it has a list of objects that are not Strings (prompt: change type of c to List<Object>) ?
Is there an easy way to do this to make it look a little more polished?!
You are limited by the inference of the compiler.
List <String> c = Stream.concat(getColDataStream(col1).stream(), getColDataStream(col2).stream()).collect(Collectors.toList());
cannot compile because getColDataStream() return is inferred as List<Object> as you don't specify a target type from the invocation side.
You can concatenate two streams of List<Object> but it not will produce Stream<String> but Stream<Object>.
Introducing two intermediary variables is not necessary the best way.
1) As alternative as suggested by Holger you could specify the T type from the target side :
Stream.concat(table.<String>getColData(col1).stream(), table.<String>getColData(col2).stream())
.collect(Collectors.toList());
2) You could also transform Stream<Object> to Stream<String>in a map() operation :
List<String> c = Stream.concat(table.getColData(col1).stream(), table.getColData(col2).stream())
.map(s -> (String) s)
.collect(Collectors.toList());
3) or introducing an additional method that prevents any explicit cast by concatenating streams of the lists, and collecting it in a List that it returns :
public <T> List<T> concatAndCollectToList(final List<T> a, List<T> b) {
return Stream.concat(a.stream(), b.stream())
.collect(Collectors.toList());
}
You can now do just :
List<String> c = concatAndCollectToList(table.getColData(col1), table.getColData(col2));
To be honest, I would add a type witness (a method argument) that is not used, to make this method type safe:
public static <T> List<T> getColData(String col, Class<T> clazz) {
// whatever you did as before
}
And in such a case your type-safety would be in place:
List<String> set = Stream.concat(
getColData("", String.class).stream(),
getColData("", String.class).stream())
.collect(Collectors.toList());
Firstly, the T extends Object is redundant since every object T extends from Object since the definition.
Since the method returns List<T>, it's unknown at the compilation time what type is T, therefore it's not possible to pass those two generic results to Stream - the collected result is List<Object> regardless the T.
You have to map each of the values to String - at this point you have to assure that all the list items are convertible into String using casting or conversion:
List<String> c = Stream
.concat(table.getColData(col1).stream(), table.getColData(col2).stream())
.map(s -> (String) s)
.collect(Collectors.toList());
I recommend you the shorter and more readable way which uses Stream::flatMap:
List<String> c = Stream.of(table.getColData(col1), table.getColData(col2))
.flatMap(list -> list.stream().map(s -> (String) s))
.collect(Collectors.toList());
The result depends on what exactly the method getColData returns and whether it is convertible to String (#Eugene).
The simplest would be:
Stream.of(col1, col2)
.map(table::<String>getColData)
.flatMap(List::stream)
.collect(Collectors.toList());
Since your getColData method returns a T, you can specify what type T is by using a type witness <String>. The rest is the java syntax for method references.
Also, the use of generics can be questioned here. This is equivalent to having
public List<Object> getColData(final String col)
and casting your list to a list of String:
Stream.of(col1, col2)
.map(table::getColData)
.map(o -> (String) o)
.flatMap(List::stream)
.collect(Collectors.toList());
The simplest approach would be to just add a generic argument to the method using <>
Stream.concat(
table.<String>getColData(col1).stream(),
table.<String>getColData(col2).stream())
.collect(toList());
Related
Let's say I have two classes and two methods:
class Scratch {
private class A{}
private class B extends A{}
public Optional<A> getItems(List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
public Optional<A> getItems2(List<String> items){
return Optional.of(
items.stream()
.map(s -> new B())
.findFirst()
.get()
);
}
}
Why does getItems2 compile while getItems gives compiler error
incompatible types: java.util.Optional<Scratch.B> cannot be converted to java.util.Optional<Scratch.A>
So when I get the value of the Optional returned by findFirst and wrap it again with Optional.of the compiler recognizes the inheritance but not if I use directly the result of findFirst.
An Optional<B> is not a subtype of Optional<A>. Unlike other programming languages, Java’s generic type system does not know “read only types” or “output type parameters”, so it doesn’t understand that Optional<B> only provides an instance of B and could work at places where an Optional<A> is required.
When we write a statement like
Optional<A> o = Optional.of(new B());
Java’s type inference uses the target type to determine that we want
Optional<A> o = Optional.<A>of(new B());
which is valid as new B() can be used where an instance of A is required.
The same applies to
return Optional.of(
items.stream()
.map(s -> new B())
.findFirst()
.get()
);
where the method’s declared return type is used to infer the type arguments to the Optional.of invocation and passing the result of get(), an instance of B, where A is required, is valid.
Unfortunately, this target type inference doesn’t work through chained invocations, so for
return items.stream()
.map(s -> new B())
.findFirst();
it is not used for the map call. So for the map call, the type inference uses the type of new B() and its result type will be Stream<B>. The second problem is that findFirst() is not generic, calling it on a Stream<T> invariably produces a Optional<T> (and Java’s generics does not allow to declare a type variable like <R super T>, so it is not even possible to produce an Optional<R> with the desired type here).
→ The solution is to provide an explicit type for the map call:
public Optional<A> getItems(List<String> items){
return items.stream()
.<A>map(s -> new B())
.findFirst();
}
Just for completeness, as said, findFirst() is not generic and hence, can’t use the target type. Chaining a generic method allowing a type change would also fix the problem:
public Optional<A> getItems(List<String> items){
return items.stream()
.map(s -> new B())
.findFirst()
.map(Function.identity());
}
But I recommend using the solution of providing an explicit type for the map invocation.
The issue you have is with inheritance for generics.
Optional< B > doesn't extend Optional< A >, so it can't be returned as such.
I'd imagine that something like this:
public Optional<? extends A> getItems( List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
Or:
public Optional<?> getItems( List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
Would work fine, depending on your needs.
Edit: escaping some characters
An Optional<B> is not a sub-class of Optional<A>.
In the first case, you have a Stream<B>, so findFirst returns an Optional<B>, which cannot be converted to an Optional<A>.
In the second case, you have a stream pipeline that returns an instance of B. When you pass that instance to Optional.of(), the compiler sees that the return type of the method is Optional<A>, so Optional.of() returns an Optional<A> (since an Optional<A> can hold an instance of B as its value (since B extends A)).
Look at this similar example:
Optional<A> optA = Optional.of(new B()); //OK
Optional<B> optB = Optional.of(new B()); //OK
Optional<A> optA2 = optB; //doesn't compile
You can make the second method fail by rewriting it as:
public Optional<A> getItems2(List<String> items) {
return Optional.<B>of(items.stream().map(s -> new B()).findFirst().get());
}
This is simply because generic types are invariant.
Why the difference? See the declaration of Optional.of:
public static <T> Optional<T> of(T value) {
return new Optional<>(value);
}
The type of the optional is picked up from the target of the assignment (or return type in this case).
And Stream.findFirst():
//T comes from Stream<T>, it's not a generic method parameter
Optional<T> findFirst();
In this case, however, return items.stream().map(s -> new B()).findFirst(); doesn't type the result of .findFirst() based on the declared return type of getItems (T is strictly based on the type argument of Stream<T>)
If the class B inherits class A, that doesn't mean Optional inherits Optional. Optional is a different class.
I was trying to write a mkString function in Java8, a la Scala's useful mkString and ran into 2 issues that I could use some help on:
I am unable to make the first argument of mkString a generic Collection reference like Collection<Object> c and have invokers call with ANY type of collection.
Unable to reference the returned result of reduce() in-line to access the result's length to remove the extra leading separator.
Here's the code :
public static void main(String[] args) {
List<Integer> numbers = Arrays.asList(1, 2, 3, 4, 5);
System.out.println(mkString(numbers, ","));
}
public static String mkString(Collection<Integer> c, String sep) {
return c.stream()
.map(e -> String.valueOf(e))
.reduce("", (a, b) -> a + sep + b)
.substring(1, <<>>.length);
}
Note that if you're doing this not for self-education but to actually use it in some production code, you might want to consider the built-in Collectors.joining collector:
String result = numbers.stream()
.map(Object::toString)
// or
// .map(x -> x.toString()) // exactly the same
// or
// .map(String::valueOf) // handles nulls by turning them to the string "null"
.collect(Collectors.joining(","));
It has several overloads, similar to Scala's mkString. Still, this collector only accepts CharSequences, so you need to convert your values to strings explicitly as a separate map step.
Additionally, there is the String.join method, which also works for a collection of CharSequences. If you specifically have one of those (e.g. List<String>), it might be more convenient to use this method rather than converting the collection to a stream first:
List<String> strings = ...;
String result = String.join(",", strings);
// vs
String result = strings.stream().collect(Collectors.joining(","))
If I remember my java correctly, you can declare the argument type as Collection<?> to be able to pass a collection of any objects.
As to biting the separator off, I think, just .substring(1) will do what you want.
You can do it like :
public static <T> String mkString(Collection<T> c, String sep) { // generic impl
return c.stream()
.map(String::valueOf)
.reduce("", (a, b) -> a + sep + b)
.substring(1); // substring implementation to strip leading character
}
Any type of collection in java means Collection<?>, which semantically is the same as Collection<T> (in your case), it is said that if the type parameter is used only once) it can safely be replaced with a wildcard. But, since you want to be able to concat any collection, you should also ask for the callers to supply a Function that would transform from that type to a String representation, thus your method would become:
public static <T> String mkString(Collection<T> c,
Function<T, ? extends CharSequence> mapper,
String sep) {
return c.stream()
.map(mapper)
.collect(Collectors.joining(sep));
}
You can utilize String.join with a generic type:
public static <T> String mkString(Collection<T> c, String sep) {
return String.join(sep, c.stream()
.map(e -> String.valueOf(e))
.collect(Collectors.toList()));
}
Here it is in action with both Strings and other objects.
I have a requirement to validate a field against some predefined values (that can grow in future). So for this I have created a Enum and defined a method that returns the stream of the allowed values.
public enum EnumDemo {
VERSION("1.0.0","2.0.3");
private List<String> ver;
EnumDemo(String... ver) {
this.ver = Arrays.asList(ver);
}
public List<String> getVer() {
return ver;
}
public static Stream<EnumDemo> stream() {
return Arrays.stream(EnumDemo.values());
}
}
Now I need to validate a field against the values defined in this Enum.
I'm using:
Optional<EnumDemo> ab = EnumDemo.stream()
.map(l -> {l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst();})
.findFirst();
System.out.println(ab.get().getVer());
But it is giving me compilation error. Any help would be appreciated.
Edit:
Compilation Error:
The method map(Function<? super EnumDemo,? extends R>) in the type Stream<EnumDemo> is not applicable for the arguments ((<no type> l) -> {})
You should write it this way:
Optional<EnumDemo> ab = EnumDemo.stream().filter(l -> l.getVer().contains("2.0.3"))
.findFirst();
By the way, it wasn't working because you used {} for the lambda expression, so it was expecting a return statement in the {}. You could either remove the {} (along with the ;) or add in the return.
Anyway the original codes looked confusing, not sure if I guessed the intention correctly, but this implementation should be clearer.
Edit
Based on your comment, this is what you need:
EnumDemo.stream().flatMap(l -> l.getVer().stream())
.filter("2.0.3"::equals)
.findAny()
.ifPresent(System.out::println);
Update
Holger commented that there is a shorter and more meaningful way, with better performance:
if(EnumDemo.stream()
.anyMatch(l -> l.getVer().contains(userString))) {
System.out.println(userString);
}
To understand it, you have to think about lambdas. Lambdas represent interfaces but are specially treated by the JVM, so not every Lambda needs a class to represent. (Stateless lambdas can be just methods).
Now when looking at the map() method in the Stream interface:
<R> Stream<R> map(Function<? super T, ? extends R> mapper);
You see that it expects an implementation of the Function interface. You now have many different ways to provide that mapper. In this example lets map from Object to String:
1. Using an inline lambda:
.map(o -> o.toString())
2. Using a multiline lambda:
.map(o -> {
return o.toString();
})
3. Using method references:
.map(Object::toString)
4. Using an anonymous class:
.map(new Function<Object, String>(){
#Override
public String apply(Object o){
return o.toString();
}
})
Your current code uses the 2. approach. But without a return statement. This is even better seen when looking at the anonymous class at 4.. It seems natural, that when not using a return statement in a method that no value is returned.
And that's why you get the compilation error.
You just have to add the return statement:
.map(l -> {
return l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst();
});
Or remove the brackets {}:
.map(l -> l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst());
Or even use the approach provided by #Jai in his answer. Which works even better, than what you currently have.
You are using lambda expression and not returning any value so it is giving compilation error. It is better to use ifPresent()
String val="2.0.3";
EnumDemo.stream()
.flatMap(l -> l.getVer().stream())
.filter(c -> c.equals(val))
.findAny()
.ifPresent(x -> System.out.println(x));
I have written following code:
Set<Pair<Predicate>> interestingPairs = ...
...
interestingPairs.stream()
.filter(pair -> pair.getFirst().isNegated() == pair.getSecond().isNegated())
.flatMap( pair -> findUnification(pair)
.map(u -> {
Set<String> aVars = pair.getFirst().getAllVariables();
Set<String> bVars = pair.getSecond().getAllVariables();
if(u.sigma.isEmptyOrVarByVar(aVars) && !u.sigmaPrime.isEmptyOrVarByVar(bVars))
return Stream.of(pair.getFirst());
if (!u.sigma.isEmptyOrVarByVar(bVars) && u.sigmaPrime.isEmptyOrVarByVar(aVars))
return Stream.of(pair.getSecond());
return Stream.empty();
})
.orElse(Stream.empty()))
.forEach(toRemove -> this.predicates.remove((Predicate)toRemove));
I'm using intelliJ IDE. After flatMap operation elements of my stream have Object type instead of Predicate, so in the foreach toRemove has not desired type. When I change return Stream.empty() to return Stream.<Predicate>empty(), object's type after flatMap is Predicate. I would even understand this if not .orElse(Stream.empty()) where I don't have to add the <Predicate implicitly. What's the thing I don't get here?
I assume that your Predicate is your custom non-generic type, not the java.util.function.Predicate. Seems that the compiler fails to infer the type of the .flatMap argument. You may help it specifying it explicitly:
interestingPairs.stream()
.filter(pair -> pair.getFirst().isNegated() == pair.getSecond().isNegated())
.<Predicate>flatMap( pair -> findUnification(pair)
... and so on
To my understanding, current JLS automatic type inference rules do not cover this case, so it's not a compiler bug. You just have to specify type argument explicitly sometimes or extract some complex lambdas to intermediate variables.
This question already has answers here:
Java Lambda Stream Distinct() on arbitrary key? [duplicate]
(9 answers)
Closed 7 years ago.
Let's prefix this by my objects equals implementation is not how I need to filter so distinct itself does not work.
class MyObject {
String foo;
MyObject( String foo ) {
this.foo = foo;
}
public String getFoo() { return foo; }
}
Collection<MyObject> listA = Arrays.asList("a", "b", "c").stream().map(MyObject::new)
.collect(Collectors.toList());
Collection<MyObject> listB = Arrays.asList("b", "d").stream().map(MyObject::new)
.collect(Collectors.toList());
// magic
How can I merge and deduplicate the lists so that the resulting list should be of MyObjects containing "a", "b", "c", "d"?
Note: This is a simplification of what methods we actually need to deduplicate, which are actually complex DTOs of entities loaded by hibernate, but this example should adequately demonstrate the objective.
Such feature is discussed by JDK developers (see JDK-8072723) and might be included in Java-9 (though not guaranteed). The StreamEx library developed by me already has such feature, so you can use it:
List<MyObject> distinct = StreamEx.of(listA).append(listB)
.distinct(MyObject::getFoo).toList();
The StreamEx class is an enhanced Stream which is completely compatible with JDK Stream, but has many additional operations including distinct(Function) which allows you to specify key extractor for distinct operation. Internally it's pretty similar to the solution proposed by #fge.
You can also consider writing custom collector which will combine getting distinct objects and storing them to list:
public static <T> Collector<T, ?, List<T>> distinctBy(Function<? super T, ?> mapper) {
return Collector.<T, Map<Object, T>, List<T>> of(LinkedHashMap::new,
(map, t) -> map.putIfAbsent(mapper.apply(t), t), (m1, m2) -> {
for(Entry<Object, T> e : m2.entrySet()) {
m1.putIfAbsent(e.getKey(), e.getValue());
}
return m1;
}, map -> new ArrayList<>(map.values()));
}
This collector intermediately collects the results into Map<Key, Element> where Key is the extracted Key and Element is the corresponding stream element. To make sure that exactly first occurring element will be preserved among all repeating ones, the LinkedHashMap is used. Finally you just need to take the values() of this map and dump them into the list. So now you can write:
List<MyObject> distinct = Stream.concat(listA.stream(), listB.stream())
.collect(distinctBy(MyObject::getFoo));
If you don't care whether the resulting collection is list or not, you can even remove the new ArrayList<>() step (just using Map::values as a finisher). Also more simplifications are possible if you don't care about order:
public static <T> Collector<T, ?, Collection<T>> distinctBy(Function<? super T, ?> mapper) {
return Collector.<T, Map<Object, T>, Collection<T>> of(HashMap::new,
(map, t) -> map.put(mapper.apply(t), t),
(m1, m2) -> { m1.putAll(m2); return m1; },
Map::values);
}
Such collector (preserving the order and returning the List) is also available in StreamEx library.
If .equals() does not work for you then you may want to have a go at using Guava's Equivalence.
Provided that your type is T, you need to implement an Equivalence<T>; once you have this, you need to create a:
Set<Equivalence.Wrapper<T>>
into which you'll gather your values. Then, provided your implementation of Equivalence<T> is some static variable named EQ, adding to this set is as simple as:
coll1.stream().map(EQ::wrap).forEach(set::add);
coll2.stream().map(EQ::wrap).forEach(set::add);
And then to obtain a List<T> from this set, you could:
final Set<T> unwrapped = set.stream().map(Equivalence.Wrapper::get)
.collect(Collectors.toSet());
But of course, since in your comments you say you can do it with a loop, well... Why not keep using that loop?
If it works, don't fix it...
Collection<MyObject> result = Stream.concat(listA.stream(), listB.stream())
.filter(distinct(MyObject::getFoo))
.collect(Collectors.toList());
public static <T> Predicate<T> distinct(Function<? super T, Object> keyExtractor) {
Map<Object, String> seen = new ConcurrentHashMap<>();
return t -> seen.put(keyExtractor.apply(t), "") == null;
}
I found this distinct function once in a blog (can't remember the link atm).