Java 8 lambda get and remove element from list - java

Given a list of elements, I want to get the element with a given property and remove it from the list. The best solution I found is:
ProducerDTO p = producersProcedureActive
.stream()
.filter(producer -> producer.getPod().equals(pod))
.findFirst()
.get();
producersProcedureActive.remove(p);
Is it possible to combine get and remove in a lambda expression?

To Remove element from the list
objectA.removeIf(x -> conditions);
eg:
objectA.removeIf(x -> blockedWorkerIds.contains(x));
List<String> str1 = new ArrayList<String>();
str1.add("A");
str1.add("B");
str1.add("C");
str1.add("D");
List<String> str2 = new ArrayList<String>();
str2.add("D");
str2.add("E");
str1.removeIf(x -> str2.contains(x));
str1.forEach(System.out::println);
OUTPUT:
A
B
C

Although the thread is quite old, still thought to provide solution - using Java8.
Make the use of removeIf function. Time complexity is O(n)
producersProcedureActive.removeIf(producer -> producer.getPod().equals(pod));
API reference: removeIf docs
Assumption: producersProcedureActive is a List
NOTE: With this approach you won't be able to get the hold of the deleted item.

Consider using vanilla java iterators to perform the task:
public static <T> T findAndRemoveFirst(Iterable<? extends T> collection, Predicate<? super T> test) {
T value = null;
for (Iterator<? extends T> it = collection.iterator(); it.hasNext();)
if (test.test(value = it.next())) {
it.remove();
return value;
}
return null;
}
Advantages:
It is plain and obvious.
It traverses only once and only up to the matching element.
You can do it on any Iterable even without stream() support (at least those implementing remove() on their iterator).
Disadvantages:
You cannot do it in place as a single expression (auxiliary method or variable required)
As for the
Is it possible to combine get and remove in a lambda expression?
other answers clearly show that it is possible, but you should be aware of
Search and removal may traverse the list twice
ConcurrentModificationException may be thrown when removing element from the list being iterated

The direct solution would be to invoke ifPresent(consumer) on the Optional returned by findFirst(). This consumer will be invoked when the optional is not empty. The benefit also is that it won't throw an exception if the find operation returned an empty optional, like your current code would do; instead, nothing will happen.
If you want to return the removed value, you can map the Optional to the result of calling remove:
producersProcedureActive.stream()
.filter(producer -> producer.getPod().equals(pod))
.findFirst()
.map(p -> {
producersProcedureActive.remove(p);
return p;
});
But note that the remove(Object) operation will again traverse the list to find the element to remove. If you have a list with random access, like an ArrayList, it would be better to make a Stream over the indexes of the list and find the first index matching the predicate:
IntStream.range(0, producersProcedureActive.size())
.filter(i -> producersProcedureActive.get(i).getPod().equals(pod))
.boxed()
.findFirst()
.map(i -> producersProcedureActive.remove((int) i));
With this solution, the remove(int) operation operates directly on the index.

Use can use filter of Java 8, and create another list if you don't want to change the old list:
List<ProducerDTO> result = producersProcedureActive
.stream()
.filter(producer -> producer.getPod().equals(pod))
.collect(Collectors.toList());

I'm sure this will be an unpopular answer, but it works...
ProducerDTO[] p = new ProducerDTO[1];
producersProcedureActive
.stream()
.filter(producer -> producer.getPod().equals(pod))
.findFirst()
.ifPresent(producer -> {producersProcedureActive.remove(producer); p[0] = producer;}
p[0] will either hold the found element or be null.
The "trick" here is circumventing the "effectively final" problem by using an array reference that is effectively final, but setting its first element.

With Eclipse Collections you can use detectIndex along with remove(int) on any java.util.List.
List<Integer> integers = Lists.mutable.with(1, 2, 3, 4, 5);
int index = Iterate.detectIndex(integers, i -> i > 2);
if (index > -1) {
integers.remove(index);
}
Assert.assertEquals(Lists.mutable.with(1, 2, 4, 5), integers);
If you use the MutableList type from Eclipse Collections, you can call the detectIndex method directly on the list.
MutableList<Integer> integers = Lists.mutable.with(1, 2, 3, 4, 5);
int index = integers.detectIndex(i -> i > 2);
if (index > -1) {
integers.remove(index);
}
Assert.assertEquals(Lists.mutable.with(1, 2, 4, 5), integers);
Note: I am a committer for Eclipse Collections

The below logic is the solution without modifying the original list
List<String> str1 = new ArrayList<String>();
str1.add("A");
str1.add("B");
str1.add("C");
str1.add("D");
List<String> str2 = new ArrayList<String>();
str2.add("D");
str2.add("E");
List<String> str3 = str1.stream()
.filter(item -> !str2.contains(item))
.collect(Collectors.toList());
str1 // ["A", "B", "C", "D"]
str2 // ["D", "E"]
str3 // ["A", "B", "C"]

When we want to get multiple elements from a List into a new list (filter using a predicate) and remove them from the existing list, I could not find a proper answer anywhere.
Here is how we can do it using Java Streaming API partitioning.
Map<Boolean, List<ProducerDTO>> classifiedElements = producersProcedureActive
.stream()
.collect(Collectors.partitioningBy(producer -> producer.getPod().equals(pod)));
// get two new lists
List<ProducerDTO> matching = classifiedElements.get(true);
List<ProducerDTO> nonMatching = classifiedElements.get(false);
// OR get non-matching elements to the existing list
producersProcedureActive = classifiedElements.get(false);
This way you effectively remove the filtered elements from the original list and add them to a new list.
Refer the 5.2. Collectors.partitioningBy section of this article.

As others have suggested, this might be a use case for loops and iterables. In my opinion, this is the simplest approach. If you want to modify the list in-place, it cannot be considered "real" functional programming anyway. But you could use Collectors.partitioningBy() in order to get a new list with elements which satisfy your condition, and a new list of those which don't. Of course with this approach, if you have multiple elements satisfying the condition, all of those will be in that list and not only the first.

the task is: get ✶and✶ remove element from list
p.stream().collect( Collectors.collectingAndThen( Collector.of(
ArrayDeque::new,
(a, producer) -> {
if( producer.getPod().equals( pod ) )
a.addLast( producer );
},
(a1, a2) -> {
return( a1 );
},
rslt -> rslt.pollFirst()
),
(e) -> {
if( e != null )
p.remove( e ); // remove
return( e ); // get
} ) );

resumoRemessaPorInstrucoes.removeIf(item ->
item.getTipoOcorrenciaRegistro() == TipoOcorrenciaRegistroRemessa.PEDIDO_PROTESTO.getNome() ||
item.getTipoOcorrenciaRegistro() == TipoOcorrenciaRegistroRemessa.SUSTAR_PROTESTO_BAIXAR_TITULO.getNome());

Combining my initial idea and your answers I reached what seems to be the solution
to my own question:
public ProducerDTO findAndRemove(String pod) {
ProducerDTO p = null;
try {
p = IntStream.range(0, producersProcedureActive.size())
.filter(i -> producersProcedureActive.get(i).getPod().equals(pod))
.boxed()
.findFirst()
.map(i -> producersProcedureActive.remove((int)i))
.get();
logger.debug(p);
} catch (NoSuchElementException e) {
logger.error("No producer found with POD [" + pod + "]");
}
return p;
}
It lets remove the object using remove(int) that do not traverse again the
list (as suggested by #Tunaki) and it lets return the removed object to
the function caller.
I read your answers that suggest me to choose safe methods like ifPresent instead of get but I do not find a way to use them in this scenario.
Are there any important drawback in this kind of solution?
Edit following #Holger advice
This should be the function I needed
public ProducerDTO findAndRemove(String pod) {
return IntStream.range(0, producersProcedureActive.size())
.filter(i -> producersProcedureActive.get(i).getPod().equals(pod))
.boxed()
.findFirst()
.map(i -> producersProcedureActive.remove((int)i))
.orElseGet(() -> {
logger.error("No producer found with POD [" + pod + "]");
return null;
});
}

A variation of the above:
import static java.util.function.Predicate.not;
final Optional<MyItem> myItem = originalCollection.stream().filter(myPredicate(someInfo)).findFirst();
final List<MyItem> myOtherItems = originalCollection.stream().filter(not(myPredicate(someInfo))).toList();
private Predicate<MyItem> myPredicate(Object someInfo) {
return myItem -> myItem.someField() == someInfo;
}

Related

obtaining unique number from a list of duplicate integers using java 8 streams

I’m trying to obtain a only duplicated numbers list from a list of integers:
final Set<Integer> setOfNmums = new HashSet<>();
Arrays.asList(5,6,7,7,7,6,2,4,2,4).stream()
.peek(integer -> System.out.println("XX -> " + integer))
.filter(n -> !setOfNmums.add(n))
.peek(System.out::println)
.map(String::valueOf)
.sorted()
.collect(Collectors.toList());
The output is 2,4,6,7,7
Expected : 2,4,6,7
I don’t understand how that’s happening.. is this running in parallel? how am I getting two '7'?
The hashset should return false if it exists and that used by the filter?
Yes I can use distinct, but I’m curious to know why would the filter fail.. is it being done in parallel?
Your filter rejects the first occurrence of each element and accepts all subsequent occurrences. Therefore, when an element occurs n times, you’ll add it n-1 times.
Since you want to accept all elements which occur more than once, but only accept them a single time, you could use .filter(n -> !setOfNmums.add(n)) .distinct() or you enhance the set to a map, to be able to accept an element only on its second occurrence.
Map<Integer, Integer> occurrences = new HashMap<>();
List<String> result = Stream.of(5,6,7,7,7,6,2,4,2,4)
.filter(n -> occurrences.merge(n, 1, Integer::sum) == 2)
.map(String::valueOf)
.sorted()
.collect(Collectors.toList());
But generally, using stateful filters with streams is discouraged.
A cleaner solution would be
List<String> result = Stream.of(5,6,7,7,7,6,2,4,2,4)
.collect(Collectors.collectingAndThen(
Collectors.toMap(String::valueOf, x -> true, (a,b) -> false, TreeMap::new),
map -> { map.values().removeIf(b -> b); return new ArrayList<>(map.keySet()); }));
Note that this approach doesn’t count the occurrences but only remembers whether an element is unique or has seen at least a second time. This works by mapping each element to true with the second argument to the toMap collector, x -> true, and resolving multiple occurrences with a merge function of (a,b) -> false. The subsequent map.values().removeIf(b -> b) will remove all unique elements, i.e. those mapped to true.
You can use .distinct() function in your stream check this out.
Since Holger already explained why your solution didn't work, I'll just provide an alternative.
Why not use Collections.frequency(collection, element) together with distinct()?
The solution would be quite simple(i apologize for the formatting, i just copied it from my ide and there doesn't seem to be an autoformat feature in SOF):
List<Integer> numbers = List.of(5, 6, 7, 7, 7, 6, 2, 4, 2, 4);
List<String> onlyDuplicates = numbers.stream()
.filter(n -> Collections.frequency(numbers, n) > 1)
.distinct()
.sorted()
.map(String::valueOf)
.toList();
This simply keeps all elements that occur more than once and then filters out the duplicates before sorting, converting each element to a string and collecting to a list since that seems to be what you want.
if you need a mutable list you can use collect(toCollection(ArrayList::new)) instead of toList()

Create all possible combinations of elements

I need to create all possible combinations of some kind of Key, that is composed from X (in my case, 8), equally important elements. So i came up with code like this:
final LinkedList<Key> keys = new LinkedList();
firstElementCreator.getApplicableElements() // All creators return a Set of elements
.forEach( first -> secondElementCreator.getApplicableElements()
.forEach( second -> thirdElementCreator.getApplicableElements()
// ... more creators
.forEach( X -> keys.add( new Key( first, second, third, ..., X ) ) ) ) ) ) ) ) );
return keys;
and it's working, but there is X nested forEach and i have feeling that i'm missing out an easier/better/more elegant solution. Any suggestions?
Thanks in advance!
Is it Cartesian Product? Many libraries provide the API, for example: Sets and Lists in Guava:
List<ApplicableElements> elementsList = Lists.newArrayList(firstElementCreator, secondElementCreator...).stream()
.map(c -> c.getApplicableElements()).collect(toList());
List<Key> keys = Lists.cartesianProduct(elementsList).stream()
.map(l -> new Key(l.get(0), l.get(1), l.get(2), l.get(3), l.get(4), l.get(5), l.get(6), l.get(7))).collect(toList());
Since the number of input sets is fixed (it has to match the number of arguments in the Key constructor), your solution is actually not bad.
It's more efficient and easier to read without the lambdas, though, like:
for (Element first : firstElementCreator.getApplicableElements()) {
for (Element second : secondElementCreator.getApplicableElements()) {
for (Element third : thirdElementCreator.getApplicableElements()) {
keys.add(new Key(first, second, third));
}
}
}
The canonical solution is to use flatMap. However, the tricky part is to create the Key object from the multiple input levels.
The straight-forward approach is to do the evaluation in the innermost function, where every value is in scope
final List<Key> keys = firstElementCreator.getApplicableElements().stream()
.flatMap(first -> secondElementCreator.getApplicableElements().stream()
.flatMap(second -> thirdElementCreator.getApplicableElements().stream()
// ... more creators
.map( X -> new Key( first, second, third, ..., X ) ) ) )
.collect(Collectors.toList());
but this soon becomes impractical with deep nesting
A solution without deep nesting requires elements to hold intermediate compound values. E.g. if we define Key as
class Key {
String[] data;
Key(String... arg) {
data=arg;
}
public Key add(String next) {
int pos = data.length;
String[] newData=Arrays.copyOf(data, pos+1);
newData[pos]=next;
return new Key(newData);
}
#Override
public String toString() {
return "Key("+Arrays.toString(data)+')';
}
}
(assuming String as element type), we can use
final List<Key> keys =
firstElementCreator.getApplicableElements().stream().map(Key::new)
.flatMap(e -> secondElementCreator.getApplicableElements().stream().map(e::add))
.flatMap(e -> thirdElementCreator.getApplicableElements().stream().map(e::add))
// ... more creators
.collect(Collectors.toList());
Note that these flatMap steps are now on the same level, i.e. not nested anymore. Also, all these steps are identical, only differing in the actual creator, which leads to the general solution supporting an arbitrary number of Creator instances.
List<Key> keys = Stream.of(firstElementCreator, secondElementCreator, thirdElementCreator
/* , and, some, more, if you like */)
.map(creator -> (Function<Key,Stream<Key>>)
key -> creator.getApplicableElements().stream().map(key::add))
.reduce(Stream::of, (f1,f2) -> key -> f1.apply(key).flatMap(f2))
.apply(new Key())
.collect(Collectors.toList());
Here, every creator is mapping to the identical stream-producing function of the previous solution, then all are reduced to a single function combining each function with a flatMap step to the next one, and finally the resulting function is executed to get a stream, which is then collected to a List.

Java predicate - match against first predicate [duplicate]

I've just started playing with Java 8 lambdas and I'm trying to implement some of the things that I'm used to in functional languages.
For example, most functional languages have some kind of find function that operates on sequences, or lists that returns the first element, for which the predicate is true. The only way I can see to achieve this in Java 8 is:
lst.stream()
.filter(x -> x > 5)
.findFirst()
However this seems inefficient to me, as the filter will scan the whole list, at least to my understanding (which could be wrong). Is there a better way?
No, filter does not scan the whole stream. It's an intermediate operation, which returns a lazy stream (actually all intermediate operations return a lazy stream). To convince you, you can simply do the following test:
List<Integer> list = Arrays.asList(1, 10, 3, 7, 5);
int a = list.stream()
.peek(num -> System.out.println("will filter " + num))
.filter(x -> x > 5)
.findFirst()
.get();
System.out.println(a);
Which outputs:
will filter 1
will filter 10
10
You see that only the two first elements of the stream are actually processed.
So you can go with your approach which is perfectly fine.
However this seems inefficient to me, as the filter will scan the whole list
No it won't - it will "break" as soon as the first element satisfying the predicate is found. You can read more about laziness in the stream package javadoc, in particular (emphasis mine):
Many stream operations, such as filtering, mapping, or duplicate removal, can be implemented lazily, exposing opportunities for optimization. For example, "find the first String with three consecutive vowels" need not examine all the input strings. Stream operations are divided into intermediate (Stream-producing) operations and terminal (value- or side-effect-producing) operations. Intermediate operations are always lazy.
return dataSource.getParkingLots()
.stream()
.filter(parkingLot -> Objects.equals(parkingLot.getId(), id))
.findFirst()
.orElse(null);
I had to filter out only one object from a list of objects. So i used this, hope it helps.
In addition to Alexis C's answer, If you are working with an array list, in which you are not sure whether the element you are searching for exists, use this.
Integer a = list.stream()
.peek(num -> System.out.println("will filter " + num))
.filter(x -> x > 5)
.findFirst()
.orElse(null);
Then you could simply check whether a is null.
Already answered by #AjaxLeung, but in comments and hard to find.
For check only
lst.stream()
.filter(x -> x > 5)
.findFirst()
.isPresent()
is simplified to
lst.stream()
.anyMatch(x -> x > 5)
import org.junit.Test;
import java.util.Arrays;
import java.util.List;
import java.util.Optional;
// Stream is ~30 times slower for same operation...
public class StreamPerfTest {
int iterations = 100;
List<Integer> list = Arrays.asList(1, 10, 3, 7, 5);
// 55 ms
#Test
public void stream() {
for (int i = 0; i < iterations; i++) {
Optional<Integer> result = list.stream()
.filter(x -> x > 5)
.findFirst();
System.out.println(result.orElse(null));
}
}
// 2 ms
#Test
public void loop() {
for (int i = 0; i < iterations; i++) {
Integer result = null;
for (Integer walk : list) {
if (walk > 5) {
result = walk;
break;
}
}
System.out.println(result);
}
}
}
A generic utility function with looping seems a lot cleaner to me:
static public <T> T find(List<T> elements, Predicate<T> p) {
for (T item : elements) if (p.test(item)) return item;
return null;
}
static public <T> T find(T[] elements, Predicate<T> p) {
for (T item : elements) if (p.test(item)) return item;
return null;
}
In use:
List<Integer> intList = Arrays.asList(1, 2, 3, 4, 5);
Integer[] intArr = new Integer[]{1, 2, 3, 4, 5};
System.out.println(find(intList, i -> i % 2 == 0)); // 2
System.out.println(find(intArr, i -> i % 2 != 0)); // 1
System.out.println(find(intList, i -> i > 5)); // null
Improved One-Liner answer: If you are looking for a boolean return value, we can do it better by adding isPresent:
return dataSource.getParkingLots().stream().filter(parkingLot -> Objects.equals(parkingLot.getId(), id)).findFirst().isPresent();

How to use Java 8 streams to find all values preceding a larger value?

Use Case
Through some coding Katas posted at work, I stumbled on this problem that I'm not sure how to solve.
Using Java 8 Streams, given a list of positive integers, produce a
list of integers where the integer preceded a larger value.
[10, 1, 15, 30, 2, 6]
The above input would yield:
[1, 15, 2]
since 1 precedes 15, 15 precedes 30, and 2 precedes 6.
Non-Stream Solution
public List<Integer> findSmallPrecedingValues(final List<Integer> values) {
List<Integer> result = new ArrayList<Integer>();
for (int i = 0; i < values.size(); i++) {
Integer next = (i + 1 < values.size() ? values.get(i + 1) : -1);
Integer current = values.get(i);
if (current < next) {
result.push(current);
}
}
return result;
}
What I've Tried
The problem I have is I can't figure out how to access next in the lambda.
return values.stream().filter(v -> v < next).collect(Collectors.toList());
Question
Is it possible to retrieve the next value in a stream?
Should I be using map and mapping to a Pair in order to access next?
Using IntStream.range:
static List<Integer> findSmallPrecedingValues(List<Integer> values) {
return IntStream.range(0, values.size() - 1)
.filter(i -> values.get(i) < values.get(i + 1))
.mapToObj(values::get)
.collect(Collectors.toList());
}
It's certainly nicer than an imperative solution with a large loop, but still a bit meh as far as the goal of "using a stream" in an idiomatic way.
Is it possible to retrieve the next value in a stream?
Nope, not really. The best cite I know of for that is in the java.util.stream package description:
The elements of a stream are only visited once during the life of a stream. Like an Iterator, a new stream must be generated to revisit the same elements of the source.
(Retrieving elements besides the current element being operated on would imply they could be visited more than once.)
We could also technically do it in a couple other ways:
Statefully (very meh).
Using a stream's iterator is technically still using the stream.
That's not a pure Java8, but recently I've published a small library called StreamEx which has a method exactly for this task:
// Find all numbers where the integer preceded a larger value.
Collection<Integer> numbers = Arrays.asList(10, 1, 15, 30, 2, 6);
List<Integer> res = StreamEx.of(numbers).pairMap((a, b) -> a < b ? a : null)
.nonNull().toList();
assertEquals(Arrays.asList(1, 15, 2), res);
The pairMap operation internally implemented using custom spliterator. As a result you have quite clean code which does not depend on whether the source is List or anything else. Of course it works fine with parallel stream as well.
Committed a testcase for this task.
It's not a one-liner (it's a two-liner), but this works:
List<Integer> result = new ArrayList<>();
values.stream().reduce((a,b) -> {if (a < b) result.add(a); return b;});
Rather than solving it by "looking at the next element", this solves it by "looking at the previous element, which reduce() give you for free. I have bent its intended usage by injecting a code fragment that populates the list based on the comparison of previous and current elements, then returns the current so the next iteration will see it as its previous element.
Some test code:
List<Integer> result = new ArrayList<>();
IntStream.of(10, 1, 15, 30, 2, 6).reduce((a,b) -> {if (a < b) result.add(a); return b;});
System.out.println(result);
Output:
[1, 15, 2]
The accepted answer works fine if either the stream is sequential or parallel but can suffer if the underlying List is not random access, due to multiple calls to get.
If your stream is sequential, you might roll this collector:
public static Collector<Integer, ?, List<Integer>> collectPrecedingValues() {
int[] holder = {Integer.MAX_VALUE};
return Collector.of(ArrayList::new,
(l, elem) -> {
if (holder[0] < elem) l.add(holder[0]);
holder[0] = elem;
},
(l1, l2) -> {
throw new UnsupportedOperationException("Don't run in parallel");
});
}
and a usage:
List<Integer> precedingValues = list.stream().collect(collectPrecedingValues());
Nevertheless you could also implement a collector so thats works for sequential and parallel streams. The only thing is that you need to apply a final transformation, but here you have control over the List implementation so you won't suffer from the get performance.
The idea is to generate first a list of pairs (represented by a int[] array of size 2) which contains the values in the stream sliced by a window of size two with a gap of one. When we need to merge two lists, we check the emptiness and merge the gap of the last element of the first list with the first element of the second list. Then we apply a final transformation to filter only desired values and map them to have the desired output.
It might not be as simple as the accepted answer, but well it can be an alternative solution.
public static Collector<Integer, ?, List<Integer>> collectPrecedingValues() {
return Collectors.collectingAndThen(
Collector.of(() -> new ArrayList<int[]>(),
(l, elem) -> {
if (l.isEmpty()) l.add(new int[]{Integer.MAX_VALUE, elem});
else l.add(new int[]{l.get(l.size() - 1)[1], elem});
},
(l1, l2) -> {
if (l1.isEmpty()) return l2;
if (l2.isEmpty()) return l1;
l2.get(0)[0] = l1.get(l1.size() - 1)[1];
l1.addAll(l2);
return l1;
}), l -> l.stream().filter(arr -> arr[0] < arr[1]).map(arr -> arr[0]).collect(Collectors.toList()));
}
You can then wrap these two collectors in a utility collector method, check if the stream is parallel with isParallel an then decide which collector to return.
If you're willing to use a third party library and don't need parallelism, then jOOλ offers SQL-style window functions as follows
System.out.println(
Seq.of(10, 1, 15, 30, 2, 6)
.window()
.filter(w -> w.lead().isPresent() && w.value() < w.lead().get())
.map(w -> w.value())
.toList()
);
Yielding
[1, 15, 2]
The lead() function accesses the next value in traversal order from the window.
Disclaimer: I work for the company behind jOOλ
You can achieve that by using a bounded queue to store elements which flows through the stream (which is basing on the idea which I described in detail here: Is it possible to get next element in the Stream?
Belows example first defines instance of BoundedQueue class which will store elements going through the stream (if you don't like idea of extending the LinkedList, refer to link mentioned above for alternative and more generic approach). Later you just examine the two subsequent elements - thanks to the helper class:
public class Kata {
public static void main(String[] args) {
List<Integer> input = new ArrayList<Integer>(asList(10, 1, 15, 30, 2, 6));
class BoundedQueue<T> extends LinkedList<T> {
public BoundedQueue<T> save(T curElem) {
if (size() == 2) { // we need to know only two subsequent elements
pollLast(); // remove last to keep only requested number of elements
}
offerFirst(curElem);
return this;
}
public T getPrevious() {
return (size() < 2) ? null : getLast();
}
public T getCurrent() {
return (size() == 0) ? null : getFirst();
}
}
BoundedQueue<Integer> streamHistory = new BoundedQueue<Integer>();
final List<Integer> answer = input.stream()
.map(i -> streamHistory.save(i))
.filter(e -> e.getPrevious() != null)
.filter(e -> e.getCurrent() > e.getPrevious())
.map(e -> e.getPrevious())
.collect(Collectors.toList());
answer.forEach(System.out::println);
}
}

Return empty element from Java 8 map operation

Using Java 8 stream what is the best way to map a List<Integer> when you have no output for the input Integer ?
Simply return null? But now my output list size will be smaller than my input size...
List<Integer> input = Arrays.asList(0,1,2,3);
List<Integer> output = input.stream()
.map(i -> {
Integer out = crazyFunction(i);
if(out == null || out.equals(0))
return null;
return Optional.of(out);
})
.collect(Collectors.toList());
I don’t get why you (and all answers) make it so complicated. You have a mapping operation and a filtering operation. So the easiest way is to just apply these operation one after another. And unless your method already returns an Optional, there is no need to deal with Optional.
input.stream().map(i -> crazyFunction(i))
.filter(out -> out!=null && !out.equals(0))
.collect(Collectors.toList());
It may be simplified to
input.stream().map(context::crazyFunction)
.filter(out -> out!=null && !out.equals(0))
.collect(Collectors.toList());
But you seem to have a more theoretical question about what kind of List to generate, one with placeholders for absent values or one with a different size than the input list.
The simple answer is: don’t generate a list. A List is not an end in itself so you should consider for what kind of operation you need this list (or its contents) and apply the operation right as the terminal operation of the stream. Then you have your answer as the operation dictates whether absent values should be filtered out or represented by a special value (and what value that has to be).
It might be a different answer for different operations…
Replace the map call with flatMap. The map operation produces one output value per input value, whereas the flatMap operation produces any number of output values per input value -- include zero.
The most straightforward way is probably to replace the check like so:
List<Integer> output = input.stream()
.flatMap(i -> {
Integer out = crazyFunction(i);
if (out == null || out.equals(0))
return Stream.empty();
else
return Stream.of(out);
})
.collect(Collectors.toList());
A further refactoring could change crazyFunction to have it return an Optional (probably OptionalInt). If you call it from map, the result is a Stream<OptionalInt>. Then you need to flatMap that stream to remove the empty optionals:
List<Integer> output = input.stream()
.map(this::crazyFunctionReturningOptionalInt)
.flatMap(o -> o.isPresent() ? Stream.of(o.getAsInt()) : Stream.empty())
.collect(toList());
The result of the flatMap is a Stream<Integer> which boxes up the ints, but this is OK since you're going to send them into a List. If you weren't going to box the int values into a List, you could convert the Stream<OptionalInt> to an IntStream using the following:
flatMapToInt(o -> o.isPresent() ? IntStream.of(o.getAsInt()) : IntStream.empty())
For further discussion of dealing with streams of optionals, see this question and its answers.
Simpler variants of #Martin Magakian 's answer:
List<Integer> input = Arrays.asList(0,1,2,3);
List<Optional<Integer>> output =
input.stream()
.map(i -> crazyFunction(i)) // you can also use a method reference here
.map(Optional::ofNullable) // returns empty optional
// if original value is null
.map(optional -> optional.filter(out -> !out.equals(0))) // return empty optional
// if captured value is zero
.collect(Collectors.toList())
;
List<Integer> outputClean =
output.stream()
.filter(Optional::isPresent)
.map(Optional::get)
.collect(Collectors.toList())
;
You can wrap the output into an Optional which may or may not contain a non-null value.
With an output: return Optional.of(out);
Without output: return Optional.<Integer>empty();
You have to wrap into an option because an array cannot contain any null value.
List<Integer> input = Arrays.asList(0,1,2,3);
List<Option<Integer>> output = input.stream()
.map(i -> {
Integer out = crazyFunction(i);
if(out == null || out.equals(0))
return Optional.<Integer>empty();
return Optional.of(out);
})
.collect(Collectors.toList());
This will make sure input.size() == output.size().
Later on you can exclude the empty Optional using:
List<Integer> outputClean = output.stream()
.filter(Optional::isPresent)
.map(i -> {
return i.get();
})
.collect(Collectors.toList());

Categories