RxJava: Split Rx Flowable into multiple streams - java

I would like to perform some operations on stream, and then split stream into two streams, and then process them separately.
Example to show problem:
Flowable<SuccessfulObject> stream = Flowable.fromArray(
new SuccessfulObject(true, 0),
new SuccessfulObject(false, 1),
new SuccessfulObject(true, 2));
stream = stream.doOnEach(System.out::println);
Flowable<SuccessfulObject> successful = stream.filter(SuccessfulObject::isSuccess);
Flowable<SuccessfulObject> failed = stream.filter(SuccessfulObject::isFail);
successful.doOnEach(successfulObject -> {/*handle success*/}).subscribe();
failed.doOnEach(successfulObject -> {/*handle fail*/}).subscribe();
Class:
class SuccessfulObject {
private boolean success;
private int id;
public SuccessfulObject(boolean success, int id) {
this.success = success;
this.id = id;
}
public boolean isSuccess() {
return success;
}
public boolean isFail() {
return !success;
}
public void setSuccess(boolean success) {
this.success = success;
}
#Override
public String toString() {
return "SuccessfulObject{" +
"id=" + id +
'}';
}
}
But this code prints all elements twice whereas I would like to perform all operations before splitting only once.
Output:
OnNextNotification[SuccessfulObject{id=0}]
OnNextNotification[SuccessfulObject{id=1}]
OnNextNotification[SuccessfulObject{id=2}] OnCompleteNotification
OnNextNotification[SuccessfulObject{id=0}]
OnNextNotification[SuccessfulObject{id=1}]
OnNextNotification[SuccessfulObject{id=2}] OnCompleteNotification
How can I process the stream to receive this behaviour?

Use publish to share a subscription to the source:
Flowable<Integer> source = Flowable.range(1, 5);
ConnectableFlowable<Integer> cf = source.publish();
cf.filter(v -> v % 2 == 0).subscribe(v -> System.out.println("Even: " + v));
cf.filter(v -> v % 2 != 0).subscribe(v -> System.out.println("Odd: " + v));
cf.connect();

Related

Osmosis WayNode instances always return 0 from getLatitude and getLongitude

I am trying to use openstreetmap osmosis to read a pbf file of an airport and extract features like gates and runways.
I am using code similar to: http://www.javaoptimum.com/how-to-read-osm-pbf-files-programmatically-java/
When the code encounters a Node instance, it returns reasonable values from getLatitude and getLongitude...
However when the code encounters a Way instance the coordinates appear to be zero. Here is the code that I am using:
Sink sinkImplementation = new Sink() {
public void process(EntityContainer entityContainer) {
Entity entity = entityContainer.getEntity();
entity.getTags().forEach((tag) -> {
if ("aeroway".equals(tag.getKey())) {
if (entity instanceof Node) {
if ("holding_position".equals(tag.getValue())) {
installPointHook(airportIcaoCode, entity, tag);
} else if ("gate".equals(tag.getValue())) {
installPointHook(airportIcaoCode, entity, tag);
} else {
LOGGER.info("Ignoring unrecognized tag value " + tag.getValue());
}
} else if (entity instanceof Way) {
Way way = (Way)entity;
if ("runway".equals(tag.getValue())) {
way.getWayNodes().forEach((it) -> System.out.println(it + " : " + it.getLatitude()+","+it.getLongitude()));
} else if ("taxiway".equals(tag.getValue())) {
way.getWayNodes().forEach((it) -> System.out.println(it + " : " + it.getLatitude()+","+it.getLongitude()));
} else if ("apron".equals(tag.getValue())) {
way.getWayNodes().forEach((it) -> System.out.println(it + " : " + it.getLatitude()+","+it.getLongitude()));
} else if ("hangar".equals(tag.getValue())) {
way.getWayNodes().forEach((it) -> System.out.println(it + " : " + it.getLatitude()+","+it.getLongitude()));
} else {
LOGGER.info("Ignoring unrecognized tag value " + tag.getValue());
}
} else if (entity instanceof Relation) {
LOGGER.info("Ignoring unrecognized tag value " + tag.getValue());
}
}
});
}
public void initialize(Map<String, Object> arg0) {
}
public void complete() {
}
#Override
public void close() {
}
};
Is there some other processing I need to do in order to get the coordinates for Ways?
Turns out that ways don't have coordinates themselves, instead they have lists of WayNodes that have coordinates:
public void process(EntityContainer entityContainer) {
Entity entity = entityContainer.getEntity();
entity.getTags().forEach((tag) -> {
if (tag.getKey().equals("aeroway") && tag.getValue().equals("runway")
&& entity instanceof Way) {
final List<WayNode> wayNodes = ((Way) entity).getWayNodes();
Runway runway = new Runway(entity.getId(), nodes.get(wayNodes.get(0).getNodeId()),
nodes.get(wayNodes.get(wayNodes.size() - 1).getNodeId()));
runways.add(runway);
}
});
}
You could enhance the WayNodes with coordinates using the following snippets:
private static class MySink implements Sink {
public void process(EntityContainer entityContainer) {
if (entityContainer.getEntity() instanceof Node) {
Node node = (Node) entityContainer.getEntity();
nodes.put(node.getId(), node);
}
...
}
...
}
for (int i = 0; i < way.getWayNodes().size(); i++) {
WayNode wayNode = way.getWayNodes().get(i);
Node node = sink.nodes.get(wayNode.getNodeId());
way.getWayNodes().set(i, new WayNode(wayNode.getNodeId(), node.getLatitude(), node.getLongitude()));
}

Kafka Streams to topic

I need to calculate an average with a kafka streams. The producer produce with Avro and so i need to deserialize with it and i receive a GenericRecord with a json String that i have to elaborate.
I use a user defined type as support.
private class Tuple {
public int occ;
public int sum;
public Tuple (int occ, int sum) {
this.occ = occ;
this.sum = sum;
}
public void sum (int toAdd) {
this.sum += toAdd;
this.occ ++;
}
public Double getAverage () {
return new Double (this.sum / this.occ);
}
public String toString() {
return "occorrenze: " + this.occ + ", somma: " + sum + ", media -> " + getAverage();
}
}
Now the elaboration:
StreamsBuilder builder = new StreamsBuilder();
KStream<GenericRecord, GenericRecord> source =
builder.stream(topic);
KStream<GenericRecord, GenericRecord>[] branches = source.branch(
(key,value) -> partition(value.toString()),
(key, value) -> true
);
KGroupedStream <String, String> groupedStream = branches[0]
.mapValues(value -> createJson(value.toString()))
.map((key, value) -> KeyValue.pair(new String("T_DUR_CICLO"), value.getNumberValue("payload", "T_DUR_CICLO")))
.groupByKey( Serialized.with(
Serdes.String(), /* key (note: type was modified) */
Serdes.String())); /* value */
branches[0].foreach((key, value) -> System.out.println(key + " " + value));
KTable<String, Tuple> aggregatedStream = groupedStream.aggregate(
() -> new Tuple(0, 0), // initializer
(aggKey, newValue, aggValue) -> new Tuple (aggValue.occ + 1, aggValue.sum + Integer.parseInt(newValue)),
Materialized.as("aggregate-state-store").with(Serdes.String(), new MySerde()));
aggregatedStream
.toStream()
.foreach((key, value) -> System.out.println(key + ": " + value));
KStream<String, Double> average = aggregatedStream
.mapValues(v -> v.getAverage())
.toStream();
The problem is when i go to store the stream in a topic with:
average.to("average");
Here the exception:
Exception in thread "streamtest-6d743b83-ce22-435e-aee5-76a745ce3571-StreamThread-1" org.apache.kafka.streams.errors.ProcessorStateException: task [1_0] Failed to flush state store KSTREAM-AGGREGATE-STATE-STORE-0000000007
at org.apache.kafka.streams.processor.internals.ProcessorStateManager.flush(ProcessorStateManager.java:242)
at org.apache.kafka.streams.processor.internals.AbstractTask.flushState(AbstractTask.java:202)
at org.apache.kafka.streams.processor.internals.StreamTask.flushState(StreamTask.java:420)
at org.apache.kafka.streams.processor.internals.StreamTask.commit(StreamTask.java:394)
at org.apache.kafka.streams.processor.internals.StreamTask.commit(StreamTask.java:382)
at org.apache.kafka.streams.processor.internals.AssignedTasks$1.apply(AssignedTasks.java:67)
at org.apache.kafka.streams.processor.internals.AssignedTasks.applyToRunningTasks(AssignedTasks.java:362)
at org.apache.kafka.streams.processor.internals.AssignedTasks.commit(AssignedTasks.java:352)
at org.apache.kafka.streams.processor.internals.TaskManager.commitAll(TaskManager.java:401)
at org.apache.kafka.streams.processor.internals.StreamThread.maybeCommit(StreamThread.java:1042)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:845)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:767)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:736)
Caused by: org.apache.kafka.streams.errors.StreamsException: A serializer (key: io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer / value: io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer) is not compatible to the actual key or value type (key type: java.lang.String / value type: java.lang.Double). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:94)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:41)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:50)
at org.apache.kafka.streams.processor.internals.ProcessorNode.runAndMeasureLatency(ProcessorNode.java:244)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:133)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
at org.apache.kafka.streams.kstream.internals.KTableMapValues$KTableMapValuesProcessor.process(KTableMapValues.java:106)
at org.apache.kafka.streams.kstream.internals.KTableMapValues$KTableMapValuesProcessor.process(KTableMapValues.java:83)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:50)
at org.apache.kafka.streams.processor.internals.ProcessorNode.runAndMeasureLatency(ProcessorNode.java:244)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:133)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
at org.apache.kafka.streams.kstream.internals.ForwardingCacheFlushListener.apply(ForwardingCacheFlushListener.java:42)
at org.apache.kafka.streams.state.internals.CachingKeyValueStore.putAndMaybeForward(CachingKeyValueStore.java:101)
at org.apache.kafka.streams.state.internals.CachingKeyValueStore.access$000(CachingKeyValueStore.java:38)
at org.apache.kafka.streams.state.internals.CachingKeyValueStore$1.apply(CachingKeyValueStore.java:83)
at org.apache.kafka.streams.state.internals.NamedCache.flush(NamedCache.java:141)
at org.apache.kafka.streams.state.internals.NamedCache.flush(NamedCache.java:99)
at org.apache.kafka.streams.state.internals.ThreadCache.flush(ThreadCache.java:125)
at org.apache.kafka.streams.state.internals.CachingKeyValueStore.flush(CachingKeyValueStore.java:123)
at org.apache.kafka.streams.state.internals.InnerMeteredKeyValueStore.flush(InnerMeteredKeyValueStore.java:284)
at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore.flush(MeteredKeyValueBytesStore.java:149)
at org.apache.kafka.streams.processor.internals.ProcessorStateManager.flush(ProcessorStateManager.java:239)
... 12 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.avro.generic.GenericRecord
at io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer.serialize(GenericAvroSerializer.java:39)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:156)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:101)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
... 41 more
----- EDIT ------
i add tha class for serialize and deserialize
serializer:
private class TupleSerializer implements Serializer<Tuple> {
#Override
public void configure(Map<String, ?> map, boolean bln) {
}
#Override
public byte[] serialize(String string, Tuple t) {
ByteBuffer buffer = ByteBuffer.allocate(4 + 4);
buffer.putInt(t.occ);
buffer.putInt(t.sum);
return buffer.array();
}
#Override
public void close() {
}
}
deserializer:
private class TupleDeserializer implements Deserializer<Tuple> {
#Override
public void configure(Map<String, ?> map, boolean bln) {
}
#Override
public void close() {
}
#Override
public Tuple deserialize(String string, byte[] bytes) {
ByteBuffer buffer = ByteBuffer.wrap(bytes);
int occ = buffer.getInt();
int sum = buffer.getInt();
return new Tuple (occ, sum);
}
}
MySerde:
private class MySerde implements Serde<Tuple> {
#Override
public void configure(Map<String, ?> map, boolean bln) {
}
#Override
public void close() {
}
#Override
public Serializer<Tuple> serializer() {
return new TupleSerializer ();
}
#Override
public Deserializer<Tuple> deserializer() {
return new TupleDeserializer ();
}
}
You have to define the Serdes with .to() method to override the default serde type.
average.to("average",Produced.with(Serdes.String(),Serdes.Double());
Please refer more details here :
https://docs.confluent.io/current/streams/developer-guide/dsl-api.html#writing-streams-back-to-kafka

CompletableFutures: processing of CompletableFuture chains in parallel

I am designing an asynchronous call with CompletableFutures. This is a batch call, where I need to process several entities at once. At the end of the call I have to collect information about the status of the processing of every item.
As the input I have an array of ids of those entities. This is a complex entity, I have to place several DAO calls in order to compile an entity into an object. Each of DAO methods return CompletableFuture<PartX>.
I am chaining those DAO calls because if one of the pieces does not exist I won't be able to construct a whole object. Here is how my snippet looks like:
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.stream.Collectors;
import com.google.common.collect.Lists;
public class CfChainsAllOfTest {
private DAO dao = new DAO();
public static void main(String[] args) {
CompletableFuture<Void> resultPrintingCf = new CfChainsAllOfTest().fetchAllInParallelAndCollect(Lists.newArrayList(1l, 2l, 3l)).thenAccept(results -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + results);
});
resultPrintingCf.join();
}
private CompletableFuture<List<Item>> fetchAllInParallelAndCollect(List<Long> ids) {
List<CompletableFuture<Item>> cfs = Lists.newArrayList();
for (Long id : ids) {
// I want this to be an instant non-blocking operation
cfs.add(fetchSingle(id));
System.out.println("[" + Thread.currentThread().getName() + "]" + "After completable future was added to the list, id=" + id);
}
return waitAllOfAndCollect(cfs);
}
private CompletableFuture<Item> fetchSingle(Long id) {
return getPartCAndSetOnItem(new Item(id)).thenCompose(this::getPartBAndSetOnItem).thenCompose(this::getPartAAndSetOnItem);
}
public CompletableFuture<Item> getPartCAndSetOnItem(Item item) {
return dao.getPartC(item.getId()).thenCompose(partC -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartC(partC);
cf.complete(item);
return cf;
});
}
public CompletableFuture<Item> getPartBAndSetOnItem(Item item) {
return dao.getPartB(item.getId()).thenCompose(partB -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartB(partB);
cf.complete(item);
return cf;
});
}
public CompletableFuture<Item> getPartAAndSetOnItem(Item item) {
return dao.getPartA(item.getId()).thenCompose(partA -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartA(partA);
cf.complete(item);
return cf;
});
}
private static <T> CompletableFuture<List<T>> waitAllOfAndCollect(List<CompletableFuture<T>> futures) {
CompletableFuture<Void> allDoneFuture = CompletableFuture.allOf(futures.toArray(new CompletableFuture[futures.size()]));
return allDoneFuture.thenApply(v -> futures.stream().map(future -> future.join()).collect(Collectors.<T> toList()));
}
static class DAO {
public CompletableFuture<PartC> getPartC(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part C from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part C fetched from db for id=" + id);
return new PartC();
});
}
public CompletableFuture<PartB> getPartB(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part B from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part B fetched from db for id=" + id);
return new PartB();
});
}
public CompletableFuture<PartA> getPartA(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part A from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part A fetched from db for id=" + id);
return new PartA();
});
}
}
static class Item {
private final Long id;
private PartA partA;
private PartB partB;
private PartC partC;
public Item(Long id) {
this.id = id;
}
public Long getId() {
return id;
}
public PartA getPartA() {
return partA;
}
public void setPartA(PartA partA) {
this.partA = partA;
}
public PartB getPartB() {
return partB;
}
public void setPartB(PartB partB) {
this.partB = partB;
}
public PartC getPartC() {
return partC;
}
public void setPartC(PartC partC) {
this.partC = partC;
}
#Override
public String toString() {
return "Item [id=" + id + ", partA=" + partA + ", partB=" + partB + ", partC=" + partC + "]";
}
}
static class PartA {
#Override
public String toString() {
return "Part A";
}
}
static class PartB {
#Override
public String toString() {
return "Part B";
}
}
static class PartC {
#Override
public String toString() {
return "Part C";
}
}
}
The problem is that processing for each item is not really done in parallel because of the chaining. It looks like chaining of CompletableFutures is a blocking call. I would expect the chain of CFs to return variable of CompletableFuture<Whole> immediately and only after that start computing the value.
That said, what would be the best way to achieve such behavior? Thanks.
The problem is with this method:
private CompletableFuture<Item> fetchSingle(Long id) {
return getPartCAndSetOnItem(new Item(id)).thenCompose(this::getPartBAndSetOnItem).thenCompose(this::getPartAAndSetOnItem);
}
Basically you are telling: get part C, then get part B, then get part A.
Instead, you should call the 3 methods, then merge the results – though that might not be necessary here due to the way you just store the result on the Item (pay attention to the Java memory model here, as your Item is not synchronized here: it might not work properly for more complex examples).
So, basically:
private CompletableFuture<Item> fetchSingle(Long id) {
Item result = new Item(id);
CompletableFuture<?> c = getPartCAndSetOnItem(result);
CompletableFuture<?> b = getPartBAndSetOnItem(result);
CompletableFuture<?> a = getPartAAndSetOnItem(result);
return CompletableFuture.allOf(a, b, c).thenApply(__ -> result);
}
Of course, the drawback is that you perform all 3 calls even if one fails, but you cannot have your cake and eat it…
As a side note, your getPartXAndSetOnItem() methods can be simplified to
public CompletableFuture<Item> getPartXAndSetOnItem(Item item) {
return dao.getPartX(item.getId()).thenApply(partX -> {
item.setPartX(partX);
return item;
});
}
or, considering we don't care about the actual result type in fetchSingle():
public CompletableFuture<?> getPartXAndSetOnItem(Item item) {
return dao.getPartX(item.getId()).thenRun(item::setPartX);
}

couchbase connect closed when upsert

I have two methods upsert into couchbase. Then I write two Junit tester with springboottest. After one Junit tester completed another test will throw this exception. How to resolve?
There are two upsert methods:I don't know which one methods is better?
public List<RawJsonDocument> upsert2(String generatorId, String idPrefix, List<String> contents)
{
List<RawJsonDocument> rjd = new ArrayList<RawJsonDocument>(contents.size());
Observable.from(contents).flatMap(new Func1<String,Observable<String>>(){
#Override
public Observable<String> call(String t)
{
return bucket.async().counter(generatorId, 1)
.map(jsonLongDocument -> {
String idStr = idPrefix + generatorId + jsonLongDocument.content();
String jsonStr = idStr + "=" + t;
return jsonStr;
});
}}).subscribe(new Action1<String>() {
#Override
public void call(String t)
{
String[] s = t.split("[=]");
LOGGER.debug("\n methord2 generatorId:" + s[0] + "\n content:" + s[1]);
bucket.async().upsert(RawJsonDocument.create(s[0],s[1]));
}});
return rjd;
}
public List<RawJsonDocument> upsert1(String generatorId, String idPrefix, List<String> contents)
{
if(contents == null)
{
return null;
}
List<RawJsonDocument> rjd = new ArrayList<RawJsonDocument>(contents.size());
Observable.from(contents).flatMap(new Func1<String,Observable<RawJsonDocument>>(){
#Override
public Observable<RawJsonDocument> call(String t)
{
return bucket.async().counter(generatorId, 1)
.map(jsonLongDocument -> {
String idStr = idPrefix + generatorId + jsonLongDocument.content();
LOGGER.debug("\n method3 generatorId:" + idStr + "\n content:" + t);
return RawJsonDocument.create(idStr,t);
});
}}).subscribe(new Action1<RawJsonDocument>() {
#Override
public void call(RawJsonDocument t)
{
rjd.add(bucket.async().upsert(t).toBlocking().single());
}});
return rjd;
}
This is my Junit Tester:
#Test
public void testIncrementIds3()
{
assertThat(generatorId.upsert2("counter", "idprefix", Arrays.asList("aabbccdd","ffddeeaa")).size(),is(2));
assertThat(generatorId.upsert1("counter", "idprefix", Arrays.asList("aabbccdd","ffddeeaa")).size(),is(2));
}

Sample request JSON using RxJava and RxApacheHTTP not working

I'm trying to retrieve a JSON object of a request using RxJava!
In my example I have a restful service Java that works perfectly in browser.
//Object Data
#XmlRootElement
public class Person {
private String firstName;
private String lastName;
//getters and setters
}
Restful Java
#Path("/persons")
public class PersonResource {
#GET
#Produces(MediaType.APPLICATION_JSON)
public List<Person> getBandas() {
Person paulo = new Person();
paulo.setFirstName("Paulo Henrique");
paulo.setLastName("Pereira Santana");
//
List<Person> persons = new ArrayList<>();
persons.add(paulo);
return persons;
}
}
as a result (in browser : http://localhost:8080/learn-java-rs/persons) have a JSON object:
{"Person": {
"firstName": "Paulo Henrique",
"lastName": "Pereira Santana"
}
}
I tried to make the same request using RxJava not worked (or did not understand the implementation). Follow:
public class Example {
public static void main(String[] args) {
// TODO Auto-generated method stub
try(CloseableHttpAsyncClient client = HttpAsyncClients.createDefault()) {
client.start();
Observable<Map> requestJson = requestJson(client, "http://localhost:8080/learn-java-rs/persons");
Helpers.subscribePrint(requestJson.map(json -> json.get("firstName") + " " + json.get("lastName")), "person");
} catch (IOException e1) {
e1.printStackTrace();
}
}
private static Map<String, Set<Map<String, Object>>> cache = new ConcurrentHashMap<>();
#SuppressWarnings({"rawtypes","unchecked"})
private static Observable<Map> requestJson(HttpAsyncClient client,String url){
Observable<String> rawResponse=ObservableHttp.createGet(url,client).toObservable().flatMap(resp -> resp.getContent().map(bytes -> new String(bytes,java.nio.charset.StandardCharsets.UTF_8))).retry(5).cast(String.class).map(String::trim).doOnNext(resp -> getCache(url).clear());
Observable<String> objects=rawResponse.filter(data -> data.startsWith("{")).map(data -> "[" + data + "]");
Observable<String> arrays=rawResponse.filter(data -> data.startsWith("["));
Observable<Map> response=arrays.ambWith(objects).map(data -> {
return new Gson().fromJson(data,List.class);
}
).flatMapIterable(list -> list).cast(Map.class).doOnNext(json -> getCache(url).add((Map<String,Object>)json));
return Observable.amb(fromCache(url),response);
}
private static Observable<Map<String, Object>> fromCache(String url) {
return Observable.from(getCache(url)).defaultIfEmpty(null)
.flatMap(json -> (json == null) ? Observable.never() : Observable.just(json))
.doOnNext(json -> json.put("person", true));
}
private static Set<Map<String, Object>> getCache(String url) {
if (!cache.containsKey(url)) {
cache.put(url, new HashSet<Map<String,Object>>());
}
return cache.get(url);
}
Edit
public static <T> Subscription subscribePrint(Observable<T> observable,
String name) {
return observable.subscribe(
(v) -> System.out.println(Thread.currentThread().getName()
+ "|" + name + " : " + v), (e) -> {
System.err.println("Error from " + name + ":");
System.err.println(e);
System.err.println(Arrays
.stream(e.getStackTrace())
.limit(5L)
.map(stackEl -> " " + stackEl)
.collect(Collectors.joining("\n"))
);
}, () -> System.out.println(name + " ended!"));
}
running nothing happens.
Someone could tell me what I'm missing?
Note: I used Rxjava api 1.1.0 and rxapache-http-0.21.0

Categories