I am designing an asynchronous call with CompletableFutures. This is a batch call, where I need to process several entities at once. At the end of the call I have to collect information about the status of the processing of every item.
As the input I have an array of ids of those entities. This is a complex entity, I have to place several DAO calls in order to compile an entity into an object. Each of DAO methods return CompletableFuture<PartX>.
I am chaining those DAO calls because if one of the pieces does not exist I won't be able to construct a whole object. Here is how my snippet looks like:
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.stream.Collectors;
import com.google.common.collect.Lists;
public class CfChainsAllOfTest {
private DAO dao = new DAO();
public static void main(String[] args) {
CompletableFuture<Void> resultPrintingCf = new CfChainsAllOfTest().fetchAllInParallelAndCollect(Lists.newArrayList(1l, 2l, 3l)).thenAccept(results -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + results);
});
resultPrintingCf.join();
}
private CompletableFuture<List<Item>> fetchAllInParallelAndCollect(List<Long> ids) {
List<CompletableFuture<Item>> cfs = Lists.newArrayList();
for (Long id : ids) {
// I want this to be an instant non-blocking operation
cfs.add(fetchSingle(id));
System.out.println("[" + Thread.currentThread().getName() + "]" + "After completable future was added to the list, id=" + id);
}
return waitAllOfAndCollect(cfs);
}
private CompletableFuture<Item> fetchSingle(Long id) {
return getPartCAndSetOnItem(new Item(id)).thenCompose(this::getPartBAndSetOnItem).thenCompose(this::getPartAAndSetOnItem);
}
public CompletableFuture<Item> getPartCAndSetOnItem(Item item) {
return dao.getPartC(item.getId()).thenCompose(partC -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartC(partC);
cf.complete(item);
return cf;
});
}
public CompletableFuture<Item> getPartBAndSetOnItem(Item item) {
return dao.getPartB(item.getId()).thenCompose(partB -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartB(partB);
cf.complete(item);
return cf;
});
}
public CompletableFuture<Item> getPartAAndSetOnItem(Item item) {
return dao.getPartA(item.getId()).thenCompose(partA -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartA(partA);
cf.complete(item);
return cf;
});
}
private static <T> CompletableFuture<List<T>> waitAllOfAndCollect(List<CompletableFuture<T>> futures) {
CompletableFuture<Void> allDoneFuture = CompletableFuture.allOf(futures.toArray(new CompletableFuture[futures.size()]));
return allDoneFuture.thenApply(v -> futures.stream().map(future -> future.join()).collect(Collectors.<T> toList()));
}
static class DAO {
public CompletableFuture<PartC> getPartC(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part C from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part C fetched from db for id=" + id);
return new PartC();
});
}
public CompletableFuture<PartB> getPartB(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part B from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part B fetched from db for id=" + id);
return new PartB();
});
}
public CompletableFuture<PartA> getPartA(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part A from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part A fetched from db for id=" + id);
return new PartA();
});
}
}
static class Item {
private final Long id;
private PartA partA;
private PartB partB;
private PartC partC;
public Item(Long id) {
this.id = id;
}
public Long getId() {
return id;
}
public PartA getPartA() {
return partA;
}
public void setPartA(PartA partA) {
this.partA = partA;
}
public PartB getPartB() {
return partB;
}
public void setPartB(PartB partB) {
this.partB = partB;
}
public PartC getPartC() {
return partC;
}
public void setPartC(PartC partC) {
this.partC = partC;
}
#Override
public String toString() {
return "Item [id=" + id + ", partA=" + partA + ", partB=" + partB + ", partC=" + partC + "]";
}
}
static class PartA {
#Override
public String toString() {
return "Part A";
}
}
static class PartB {
#Override
public String toString() {
return "Part B";
}
}
static class PartC {
#Override
public String toString() {
return "Part C";
}
}
}
The problem is that processing for each item is not really done in parallel because of the chaining. It looks like chaining of CompletableFutures is a blocking call. I would expect the chain of CFs to return variable of CompletableFuture<Whole> immediately and only after that start computing the value.
That said, what would be the best way to achieve such behavior? Thanks.
The problem is with this method:
private CompletableFuture<Item> fetchSingle(Long id) {
return getPartCAndSetOnItem(new Item(id)).thenCompose(this::getPartBAndSetOnItem).thenCompose(this::getPartAAndSetOnItem);
}
Basically you are telling: get part C, then get part B, then get part A.
Instead, you should call the 3 methods, then merge the results – though that might not be necessary here due to the way you just store the result on the Item (pay attention to the Java memory model here, as your Item is not synchronized here: it might not work properly for more complex examples).
So, basically:
private CompletableFuture<Item> fetchSingle(Long id) {
Item result = new Item(id);
CompletableFuture<?> c = getPartCAndSetOnItem(result);
CompletableFuture<?> b = getPartBAndSetOnItem(result);
CompletableFuture<?> a = getPartAAndSetOnItem(result);
return CompletableFuture.allOf(a, b, c).thenApply(__ -> result);
}
Of course, the drawback is that you perform all 3 calls even if one fails, but you cannot have your cake and eat it…
As a side note, your getPartXAndSetOnItem() methods can be simplified to
public CompletableFuture<Item> getPartXAndSetOnItem(Item item) {
return dao.getPartX(item.getId()).thenApply(partX -> {
item.setPartX(partX);
return item;
});
}
or, considering we don't care about the actual result type in fetchSingle():
public CompletableFuture<?> getPartXAndSetOnItem(Item item) {
return dao.getPartX(item.getId()).thenRun(item::setPartX);
}
Related
I have two methods upsert into couchbase. Then I write two Junit tester with springboottest. After one Junit tester completed another test will throw this exception. How to resolve?
There are two upsert methods:I don't know which one methods is better?
public List<RawJsonDocument> upsert2(String generatorId, String idPrefix, List<String> contents)
{
List<RawJsonDocument> rjd = new ArrayList<RawJsonDocument>(contents.size());
Observable.from(contents).flatMap(new Func1<String,Observable<String>>(){
#Override
public Observable<String> call(String t)
{
return bucket.async().counter(generatorId, 1)
.map(jsonLongDocument -> {
String idStr = idPrefix + generatorId + jsonLongDocument.content();
String jsonStr = idStr + "=" + t;
return jsonStr;
});
}}).subscribe(new Action1<String>() {
#Override
public void call(String t)
{
String[] s = t.split("[=]");
LOGGER.debug("\n methord2 generatorId:" + s[0] + "\n content:" + s[1]);
bucket.async().upsert(RawJsonDocument.create(s[0],s[1]));
}});
return rjd;
}
public List<RawJsonDocument> upsert1(String generatorId, String idPrefix, List<String> contents)
{
if(contents == null)
{
return null;
}
List<RawJsonDocument> rjd = new ArrayList<RawJsonDocument>(contents.size());
Observable.from(contents).flatMap(new Func1<String,Observable<RawJsonDocument>>(){
#Override
public Observable<RawJsonDocument> call(String t)
{
return bucket.async().counter(generatorId, 1)
.map(jsonLongDocument -> {
String idStr = idPrefix + generatorId + jsonLongDocument.content();
LOGGER.debug("\n method3 generatorId:" + idStr + "\n content:" + t);
return RawJsonDocument.create(idStr,t);
});
}}).subscribe(new Action1<RawJsonDocument>() {
#Override
public void call(RawJsonDocument t)
{
rjd.add(bucket.async().upsert(t).toBlocking().single());
}});
return rjd;
}
This is my Junit Tester:
#Test
public void testIncrementIds3()
{
assertThat(generatorId.upsert2("counter", "idprefix", Arrays.asList("aabbccdd","ffddeeaa")).size(),is(2));
assertThat(generatorId.upsert1("counter", "idprefix", Arrays.asList("aabbccdd","ffddeeaa")).size(),is(2));
}
I would like to perform some operations on stream, and then split stream into two streams, and then process them separately.
Example to show problem:
Flowable<SuccessfulObject> stream = Flowable.fromArray(
new SuccessfulObject(true, 0),
new SuccessfulObject(false, 1),
new SuccessfulObject(true, 2));
stream = stream.doOnEach(System.out::println);
Flowable<SuccessfulObject> successful = stream.filter(SuccessfulObject::isSuccess);
Flowable<SuccessfulObject> failed = stream.filter(SuccessfulObject::isFail);
successful.doOnEach(successfulObject -> {/*handle success*/}).subscribe();
failed.doOnEach(successfulObject -> {/*handle fail*/}).subscribe();
Class:
class SuccessfulObject {
private boolean success;
private int id;
public SuccessfulObject(boolean success, int id) {
this.success = success;
this.id = id;
}
public boolean isSuccess() {
return success;
}
public boolean isFail() {
return !success;
}
public void setSuccess(boolean success) {
this.success = success;
}
#Override
public String toString() {
return "SuccessfulObject{" +
"id=" + id +
'}';
}
}
But this code prints all elements twice whereas I would like to perform all operations before splitting only once.
Output:
OnNextNotification[SuccessfulObject{id=0}]
OnNextNotification[SuccessfulObject{id=1}]
OnNextNotification[SuccessfulObject{id=2}] OnCompleteNotification
OnNextNotification[SuccessfulObject{id=0}]
OnNextNotification[SuccessfulObject{id=1}]
OnNextNotification[SuccessfulObject{id=2}] OnCompleteNotification
How can I process the stream to receive this behaviour?
Use publish to share a subscription to the source:
Flowable<Integer> source = Flowable.range(1, 5);
ConnectableFlowable<Integer> cf = source.publish();
cf.filter(v -> v % 2 == 0).subscribe(v -> System.out.println("Even: " + v));
cf.filter(v -> v % 2 != 0).subscribe(v -> System.out.println("Odd: " + v));
cf.connect();
I'm trying to retrieve a JSON object of a request using RxJava!
In my example I have a restful service Java that works perfectly in browser.
//Object Data
#XmlRootElement
public class Person {
private String firstName;
private String lastName;
//getters and setters
}
Restful Java
#Path("/persons")
public class PersonResource {
#GET
#Produces(MediaType.APPLICATION_JSON)
public List<Person> getBandas() {
Person paulo = new Person();
paulo.setFirstName("Paulo Henrique");
paulo.setLastName("Pereira Santana");
//
List<Person> persons = new ArrayList<>();
persons.add(paulo);
return persons;
}
}
as a result (in browser : http://localhost:8080/learn-java-rs/persons) have a JSON object:
{"Person": {
"firstName": "Paulo Henrique",
"lastName": "Pereira Santana"
}
}
I tried to make the same request using RxJava not worked (or did not understand the implementation). Follow:
public class Example {
public static void main(String[] args) {
// TODO Auto-generated method stub
try(CloseableHttpAsyncClient client = HttpAsyncClients.createDefault()) {
client.start();
Observable<Map> requestJson = requestJson(client, "http://localhost:8080/learn-java-rs/persons");
Helpers.subscribePrint(requestJson.map(json -> json.get("firstName") + " " + json.get("lastName")), "person");
} catch (IOException e1) {
e1.printStackTrace();
}
}
private static Map<String, Set<Map<String, Object>>> cache = new ConcurrentHashMap<>();
#SuppressWarnings({"rawtypes","unchecked"})
private static Observable<Map> requestJson(HttpAsyncClient client,String url){
Observable<String> rawResponse=ObservableHttp.createGet(url,client).toObservable().flatMap(resp -> resp.getContent().map(bytes -> new String(bytes,java.nio.charset.StandardCharsets.UTF_8))).retry(5).cast(String.class).map(String::trim).doOnNext(resp -> getCache(url).clear());
Observable<String> objects=rawResponse.filter(data -> data.startsWith("{")).map(data -> "[" + data + "]");
Observable<String> arrays=rawResponse.filter(data -> data.startsWith("["));
Observable<Map> response=arrays.ambWith(objects).map(data -> {
return new Gson().fromJson(data,List.class);
}
).flatMapIterable(list -> list).cast(Map.class).doOnNext(json -> getCache(url).add((Map<String,Object>)json));
return Observable.amb(fromCache(url),response);
}
private static Observable<Map<String, Object>> fromCache(String url) {
return Observable.from(getCache(url)).defaultIfEmpty(null)
.flatMap(json -> (json == null) ? Observable.never() : Observable.just(json))
.doOnNext(json -> json.put("person", true));
}
private static Set<Map<String, Object>> getCache(String url) {
if (!cache.containsKey(url)) {
cache.put(url, new HashSet<Map<String,Object>>());
}
return cache.get(url);
}
Edit
public static <T> Subscription subscribePrint(Observable<T> observable,
String name) {
return observable.subscribe(
(v) -> System.out.println(Thread.currentThread().getName()
+ "|" + name + " : " + v), (e) -> {
System.err.println("Error from " + name + ":");
System.err.println(e);
System.err.println(Arrays
.stream(e.getStackTrace())
.limit(5L)
.map(stackEl -> " " + stackEl)
.collect(Collectors.joining("\n"))
);
}, () -> System.out.println(name + " ended!"));
}
running nothing happens.
Someone could tell me what I'm missing?
Note: I used Rxjava api 1.1.0 and rxapache-http-0.21.0
It so happens that I need to support in Java JSON data coming from external data sources. There is one common pattern. It's an array containing fixed number of elements of certain different types. We call it tuple. Here is my example of de-serialization for 3-element tuple with particular expected types of elements using FasterXML Jackson:
public class TupleTest {
public static void main(String[] args) throws Exception {
String person = "{\"name\":\"qqq\",\"age\":35,\"address\":\"nowhere\",\"phone\":\"(555)555-5555\",\"email\":\"super#server.com\"}";
String jsonText = "[[" + person + ",[" + person + "," + person + "],{\"index1\":" + person + ",\"index2\":" + person + "}]]";
ObjectMapper om = new ObjectMapper().registerModule(new TupleModule());
List<FixedTuple3> data = om.readValue(jsonText, new TypeReference<List<FixedTuple3>>() {});
System.out.println("Deserialization result: " + data);
System.out.println("Serialization result: " + om.writeValueAsString(data));
}
}
class Person {
public String name;
public Integer age;
public String address;
public String phone;
public String email;
#Override
public String toString() {
return "Person{name=" + name + ", age=" + age + ", address=" + address
+ ", phone=" + phone + ", email=" + email + "}";
}
}
class FixedTuple3 {
public Person e1;
public List<Person> e2;
public Map<String, Person> e3;
#Override
public String toString() {
return "Tuple[" + e1 + ", " + e2 + ", " + e3 + "]";
}
}
class TupleModule extends SimpleModule {
public TupleModule() {
super(TupleModule.class.getSimpleName(), new Version(1, 0, 0, null, null, null));
setSerializers(new SimpleSerializers() {
#Override
public JsonSerializer<?> findSerializer(SerializationConfig config,
JavaType type, BeanDescription beanDesc) {
if (isTuple(type.getRawClass()))
return new TupleSerializer();
return super.findSerializer(config, type, beanDesc);
}
});
setDeserializers(new SimpleDeserializers() {
#Override
public JsonDeserializer<?> findBeanDeserializer(JavaType type,
DeserializationConfig config, BeanDescription beanDesc) throws JsonMappingException {
Class<?> rawClass = type.getRawClass();
if (isTuple(rawClass))
return new TupleDeserializer(rawClass);
return super.findBeanDeserializer(type, config, beanDesc);
}
});
}
private boolean isTuple(Class<?> rawClass) {
return rawClass.equals(FixedTuple3.class);
}
public static class TupleSerializer extends JsonSerializer<Object> {
public void serialize(Object value, JsonGenerator jgen, SerializerProvider provider) throws IOException, JsonProcessingException {
try {
jgen.writeStartArray();
for (int i = 0; i < 3; i++) {
Field f = value.getClass().getField("e" + (i + 1));
Object res = f.get(value);
jgen.getCodec().writeValue(jgen, res);
}
jgen.writeEndArray();
} catch (Exception ex) {
throw new IllegalStateException(ex);
}
}
}
public static class TupleDeserializer extends JsonDeserializer<Object> {
private Class<?> retClass;
public TupleDeserializer(Class<?> retClass) {
this.retClass = retClass;
}
public Object deserialize(JsonParser p, DeserializationContext ctx) throws IOException, JsonProcessingException {
try {
Object res = retClass.newInstance();
if (!p.isExpectedStartArrayToken()) {
throw new JsonMappingException("Tuple array is expected but found " + p.getCurrentToken());
}
JsonToken t = p.nextToken();
for (int i = 0; i < 3; i++) {
final Field f = res.getClass().getField("e" + (i + 1));
TypeReference<?> tr = new TypeReference<Object>() {
#Override
public Type getType() {
return f.getGenericType();
}
};
Object val = p.getCodec().readValue(p, tr);
f.set(res, val);
}
t = p.nextToken();
if (t != JsonToken.END_ARRAY)
throw new IOException("Unexpected ending token in tuple deserializer: " + t.name());
return res;
} catch (IOException ex) {
throw ex;
} catch (Exception ex) {
throw new IllegalStateException(ex);
}
}
}
}
But this approach means I have to make new class every time I face new type configuration in tuple of certain size. So I wonder if there is any way to define deserializer dealing with generic typing. So that it will be enough to have one tuple class per tuple size. For instance my generic tuple of size 3 could be defined like:
class Tuple3 <T1, T2, T3> {
public T1 e1;
public T2 e2;
public T3 e3;
#Override
public String toString() {
return "Tuple[" + e1 + ", " + e2 + ", " + e3 + "]";
}
}
And usage of it would look like:
List<Tuple3<Person, List<Person>, Map<String, Person>>> data =
om.readValue(jsonText,
new TypeReference<List<Tuple3<Person, List<Person>, Map<String, Person>>>>() {});
Is it something doable or not?
Ok. So... there may be a simpler way to do "tuple"-style. You can actually annotate POJOs to be serialized as arrays:
#JsonFormat(shape=JsonFormat.Shape.ARRAY)
#JsonPropertyOrder({ "name", "age" }) // or use "alphabetic"
public class POJO {
public String name;
public int age;
}
and if so, they'll get written as arrays, read from arrays.
But if you do what to handle custom generic types, you probably need to get type parameters resolved. This can be done using TypeFactory, method findTypeParameters(...). While this may seem superfluous, it is needed for general case if you sub-type (if not, JavaType actually has accessors for direct type parameters).
Yes, you must use Reflection to get ALL FIELDS, not to get the known fields by name.
When reading up on the play2 documentation I found this:
Because of the way Play 2.0 works, action code must be as fast as
possible (i.e. non blocking). So what should we return as result if we
are not yet able to compute it? The response should be a promise of a
result!
Wow! This of course made me interested in playakka and akka.
I'm currently building an autocomplete application that is integrating with elasticsearch,
so this would be a perfect fit!
Controller:
public class AutoComplete extends Controller {
#BodyParser.Of(value = BodyParser.Json.class)
public static Result complete(final String term) {
F.Promise<List<String>> list = Akka.future(new Callable<List<String>>() {
public List<String> call() throws Exception {
List<String> list = IndexService.find(term);
return list;
}
});
return async(list.map(new F.Function<List<String>, Result>() {
#Override
public Result apply(List<String> list) throws Throwable {
return ok(Json.toJson(list));
}
}));
}
Service:
public static List<String> find(final String term) {
IndexQuery < SearchWord > query = SearchWord.find.query();
query.setQuery("{\n" +
" \"bool\": {\n" +
" \"should\": [\n" +
" {\n" +
" \"text\": {\n" +
" \"search_word.ngrams\": {\n" +
" \"operator\": \"and\",\n" +
" \"query\": \""+term+"\"\n" +
" }\n" +
" }\n" +
" },\n" +
" {\n" +
" \"text\": {\n" +
" \"search_word.full\": {\n" +
" \"boost\": 1,\n" +
" \"query\": \""+term+"\"\n" +
" }\n" +
" }\n" +
" }\n" +
" ]\n" +
" }\n" +
"}");
IndexResults<SearchWord> indexResults = SearchWord.find.search(query);
List<String> list = new ArrayList<String>();
for(SearchWord word : indexResults.getResults()){
list.add(word.getWord());
}
return list;
}
}
SearchWord:
#IndexType(name = "search_word")
public class SearchWord extends Index {
// Find method static for request
public static Index.Finder<SearchWord> find = new Index.Finder<SearchWord>(SearchWord.class);
public enum WordType {
NAME,
STRONG_SEARCH_WORD,
WEAK_SEARCH_WORD,
BANNED
}
private String word;
private WordType wordType;
public SearchWord() {
}
public SearchWord(IndexWord indexWord) {
super.id = ""+indexWord.getId();
this.word = StringUtils.lowerCase(indexWord.getWord());
this.wordType = WordType.valueOf(indexWord.getType());
}
public String getId() {
return super.id;
}
public void setId(String id) {
super.id = id;
}
public String getWord() {
return word;
}
public void setWord(String word) {
this.word = word;
}
public WordType getWordType() {
return wordType;
}
public void setWordType(WordType wordType) {
this.wordType = wordType;
}
#Override
public Map toIndex() {
HashMap map = new HashMap();
map.put("id", super.id);
map.put("word", word);
map.put("word_type", wordType.toString());
return map;
}
#Override
public Indexable fromIndex(Map map) {
if (map == null) {
return this;
}
this.word = (String) map.get("word");
this.wordType = WordType.valueOf((String)map.get("word_type"));
return this;
}
}
The code works very well but I must say that I'm not that sure that I have implemented this correctly. I'm really struggling to understand the documentation.
So my questions are basically:
Have I implemented the Future and Promise correctly?
Would it be better to create a custom actor, and in that actor perform the index
search, like the example in the docs:
=====
return async(
Akka.asPromise(ask(myActor,"hello", 1000)).map(
new Function<Object,Result>() {
public Result apply(Object response) {
return ok(response.toString());
}
}
)
);
Maybe you have some great example that I have not found yet?
AFAIK, your code is totally ok.
I may be wrong, but I think that the second option is strictly equivalent to the first one, since the Akka.future() method is a wrapper around the Akka.promise() method.
From the Akka class source code of Play 2.0.4:
/**
* Executes a block of code asynchronously in the application Akka Actor system.
*/
public static <T> Promise<T> future(java.util.concurrent.Callable<T> callable) {
return asPromise(akka.dispatch.Futures.future(callable, system().dispatcher()));
}
Although you have correctly implemented the Promise and Future, i wouldn't consider this code to be "non-blocking"...
It seems that the blocking call is
List<String> list = IndexService.find(term);
and although this is now wrapped in a promise/future, it is still a blocking call...
If you want to be truly non-blocking (with all its benefits), you'll have to make your data access (queries) non-blocking...
Oh, and a non-blocking action method should return a Promise of a Result, not a Result...
This is how i should write your code:
#BodyParser.Of(value = BodyParser.Json.class)
public static F.Promise<Result> complete(final String term) {
scala.concurrent.Future<List<String>> listFuture = IndexService.find(term);
F.Promise<List<String>> listPromise = F.Promise.wrap(listFuture);
return listPromise.map(new F.Function<List<String>, Result>() {
#Override
public Result apply(List<String> list) throws Throwable {
return ok(Json.toJson(list));
}
});
}
Hope this helps!