How to resolve a promise inside another promise? - java

I have an action which requires to get a list of emails from a remote server. Then I want to use the emails to get a list of emailDomainInformation from another remote server (note that this second piece of info depends on the first). After all this, I want to output data from both servers onto a map and render it onto the page with dust.
I managed to get this to work without the second piece of data by doing it like this:
public static Result index()
{
F.Promise<Email> emailPromise = getEmailPromise(...);
F.Promise<Result> results = emailPromise.map( new F.Function<Email, Result>()
{
public Result apply(Email email)
{
Map<String, Object> data = new HashMap<String, Object>();
data.put("email", email.getAddress());
data.put("domain", email.getDomain());
dustRenderer.render(data);
}
}
async(results);
}
Now, since I want to make an async call to getEmailDomainData(email.getDomain()); inside the emailPromise.map() method. What do I do with the Promise<EmailDomain> object I get back? How do I put that into the data map to pass to the dustRenderer?

Here is an example that essentially does what you need:
public static Result social() {
final F.Promise<WS.Response> twitterPromise = WS.url("http://search.twitter.com/search.json").setQueryParameter("q", "playframework").get();
final F.Promise<WS.Response> githubPromise = WS.url("https://api.github.com/legacy/repos/search/playframework").get();
return async(
twitterPromise.flatMap(
new F.Function<WS.Response, F.Promise<Result>>() {
public F.Promise<Result> apply(final WS.Response twitterResponse) {
return githubPromise.map(
new F.Function<WS.Response, Result>() {
public Result apply(final WS.Response githubResponse) {
return ok(views.html.social.render(twitterResponse.asJson().findValuesAsText("text"), githubResponse.asJson().findValuesAsText("name")));
}
}
);
}
}
)
);
}
In this case the two run in parallel but you could move the second Promise creation into the handler for the first Promise.

Related

In Hazelcast jet how can we store IList to normal list as I have to sent it in Response?

I am new to Hazelcast jet and in my application on data I am doing some aggregation and getting data but I want to send that in rest response so how can I change it to normal list?
public class ResponseMessage<T> {
private T responseClassType;
private ResponseMessage() {}
private ResponseMessage(T t) {
this.responseClassType = t;
}
public static <T> ResponseMessage<T> withResponseData(T classType) {
return new ResponseMessage<T>(classType);
}
public static ResponseMessage<Void> empty() {
return new ResponseMessage<>();
}
public T getResponseClassType() {
return responseClassType;
}
public void setResponseClassType(T responseClassType) {
this.responseClassType = responseClassType;
}
}
This is my generic response class and as below I am sending response after all calculations:
public ResponseMessage<?> runProcess(Pipeline pl) {
Map<String, BatchStage<Object>> allBatch = new HashMap<String,BatchStage<Object>>();
allBatch.put(z.get("id").toString(), new SomeCalulation().readSource(pipeline));
BatchStage<Object> h = allBatch.values().iterator().next();
h.writeTo(Sinks.list("abc"));
IList<Object> abc = jetInstance.getList("abc");
List<Object> result = new ArrayList(abc);
abc.destroy();
return ResponseMessage.withResponseData(result);
}
Now this is working but everytime I call rest request it is increasing the list and if I clear the list it is showing blank records, please help how can I convert it to normal list or best way to send response?
It was not working because I was joining it after method call:
runProcess(pl);
job.join(); // so because I am joining it after runProcess not working but if I directly return ResponseMessage.withResponseData(jetInstance.getList("abc")); and then join it will work.
I don't see submitting the pipeline as a job and waiting for the result (job.join()). I suppose you have omitted this from your code sample.
To solve your issue with empty list simply copy the result before destroying the list:
job.join();
IList<Object> abc = jetInstance.getList("abc");
List<Object> result = new ArrayList(abc)
abc.destroy();
return ResponseMessage.withResponseData(result);
Also, the list should have a unique name for each request, otherwise, multiple requests will write to the same list, having unpredictable results.

Access URL Query Parameters in Lagom

How can I access URL query parameters from the http request in Lagom? I have a requirement where the set of query parameters are indefinite and infinite. I want to access the query parameter as a map. Is there any way to do that?
There isn't currently a way to access the query parameters as a map, or to declare a service call that takes indefinite parameters, as of Lagom 1.3.
In situations where the request may be of arbitrary length or complexity, it is better to encode request data in the entity body and use a request message deserializer in Lagom to map that to an immutable data type.
From the docs:
Query string parameters can also be extracted from the path, using a & separated list after a ? at the end of the path. For example, the following service call uses query string parameters to implement paging:
ServiceCall> getItems(long orderId, int pageNo, int pageSize);
default Descriptor descriptor() {
return named("orders").withCalls(
pathCall("/order/:orderId/items?pageNo&pageSize", this::getItems)
);
}
Check this link for more details.
https://github.com/msdhillon8989/lagom-demo-request-header.git
you can use the HeaderServiceCall of lagom.
#Override
public ServiceCall<NotUsed, String> method1() {
return readHeader(
new Function<String, ServerServiceCall<NotUsed, String>>() {
#Override
public ServerServiceCall<NotUsed, String> apply(String param) throws Exception {
return request -> {
return completedFuture(Utilities.ok(null, parseQueryString(param).toString()));
};
}
});
}
Definition of readHeader function is as below
public <Request, Response> ServerServiceCall<Request, Response> readHeader(Function<String, ServerServiceCall<Request, Response>> serviceCall) {
return HeaderServiceCall.composeAsync(new java.util.function.Function<RequestHeader, CompletionStage<? extends ServerServiceCall<Request, Response>>>() {
#Override
public CompletionStage<? extends ServerServiceCall<Request , Response>> apply(RequestHeader requestHeader) {
CompletableFuture<String> uri = CompletableFuture.supplyAsync(()->requestHeader.uri().getRawQuery().toString());
return uri.thenApply(query->
{
try {
return serviceCall.apply(query);
} catch (Exception e) {
e.printStackTrace();
throw new Forbidden("Bad request "+e.getMessage());
}
}
);
}
});
}

RxAndroid - execute another request based on a specific result?

i am using RxAndroid/RxJava for the first time and trying to figure out how to implement a chain of requests but each next request made is dependent on the result of the other.
example:
private Boolean isUserEligible(){
..
}
private String registerDevice()
..
}
private String login(){
..
}
As far as i know, the Observable can only execute all of the above methods or one by one like below:
// Fetch from both simultaneously
Observable<String> zipped
= Observable.zip(isUserEligible(), registerDevice(),login(), new Func2<String, String, String>() {
});
Observable<String> concatenated = Observable.concat(isUserEligible(), registerDevice(),login());
what if i want to do something like this
//execute usUserEligible first and if eligible, execute registerDevice, else execute login().
Thanks in advance
Assuming all of these methods return observables, you could write:
Observable<String> response = isUserEligible()
.flatMap(isEligible -> isEligible ? registerDevice() : login());
Without retro-lambda; you could write:
Observable<String> response = isUserEligible()
.flatMap(new Func1<Boolean, Observable<String>>() {
public Observable<String> call(final Boolean isEligible) {
return isEligible ? registerDevice() : login();
}
});
This is a use case for a flatmap.
http://reactivex.io/documentation/operators/flatmap.html
Create the mapping from the first result to a second observable, here you can use the result of the first function to input it into the second.
final Func1<Boolean, Observable<String>> registerFunc = isEligible -> {
return registerDevice(isEligible)
};
Now you have to create your chain of calls and flatMaps: do the first call, and flatmap the resulting Observable with the function you just created. This will again return an Observable. you can keep chaining it here with other flatmaps
isUserEligible().flatMap(registerFunc);
Be aware that all your functions need to return Observables to make this possible.

Join two streams using a count-based window

I am new to Flink Streaming API and I want to complete the following simple (IMO) task. I have two streams and I want to join them using count-based windows. The code I have so far is the following:
public class BaselineCategoryEquiJoin {
private static final String recordFile = "some_file.txt";
private static class ParseRecordFunction implements MapFunction<String, Tuple2<String[], MyRecord>> {
public Tuple2<String[], MyRecord> map(String s) throws Exception {
MyRecord myRecord = parse(s);
return new Tuple2<String[], myRecord>(myRecord.attributes, myRecord);
}
}
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment environment = StreamExecutionEnvironment.createLocalEnvironment();
ExecutionConfig config = environment.getConfig();
config.setParallelism(8);
DataStream<Tuple2<String[], MyRecord>> dataStream = environment.readTextFile(recordFile)
.map(new ParseRecordFunction());
DataStream<Tuple2<String[], MyRecord>> dataStream1 = environment.readTextFile(recordFile)
.map(new ParseRecordFunction());
DataStreamSink<Tuple2<String[], String[]>> joinedStream = dataStream1
.join(dataStream)
.where(new KeySelector<Tuple2<String[],MyRecord>, String[]>() {
public String[] getKey(Tuple2<String[], MyRecord> recordTuple2) throws Exception {
return recordTuple2.f0;
}
}).equalTo(new KeySelector<Tuple2<String[], MyRecord>, String[]>() {
public String[] getKey(Tuple2<String[], MyRecord> recordTuple2) throws Exception {
return recordTuple2.f0;
}
}).window(TumblingProcessingTimeWindows.of(Time.seconds(1)))
.apply(new JoinFunction<Tuple2<String[],MyRecord>, Tuple2<String[],MyRecord>, Tuple2<String[], String[]>>() {
public Tuple2<String[], String[]> join(Tuple2<String[], MyRecord> tuple1, Tuple2<String[], MyRecord> tuple2) throws Exception {
return new Tuple2<String[], String[]>(tuple1.f0, tuple1.f0);
}
}).print();
environment.execute();
}
}
My code works without errors, but it does not produce any results. In fact, the call to apply method is never called (verified by adding a breakpoint on debug mode). I think, the main reason for the previous is that my data do not have a time attribute. Therefore, windowing (materialized through window) is not done properly. Therefore, my question is how can I indicate that I want my join to take place based on count-windows. For instance, I want the join to materialize every 100 tuples from each stream. Is the previous feasible in Flink? If yes, what should I change in my code to achieve it.
At this point, I have to inform you that I tried to call the countWindow() method, but for some reason it is not offered by Flink's JoinedStreams.
Thank you
Count-based joins are not supported. You could emulate count-based windows, by using "event-time" semantics and apply a unique seq-id as timestamp to each record. Thus, a time-window of "5" would be effectively a count-window of 5.

RXJava combining multiple subscriptions

So I have a situation I cannot seem to solve at all.
I have a situation where I want to run two network requests in parallel and then run some code at the end of each network request then at the end of the processing of each network request run additional.
Modeled like this
GET -> /users (run unique code to this request independently once the request is done)
GET -> /groups (run some unique code to this request independently once the request is done)
Both requests are done, now run some unique code independent of the request processing.
I've been trying to do a Observable.merge but that seems hopefully, as it won't allow me to keep the subscription code separate from one massive handler. Does anyone have a suggestion?
One of a options is to use map for doing extra work on each response and then zip to join results; see example:
//this emulates the first network call
Observable<List<String>> o1 = Observable.just(Arrays.asList("user1", "user2"));
//when the data arrives, you may transform it
Observable<List<String>> m1 = o1.map(new Func1<List<String>, List<String>>() {
#Override
public List<String> call(List<String> users) {
return users;
}
});
//and the same for the second network call
Observable<List<String>> o2 = Observable.just(Arrays.asList("group1", "group2"));
Observable<List<String>> m2 = o2.map(new Func1<List<String>, List<String>>() {
#Override
public List<String> call(List<String> groups) {
return groups;
}
});
//when both network calls succeed you can merge results using zip method
Observable<Map<String, List<String>>> result = Observable.zip(m1, m2, new Func2<List<String>, List<String>, Map<String, List<String>>>() {
#Override
public Map<String, List<String>> call(List<String> users, List<String> groups) {
Map<String, List<String>> result = new HashMap<String, List<String>>();
for(String user: users){
result.put(user, groups);
}
return result;
}
});
/// now you can return the result
/// finally you have to subscibe to get the results, e.g:
result.subscribe(new Action1<Map<String, List<String>>>() {
#Override
public void call(Map<String, List<String>> stringListMap) {
for(String user: stringListMap.keySet()){
System.out.println("User :"+user+", groups :"+stringListMap.get(user));
}
}
});

Categories