Sync methods using rxjava - java

I have these following scenarios -
Say I have 2 async callback(imagine I am calling 2 diff apis) methods callback1(Data d1) and callback2(Data d2), based on d1 and d2 (i.e when both the callback methods are being called)I have to call a method say setupUI(), How to efficiently do that using RxJava
There are two viewpagers v1 and v2 which needs to be synced, i.e on v1 pagechange v2 will also change its current page (indices will be the same) and vice-versa, using Rxjava

Try putting subjects in the callbacks to convert them into rx streams.
You can then zip the two subjects up and subscribe to the zipped observable to get the result of both callbacks at the same time
Example: lets make two subjects
PublishSubject<Data> subject1 = PublishSubject.create();
PublishSubject<Data> subject2 = PublishSubject.create();
We can use these to convert our callbacks into something we can subscribe to like this:
public void callback1(Data d1) {
subject1.onNext(d1);
}
public void callback2(Data d2) {
subject2.onNext(d2);
}
Now we can get the output when they both emit something like this:
class DataDto {
Data data1;
Data data2;
DataDto(Data data1, Data data2) {
this.data1 = data1;
this.data2 = data2;
}
}
public void main() {
Observable.zip(subject1, subject2, new BiFunction<Data, Data, DataDto>() {
#Override
public DataDto apply(#NonNull Data data1, #NonNull Data data2) throws Exception {
return new DataDto(data1, data2);
}
}).subscribe(new Consumer<DataDto>() {
#Override
public void accept(#NonNull DataDto dataDto) throws Exception {
//do something
}
});
}
What zip does is that it waits until both streams have emitted, then emits that as one item.
Here we made a DataDto which contains both data1 and data2
Hope this helps

Related

Java 8: Input a list of functional Interfaces and call them dynamically after .stream()

I have the following method:
public void caller(){
List<Class1> data1 = Arrays.asList(new Class1(), new Class1() ...);
List<Class2> data2 = Arrays.asList(new Class2(), new Class2() ...);
// The following is what I'm trying to implement:
List<BiConsumer<Class1, Double>> peeks1 = Arrays.asList(Class1::setOneNum, Class1::setAnotherNum, Class1:: setMoreNum);
List<BiConsumer<Class2, Double>> peeks2 = Arrays.asList(Class2::setSomeNum1, Class2::setSomeNum2);
helper(data1, peeks1);
helper(data2, peeks2);
...
}
private <T> List<T> helper(List<T> data, List<BiConsumer<T, Double>> peeks) {
for(BiConsumer<T, Double> singlePeek: peeks){
data = data.stream()
.peek(a -> singlePeek.accept(a, math.random()))
.collect(Collectors.toList());
}
return data;
}
There are other implementation in common for Class1 and Class2, the only difference are the methods called after the .stream() which is why I'm trying to "merge" the functions into one helper.
Where BiConsumer is a setter. I want to call a list of setters after stream(). But I cannot input a list of functional interface into helper() (what I tried was Arrays.asList(Class1::setNum, Class1::setAnotherNum, Class1::setMoreNum) won't work as an input since Array.asList() only accepts Object). So is there any work-around? Thanks!
#user7 Thanks for pointing it out. I was careless but I've fixed the "typo". And added the caller function.
You have to specify the target type, when you call the .asList method:
Arrays.<BiConsumer<Object, Double>>asList(Class1::setOneNum, ...)
Update:
According to the updated code of the question the result of Arrays.asList is not directly handed over to the helper method, so no explicit typing is required.
The only possible reasons left why the code is not working are:
At least one of the methods (setOneNum, setSomeNum1, ...) has wrong parameters types
At least one of the methods is not static
Could I advise you in trying to make it a little bit more functional?
For your code consider the following helper, this one will make use of function as a first class citizen concept and make some High Order Functions:
private <T, V> Function<Supplier<T>, Supplier<T>> helper(Supplier<V> v,
BiConsumer<T, V> bc) {
return (Supplier<T> r) -> {
bc.accept(r.get(), v.get());
return r;
};
}
This helper function expects a Supplier of some value kind of value and a BiConsumer that will be your setter function. The returns is a function of Suppliers of the same class you are working with.
With that we can make something like a pipe operator of functional languages. Their premises is that the data should processed in a pipeline operation.
List<Class1> data1 = Arrays.asList(new Class1(), new Class1());
List<Class2> data2 = Arrays.asList(new Class2(), new Class2());
Supplier<Double> random = () -> Math.random();
This will be our data, you have the same array and now a Supplier with the random value you want.
Now lets compose our pipeline with andThem:
data1.stream()//
.forEach(data -> {
helper(random, Class1::setOneNum)//
.andThen(helper(random, Class1::setAnotherNum))//
.andThen(helper(random, Class1::setMoreNum))//
.apply(() -> data);
System.out.println(data.toString());
});
data2.stream()//
.forEach(data -> {
helper(random, Class2::setSomeNum1)//
.andThen(helper(random, Class2::setSomeNum2))//
.apply(() -> data);
System.out.println(data.toString());
});
As you can see the helper function can be chained together with "andThem" method of Function interface. This will make Java execute the helper function and use it's return as the parameter of the next Function.
The data parameter will hole the values of classes and will be changed each chain. As we iterated all objects will
And the result:
Class1 [oneNum=0,047, anotherNum=0,482, moreNum=0,339]
Class1 [oneNum=0,131, anotherNum=0,889, moreNum=0,411]
Class2 [someNum1=0,18, someNum2=0,004]
Class2 [someNum1=0,497, someNum2=0,702]
I think it is the same result you want. And as you can see you don't need to pass any generics as the Java will understand it well.
The classes that I made for reference:
class Class1 {
double oneNum;
double anotherNum;
double moreNum;
public double getOneNum() {
return oneNum;
}
public void setOneNum(double oneNum) {
this.oneNum = oneNum;
}
public double getAnotherNum() {
return anotherNum;
}
public void setAnotherNum(double anotherNum) {
this.anotherNum = anotherNum;
}
public double getMoreNum() {
return moreNum;
}
public void setMoreNum(double moreNum) {
this.moreNum = moreNum;
}
#Override
public String toString() {
return MessageFormat.format("Class1 [oneNum={0}, anotherNum={1}, moreNum={2}]", oneNum, anotherNum, moreNum);
}
}
class Class2 {
double someNum1;
double someNum2;
public double getSomeNum1() {
return someNum1;
}
public void setSomeNum1(double someNum1) {
this.someNum1 = someNum1;
}
public double getSomeNum2() {
return someNum2;
}
public void setSomeNum2(double someNum2) {
this.someNum2 = someNum2;
}
#Override
public String toString() {
return MessageFormat.format("Class2 [someNum1={0}, someNum2={1}]", someNum1, someNum2);
}
}

What is the best way to send a collection of objects over the Event Bus in Vertx?

I have a handler that serves HTTP requests at a given endpoint. The handler messages a verticle via the event bus that makes some external paginated REST calls, aggregates the results, and returns the results back to the handler. The result of the paginated REST calls is represented as a List of custom objects. If I just try to send the List itself, Vertx throws an exception complaining that it can't find a codec for java.util.ArrayList.
I'm trying to find the "best" -- meaning the easiest, most efficient, and most readable/maintainable -- way in Vertx to send a list of these objects back across the event bus to my handler. These are the options I know of and have tried so far, are there better ways to achieve this?
Serialize list to JSON and store in a JsonObject. This requires an explicit serialization/deserialization on either end which seems unnecessary:
// Verticle
List<CustomObject> result = method();
JsonObject data = new JsonObject();
data.put("result", Json.encode(result));
msg.reply(data);
// Handler
String serializedList = body.getString("result");
List<CustomObject> list = objectMapper.readValue(serializedList, new TypeReference<List<CustomObject>>(){});
Define a message codec for ArrayList<CustomObject>. In theory I believe this would work, but all the examples I've seen online for message codecs are always about creating a codec for a single object, and I'm not entirely sure if this would work for collections.
Is there a simpler method that fits my use case that I'm unaware of? Thanks!
Sorry for a lengthy example, but here you go:
public class EventBusHolder {
public static void main(String[] args) {
Vertx vertx = Vertx.vertx();
vertx.eventBus().registerDefaultCodec(Holder.class, new HolderCodec());
vertx.deployVerticle(new SomeVerticle(), (r) -> {
vertx.eventBus().send("custom", new Holder(new CustomObject("a")));
});
}
}
class HolderCodec implements MessageCodec<Holder, Holder> {
#Override
public void encodeToWire(Buffer buffer, Holder holder) {
}
#Override
public Holder decodeFromWire(int pos, Buffer buffer) {
return null;
}
#Override
public Holder transform(Holder holder) {
return holder;
}
#Override
public String name() {
return "HolderCodec";
}
#Override
public byte systemCodecID() {
return -1;
}
}
class SomeVerticle extends AbstractVerticle {
#Override
public void start() {
vertx.eventBus().consumer("custom", (msg) -> {
System.out.println(msg.body());
});
}
}
class CustomObject {
public String name;
public CustomObject(String name) {
this.name = name;
}
#Override
public String toString() {
return "CustomObject{" +
"name='" + name + '\'' +
'}';
}
}
final class Holder {
#Override
public String toString() {
return "Holder{" +
"data=" + data +
'}';
}
private final List<CustomObject> data;
public Holder(final CustomObject... data) {
this.data = Arrays.asList(data);
}
public List<CustomObject> getData() {
return data;
}
}
Take note that encodeToWire and decodeFromWire are not implemented. They aren't invoked for local messages.
Having this Holder object is an easy way to get around type erasure on the JVM.

Data Extraction from JMeter SampleResult Response

In the snippet shown below data and data1 are set from different JMeter SampleResult Responses. The challenge which I am facing is during the processing of value1, I require the data from value which is present in another class.
The value is coming from the response of JMeter SampleResult(say 1), whereas the data1 is coming from the response of JMeter SampleResult(say 2).
I am also using a validate file for BeanShell assertions which only processes the response of JMeter SampleResult 2 for validation purposes.
How can I get the data from value to use it for further calculations of value1?
Class C is an abstract class
class A extends C {
#Override
public String processValue() {
****Some code written here****
value = getValue();
****Calculation of result done here****
return result;
}
#Override
public void setData(Object data) {
this.data=(typecast)data;
}
private String getValue() {
****logic written here****
return value;
}
}
value1 requires value from Class A for it's processing
class B extends C {
#Override
public String processValue() {
****Some code written here****
return value1;
}
#Override
public void setData(Object data1) {
this.data1=(typecast)data1;
}
}
data1 and data are typecasted into different types
In JMeter you can put different objects in JMeterVariables as put:
JMeterVariables vars = JMeterContextService.getContext().getVariables();
vars.putObject("data1", data1);
vars.putObject("data", data);
and get:
vars.getObject("data1");
vars.getObject("data");

Nested network requests with RxJava

I'm very new to RxJava. I have a problem with converting a nested async operation into RxJava structure. Having a single async task that fetches data from the server has not been a problem to create, however I do have a problem with an exemplary case of this sort:
List<A> aaa = new ArrayList<>();
List<B> bbb = new ArrayList<>();
new FetchItemA(String id){
#Override
protected void onPostExecute(List<A> items){
foreach(A item:items){
new FetchItemB(item.getId())
#Override
protected void onPostExecute(List<B> newItems){
neededList.addAll(newItems);
}
}
}
}
}
The problem is with the return types. I've created my observable this way:
Observable.fromArray(String userId)
.map(new Function(String, List<A>){
#Override
public List<A> apply(String id){
return getListA();
}
})
.map(new Function<String, List<B>){
#Override
public List<B> apply(String id){
someList.add(getItemB(id));
return someList;
}
})
.observeOn(AndroidSchedulers.mainThread())
.subscribe(new Observer(){
#Override
public void onCompleted(List<B> items){
bbb.addAll(items);
adapter.setItems(bbb);
}
});
This however is illegal as this Observableexpects the type List<A>whereas I'm returning List<B>. How can I structure my RxJava observable to be able to fetch a list of items (of type A, each having a unique id), and based on the fetched list, fetch a single item (of type B) with id (of each item of type A in the first list) as the argument, and only after add the received items (of type B) to a list?
The essential step in fetching the data is to transform each A into a B using a network call. In the RxJava world, that means wrapping the network call in an observable and then using the flatMap() operator.
Observable.fromList(aaa)
.flatMap(new Func1<A, Observable<B>>() {
#Override
public Observable<B> call(A a) {
return Observable.fromCallable(getNetworkValueAsB(item));
}
} )
.toList()
.subscribe( new Observer<List<B>>() {
#Override
public void onNext(B bItemList) {
bbb.addAll(bItemList);
adapter.setItems(bbb);
}
});
The operations are fromList() which converts from the List<A> to Observable<A>, flatMap() which converts from A to B using the call, and toList() which gathers all the B values produced into a list, which is then used in the subscription.
The fromList() operator might be called fromIterable() or from() depending on the version of the library you are using.
Edited to removed lambdas

How can I aggregate data from 3 streams to one single combined object?

I have the following situation in an Apache Flink project.
3 streams with different objects, like
Person -> String id, String firstName, String lastName (i.e. 101, John, Doe)
PersonDetail -> String id, String address, String city, String phoneNumber, long personId (i.e. 99, Stefansplatz 1, +43066012345678, 101)
PersonAddDetail -> String id, String AddDetailType, object AddDetailValue, long personId (i.e. 77, 1, Hansi or 78, 2, 1234 or 80, 3, true)
I'd like to aggregate (not sure if this is the right wording here) objects from these streams to a new object that I put to a new stream. The aggregation should be based on the Person id and as additional catch I need to filter out PersonAddDetail only with specific AddDetailType (let's say I'm only interested in objects with type 1 and 2).
The aggregated object should look somehow like
PersonReport -> long id, String firstName, String lastName, String address, String city, String phoneNumber, ArrayList< PersonAddDetail > details
The question now is if this is possible at all and if yes how can I accomplish it. Every input welcome.
Your problem sounds like a join operation. You could do something like:
personDataStream.join(personDetailDataStream).where(new KeySelector<Person, Long>() {
...
}).equalTo(new KeySelector<PersonDetail, Long>() {
...
}).window(TumblingEventTimeWindows.of(Time.seconds(2))).apply(new JoinFunction<Person, PersonDetail, PersonWithDetail>() {
...
});
Note that in general join operation is impossible on unbounded(infinite) collections so you need to bound it into windows.
Thanks to #Jeremy Grand comment I came up with a solution by myself and I'd like to share my thoughts and code. I introduced a new class called PersonContainer
public class PersonContainer {
private String id;
private Person person;
private PersonDetail personDetail;
private List<PersonAddDetail> personAddDetailList = new ArrayList<>();
public PersonContainer(Person person) {
this.id = person.getID();
this.person = person;
}
public PersonContainer(PersonDetail personDetail) {
this.id = personDetail.getOTTRID();
this.personDetail = personDetail;
}
public PersonContainer(PersonAddDetail personAddDetail) {
this.id = personAddDetail.getOTTRID();
this.timeStamp = ttrDetailAddEvent.getDATECREATED();
this.personAddDetailList.add(personAddDetail);
}
public PersonContainer merge(PersonContainer other) {
if (other.person != null) {
this.person = other.person;
return this;
}
if (other.personDetail != null) {
this.personDetail = other.personDetail;
return this;
}
if (other.personAddDetailList.size() > 0) {
this.personAddDetailList.addAll(other.personAddDetailList);
return this;
}
return null;
}
public String getId() {
return id;
}
public Person getPerson() {
return person;
}
public PersonDetail getPersonDetail() {
return personDetail;
}
public List<PersonAddDetail> getPersonAddDetailList() {
return PersonAddDetailList;
}
public boolean isComplete() {
return person != null && personDetail != null && personAddDetailList.size() > 1;
}
}
That's important part as I'm going to map the objects of the three input streams to this common object at first to union the streams afterwards.
So here is what I do, I described the single steps in the comments. In short I map the three input streams to new streams of the newly introduced container. Then I do a union on the three streams and use the iterate pattern to key these objects and merge them using my custom merge method. Finally I have defined a custom complete method to differ fully merged containers that are in the end mapped to the output and not done yet containers that are fed back into the merge process.
//Filter PersonAddDetail to have just the types needed
DataStream<PersonContainer> filteredPersonAddDetail = unfilteredPersonAddDetail.filter(new FilterFunction<OboTtrDetailAddEvent>() {
#Override
public boolean filter(PersonAddDetail personAddDetail) throws Exception {
return personAddDetail.getAddDetailType().matches("1|2");
}
});
//map Person stream to common object
DataStream<PersonContainer> mappedPersonStream = personInputStream.map(new MapFunction<Person, PersonContainer>() {
#Override
public PersonContainer map(Person Person) throws Exception {
return new PersonContainer(Person);
}
});
//map PersonDetail stream to common object
DataStream<PersonContainer> mappedPersonDetailStream = personDetailInputStream.map(new MapFunction<PersonDetail, PersonContainer>() {
#Override
public PersonContainer map(PersonDetail PersonDetail) throws Exception {
return new PersonContainer(PersonDetail);
}
});
//map PersonAddDetail stream to common object
DataStream<PersonContainer> mappedPersonAddDetailStream = filteredPersonAddDetail.map(new MapFunction<PersonAddDetail, PersonContainer>() {
#Override
public PersonContainer map(PersonAddDetail PersonAddDetail) throws Exception {
return new PersonContainer(PersonAddDetail);
}
});
//union the three input streams to one single stream
DataStream<PersonContainer> combinedInput = mappedPersonStream.union(mappedPersonDetailStream, mappedPersonAddDetailStream);
// Iteration pattern is in place here and I'm going to recursively try to merge corresponding objects togehter
IterativeStream<PersonContainer> iteration = combinedInput.iterate();
// Group objects by there shared ID and then use reduce to merge them
DataStream<PersonContainer> iterationBody = iteration.keyBy(new KeySelector<PersonContainer, String>() {
#Override
public String getKey(PersonContainer personContainer) throws Exception {
return personContainer.getId();
}
})
.reduce(new ReduceFunction<PersonContainer>() {
#Override
public PersonContainer reduce(PersonContainer personContainer, PersonContainer other) throws Exception {
return personContainer.merge(other);
}
});
// use the containers complete method to check whether the merge is finished or we need to wait for further objects in the stream
DataStream<PersonContainer> containersNotCompleteYet = iterationBody.filter(new FilterFunction<PersonContainer>() {
#Override
public boolean filter(PersonContainer PersonContainer) throws Exception {
return !personContainer.isComplete();
}
});
// partially merged or not merged at all containers are put back on the stream
iteration.closeWith(containersNotCompleteYet);
// fully merged containers are processed further
DataStream<PersonContainer> completeContainers = iterationBody.filter(new FilterFunction<PersonContainer>() {
#Override
public boolean filter(PersonContainer PersonContainer) throws Exception {
return personContainer.isComplete();
}
});
// finally the container is mapped to the correct output object
DataStream<PersonReport> personReport = completeContainers.map(new MapFunction<PersonContainer, PersonReport>() {
#Override
public PersonReport map(PersonContainer personContainer) throws Exception {
// map personContainer to final PersonReport
return personContainer;
}
});
This approach is working for me, good thing is that I can handle objects that arrive late on the stream (let's say i.e. a PersonAddDetail comes in some minutes after the other objects) and I don't need to define some sort of windowing.
Thanks for the input anyway

Categories