I am trying to create a hot observable with filtering in register method as list of events that are implementing a base interface as below:
public void register(Observer<AppEvent> observer, List<AppEvent> filter) {
Log.d(TAG, "Registering observer");
subject.subscribeOn(Schedulers.io())
.filter(o -> {
return filter.stream().anyMatch(it -> (o instanceof it));
})
.observeOn(AndroidSchedulers.mainThread())
.subscribe(observer);
}
public interface AppEvent extends Serializable {
}
public static class ItemDetails implements AppEvent {
public String info;
public ItemDetails(String malwareInfo) {
info = malwareInfo;
}
}
public static class InitApi implements AppEvent {
public Boolean isSuccess;
public InitApi(boolean isSuccess) {
this.isSuccess = isSuccess;
}
}
What I am trying to do is to have a filter based on type of event class, in our case ItemDetails or InitApi.
Logically the code is correct:
.filter(o -> {
return filter.stream().anyMatch(it -> (o instanceof it));
})
but the compiler says unknown class it.
How can I do filtering over a list of AppEvent?
As mentioned in the comments, you're seeing the compiler error because the instanceof operator requires the right-hand-side argument to be either a class or an interface.
If you're trying to create an "allowed" list of events to pass on to the Observer, you could do something like this:
public void register(Observer<AppEvent> observer, List<AppEvent> filter) {
Log.d(TAG, "Registering observer");
// Collect the names of events to pass on to the observer
final List<String> allowedEventNames = filter.stream()
.map(it -> it.getClass().getName())
.collect(Collectors.toList());
subject.subscribeOn(Schedulers.io())
// Filter based on names
.filter(event -> allowedEventNames.contains(event.getClass().getName()))
.observeOn(AndroidSchedulers.mainThread())
.subscribe(observer);
}
Related
I have annotation processor. I retrieve the list of variables of the class that was annotated with my annotation this way:
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
for (TypeElement annotation : annotations) {
Set<? extends Element> annotatedElements = roundEnv.getElementsAnnotatedWith(annotation);
for (Element el : annotatedElements) {
try {
generate(el);
} catch (IOException e) {
e.printStackTrace();
}
}
}
return true;
}
private void generate(Element annotatedElement) throws IOException {
TypeElement targetClass = (TypeElement) annotatedElement;
List<? extends Element> enclosedElements = targetClass.getEnclosedElements();
List<VariableElement> fields = new ArrayList<>(enclosedElements.size());
for (Element field : enclosedElements) {
if (field.getKind() == ElementKind.FIELD && !field.getModifiers().contains(Modifier.STATIC)) {
fields.add((VariableElement) field);
}
}
processFields(fields);
}
Now fields List contains list of all variables. But i need to extract the TYPE of variables, so for example if class has
private String blabla i want to getString
I am doing it following way:
void processFields(List<VariableElement> elements){
List<String> types =elements
.stream()
.map( e -> e.asType().toString())
.collect(Collectors.toList());
}
List types now contains all types of variables. However, if i have class like this
#MyAnnotation
public class Test{
#NotBlank(message = "message")
private String name;
}
The type of the variable name is not string, its
(#javax.validation.constraints.NotBlank(message="message") :: java.lang.String
How can i get just type without annotations then? Is there any way?
Thanks for help!
Add members and override init() to get the instances of
Elements
and
Types.
...
import javax.lang.model.util.Elements;
import javax.lang.model.util.Types;
...
/** See {#link javax.annotation.processing.ProcessingEnvironment#getElementUtils()}. */
protected Elements elements = null;
/** See {#link javax.annotation.processing.ProcessingEnvironment#getTypeUtils()}. */
protected Types types = null;
...
#Override
public void init(ProcessingEnvironment processingEnv) {
elements = processingEnv.getElementUtils();
types = processingEnv.getTypeUtils();
}
...
FYI, you can simplify your generate(Element) method:
...
import javax.lang.model.util.ElementFilter;
...
private void generate(Element annotatedElement) throws IOException {
List<VariableElement> fields =
ElementFilter.fieldsIn(Collections.singleton(annotatedElement)).stream()
.filter(t -> (! t.getModifiers().contains(Modifier.STATIC)))
.collect(Collectors.toList());
processFields(fields);
}
Your processFields(List<VariableElement>) method could look something
like below. I made all the steps explicit but you can look at
TypeElement
and
Name
methods to determine which String you want.
void processFields(List<VariableElement> in) {
List<String> out =
in.stream()
.map(e -> e.asType()) /* javax.lang.model.type.TypeMirror */
.map(e -> types.asElement(e)) /* javax.lang.model.element.Element */
.map(e -> (TypeElement) e) /* javax.lang.model.element.TypeElement */
.map(e -> e.getQualifiedName()) /* javax.lang.model.element.Name */
.map(e -> e.toString())
.collect(Collectors.toList());
}
I made my declarative http client on my app built in micronaut. This need to consume a services which responds with text/html type.
I manage to get a list but with LinkedHashMap inside. And I want them to be objects of Pharmacy
My question is: how I can transform that response into a List of object?
#Client("${services.url}")
public interface PharmacyClient {
#Get("${services.path}?${services.param}=${services.value}")
Flowable<List<Pharmacy>> retrieve();
}
public class StoreService {
private final PharmacyClient pharmacyClient;
public StoreService(PharmacyClient pharmacyClient) {
this.pharmacyClient = pharmacyClient;
}
public Flowable<List<Store>> all() {
Flowable<List<Pharmacy>> listFlowable = this.pharmacyClient.retrieve();
return listFlowable
.doOnError(throwable -> log.error(throwable.getLocalizedMessage()))
.flatMap(pharmacies ->
Flowable.just(pharmacies.stream() // here is a list of LinkedHashMap and i'd like to user Pharmacy objects
.map(pharmacy -> Store.builder().borough(pharmacy.getBoroughFk()).build())
.collect(Collectors.toList())
)
);
}
}
Code: https://github.com/j1cs/drugstore-demo/tree/master/backend
There is no fully-fledged framework AFAIK that provides support for HTML content to POJO mapping (which is usually referred to as scraping) as is the case for Micronaut, .
Meanwhile you can easily plug a converter bean based on jspoon intercepting and transforming your API results in equivalent POJOs:
class Root {
#Selector(value = ".pharmacy") List<Pharmacy> pharmacies;
}
class Pharmacy {
#Selector(value = "span:nth-child(1)") String name;
}
#Client("${services.minsal.url}")
public interface PharmacyClient {
#Get("${services.minsal.path}?${services.minsal.param}=${services.minsal.value}")
Flowable<String> retrieve();
}
#Singleton
public class ConverterService {
public List<Pharmacy> toPharmacies(String htmlContent) {
Jspoon jspoon = Jspoon.create();
HtmlAdapter<Root> htmlAdapter = jspoon.adapter(Root.class);
return htmlAdapter.fromHtml(htmlContent).pharmacies;
}
}
public class StoreService {
private final PharmacyClient pharmacyClient;
private final ConverterService converterService;
public StoreService(PharmacyClient pharmacyClient, ConverterService converterService) {
this.pharmacyClient = pharmacyClient;
this.converterService = converterService;
}
public Flowable<List<Store>> all() {
Flowable<List<Pharmacy>> listFlowable = this.pharmacyClient.retrieve().map(this.converterService::toPharmacies)
return listFlowable
.doOnError(throwable -> log.error(throwable.getLocalizedMessage()))
.flatMap(pharmacies ->
Flowable.just(pharmacies.stream() // here is a list of LinkedHashMap and i'd like to user Pharmacy objects
.map(pharmacy -> Store.builder().borough(pharmacy.getBoroughFk()).build())
.collect(Collectors.toList())
)
);
}
}
I ended up with this.
#Client("${services.url}")
public interface PharmacyClient {
#Get(value = "${services.path}?${services.param}=${services.value}")
Flowable<Pharmacy[]> retrieve();
}
public class StoreService {
private final PharmacyClient pharmacyClient;
public StoreService(PharmacyClient pharmacyClient) {
this.pharmacyClient = pharmacyClient;
}
public Flowable<List<Store>> all() {
Flowable<Pharmacy[]> flowable = this.pharmacyClient.retrieve();
return flowable
.switchMap(pharmacies ->
Flowable.just(Arrays.stream(pharmacies)
.map(pharmacyStoreMapping)
.collect(Collectors.toList())
)
).doOnError(throwable -> log.error(throwable.getLocalizedMessage()));
}
}
Still I want to know if i can change arrays to list in the declarative client.
Meanwhile i think this it's a good option.
UPDATE
I have been wrong all this time. First of all I don't need to add a list to the flowable because when the framework exposes the service it responds with a list of elements already.
So finally I did this:
#Client("${services.url}")
public interface PharmacyClient {
#Get(value = "${services.path}?${services.param}=${services.value}")
Flowable<Pharmacy> retrieve();
}
public class StoreService {
private final PharmacyClient pharmacyClient;
public StoreService(PharmacyClient pharmacyClient) {
this.pharmacyClient = pharmacyClient;
}
public Flowable<Store> all() {
Flowable<Pharmacy> flowable = this.pharmacyClient.retrieve();
return flowable
.switchMap(pharmacyPublisherFunction)
.doOnError(throwable -> log.error(throwable.getLocalizedMessage()));
}
As we can see the http client automatically transform the text/html data into json and it parses it well. I don't know why really. Maybe #JeffScottBrown can gives us some hints.
Problem statement:
I have to process request similar to a pipeline.
For example:
When a request comes, it has to undergo a sequence of operations, like (step1,step2,step3...).
So, in order to achieve that, I am using Template design pattern.
Please review and suggest if I am implementing this problem correctly, or there is a better solution.
I am suspecting my approach will introduce code smells, as I am changing values of objects very frequently.
Also, suggest if I & how can I use Java 8 to accomplish this?
Thanks.
Code:
package com.example.demo.design;
import java.util.List;
public abstract class Template {
#Autowired
private Step1 step1;
#Autowired
private Step2 step2;
#Autowired
private Save save;
List<String> stepOutput = null;
List<String> stepOutputTwo = null;
List<String> stepOutputThree = null;
public void step1(String action1) {
stepOutput = step1.method(action1);
}
public void step2(String action2) {
stepOutputTwo = step2.method(stepOutput, action2);
}
abstract public void step3();
public void save() {
save.persist(stepOutputThree);
}
final public void run(String action1, String action2) {
step1(action1);
step2(action2);
stepOutputTwo = step3();
}
}
In Java 8 streams model, that could look like the following:
final public void run(String action1, String action2) {
Stream.of(action1) // Stream<String>
.map(s -> step1.method(s)) // Stream<List<String>>
.map(l -> step2.method(l,action2) // Stream<List<String>>
.map(l -> step3.method(l)) // Stream<List<String>>
.forEach(l -> save.persist(l));
}
I had same issue! you can do something like this: and uncheckCall method is for handling exceptions.
final public void run(String action1, String action2) {
//other stuffs
Stream.of(step1.method(action1))
.map(stepOutput->uncheckCall(() ->step2.method(stepOutput,action2)))
.forEach(stepOutputThree -> uncheckCall(()->save.persist(stepOutputThree)));
//.....
}
for uncheckCall method:
public static <T> T uncheckCall(Callable<T> callable) {
try {
return callable.call();
} catch (RuntimeException e) {
// throw BusinessException.wrap(e);
} catch (Exception e) {
//throw BusinessException.wrap(e);
}
}
Well, when there are "pipelines", "sequence of operations", etc. the first design pattern that comes to mind is Chain of Responsibility, that looks like the following
and provides you with these benefits:
allows you to add new handlers when necessary (e.g. at runtime) without modifying other handlers and processing logic (Open/Closed Principle of SOLID)
allows a handler to stop processing a request if necessary
allows you to decouple processing logic of the handlers from each other (Single Responsibility Principle of SOLID)
allows you to define the order of the handlers to process a request outside of the handlers themselves
One example of real world usage is Servlet filters where you call doFilter(HttpRequest, HttpResponse, FilterChain) to invoke the next handler
protected void doFilter(HttpServletRequest req, HttpServletResponse resp, FilterChain chain) {
if (haveToInvokeNextHandler) {
chain.doFilter(req, resp);
}
}
In case of using classical Chain of Responsibility pattern your processing pipeline may look like the following:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
public abstract class AbstractStep implements Step {
private Step next;
public AbstractStep() {
}
public AbstractStep(Step next) {
this.next = next;
}
protected void next(StepContext ctx) {
if (next != null) {
next.handle(ctx);
}
}
}
Implementation
public class Step1 extends AbstractStep {
public Step1(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
next(ctx); // invoke next step
}
}
public class Step2 extends AbstractStep {
public Step2(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
next(ctx); // invoke next step
}
}
public class Step3 extends AbstractStep {
public Step3(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
next(ctx); // invoke next step
}
}
Client code
Step step3 = new Step3(null);
Step step2 = new Step2(step3);
Step step1 = new Step1(step2);
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
step1.handle(ctx);
Also all this stuff can be simplified into a chain of handlers decoupled from each other by means of removing the corresponding next references in case your processing pipeline will have to always invoke all the available steps without controlling the necessity of invocation from the previous one:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
Implementation
public class Step1 implements Step {
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
}
}
public class Step2 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
}
}
public class Step3 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
}
}
Client code
Note that in case of Spring framework (just noticed #Autowired annotation) the client code may be simplified even more as the #Autowired annotation can be used to inject all the beans of the corresponding type into a corresponding collection.
Here what the documentation states:
Autowiring Arrays, Collections, and Maps
In case of an array, Collection, or Map dependency type, the container autowires all beans matching the declared value type. For such purposes, the map keys must be declared as type String which will be resolved to the corresponding bean names. Such a container-provided collection will be ordered, taking into account Ordered and #Order values of the target components, otherwise following their registration order in the container. Alternatively, a single matching target bean may also be a generally typed Collection or Map itself, getting injected as such.
public class StepsInvoker {
// spring will put all the steps into this collection in order they were declared
// within the spring context (or by means of `#Order` annotation)
#Autowired
private List<Step> steps;
public void invoke(String action1, String action2) {
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
steps.forEach(step -> step.handle(ctx))
}
}
I want to wrap retrofit api call in another method where I can additionally show/hide loader, check for network etc. As my api returns observable, the way I ended up is below:
private <T> Observable<T> request(final Observable<T> apiCall, final ViewManager viewManager) {
return Observable.create(new Action1<Emitter<T>>() {
#Override
public void call(final Emitter<T> emitter) {
if (!NetworkUtils.isConnected(context)) {
emitter.onError(new ConnectException("network not connected"));
return;
}
viewManager.showLoader();
apiCall.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe(new Observer<T>() {
#Override
public void onCompleted() {
viewManager.hideLoader();
emitter.onCompleted();
}
#Override
public void onError(Throwable e) {
viewManager.hideLoader();
emitter.onError(e);
}
#Override
public void onNext(T response) {
emitter.onNext(response);
}
});
}
}, Emitter.BackpressureMode.BUFFER);
}
Is this a standard way of dealing with the problem? How do you wrap an observable inside another observable? Can anyone guide?
the idiomatic way with reactive extensions is to use composition, and this is one of the great powers of RX.
first let's define the desired behaviors using operators, what you want is something like this:
apiCall
.observeOn(AndroidSchedulers.mainThread())
.startWith(Observable.defer(() -> {
if (!NetworkUtils.isConnected(context)) {
return Observable.error(new ConnectException("network not connected"));
} else {
return Observable.empty();
}
}))
.doOnSubscribe(() -> viewManager.showLoader())
.doOnCompleted(() -> viewManager.hideLoader())
.doOnError(throwable -> viewManager.hideLoader());
now, for composing it to any network apiCall Observable, you can use compose() operator and encapsulate this logic into Transformer for that:
class CustomTransformer<T> implements Observable.Transformer<T, T> {
private final ViewManager viewManager;
private final Context context;
CustomTransformer(ViewManager viewManager, Context context) {
this.viewManager = viewManager;
this.context = context;
}
#Override
public Observable<T> call(Observable<T> apiCall) {
return apiCall
.observeOn(AndroidSchedulers.mainThread())
.startWith(Observable.defer(() -> {
if (!NetworkUtils.isConnected(context)) {
return Observable.error(new ConnectException("network not connected"));
} else {
return Observable.empty();
}
}))
.doOnSubscribe(() -> viewManager.showLoader())
.doOnCompleted(() -> viewManager.hideLoader())
.doOnError(throwable -> viewManager.hideLoader());
;
}
}
then you can compose it with any network Observable:
someRetrofitQuery
.compose(new CustomTransformer<>(viewManager, context))
...
.subscribe();
Our processor returns a List<?> (effectively passing a List<List<?>>) to our ItemWriter.
Now, we observed that the JdbcBatchItemWriter is not programmed to handle item instanceof List. We also observed to process item instanceof List; we need to write a custom ItemSqlParameterSourceProvider.
But the sad part is that it returns SqlParameterSource which can handle only one item and again not capable of handling a List.
So, can someone help us understand how to handle list of lists in the JdbcBatchItemWriter?
Typically, the design pattern is:
Reader -> reads something, returns ReadItem
Processor -> ingests ReadItem, returns ProcessedItem
Writer -> ingests List<ProcessedItem>
If your processor is returning List<Object>, then you need your Writer to expect List<List<Object>>.
You could do this by wrapping your JdbcBatchItemWriter as a delegate in an ItemWriter that looks something like this:
public class ListUnpackingItemWriter<T> implements ItemWriter<List<T>>, ItemStream, InitializingBean {
private ItemWriter<T> delegate;
#Override
public void write(final List<? extends List<T>> lists) throws Exception {
final List<T> consolidatedList = new ArrayList<>();
for (final List<T> list : lists) {
consolidatedList.addAll(list);
}
delegate.write(consolidatedList);
}
#Override
public void afterPropertiesSet() {
Assert.notNull(delegate, "You must set a delegate!");
}
#Override
public void open(ExecutionContext executionContext) {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).open(executionContext);
}
}
#Override
public void update(ExecutionContext executionContext) {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).update(executionContext);
}
}
#Override
public void close() {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).close();
}
}
public void setDelegate(ItemWriter<T> delegate) {
this.delegate = delegate;
}
}
public class ListUnpackingItemWriter<T> implements FlatFileItemWriter<List<T>>, ItemStream, InitializingBean {
#Override
public void afterPropertiesSet() {
setLineAggregator(item -> String.join("\n", item.stream().map(T::toString).collect(Collectors.toList())));
}
}
Just added a custom line aggregator to the above solution, this helps in writing the content to a file by using FlatFileItemWriter<List<T>>. You can replace T with actual class name to avoid compilation error while calling toString() method.