Sample request JSON using RxJava and RxApacheHTTP not working - java

I'm trying to retrieve a JSON object of a request using RxJava!
In my example I have a restful service Java that works perfectly in browser.
//Object Data
#XmlRootElement
public class Person {
private String firstName;
private String lastName;
//getters and setters
}
Restful Java
#Path("/persons")
public class PersonResource {
#GET
#Produces(MediaType.APPLICATION_JSON)
public List<Person> getBandas() {
Person paulo = new Person();
paulo.setFirstName("Paulo Henrique");
paulo.setLastName("Pereira Santana");
//
List<Person> persons = new ArrayList<>();
persons.add(paulo);
return persons;
}
}
as a result (in browser : http://localhost:8080/learn-java-rs/persons) have a JSON object:
{"Person": {
"firstName": "Paulo Henrique",
"lastName": "Pereira Santana"
}
}
I tried to make the same request using RxJava not worked (or did not understand the implementation). Follow:
public class Example {
public static void main(String[] args) {
// TODO Auto-generated method stub
try(CloseableHttpAsyncClient client = HttpAsyncClients.createDefault()) {
client.start();
Observable<Map> requestJson = requestJson(client, "http://localhost:8080/learn-java-rs/persons");
Helpers.subscribePrint(requestJson.map(json -> json.get("firstName") + " " + json.get("lastName")), "person");
} catch (IOException e1) {
e1.printStackTrace();
}
}
private static Map<String, Set<Map<String, Object>>> cache = new ConcurrentHashMap<>();
#SuppressWarnings({"rawtypes","unchecked"})
private static Observable<Map> requestJson(HttpAsyncClient client,String url){
Observable<String> rawResponse=ObservableHttp.createGet(url,client).toObservable().flatMap(resp -> resp.getContent().map(bytes -> new String(bytes,java.nio.charset.StandardCharsets.UTF_8))).retry(5).cast(String.class).map(String::trim).doOnNext(resp -> getCache(url).clear());
Observable<String> objects=rawResponse.filter(data -> data.startsWith("{")).map(data -> "[" + data + "]");
Observable<String> arrays=rawResponse.filter(data -> data.startsWith("["));
Observable<Map> response=arrays.ambWith(objects).map(data -> {
return new Gson().fromJson(data,List.class);
}
).flatMapIterable(list -> list).cast(Map.class).doOnNext(json -> getCache(url).add((Map<String,Object>)json));
return Observable.amb(fromCache(url),response);
}
private static Observable<Map<String, Object>> fromCache(String url) {
return Observable.from(getCache(url)).defaultIfEmpty(null)
.flatMap(json -> (json == null) ? Observable.never() : Observable.just(json))
.doOnNext(json -> json.put("person", true));
}
private static Set<Map<String, Object>> getCache(String url) {
if (!cache.containsKey(url)) {
cache.put(url, new HashSet<Map<String,Object>>());
}
return cache.get(url);
}
Edit
public static <T> Subscription subscribePrint(Observable<T> observable,
String name) {
return observable.subscribe(
(v) -> System.out.println(Thread.currentThread().getName()
+ "|" + name + " : " + v), (e) -> {
System.err.println("Error from " + name + ":");
System.err.println(e);
System.err.println(Arrays
.stream(e.getStackTrace())
.limit(5L)
.map(stackEl -> " " + stackEl)
.collect(Collectors.joining("\n"))
);
}, () -> System.out.println(name + " ended!"));
}
running nothing happens.
Someone could tell me what I'm missing?
Note: I used Rxjava api 1.1.0 and rxapache-http-0.21.0

Related

Publisher/Subscriber in JAX-RS - List not updating

So I'm trying to implement a publisher/subscriber pattern in JAX-RS however it seems that after the subscriber subscribes the publisher cannot find the subscription.
Server Code:
#GET
#Path("{id}/subscribe")
public void subscribe(#Suspended AsyncResponse asyncResponse, #PathParam("id") Long id) {
if (responses.containsKey(id)) {
responses.get(id).add(asyncResponse);
} else {
List<AsyncResponse> newList = new ArrayList<>();
newList.add(asyncResponse);
responses.put(id, newList);
}
System.out.println(responses.size());
}
#POST
#Path("{id}/publish")
#Consumes(MediaType.TEXT_PLAIN)
public void publish(String message, #PathParam("id") Long id) {
System.out.println(responses.size());
List<AsyncResponse> responseList = responses.get(id);
if (responseList == null) {
return;
}
for (AsyncResponse response : responseList) {
response.resume(message);
}
responseList.clear();
}
Client Code:
public void subscribeToConcertNews(ConcertDTO concertDTO) {
Response response = null;
String url = CONCERT_URL + "/" + concertDTO.getId() + "/subscribe";
ClientBuilder.newClient().target(url)
.request()
.async()
.get(new InvocationCallback<String>() {
#Override
public void completed(String s) {
System.out.println(s);
_client.target(CONCERT_URL + "/" + concertDTO.getId() + "/subscribe")
.request()
.async()
.get(this);
}
#Override
public void failed(Throwable throwable) {
throw new ServiceException("");
}
});
}
public void publishToConcertNews(ConcertDTO concertDTO, String message) {
Response response = _client.target(CONCERT_URL + "/" + concertDTO.getId() + "/publish")
.request()
.post(Entity.entity("News!", MediaType.TEXT_PLAIN));
}
Testing Code:
ConcertDTO concertDTO = new ConcertDTO(1L, "123", new HashSet<>(), new HashMap<>(), new HashSet<>());
_service.subscribeToConcertNews(concertDTO);
_service.publishToConcertNews(concertDTO, "213123");
After the subscription, the size of the map is 1, however news is attempted to be published it reads that the size of the map to hold the responses is 0. So the AsyncResponse stored in the map is disappearing. Any help would be appreciated!

Kafka Streams not sending records to output topics

I have a Kafka Streams Application that is receiving records to an input topic doing some stream processing on it and sending processed records to multiple output topics. It was running perfectly fine until I stopped it.
After stopping the first Streams app, I created a different streams application that has different input and output topic.
Next, I started both applications and my second application worked perfect but my first application has stopped doing any processing at all.
When I start a console-consumer on input topics I can see records being produced, but when I start console-consumer on output topics I am not receiving any records. I don't what could have gone wrong. I thought may be its because of the second streams application. So I stopped it and recreated all the topics and restarted the first application again but its still not doing anything.
Note: I am doing this on a local server.
Next I build the streams application on my personal machine and its working as expected.
What could have gone wrong that I am seeing this unexpectedly strange behavior?
Here is my code:
Main Class:
public class Pipe {
static Logger log = Logger.getLogger(Pipe.class.getName());
public static void main(String[] args) throws Exception {
PropertyConfigurator.configure("log4j.properties");
log.info("Starting application");
Map<String, String> env = System.getenv();
Properties props = new Properties();
String BROKER_URL = env.get("BROKER_URL");
String appId = "98aff1c5-7a69-46b7-899c-186851054b43";
String appSecret = "zVyS/V694ffWe99QpCvYqE1sqeqLo36uuvTL8gmZV0A=";
String appTenant = "2f6cb1a6-ecb8-4578-b680-bf84ded07ff4";
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-pipe");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); // pass from env localhost:9092 | BROKER_URL + ":9092"
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
final StreamsBuilder builder = new StreamsBuilder();
log.info("Creating stream: o365_storage");
KStream<String, String> source_o365_storage = builder.stream("o365_storage");
log.info("Creating stream: scan_result_dlp");
KStream<String, String> source_scan_result_dlp = builder.stream("scan_result_dlp");
log.info("Creating stream: scan_result_malware");
KStream<String, String> source_scan_result_malware = builder.stream("scan_result_malware");
log.info("Creating stream: source_o365_user_contenturl");
//KStream<String, String> source_o365_contenturl = builder.stream("o365_activity_contenturl");
KStream<String, String> source_o365_user_contenturl = builder.stream("o365_activity_contenturl");
log.info("Creating stream: source_o365_contenturl_result");
KStream<String, String> source_o365_contenturl_result = source_o365_user_contenturl.flatMapValues(new ValueMapper<String, Iterable<String>>() {
#Override
public Iterable<String> apply(String value) {
ArrayList<String> keywords = new ArrayList<String>();
ExecutorService executor = new ThreadPoolExecutor(4, 4, 1, TimeUnit.MINUTES, new LinkedBlockingQueue<Runnable>());
try {
String accessToken = O365Util.getAccessToken(appId, appSecret, appTenant);
System.out.println("accessToken : " + accessToken);
System.out.println("Creating futures..");
List<Future<?>> futures = new ArrayList<Future<?>>();
JSONArray contentUrlList = new JSONArray(value);
for (int i = 0; i < contentUrlList.length(); i++) {
JSONObject contentUri = contentUrlList.getJSONObject(i);
//futures.add(executor.submit(new FetchLogService(accessToken, contentUri.getString("contentUri"))));
futures.add(executor.submit(new FetchLogService(accessToken, contentUri, appTenant)));
}
System.out.println("futures size is : " + futures.size());
for (Future<?> f : futures) {
if (f.get() != null) {
//System.out.println("Executing contentUri parallel....................... ");
String futureResult = f.get().toString();
if (String.valueOf(futureResult.charAt(0)).equalsIgnoreCase("[")) {
//System.out.println("futureResult is JSONArray");
JSONArray logList = new JSONArray(futureResult);
for (int k = 0; k < logList.length(); k++) {
JSONObject log = logList.getJSONObject(k);
//System.out.println("Added logs into Events for action : " + log.getString("Operation"));
keywords.add(log.toString());
}
} else {
System.out.println("futureResult is JSONObject");
JSONObject contentUrlObj = new JSONObject(futureResult);
keywords.add(contentUrlObj.toString());
}
} else {
System.out.println("future result is nullllllllllllllllllllllllllllllllllllllll");
}
}
} catch (Exception e) {
System.err.println("Unable to convert to json");
e.printStackTrace();
} finally {
executor.shutdownNow();
}
return keywords;
}
});
log.info("Creating stream: source_o365_user_activity_intermediate");
KStream<String, String> source_o365_user_activity_intermediate = source_o365_contenturl_result.flatMapValues(new ValueMapper<String, Iterable<String>>() {
#Override
public Iterable<String> apply(String value) {
ArrayList<String> keywords = new ArrayList<String>();
try {
if (value.contains("Operation\":")) {
keywords.add(value);
}
} catch (Exception e) {
System.err.println("Unable to convert to json");
e.printStackTrace();
}
return keywords;
}
});
source_o365_user_activity_intermediate.to("o365_user_activity");
log.info("Creating stream: o365_contenturls");
KStream<String, String> o365_contenturls = source_o365_contenturl_result.flatMapValues(new ValueMapper<String, Iterable<String>>() {
#Override
public Iterable<String> apply(String value) {
ArrayList<String> keywords = new ArrayList<String>();
try {
if (value.contains("contentUri\":")) {
keywords.add("["+value+"]");
}
} catch (Exception e) {
System.err.println("Unable to convert to json");
e.printStackTrace();
}
return keywords;
}
});
o365_contenturls.to("o365_activity_contenturl");
log.info("Creating stream: o365_user_activity");
KStream<String, String> source_o365_user_activity = builder.stream("o365_user_activity");
log.info("Creating branch: branches_source_o365_user_activity");
#SuppressWarnings("unchecked")
KStream<String, String>[] branches_source_o365_user_activity = source_o365_user_activity.branch(
(key, value) -> (value.contains("Operation\":\"SharingSet") && value.contains("ItemType\":\"File")), // Sharing Set by Date
(key, value) -> (value.contains("Operation\":\"AddedToSecureLink") && value.contains("ItemType\":\"File")), // Added to secure link
(key, value) -> (value.contains("Operation\":\"AddedToGroup")), // Added to group
(key, value) -> (value.contains("Operation\":\"Add member to role.") || value.contains("Operation\":\"Remove member from role.")),//Role update by date
(key, value) -> (value.contains("Operation\":\"FileUploaded") || value.contains("Operation\":\"FileDeleted")
|| value.contains("Operation\":\"FileRenamed") || value.contains("Operation\":\"FileMoved")), // Upload file by date
(key, value) -> (value.contains("Operation\":\"UserLoggedIn")), // User logged in by date
(key, value) -> (value.contains("Operation\":\"Delete user.") || value.contains("Operation\":\"Add user.")
&& value.contains("ResultStatus\":\"success")) // Manage user by date
);
log.info("Creating branch: branches1_source_o365_user_activity");
#SuppressWarnings("unchecked")
KStream<String, String>[] branches1_source_o365_user_activity = source_o365_user_activity.branch(
(key, value) -> (value.contains("Operation\":\"FileUploaded") || value.contains("Operation\":\"FileModified")
|| value.contains("Operation\":\"FileDeleted")), // File update by date
(key, value) -> (value.contains("Operation\":\"FileAccessed")) // File access by date
);
log.info("Creating branch: branches2_source_o365_user_activity");
#SuppressWarnings("unchecked")
KStream<String, String>[] branches2_source_o365_user_activity = source_o365_user_activity.branch(
(key, value) -> (value.contains("Operation\":\"FileUploaded") || value.contains("Operation\":\"FileModified")
|| value.contains("Operation\":\"FileDeleted") || value.contains("Operation\":\"SharingSet")
&& value.contains("ItemType\":\"File")) // File operation by date
);
log.info("Creating branch: branches3_source_o365_user_activity");
#SuppressWarnings("unchecked")
KStream<String, String>[] branches3_source_o365_user_activity = source_o365_user_activity.branch(
(key, value) -> (value.contains("Workload\":\"AzureActiveDirectory") || value.contains("Workload\":\"OneDrive") || value.contains("Workload\":\"SharePoint")) // Activity log by date
);
log.info("Creating branch: branches4_source_o365_user_activity");
#SuppressWarnings("unchecked")
KStream<String, String>[] branches4_source_o365_user_activity = source_o365_user_activity.branch(
(key, value) -> (value.contains("Operation\":\"FileUploaded") || value.contains("Operation\":\"FileModified")) // Download file for scanning
);
/////////////////////////////////========================= DLP LOGS ========================/////////////////////////////////////////////////////////
AppUtil.pushToTopic(source_scan_result_dlp, Constant.O365_GTB_BREACHED_POLICY_BY_DATE, "o365_gtb_dlp_breached_policy_by_date");
//////////////////////////////////////==================== MALWARE LOGS ================================////////////////////////////////////////////
AppUtil.pushToTopic(source_scan_result_malware, Constant.O365_LAST_LINE_MALWARE, "o365_last_line_malware");
//////////////////////////////////////==================== ALL LOGS ====================================////////////////////////////////////////////
AppUtil.pushToTopic(source_o365_user_activity, Constant.O365_USER_ACTIVITY_BY_DATE, "o365_user_activity_by_date");
////////////////////////////////////====================== STORAGE LOGS ====================================////////////////////////////////////////////
AppUtil.pushToTopic(source_o365_storage, Constant.O365_STORAGE_BY_DATE, "o365_storage_by_date");
//////////////////////////////////////==================== BRANCH LOGS ====================================////////////////////////////////////////////
AppUtil.pushToTopic(branches_source_o365_user_activity[0], Constant.O365_SHARING_SET_BY_DATE, "o365_sharing_set_by_date", Constant.O365_SHARING_SET_BY_DATE_EXCEP_KEYS);
AppUtil.pushToTopic(branches_source_o365_user_activity[1], Constant.O365_ADDED_TO_SECURE_LINK_BY_DATE, "o365_added_to_secure_link_by_date");
AppUtil.pushToTopic(branches_source_o365_user_activity[2], Constant.O365_ADDED_TO_GROUP_BY_DATE, "o365_added_to_group_by_date");
AppUtil.pushToTopic(branches_source_o365_user_activity[3], Constant.O365_ROLE_UPDATE_BY_DATE, "o365_role_update_by_date");
AppUtil.pushToTopic(branches_source_o365_user_activity[4], Constant.O365_UPLOAD_FILE_BY_DATE, "o365_upload_file_by_date", Constant.O365_UPLOAD_FILE_BY_DATE_EXCEP_KEYS);
AppUtil.pushToTopic(branches_source_o365_user_activity[5], Constant.O365_USER_LOGGED_IN_BY_DATE, "o365_user_logged_in_by_date");
AppUtil.pushToTopic(branches_source_o365_user_activity[6], Constant.O365_MANAGE_USER_BY_DATE, "o365_manage_user_by_date");
////////////////////////////////////====================== BRANCH 1 LOGS ====================================////////////////////////////////////////////
AppUtil.pushToTopic(branches1_source_o365_user_activity[0], Constant.O365_FILE_UPDATE_BY_DATE, "o365_file_update_by_date");
AppUtil.pushToTopic(branches1_source_o365_user_activity[1], Constant.O365_FILE_ACCESS_BY_DATE, "o365_file_access_by_date");
////////////////////////////////////====================== BRANCH 2 LOGS ====================================////////////////////////////////////////////
AppUtil.pushToTopic(branches2_source_o365_user_activity[0], Constant.O365_FILE_OPERATION_BY_DATE, "o365_file_operation_by_date");
////////////////////////////////////====================== BRANCH 3 LOGS ====================================////////////////////////////////////////////
AppUtil.pushToTopic(branches3_source_o365_user_activity[0], Constant.O365_ACTIVITY_LOG_BY_DATE, "o365_activity_log_by_date");
////////////////////////////////////====================== BRANCH 4 LOGS ====================================////////////////////////////////////////////
branches4_source_o365_user_activity[0].to("download_file_for_scanning");
final Topology topology = builder.build();
final KafkaStreams streams = new KafkaStreams(topology, props);
final CountDownLatch latch = new CountDownLatch(1);
// attach shutdown handler to catch control-c
Runtime.getRuntime().addShutdownHook(new Thread("streams-shutdown-hook") {
#Override
public void run() {
log.trace("Exiting application.");
streams.close();
latch.countDown();
}
});
try {
streams.start();
latch.await();
} catch (Throwable e) {
System.exit(1);
}
System.exit(0);
}
}
AppUtil:
public final class AppUtil {
static Logger log = Logger.getLogger(Pipe.class.getName());
public static HashMap createHashMap(String[] keys, String[] values) {
HashMap<String, String> hmap = new HashMap<String, String>();
for (int i = 0; i < values.length; i++) {
hmap.put(keys[i], values[i]);
}
return hmap;
}
public static void pushToTopic(KStream<String, String> sourceTopic, HashMap<String, String> hmap, String destTopicName) {
log.info(destTopicName+ " inside function");
System.out.println(destTopicName + " inside function");
sourceTopic.flatMapValues(new ValueMapper<String, Iterable<String>>() {
#Override
public Iterable<String> apply(String value) {
log.info("================================================================================================================================================================================");
log.info("========> " + destTopicName + " Log:\n \n" + value);
System.out.println("================================================================================================================================================================================");
System.out.println("========> " + destTopicName + " Log:\n \n" + value);
ArrayList<String> keywords = new ArrayList<String>();
try {
JSONObject send = new JSONObject();
JSONObject received = processJSON(new JSONObject(value), destTopicName);
send.put("current_date", getCurrentDate().toString());
if (!destTopicName.equals("o365_storage_by_date")) {
send.put("insertion_time", getCurrentDateTime().toString());
}
boolean valid_json = true;
for(String key: hmap.keySet()) {
if (received.has(hmap.get(key))) {
send.put(key, received.get(hmap.get(key)));
}
else {
System.out.println("\n \n Missing Key in JSON: Cannot send log to destination topic = " + destTopicName + " | " + hmap.get(key) + " Key is missing.");
log.error("\n \n Missing Key in JSON: Cannot send log to destination topic = " + destTopicName + " | " + hmap.get(key) + " Key is missing.");
valid_json = false;
}
}
if (valid_json) {
keywords.add(send.toString());
}
// apply regex to value and for each match add it to keywords
} catch (Exception e) {
// TODO: handle exception
log.error("Unable to convert to json");
System.err.println("Unable to convert to json");
e.printStackTrace();
}
return keywords;
}
}).to(destTopicName);
}
//////////////////////////////////////
public static void pushToTopic(KStream<String, String> sourceTopic, HashMap<String, String> hmap, String destTopicName, String[] exceptionalKeys) {
sourceTopic.flatMapValues(new ValueMapper<String, Iterable<String>>() {
#Override
public Iterable<String> apply(String value) {
log.info("================================================================================================================================================================================");
log.info("========> " + destTopicName + " Log:\n \n" + value);
System.out.println("================================================================================================================================================================================");
System.out.println("========> " + destTopicName + " Log:\n \n" + value);
ArrayList<String> keywords = new ArrayList<String>();
try {
JSONObject send = new JSONObject();
JSONObject received = processJSON(new JSONObject(value), destTopicName);
send.put("current_date", getCurrentDate().toString());
if (!destTopicName.equals("o365_storage_by_date")) {
send.put("insertion_time", getCurrentDateTime().toString());
}
boolean valid_json = true;
for(String key: hmap.keySet()) {
if (received.has(hmap.get(key))) {
send.put(key, received.get(hmap.get(key)));
}
else {
System.out.println("\n \n Missing Key in JSON: Sending log to destination topic = " + destTopicName + " with null value | " + hmap.get(key) + " Key is missing.");
log.warn("\n \n Missing Key in JSON: Sending log to destination topic = " + destTopicName + " with null value | " + hmap.get(key) + " Key is missing.");
if(!isExceptionalKey(exceptionalKeys, hmap.get(key))) {
valid_json = false;
}
}
}
if (valid_json) {
keywords.add(send.toString());
}
// apply regex to value and for each match add it to keywords
} catch (Exception e) {
// TODO: handle exception
log.error("Unable to convert to json");
System.err.println("Unable to convert to json");
e.printStackTrace();
}
return keywords;
}
}).to(destTopicName);
}
//////////////////////////////////////
private static boolean isExceptionalKey(String[] exceptionalKeys, String currKey) {
// TODO Auto-generated method stub
boolean isExceptionalKey = false;
for (String string : exceptionalKeys) {
if (string.equals(currKey)) {
isExceptionalKey = true;
break;
}
}
return isExceptionalKey;
}
public static JSONObject processJSON(JSONObject jsonObj, String destTopicName) {
if (jsonObj.has("UserId")) {
String val = jsonObj.get("UserId").toString().toLowerCase();
jsonObj.remove("UserId");
jsonObj.put("UserId", val);
}
if (jsonObj.has("TargetUserOrGroupName")) {
String val = jsonObj.get("TargetUserOrGroupName").toString().toLowerCase();
jsonObj.remove("TargetUserOrGroupName");
jsonObj.put("TargetUserOrGroupName", val);
}
if (jsonObj.has("ObjectId")) {
String val = jsonObj.get("ObjectId").toString().toLowerCase();
jsonObj.remove("ObjectId");
jsonObj.put("ObjectId", val);
}
if (jsonObj.has("EventData")) {
String val = jsonObj.get("EventData").toString().toLowerCase();
jsonObj.remove("EventData");
jsonObj.put("EventData", val);
}
if (destTopicName.equals("o365_last_line_malware")) {
jsonObj.put("MaliciousScore", "-1");
}
if (destTopicName.equals("o365_activity_log_by_date") || destTopicName.equals("o365_gtb_dlp_breached_policy_by_date")) {
jsonObj.put("ActivityDetail", jsonObj.toString());
}
return jsonObj;
}
public static String getCurrentDate() {
Date date = new Date();
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd");
dateFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
String UTCdate = dateFormat.format(date);
return UTCdate;
}
private static String getCurrentDateTime() {
DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
Date date = new Date();
String datetime = dateFormat.format(date);
return datetime;
}
}

CompletableFutures: processing of CompletableFuture chains in parallel

I am designing an asynchronous call with CompletableFutures. This is a batch call, where I need to process several entities at once. At the end of the call I have to collect information about the status of the processing of every item.
As the input I have an array of ids of those entities. This is a complex entity, I have to place several DAO calls in order to compile an entity into an object. Each of DAO methods return CompletableFuture<PartX>.
I am chaining those DAO calls because if one of the pieces does not exist I won't be able to construct a whole object. Here is how my snippet looks like:
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.stream.Collectors;
import com.google.common.collect.Lists;
public class CfChainsAllOfTest {
private DAO dao = new DAO();
public static void main(String[] args) {
CompletableFuture<Void> resultPrintingCf = new CfChainsAllOfTest().fetchAllInParallelAndCollect(Lists.newArrayList(1l, 2l, 3l)).thenAccept(results -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + results);
});
resultPrintingCf.join();
}
private CompletableFuture<List<Item>> fetchAllInParallelAndCollect(List<Long> ids) {
List<CompletableFuture<Item>> cfs = Lists.newArrayList();
for (Long id : ids) {
// I want this to be an instant non-blocking operation
cfs.add(fetchSingle(id));
System.out.println("[" + Thread.currentThread().getName() + "]" + "After completable future was added to the list, id=" + id);
}
return waitAllOfAndCollect(cfs);
}
private CompletableFuture<Item> fetchSingle(Long id) {
return getPartCAndSetOnItem(new Item(id)).thenCompose(this::getPartBAndSetOnItem).thenCompose(this::getPartAAndSetOnItem);
}
public CompletableFuture<Item> getPartCAndSetOnItem(Item item) {
return dao.getPartC(item.getId()).thenCompose(partC -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartC(partC);
cf.complete(item);
return cf;
});
}
public CompletableFuture<Item> getPartBAndSetOnItem(Item item) {
return dao.getPartB(item.getId()).thenCompose(partB -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartB(partB);
cf.complete(item);
return cf;
});
}
public CompletableFuture<Item> getPartAAndSetOnItem(Item item) {
return dao.getPartA(item.getId()).thenCompose(partA -> {
CompletableFuture<Item> cf = new CompletableFuture<>();
item.setPartA(partA);
cf.complete(item);
return cf;
});
}
private static <T> CompletableFuture<List<T>> waitAllOfAndCollect(List<CompletableFuture<T>> futures) {
CompletableFuture<Void> allDoneFuture = CompletableFuture.allOf(futures.toArray(new CompletableFuture[futures.size()]));
return allDoneFuture.thenApply(v -> futures.stream().map(future -> future.join()).collect(Collectors.<T> toList()));
}
static class DAO {
public CompletableFuture<PartC> getPartC(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part C from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part C fetched from db for id=" + id);
return new PartC();
});
}
public CompletableFuture<PartB> getPartB(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part B from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part B fetched from db for id=" + id);
return new PartB();
});
}
public CompletableFuture<PartA> getPartA(Long id) {
return CompletableFuture.supplyAsync(() -> {
System.out.println("[" + Thread.currentThread().getName() + "]" + "Fetching Part A from database for id=" + id);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
}
System.out.println("[" + Thread.currentThread().getName() + "]" + "Part A fetched from db for id=" + id);
return new PartA();
});
}
}
static class Item {
private final Long id;
private PartA partA;
private PartB partB;
private PartC partC;
public Item(Long id) {
this.id = id;
}
public Long getId() {
return id;
}
public PartA getPartA() {
return partA;
}
public void setPartA(PartA partA) {
this.partA = partA;
}
public PartB getPartB() {
return partB;
}
public void setPartB(PartB partB) {
this.partB = partB;
}
public PartC getPartC() {
return partC;
}
public void setPartC(PartC partC) {
this.partC = partC;
}
#Override
public String toString() {
return "Item [id=" + id + ", partA=" + partA + ", partB=" + partB + ", partC=" + partC + "]";
}
}
static class PartA {
#Override
public String toString() {
return "Part A";
}
}
static class PartB {
#Override
public String toString() {
return "Part B";
}
}
static class PartC {
#Override
public String toString() {
return "Part C";
}
}
}
The problem is that processing for each item is not really done in parallel because of the chaining. It looks like chaining of CompletableFutures is a blocking call. I would expect the chain of CFs to return variable of CompletableFuture<Whole> immediately and only after that start computing the value.
That said, what would be the best way to achieve such behavior? Thanks.
The problem is with this method:
private CompletableFuture<Item> fetchSingle(Long id) {
return getPartCAndSetOnItem(new Item(id)).thenCompose(this::getPartBAndSetOnItem).thenCompose(this::getPartAAndSetOnItem);
}
Basically you are telling: get part C, then get part B, then get part A.
Instead, you should call the 3 methods, then merge the results – though that might not be necessary here due to the way you just store the result on the Item (pay attention to the Java memory model here, as your Item is not synchronized here: it might not work properly for more complex examples).
So, basically:
private CompletableFuture<Item> fetchSingle(Long id) {
Item result = new Item(id);
CompletableFuture<?> c = getPartCAndSetOnItem(result);
CompletableFuture<?> b = getPartBAndSetOnItem(result);
CompletableFuture<?> a = getPartAAndSetOnItem(result);
return CompletableFuture.allOf(a, b, c).thenApply(__ -> result);
}
Of course, the drawback is that you perform all 3 calls even if one fails, but you cannot have your cake and eat it…
As a side note, your getPartXAndSetOnItem() methods can be simplified to
public CompletableFuture<Item> getPartXAndSetOnItem(Item item) {
return dao.getPartX(item.getId()).thenApply(partX -> {
item.setPartX(partX);
return item;
});
}
or, considering we don't care about the actual result type in fetchSingle():
public CompletableFuture<?> getPartXAndSetOnItem(Item item) {
return dao.getPartX(item.getId()).thenRun(item::setPartX);
}

couchbase connect closed when upsert

I have two methods upsert into couchbase. Then I write two Junit tester with springboottest. After one Junit tester completed another test will throw this exception. How to resolve?
There are two upsert methods:I don't know which one methods is better?
public List<RawJsonDocument> upsert2(String generatorId, String idPrefix, List<String> contents)
{
List<RawJsonDocument> rjd = new ArrayList<RawJsonDocument>(contents.size());
Observable.from(contents).flatMap(new Func1<String,Observable<String>>(){
#Override
public Observable<String> call(String t)
{
return bucket.async().counter(generatorId, 1)
.map(jsonLongDocument -> {
String idStr = idPrefix + generatorId + jsonLongDocument.content();
String jsonStr = idStr + "=" + t;
return jsonStr;
});
}}).subscribe(new Action1<String>() {
#Override
public void call(String t)
{
String[] s = t.split("[=]");
LOGGER.debug("\n methord2 generatorId:" + s[0] + "\n content:" + s[1]);
bucket.async().upsert(RawJsonDocument.create(s[0],s[1]));
}});
return rjd;
}
public List<RawJsonDocument> upsert1(String generatorId, String idPrefix, List<String> contents)
{
if(contents == null)
{
return null;
}
List<RawJsonDocument> rjd = new ArrayList<RawJsonDocument>(contents.size());
Observable.from(contents).flatMap(new Func1<String,Observable<RawJsonDocument>>(){
#Override
public Observable<RawJsonDocument> call(String t)
{
return bucket.async().counter(generatorId, 1)
.map(jsonLongDocument -> {
String idStr = idPrefix + generatorId + jsonLongDocument.content();
LOGGER.debug("\n method3 generatorId:" + idStr + "\n content:" + t);
return RawJsonDocument.create(idStr,t);
});
}}).subscribe(new Action1<RawJsonDocument>() {
#Override
public void call(RawJsonDocument t)
{
rjd.add(bucket.async().upsert(t).toBlocking().single());
}});
return rjd;
}
This is my Junit Tester:
#Test
public void testIncrementIds3()
{
assertThat(generatorId.upsert2("counter", "idprefix", Arrays.asList("aabbccdd","ffddeeaa")).size(),is(2));
assertThat(generatorId.upsert1("counter", "idprefix", Arrays.asList("aabbccdd","ffddeeaa")).size(),is(2));
}

Iterate ArrayList

I want to know the best way to iterate this ArrayList, this ArrayList comes from a Response from an API, this is the ArrayList:
The problem is that i dont know how to get the "id" and the "value" from the loop,
i know the arraylist size but i dont have any idea how to print the "Keys" and "Values" from this Array
for(int i=1; i <= contacts.size(); i++) {
//Example System.out.print(contacts[i]->id);
//Example System.out.print(contacts[i]->contact_name) ;
//Example System.out.print(contacts[i]->numbers);
//Example System.out.print(contacts[i]->emails);
//I want to print id and value
//
}
In onResponse i call this fucntion for example:
ServerResponse resp = response.body();
functionExample((ArrayList) resp.getResponse());
The functionExample have an ArrayList as parameter.
This is my result from my resp.getResponse():
This is my json from the API:
{
"result": "success",
"message": "Lista de Contactos",
"response": [
{
"id": 1,
"contact_name": "EDIFICADORA JUANA",
"numbers": "{24602254,55655545}",
"emails": "{oipoa#gmaio.com,rst008#guan.com}"
},
{
"id": 2,
"contact_name": "LA MEJOR",
"numbers": "{25445877,25845877}",
"emails": "{AMEJOR#GMAIL.COM}"
}
]
}
I appreciate any help.
public void FunctionExample(ArrayList contacts) {
for(int i=0; i < contacts.size(); i++) {
LinkedTreeMap<String, Object> map = (LinkedTreeMap<String, Object>) contacts.get(i);
map.containsKey("id");
String id = (String) map.get("id");
map.containsKey("contact_name");
String contact_name = (String) map.get("contact_name");
map.containsKey("numbers");
String numbers = (String) map.get("numbers");
numbers.replace("{","").replace("}","");
map.containsKey("emails");
String emails = (String) map.get("emails");
emails.replace("{","").replace("}","");
Snackbar.make(getView(), id, Snackbar.LENGTH_LONG).show();
Snackbar.make(getView(), contact_name, Snackbar.LENGTH_LONG).show();
Snackbar.make(getView(), numbers, Snackbar.LENGTH_LONG).show();
Snackbar.make(getView(), emails, Snackbar.LENGTH_LONG).show();
}
}
Try this..It will give arrayList of id's
JSONObject object=new JSONObject(response);
JSONArray array= null;
try {
array = object.getJSONArray("response");
} catch (JSONException e) {
e.printStackTrace();
}
ArrayList<String> idArray=new ArrayList<>();
for(int i=0;i< array.length();i++)
{
idArray.add(getJSONObject(i).getString("id"));
}
Try this way if you are using ArrayList<TreeMap<String, String>> contacts;
for(TreeMap<String,String> contact : contacts){
String id = contact.getValue("id");
}
I would strongly encourage you to use e.g. Jackson to map your JSON response to a proper object. Consider following example:
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.Test;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;
public class JacksonTest {
private static final String JSON = "{\n" +
"\"result\": \"success\",\n" +
"\"message\": \"Lista de Contactos\",\n" +
"\"response\": [\n" +
" {\n" +
" \"id\": 1,\n" +
" \"contact_name\": \"EDIFICADORA JUANA\",\n" +
" \"numbers\": \"{24602254,55655545}\",\n" +
" \"emails\": \"{oipoa#gmaio.com,rst008#guan.com}\"\n" +
" },\n" +
" {\n" +
" \"id\": 2,\n" +
" \"contact_name\": \"LA MEJOR\",\n" +
" \"numbers\": \"{25445877,25845877}\",\n" +
" \"emails\": \"{AMEJOR#GMAIL.COM}\"\n" +
" }\n" +
" ]\n" +
"}";
#Test
public void testParsingJSONStringWithObjectMapper() throws IOException {
//given:
final ObjectMapper objectMapper = new ObjectMapper();
//when:
final Response response = objectMapper.readValue(JSON, Response.class);
//then:
assert response.getMessage().equals("Lista de Contactos");
//and:
assert response.getResult().equals("success");
//and:
assert response.getResponse().get(0).getId().equals(1);
//and:
assert response.getResponse().get(0).getContactName().equals("EDIFICADORA JUANA");
//and:
assert response.getResponse().get(0).getEmails().equals(Arrays.asList("oipoa#gmaio.com", "rst008#guan.com"));
//and:
assert response.getResponse().get(0).getNumbers().equals(Arrays.asList(24602254, 55655545));
}
static class Response {
private String result;
private String message;
private List<Data> response = new ArrayList<>();
public String getResult() {
return result;
}
public void setResult(String result) {
this.result = result;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public List<Data> getResponse() {
return response;
}
public void setResponse(List<Data> response) {
this.response = response;
}
}
static class Data {
private String id;
#JsonProperty("contact_name")
private String contactName;
private String numbers;
private String emails;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getContactName() {
return contactName;
}
public void setContactName(String contactName) {
this.contactName = contactName;
}
public List<Integer> getNumbers() {
return Stream.of(numbers.replaceAll("\\{", "")
.replaceAll("}", "")
.split(","))
.map(Integer::valueOf)
.collect(Collectors.toList());
}
public void setNumbers(String numbers) {
this.numbers = numbers;
}
public List<String> getEmails() {
return Arrays.asList(emails.replaceAll("\\{", "")
.replaceAll("}", "")
.split(","));
}
public void setEmails(String emails) {
this.emails = emails;
}
}
}
In this example I used same JSON response you receive and jackson-core library (http://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core/2.8.9) for mapping String to a POJOs (instead of String you can use InputStream, byte[] etc.). There are two POJOs: Response and Data. Response aggregates a list of Data objects. Additionally, Data's getEmails() and getNumbers() methods parse your input String to a list of expected objects. For example if you call setNumbers("{24602254,55655545}") then getNumbers() will return a list of Integers (you can use any numeric type instead) like [24602254, 55655545].
Other suggestions are also valid, e.g. iterating over collection of TreeMaps or JSONObjects. In this example we limit our focus to deal with Java objects with specific types instead of dealing with primitives like Object class for example.
The final solution also depends on your runtime environment. In this case you will have to add jackson-core dependency - it makes more sense if your project already uses Jackson for other reasons.
If you are using Set< Map< String, String>> set;
set.stream().forEach(map -> {
System.out.print("Id:" + map.get("id") + "ContactName:" + map.get("contact_name"));
});
Try this loop to extract every value from ArrayList of yours
List<LinkedTreeMap> list = new ArrayList<LinkedTreeMap>(); //assign result from API to list
for(LinkedTreeMap<String,String> contact : list){
for(String id : contact.keySet()){
if(id.equalsIgnoreCase("id")){
System.out.println("ID: "+ contact.get(id));
}else if(id.equalsIgnoreCase("contact_name")){
System.out.println("Contact Name: "+ contact.get(id));
}else{ //if it is list of numbers or e-mails
String result = contact.get(id);
result = result.replaceAll("{|}", ""); //removing { }
String[] array = result.split(",");
System.out.println(id+": "); // this will be either numbers or e-mails
//now iterating to get each value
for(String s : array){
System.out.println(s);
}
}
}
}

Categories