I have method as
public List<SenderResponse> sendAllFiles(String folderName) {
List<File> allFiles = getListOfFiles();
List<SenderResponse> finalResponse = new ArrayList<SenderResponse>();
for (File file : allFiles) {
finalResponse.getResults().add(sendSingleFile(file));
}
return finalResponse;
}
which is running as a single thread. I want run sendSingleFile(file) using multithread so I can reduce the total time taken to send files.
how can I run sendSingleFile(file) using multithreads for various files and get the final response?
I found few articles using threadpoolexecutor. But how to handle the response got during the sendSingleFile(file) and add it to one Final SenderResponse?
I am kind of new to multi-thread. Please suggest the best way to process these files.
Define an executor service
ExecutorService executor = Executors.newFixedThreadPool(MAX_THREAD); //Define integer value of MAX_THREAD
Then for each job you can do something like this:-
Callable<SenderResponse> task = () -> {
try {
return sendSingleFile(file);
}
catch (InterruptedException e) {
throw new IllegalStateException("Interrupted", e);
}
};
Future<SenderResponse> future = executor.submit(task);
future.get(MAX_TIME_TO_WAIT, TimeUnit.SECONDS); //Blocking call. MAX_TIME_TO_WAIT is max time future will wait for the process to execute.
You start by writing code that works works for the single-thread solution. The code you posted wouldn't even compile; as the method signature says to return SenderResponse; whereas you use/return a List<SenderResponse> within the method!
When that stuff works, you continue with this:
You create an instance of
ExecutorService, based on as many threads as you want to
You submit tasks into that service.
Each tasks knows about that result list object. The task does its work, and adds the result to that result list.
The one point to be careful about: making sure that add() is synchronized somehow - having multiple threads update an ordinary ArrayList is not safe.
For your situation, I would use a work stealing pool (ForkJoin executor service) and submit "jobs" to it. If you're using guava, you can wrap that in a listeningDecorator which will allow you to add a listener on the futures it returns.
Example:
// create the executor service
ListeningExecutorService exec = MoreExecutors.listeningDecorator(Executors.newWorkStealingPool());
for(Foo foo : bar) {
// submit can accept Runnable or Callable<T>
final ListenableFuture<T> future = exec.submit(() -> doSomethingWith(foo));
// Run something when it is complete.
future.addListener(() -> doSomeStuff(future), exec);
}
Note that the listener will be called whether the future was successful or not.
Related
I'm playing around with threads and I'm wondering if it's possible to force a thread to execute something.
So, the thing is I have some method like this:
public void asyncSleep() {
Supplier<Boolean> sleeper = () -> {
try {
Thread.sleep(4000);
} catch (InterruptedException e) {
}
return true;
};
CompletableFuture<Boolean> promise = CompletableFuture.supplyAsync(sleeper, ex);
promise.thenAccept(u -> {
System.out.println("thread=" + Thread.currentThread());
});
}
And I'd need the original thread (the one executing the asyncSleep() method) to be the one executing the thenAccept. Is that even possible? And if so, how can I do it?
If you want it to be working in the non blocking way and you still want that original thread (the one that executes asyncSleep) to also execut method that you pass into thenAccept then the only way is as follows:
You'd need to use an ExecutorService with just single thread. Execute asyncSleep with this executor, but then, you'll have to pass the same executor into thenAcceptAsync (instead of thenAccept) as a second argument.
Then if your thread didn't crash and wasn't replaced by the executor with new thread, it should be exactly the same thread executing both methods.
But that's an artificial use case.
I have to get data from an API, so naturally I have an endpoint handeler that is accessed through a lambda that, I assume, spawns off several threads to complete each API call that I need. However, After all of the API calls are finished (all of the lambda threads complete) I need to oranize my data. Currently, the Sort method that I have runs on the main thread, and therefore finishes before any of the API calls in the lambda finish. Here is a sample of what I have
for(String data : dataArray) {
APIEndpoint apiCall = new APIEndpoint("http://sampleAPI.org/route/" + data);
apiCall.execute(((response, success) -> {
//Format and gather the info from the response
apiDataArray.add(DataFromAPIObject);
}));
}
System.out.print(apiDataArray.size());//Returns 0
sortData();//Currently Doesn't Sort anything because the array is empty
Edit: Here is the Endpoint Executer I am working with:
https://github.com/orange-alliance/TOA-DataSync/blob/master/src/org/theorangealliance/datasync/util/FIRSTEndpoint.java
Using semaphores might be an option. But it will deadlock if for some reason there is no response for at least one of the data points. (To fix the deadlock you might need to release the semaphore on errors).
Semaphore semaphore = new Semaphore(dataArray.length);
for (String data : dataArray) {
semaphore.acquire();
APIEndpoint apiCall = new APIEndpoint("http://sampleAPI.org/route/" + data);
apiCall.execute(((response, success) -> {
// Format and gather the info from the response
apiDataArray.add(DataFromAPIObject);
semaphore.release();
}));
}
semaphore.acquire(dataArray.length);
sortData();
There's a thread pool with a single thread that is used to perform tasks submitted by multiple threads. The task is actually comprised of two parts - perform with meaningful result and cleanup that takes quite some time but returns no meaningful result. At the moment (obviously incorrect) implementation looks something like this. Is there an elegant way to ensure that another perform task will be executed only after previous cleanup task?
public class Main {
private static class Worker {
int perform() {
return 1;
}
void cleanup() {
}
}
private static void perform() throws InterruptedException, ExecutionException {
ExecutorService pool = Executors.newFixedThreadPool(1);
Worker w = new Worker();
Future f = pool.submit(() -> w.perform());
pool.submit(w::cleanup);
int x = (int) f.get();
System.out.println(x);
}
}
Is there an elegant way to ensure that another perform task will be executed only after previous cleanup task?
The most obvious thing to do is to call cleanup() from perform() but I assume there is a reason why you aren't doing that.
You say that your solution is currently "obviously incorrect". Why? Because of race conditions? Then you could add a synchronized block:
synchronized (pool) {
Future f = pool.submit(() -> w.perform());
pool.submit(w::cleanup);
}
That would ensure that the cleanup() would come immediately after a perform(). If you are worried about the performance hit with the synchronized, don't be.
Another solution might be to use the ExecutorCompletionService class although I'm not sure how that would help with one thread. I've used it before when I had cleanup tasks running in another thread pool.
If you are using java8, you can do this with CompletableFuture
CompletableFuture.supplyAsync(() -> w.perform(), pool)
.thenApplyAsync(() -> w.cleanup(), pool)
.join();
I am trying to implement multithreading using ExecutorService for downloading files parallely. Below is my code
public void downloadFiles(List<String> filenames, final String fileSavePath) {
if (filenames != null && filenames.size() > 0) {
List<Callable<Void>> jobs = new ArrayList();
for (final String fileName : filenames) {
jobs.add(new Callable() {
public Void call() throws Exception {
downloadFile(fileName, fileSavePath);
return null;
}
});
}
performJobs(jobs);
}
}
My requirement is that i want to return a status from this method after all the files are downloaded succesfully. I am not sure how to do this. I cannot access variable of inner class from an outer one.
Any advice would be appreciable.
Thanks
A Callable can return a result. When you submit a job to the executor service, you get a future back. Calling get() on it will give you back the result returned by the Callable which can very well be the status of that particular download.
In your particular example, instead of returning null, return the result of downloading the file. Another way can be to use a shared thread-safe queue between the callables and add the status to that queue (though it's a roundabout way of doing stuff). You can also use this sort of trick to "update" some status on the UI etc.
From the Javadoc of Callable:
A task that returns a result and may throw an exception. Implementors
define a single method with no arguments called call.
Taking a cue from this, change List<Callable<Void>> jobs to List<Callable<Boolean>> jobs and similarly change your return type of your call method. Using this, after completion of the task, you can then check the returned status.
Use an ExecutorCompletionService.
I have a pre-populated set of strings. I want to iterate over the items and while iterating, i need to "do work" which might also remove the item from the set. I want to spawn a new thread for each item's "do work". Please note that only some items are removed from the set during "do work".
Now i have the following question,
Can i achieve this by simply using Collections.synchronizedSet(new HashSet()); ? I am guessing this will throw up ConcurrentModificationException since i am removing items from the list while i am iterating. How can i achieve the above behavior efficiently without consistency issues ?
Thanks!
I would use an ExecutorService
ExecutorService es = Executors.newFixedThreadPool(n);
List<Future<String>> toRemove = new ARraysList<>();
for(String s: set)
toRemove.add(es.submit(new Task(s)));
for(Future<String> future : toRemove()) {
String s = future.get();
if (s != null)
set.remove(s);
}
This avoids needing to access the collection in a multi-threaded way.
Use a master producer thread that will remove the elements from the collection and will feed them to consumer threads. The consumer threads have no need to "personally" remove the items.
Yes, a SynchronisedSet will still throw ConcurrentModificationExceptions.
Try this:
Set s = Collections.newSetFromMap(new ConcurrentHashMap())
ConcurrentHashMap should never throw a ConcurrentModificationException, when multiple threads are accessing and modifying it.
The approach depends on the relation between the data in your set and the successful completion of the operation.
Remove from Set is independent of the result of task execution
If you don't care about the actual result of the thread execution, you can just go through the set and remove every item as you dispatch the task (you have some examples of that already)
Remove from Set only if task execution completed successfully
If the deletion from the set should be transactional to the success of the execution, you could use Futures to collect information about the success of the task execution. That way, only successfully executed items will be deleted from the original set. There's no need to access the Set structure concurrently, as you can separate execution from check using Futures and an ExecutorService . eg:
// This task will execute the job and,
// if successful, return the string used as context
class Task implements Callable<String> {
final String target;
Task(String s) {
this.target = s;
}
#Override
public String call() throws Exception {
// do your stuff
// throw an exception if failed
return target;
}
}
And this is how it's used:
ExecutorService executor;
Set<Callable<String>> myTasks = new HashSet<Callable<String>>();
for(String s: set) {
myTasks.add(new Task(s));
}
List<Future<String>> results = executor.invoqueAll(myTasks);
for (Future<String> result:results) {
try {
set.remove(result.get());
} catch (ExecutionException ee) {
// the task failed during execution - handle as required
} catch (CancellationException ce) {
// the task was cancelled - handle as required
}
}