How to return from a thread - java

I am trying use tow thread,form the first get the input and the second process the input.
put the problem I can not found how to return a value from a thread without using callback
an callback does not act like a thread (I think) so any good idea how to do that and thank.
Thread t1 = new Thread() {
public void input() {
while (true) {
while (true) {
/*
* get input using Scanner
*/
}
}
}
};
t1.start();
Thread t2 = new Thread() {
public void input() {
while (true) {
while (true) {
/* get input form above than
* swith something or do something
*/
}
}
}
};
t2.start();

Use a shared BlockingQueue. The first thread (producer) adds the inputs to the queue, and the second one (consumer) gets them from the queue. A BlockingQueue, as its name indicates, is blocking. So the consumer getting the next element from the queue will block until the queue actually contains an element.

Your second thread should raise an event when it has something available for the first thread. When creating the second thread, have the first thread add itself as a listener, then the second thread uses that to signal the event.
This talks about swing but you can use generic events and listeners for anything.
http://docs.oracle.com/javase/tutorial/uiswing/events/

If you only have one thread you may want to use a java.util.concurrent.FutureTask (provided you use Java 1.5 or later).

the blocking queue is the design of choice in such scenarios
Spawn your producer threads and let them insert their data into the queue. Another thread (or the main thread), the consumer, will "spin-lock" watching for data on the queue: as soon as some data arrives, the consumer will grab it and use it.
So your threads "returns" data by inserting it into this common data structure (the queue).
Don't forget to protect your queue using a mutex (it's a critical section) or multiple threads could use (read/write) the queue data structure at the same time causing all sort of weird behaviors, a plain SIGSEGV signal if you are lucky :-)

Related

Queue print jobs in a separate single Thread for JavaFX

currently I am experimenting with Concurrency in Java/JavaFX. Printing must run in a different thread otherwise it will make the JavaFX main thread freeze for a couple seconds. Right now my printing is done with this simplified example.
public void print(PrintContent pt) {
setPrintContent(pt);
Thread thread = new Thread(this);
thread.start();
}
#Override
public void run() {
// send content to printer
}
With this code I am sending many print jobs parallel to my printer. Therefore I get the error telling me that my printer can only handle 1 print job at a time. Since I know that Threads cannot be reused, I would like to know if there is a possibility to queue up Threads, so that my printer only handles one print job at a time.
Thank you very much for your effort and your time.
Use a single threaded executor to execute the print jobs. It will create one (and only one) background thread and queue the jobs:
// it might be better not to make this static; but you need to ensure there is
// only one instance of this executor:
private static final Executor PRINT_QUEUE = Executors.newSingleThreadExecutor();
// ...
public void print(PrintContent pt) {
PRINT_QUEUE.execute(() -> {
// send content to printer
});
}
~~> WAY 1
You can implement your own BlockingQueue read this is very useful or use a default implementation from Java libraries tutorial
So after reading the above links,you add a method in your class like
public void addJob(Object job){
queue.put(job);
}
Secondly you implement a Thread that is running into an infinite while loop.Inside it you call the method
queue.take();
When the queue is empty this Thread is blocked waiting until a new Object is added,so you dont have to worry about spending cpu time.
Finally you can set some upper bounds so for example queue can contain .. 27 items.
Mention that in case of Thread failure you have to recreate it manually.
~~>WAY 2 Better Approach
You can use an Executors Interface:
ExecutorService executorService1 = Executors.newSingleThreadExecutor();
From documentation:
Creates an Executor that uses a single worker thread operating off an
unbounded queue. (Note however that if this single thread terminates
due to a failure during execution prior to shutdown, a new one will
take its place if needed to execute subsequent tasks.) Tasks are
guaranteed to execute sequentially, and no more than one task will be
active at any given time.
With the method below you retrieve a result if the job has successfully done.
Future future = executorService.submit(new Callable(){ public Object call() throws Exception { System.out.println("Asynchronous Callable"); return "Callable Result"; } });
System.out.println("future.get() = " + future.get());
If future.get() returns null, the job has been done successfully.
Remember to call
executorService.shutdown(); because the active threads inside this ExecutorService may prevent the JVM from shutting down.
Full tutorial here

producer consumer pattern with concurrenthashmap in java

I have the following problem, and I am not sure how to design parts of the solution:
I have a large text file that I read line by line.
I need to process each line and update a HashMap.
AFAIK I need one producer thread to read the lines from the file, and dispatch the lines to a pool of consumer threads. The consumer threads should update the ConcurrentHashMap and then get new lines.
My questions are:
How can the consumer threads access the ConcurrentHashMap?
If I use a fixed thread pool, does the producer need to add the line to a queue first, or can it simply submit or execute a new consumer?
EDIT:
Zim-Zam is correct; I want the consumers to dump their results into the ConcurrentHashMap when they finish.
I create the ConcurrentHashMap in the main thread, and pass references to it to the Consumers in their constructors. The Consumers should either add or increment an AtomicInteger in their run methods. How can I tell in the main thread when all of the lines are read and the consumers are finished?
Thanks again.
You can either have all of the consumers share the same queue that the producer adds to, or else you can give each consumer its own queue that the producer accesses via a circular linked list or a similar data structure so that each consumer's queue receives more or less the same amount of data (e.g. if you have 3 consumers, then the producer would add data to queue1, then queue2, then queue3, then queue1, etc).
You can give each consumer a reference to the same ConcurrentHashMap (e.g. in the consumer's constructor), or else you can make the ConcurrentHashMap accessible via a static getter method.
I think you don't really need to use producer consumer queue in the way you suggested.
Simply have the main queue reading the file, and for each line you read, create a corresponding Runnable object (treat it as a command) and put it to the thread pool executor. The content of the Runnable object is simply the logic of handle that line and putting result to the concurrentHashMap
The ThreadPoolExecutor can be created with a bounded or unbounded blocking queue, depends on the behavior you want.
In pseudo code it is something like this:
class LineHandler implements Runnable {
String line;
ConcurrentHashMap resultMap;
public LineHandler(String line, ConcurrentHashMap resultMap) {
this.line = line;
this.resultMap = resultMap;
}
#Override
public void run() {
// work on line
// update resultMap
}
}
// logic in your file reader thread, supposed to be in a loop:
while (moreLinesInFile()) {
String line = readFromFile();
threadPoolExecutor.submit(new LineHandler(line, concurrentHashMap));
}
threadPoolExecutor.shutdown();
Use a CountDownLatch.
// in main thread
// assume consumers are in some kind of container
List<MyConsumer> consumers...
CountDownLatch latch = new CountDownLatch( consumers.size() );
for( MyConsumer c : consumers ) {
c.setLatch( latch );
c.start(); // starts asychronous, or submit to executor, whatever you're doing
}
// block main thread, optionally timing out
latch.await();
// Then in consumer when it's done it's work:
latch.countDown();
I would suggest you use a BlockingQueue to store the to be processed lines.
After the main thread finished parsing the file, the main thread puts a poison object as the last object into the queue and waits with awaitTermination(...) for the consumers to finish.
The poison object is handled in a special way in a consumer thread. The consumer thread that processes the posion object attemts to shutdown() the ExecutorService, while the main thread is waiting.
As for the result of the consumers just add them to some threadsafe container. The producer/consumer problem is handled by the Queue: poll(...), put(...).
Hope i could help

How do I stop my command queue loop using so much CPU properly?

I have a while loop that checks if an arraylist containing commands for the program to execute is empty. Obviously it does things if not empty, but if it is right now I just have a Thread.sleep(1000) for the else. That leaves anything that interacts with it rather sluggish. Is there any way to get the thread it runs on to block until a new command is added? (It runs in it's own thread so that seems to be the best solution to me) Or is there a better solution to this?
You can use wait() and notify() to have the threads that add something to the list inform the consumer thread that there is something to be read. However, this requires proper synchronization, etc..
But a better way to solve your problem is to use a BlockingQueue instead. By definition, they are synchronized classes and the dequeuing will block appropriately and wakeup when stuff is added. The LinkedBlockingQueue is a good class to use if you want your queue to not be limited. The ArrayBlockingQueue can be used when you want a limited number of items to be stored in the queue (or LinkedBlockingQueue with an integer passed to the constructor). If a limited queue then queue.add(...) would block if the queue was full.
BlockingQueue<Message> queue = new LinkedBlockingQueue<Messsage>();
...
// producer thread(s) add a message to the queue
queue.add(message);
...
// consumer(s) wait for a message to be added to the queue and then removes it
Message message = queue.take();
...
// you can also wait for certain amount of time, returns null on timeout
Message message = queue.poll(10, TimeUnit.MINUTES);
Use a BlockingQueue<E> for your commands.
There's a very good example of how to use it in the link above.
A better solution is to use an ExecutorService. This combines a queue and a pool of threads.
// or use a thread pool with multiple threads.
ExecutorService executor = Executors.newSingleThreadExecutor();
// call as often as you like.
executor.submit(new Runnable() {
#Override
public void run() {
process(string);
}
});
// when finished
executor.shutdown();

How can I wait() on one object and then notifyAll on another?

I believe the problem I am facing is a variant of the nested-monitor lockout. Basically I have two groups of threads (not ThreadGroups, just logical groups). One group of threads(let's say the background group) will be waiting on an object while the other group of threads is working (the working group). One by one the working threads complete, until finally the last working thread is in the 'complete' method. What I want to do is figure out some method of telling this last working thread to wait, and then calling notifyAll() to wakeup all the background threads. As you can probably guess, the two groups of threads are being switched back and forth - one group is working while the other is waiting and then the groups switch. Problem is, if I notifyAll() on the currently waiting threads then there is no guarantee the final working thread will make it to the wait() call before the notified threads complete and try to start the next swap.
Sorry if this question is a bit off - seems the more I work on concurrency the more convoluted my code becomes :(
Sounds like you need something like a Gate class that is composed of two CountDownLatch instances. I use something similar in a lot of multi-threaded tests.
Your waiting threads all call gate.ready() and the the workers call gate.go() when done
Note this particular implementation assumes 1 coordinator thread. To support more, simply construct the go latch with the number of waiter threads you require.
/**
* Simple starting gate for co-ordinating a bunch of threads.
*/
final class Gate {
final CountDownLatch ready;
final CountDownLatch go = new CountDownLatch(1);
Gate(final int threads) {
ready = new CountDownLatch(threads);
}
/**
* Called from the racing threads when ready. They will then block until all
* threads are at this point;
*/
void ready() {
ready.countDown();
await(go);
}
/**
* Called from the starter thread. Blocks until everybody is ready, and then
* signals go.
*/
void go() {
await(ready);
go.countDown();
}
static void await(final CountDownLatch latch) {
try {
if (!latch.await(5, TimeUnit.SECONDS)) { // arbitrary, parameterise for production use
throw new TimedOutException()
}
} catch (final InterruptedException e) {
throw new RuntimeException(e);
}
}
static final class TimedOutException extends IllegalStateException {}
}
If you need unknown arbitrary thread counts you probably want something similar to Doug Lea's Phaser class coming in Java7.
You could try connecting the thread groups with an Exchanger. It's classically used for transferring work back and forth between two threads that alternate work. Seems like you might be able to make it work for groups of threads too if you can get the transfer to work right.
What if you had a controller thread for each group? You could then have the controller notifyAll on his group when he received an item in the Exchanger, then join on all of his own group. When the joins all return, he could transfer control back over the Exchanger.
Or if the number of threads in each group is fixed, you could create a CyclicBarrier for the group with the fixed number of threads, then specify a barrier action to be run when all of the threads complete and hit the barrier. That action could transfer control via an Exchanger or a SynchronousQueue (which is a 0-length queue that enforces synchronous coordination).
For more information on synchronizers, check out Java Concurrency in Practice or the DZone concurrency refcard.
Maybe you could use some variable to indicate number of still working threads. So, when the thread completes, it uses this method:
synchronized void completed() {
threads_working--;
if (threads_working == 0) {
synchronized (some_lock) {
some_lock.notifyAll();
}
}
}
And every thread shall increment that number when it starts working.
Is it possible to allow the threads to return and terminate instead of having them wait? If so, have you considered implementing a thread manager to spawn the threads and initiate control to each group?
the threaded process:
public void run()
{
while (workRemaining())
{
doWork();
}
this.manager.workCompleted();
}
and within the thread manager:
void workCompleted()
{
if (--this.runningThreads <= 0)
{
spawnNewGroup();
}
}
void spawnNewGroup()
{
for (int i=0; i<groupSize; i++)
{
startIndividualThread();
this.runningThreads++;
}
}

How does one stop a thread without a stop() method?

I have question about the Java threads. Here is my scenario:
I have a thread calling a method that could take while. The thread keeps itself on that method until I get the result. If I send another request to that method in the same way, now there are two threads running (provided the first did not return the result yet). But I want to give the priority to the last thread and don't want to get the results from the previously started threads. So how could I get rid of earlier threads when I do not have a stop method?
The standard design pattern is to use a local variable in the thread that can be set to stop it:
public class MyThread extends Thread {
private volatile boolean running = true;
public void stop() {
running = false;
}
public void run() {
while (running) {
// do your things
}
}
}
This way you can greacefully terminate the thread, i.e. without throwing an InterruptedException.
The best way really depends on what that method does. If it waits on something, chances are an interrupt will result in an InterruptedException which you handle and cleanly exit. If it's doing something busy, it won't:
class Scratchpad {
public static void main(String[] a) {
Thread t = new Thread(new Runnable() {
public void run() {doWork();}
});
t.start();
try {
Thread.sleep(50);
} catch (InterruptedException ie) {}
t.interrupt();
}
private static void doWork() {
for ( long i = 1; i != 0; i *=5 );
}
}
In the case above, the only viable solution really is a flag variable to break out of the loop early on a cancel, ala #inflagranti.
Another option for event-driven architectures is the poison-pill: if your method is waiting on a blocking queue for a new item, then you can have a global constant item called the "poison-pill" that when consumed (dequeued) you kill the thread:
try {
while(true) {
SomeType next = queue.take();
if ( next == POISON_PILL ) {
return;
}
consume(next);
}
} catch //...
EDIT:
It looks like what you really want is an executor service. When you submit a job to an executor service, you get back a Future which you can use to track results and cancel the job.
You can interrupt a Thread, its execution chain will throw an InterruptedException most of the time (see special cases in the documentation).
If you just want to slow down the other thread and not have it exit, you can take some other approach...
For one thing, just like exiting you can have a de-prioritize variable that, when set, puts your thread to sleep for 100ms on each iteration. This would effectively stop it while your other thread searched, then when you re-prioritize it it would go back to full speed.
However, this is a little sloppy. Since you only ever want one thing running but you want to have it remember to process others when the priority one is done, you may want to place your processing into a class with a .process() method that is called repeatedly. When you wish to suspend processing of that request you simply stop calling .process on that object for a while.
In this way you can implement a stack of such objects and your thread would just execute stack.peek().process(); every iteration, so pushing a new, more important task onto the stack would automatically stop any previous task from operating.
This leads to much more flexible scheduling--for instance you could have process() return false if there is nothing for it to do at which point your scheduler might go to the next item on the stack and try its' process() method, giving you some serious multi-tasking ability in a single thread without overtaxing your resources (network, I'm guessing)
There is a setPriority(int) method for Thread. You can set the first thread its priority like this:
Thread t = new Thread(yourRunnable);
t.start();
t.setPriority(Thread.MIN_PRIORITY); // The range goes from 1 to 10, I think
But this won't kill your thread. If you have only two threads using your runnable, then this is a good solution. But if you create threads in a loop and you always sets the priority of the last thread to minimum, you will get a lot of threads.
If this is what is application is going to do, take a look at a ThreadPool. This isn't an existing class in the Java API. You will have create one by yourself.
A ThreadPool is another Thread that manages all your other Threads the way you want. You can set a maximum number of running Threads. And in that ThreadPool, you can implement a system that manages the Thread priority automatically. Eg: You can make that older threads gain more priority, so you can properly end them.
So, if you know how to work with a ThreadPool, it can be very interesting.
According to java.lang.Thread API, you should use interrupt() method and check for isInterrupted() flag while you're doing some time-consuming cancelable operation. This approach allows to deal with different kind of "waiting situations":
1. wait(), join() and sleep() methods will throw InterruptedExcetion after you invoke interrupt() method
2. If thread blocked by java.nio.channels.Selector it will finish selector operation
3. If you're waiting for I/O thread will receive ClosedByInterruptException, but in this case your I/O facility must implement InterruptibleChannel interface.
If it's not possible to interrupt this action in a generic way, you could simply abandon previous thread and get results from a new one. You could do it by means of java.util.concurrent.Future and java.util.concurrent.ExecutorService.
Cosider following code snippet:
public class RequestService<Result> {
private ExecutorService executor = Executors.newFixedThreadPool(3);
private Future<Result> result;
public Future<Result> doRequest(){
if(result !=null){
result.cancel(true);
}
result = executor.submit(new Callable<Result>() {
public Result call() throws Exception {
// do your long-running service call here
}
});
return result;
}
}
Future object here represents a results of service call. If you invoke doRequest method one more time, it attempts to cancel previous task and then try to submit new request. As far as thread pool contain more than one thread, you won't have to wait until previous request is cancelled. New request is submitted immediately and method returns you a new result of request.

Categories