I have a class user which writes json objects to a file
public class User {
public static void main(String args[]) throws IOException, org.json.simple.parser.ParseException{
writetofile();
Q q= new Q();
Writer write = new Writer("write",q);
// System.out.println(q.queue.poll());
Reader reader = new Reader("read",q);
}
public static void writetofile() throws IOException{
FileWriter file = new FileWriter("file1.txt");
for(int i=0;i<3;++i){
JSONObject obj = new JSONObject();
obj.put("Name", rand_s());
obj.put("Age", rand_i());
file.write(obj.toJSONString());
file.flush();
file.write("\r\n");
// System.out.println("Successfully Copied JSON Object to File...");
// System.out.println("\nJSON Object: " + obj);
}
}
public static String rand_s(){
final String AB = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
SecureRandom rnd = new SecureRandom();
StringBuilder sb = new StringBuilder( 12 );
for( int i = 0; i < 12; i++ )
sb.append( AB.charAt( rnd.nextInt(AB.length()) ) );
return sb.toString();
}
public static String rand_i(){
final String AB = "0123456789";
SecureRandom rnd = new SecureRandom();
StringBuilder sb = new StringBuilder( 2 );
for( int i = 0; i < 2; i++ )
sb.append( AB.charAt( rnd.nextInt(AB.length()) ) );
return sb.toString();
}
}
I have a class writer which writes the Json documents from file to a queue and a class reader which reads from queue and prints the objects and deletes them from queue
Below is writer class
package org.mmt;
import java.io.File;
import java.io.FileNotFoundException;
import java.text.ParseException;
import java.util.ArrayList;
import java.util.Scanner;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
public class Writer implements Runnable {
Thread t;
Q q;
Writer(String name,Q q){
t= new Thread(this,name);
this.q = q;
t.start();
}
#Override
public void run(){
String FileName="file1.txt";
try {
ArrayList<JSONObject> jsons=ReadJSON(new File(FileName),"UTF-8");
for(JSONObject ob1 : jsons){
q.put(ob1);
notifyAll();
// System.out.println(q.queue.poll());
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
} catch (org.json.simple.parser.ParseException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public static synchronized ArrayList<JSONObject> ReadJSON(File MyFile,String Encoding) throws FileNotFoundException, ParseException, org.json.simple.parser.ParseException {
Scanner scn=new Scanner(MyFile,Encoding);
ArrayList<JSONObject> json=new ArrayList<JSONObject>();
//Reading and Parsing Strings to Json
while(scn.hasNext()){
JSONObject obj= (JSONObject) new JSONParser().parse(scn.nextLine());
json.add(obj);
}
return json;
}
}
Below is reader class
package org.mmt;
import java.util.Queue;
import org.json.simple.JSONObject;
public class Reader implements Runnable {
Thread t;
Q q;
Reader(String name,Q q){
t=new Thread(this,name);
this.q=q;
t.start();
}
public void run() {
// TODO Auto-generated method stub
while(!q.empty()){
JSONObject obj = new JSONObject();
obj = q.get();
System.out.println(obj);
}
while(q.empty()){
try {
wait();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
and below is the Q class which contains the queue in which data has to be written
package org.mmt;
import java.util.LinkedList;
import java.util.Queue;
import org.json.simple.JSONObject;
public class Q {
public Queue<JSONObject> queue = new LinkedList<JSONObject>();
public synchronized JSONObject get(){
return queue.poll();
}
public synchronized void put(JSONObject obj){
try{
queue.add(obj);
}
catch (Exception e){
System.out.println(e);
}
}
public boolean empty(){
return queue.isEmpty();
}
}
I have started threads in reader and writer for simultaneous reading and writing and whenever queue is empty reader class waits() and whenever writer writes an element to queue I use notifyall() for reader to resume but I am getting Illegal monitor state exception. I have searched the internet and found that this occurs whenever thread tries to take lock of monitor which it does not own but I am not able to resolve the issue
Your Reader and Writer classes need to share the monitor object. In your example Reader is using itself as a monitor, and Writer is using itself as a monitor.
In your case you could use the Queue q itself as a monitor because that is your state that needs synchonization.
Also the callers need to own the monitor, they usually take ownership like this:
syncronized (q) { //do stuff on q}
In other words, wait/notify should be called only in a synchronized block, which is synchronized on the object.
More about wait/notify here
You're getting that exception because you aren't in a monitor.
public class User {
public static final Object lock = new Object();
...
synchronized(User.lock) {
while(q.empty()){
try {
wait();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public class Writer implements Runnable {
...
for(JSONObject ob1 : jsons){
q.put(ob1);
synchronized(User.lock) {
notifyAll();
}
// System.out.println(q.queue.poll());
}
Please note I haven't worked with synchronized and concurrency like this in a while, so I'm not sure if this is entirely thread-safe or not.
Mostly because if you end up with
synchronized(lock) {
synchronized(Q) {
and
synchronized(Q) {
synchronized(lock) {
Then you'll get a deadlock at some point, and your app will freeze. That's why I'm personally wary of synchronized methods.
Related
I am wondering is there a way to use the take() function for a blockingqueue and add each element i am taking and add it to a map ? Or would I have to add to map while I am adding to the queue?
What i am doing is parsing input text into ngrams and then adding them to a blockingQueue, once the blockingQueue has all the elements , i want to take() from the queue and add them to a map, which I have in another class.
import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.concurrent.BlockingQueue;
public class FileParser implements Runnable{
private BlockingQueue<Task> queue;
private String fileName;
private int k;
private Database dbFile = null;
public FileParser(BlockingQueue<Task> queue,String fileName, int k) {
this.queue=queue;
this.fileName = fileName;
this.k = k;
}
public void setDbFile(Database dbFile) {
this.dbFile = dbFile;
}
#Override
public void run() {
// TODO Auto-generated method stub
try {
file();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void file() throws IOException, InterruptedException {
BufferedReader br = null;
try {
br = new BufferedReader(new InputStreamReader(new FileInputStream(fileName)));
String line;
while ((line = br.readLine()) != null) {
nGram(line);
}
} catch (FileNotFoundException e) {
System.out.println("File not found");
}
}
private void nGram(String j) throws IOException, InterruptedException {
for (int i = 0; i <= j.length() - k; i++) {
CharSequence ngrams = j.substring(i, i + k);
Task t=new Task(ngrams);
queue.put(t);
System.out.println("Adding "+t);
}
}
}
//Responsible for removing from the queue
import java.util.concurrent.BlockingQueue;
public class Consumer implements Runnable {
private BlockingQueue<Task> queue;
private volatile boolean keepRunning = true;
private Database dbFile = null;
public Consumer(BlockingQueue<Task> queue) {
super();
this.queue = queue;
}
#Override
public void run() {
while (keepRunning) {
try {
Task t = (queue.take());
if (t instanceof Poison) {
keepRunning = false;
System.out.println("Queue poisoned->" + t);
}
System.out.println("Removing " + t);
//Error here:
//dbFile.addFile(t);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
}
** Output **
Adding N-gram [ ye]
Removing N-gram [ He]
Adding N-gram [ eh]
Adding N-gram [ hi]
Exception in thread "Thread-2" Adding N-gram [ ih]
Adding N-gram [ hi]
Adding N-gram [ ay]
java.lang.NullPointerException
at ie.gmit.sw.Consumer.run(Consumer.java:27)
at java.lang.Thread.run(Thread.java:748)
I am not able to get why I am getting exception with both : class level lock as well as with object level lock in below code :
It seems object level locking should work here as we are changing and accessing hm(Object) value using different threads, but still we are getting exception(java.util.ConcurrentModificationException).
I tried with all three locking commented in the code.
I know using Hashtable or ConcurrentHashMap we can resolve this problem, but I want to know the concept which is missing using HashMap.
import java.util.HashMap;
class A{
StringBuilder str = new StringBuilder("ABCD");
StringBuilder exception = new StringBuilder("");
HashMap hm = new HashMap();
public void change() {
//synchronized(A.class) {
//synchronized (this){
synchronized (hm){
(this.str).append(Thread.currentThread().getName().toString());
System.out.println(Thread.currentThread().getName()+"::::::"+str);
hm.put(Thread.currentThread(), Thread.currentThread());
}
}
public void impact() {
//synchronized(A.class) {
//synchronized(this) {
synchronized(hm) {
System.out.println(Thread.currentThread()+"...Inside impact :::"+hm.get(Thread.currentThread()));
}
}
public void print() {
System.out.println("Inside print :::"+str);
System.out.println("Inside print :::exception--"+exception);
}
}
class B extends Thread{
A a;
B(A a){
this.a=a;
}
public void run() {
try {
System.out.println("Inside B run::"+a.hm);
a.change();
a.impact();
}
catch(Exception e){
StringWriter sw = new StringWriter();
e.printStackTrace(new PrintWriter(sw));
System.out.println(sw.toString());
a.exception.append(sw.toString());
try {
sw.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
}
}
class C extends Thread{
A a;
C(A a){
this.a=a;
}
public void run() {
try {
System.out.println("Inside C run::"+a.hm);
a.change();
a.impact();
}
catch(Exception e){
StringWriter sw = new StringWriter();
e.printStackTrace(new PrintWriter(sw));
System.out.println(sw.toString());
a.exception.append(sw.toString());
try {
sw.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
}
}
public class multiTest {
public static void main(String[] args) {
// TODO Auto-generated method stub
A a = new A();
for(int i=0;i<=100;i++) {
B b = new B(a);
C c = new C(a);
b.start();
c.start();
}
try {
Thread.currentThread().sleep(5000);
}catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
a.print();
}
}
The problem is on this line:
System.out.println("Inside B run::"+a.hm);
There is sneaky implicit invocation of a.hm.toString() here, and that does sneaky iteration of the map's entries; but you aren't synchronizing on anything, so you don't have exclusive access to the hashmap.
Put it in a synchronized block:
synchronized (a.hm) {
System.out.println("Inside B run::"+a.hm);
}
(And make hm final; and don't use raw types).
I am writing data to file using a queue on a separate thread, but the process consumes around 25% of CPU, as shown in this test main.
Is there something I can do to resolve this issue?
Perhaps I should be using flush() somewhere?
The test shows the main method start and run the queue thread and then send created data to it. The queue thread writes the data to a BufferedWriter which handles writing the data to a file.
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ConcurrentLinkedQueue;
import java.util.logging.Level;
import java.util.logging.Logger;
import uk.co.moonsit.utils.timing.Time;
public class OutputFloatQueueReceiver extends Thread {
private static final Logger LOG = Logger.getLogger(OutputFloatQueueReceiver.class.getName());
private ConcurrentLinkedQueue<List<Float>> queue = null;
private boolean running = true;
private final BufferedWriter outputWriter;
private int ctr = 0;
private final int LIMIT = 1000;
public OutputFloatQueueReceiver(String outputFile, String header, ConcurrentLinkedQueue<List<Float>> q) throws IOException {
queue = q;
File f = new File(outputFile);
FileWriter fstream = null;
if (!f.exists()) {
try {
f.getParentFile().mkdirs();
if (!f.createNewFile()) {
throw new IOException("Exception when trying to create file " + f.getAbsolutePath());
}
fstream = new FileWriter(outputFile, false);
} catch (IOException ex) {
//Logger.getLogger(ControlHierarchy.class.getName()).log(Level.SEVERE, null, ex);
throw new IOException("Exception when trying to create file " + f.getAbsolutePath());
}
}
fstream = new FileWriter(outputFile, true);
outputWriter = new BufferedWriter(fstream);
outputWriter.append(header);
}
public synchronized void setRunning(boolean running) {
this.running = running;
}
#Override
public void run() {
while (running) {
while (queue.peek() != null) {
if (ctr++ % LIMIT == 0) {
LOG.log(Level.INFO, "Output Queue size = {0} '{'ctr={1}'}'", new Object[]{queue.size(), ctr});
}
List<Float> list = queue.poll();
if (list == null) {
continue;
}
try {
StringBuilder sbline = new StringBuilder();
Time t = new Time(list.get(0));
sbline.append(t.HMSS()).append(",");
for (Float f : list) {
sbline.append(f).append(",");
}
sbline.append("\n");
outputWriter.write(sbline.toString());
} catch (IOException ex) {
LOG.info(ex.toString());
break;
}
}
}
if (outputWriter != null) {
try {
outputWriter.close();
LOG.info("Closed outputWriter");
} catch (IOException ex) {
Logger.getLogger(OutputFloatQueueReceiver.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
public static void main(String[] args) {
try {
String outputFile = "c:\\tmp\\qtest.csv";
File f = new File(outputFile);
f.delete();
StringBuilder header = new StringBuilder();
header.append("1,2,3,4,5,6,7,8,9");
header.append("\n");
ConcurrentLinkedQueue<List<Float>> outputQueue = null;
OutputFloatQueueReceiver outputQueueReceiver = null;
outputQueue = new ConcurrentLinkedQueue<>();
outputQueueReceiver = new OutputFloatQueueReceiver(outputFile, header.toString(), outputQueue);
outputQueueReceiver.start();
for (int i = 1; i < 100000; i++) {
List<Float> list = new ArrayList<>();
//list.set(0, (float) i); // causes exception
list.add((float) i);
for (int j = 1; j < 10; j++) {
list.add((float) j);
}
outputQueue.add(list);
}
try {
Thread.sleep(5000);
} catch (InterruptedException ex) {
Logger.getLogger(OutputFloatQueueReceiver.class.getName()).log(Level.SEVERE, null, ex);
}
outputQueueReceiver.setRunning(false);
} catch (IOException ex) {
Logger.getLogger(OutputFloatQueueReceiver.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
This code is the reason while your code is using so much CPU:
while (running) {
while (queue.peek() != null) {
// logging
List<Float> list = queue.poll();
if (list == null) {
continue;
}
// do stuff with list
}
}
Basically, your code is busy-waiting, repeatedly "peeking" until a queue entry becomes available. It is probably spinning there in a tight loop.
You should replace your queue class with a BlockingQueue, and simply use take() ... like this:
while (running) {
List<Float> list = queue.take();
// do stuff with list
}
The take() call block indefinitely, only returning once there is an element available, and returning that element as the result. If blocking indefinitely is a problem, you could either use poll(...) with a timeout, or you could arrange that some other thread interrupts the thread that is blocked.
I have a multi-threaded command line app. It is a web service client with a pool of 10 threads that churns away, sending requests, batch-style, to a server.
But it runs for a few days, and sometimes further down the pipeline, the queues start getting backed up. So I want to go to the client, press - or + and have that increase or decrease a Thread.sleep(waitingTime), to take pressure off the server.
I tried running a Scanner in a separate thread, but it didn't seem to work. Has anyone managed to get non-blocking I/O working in Java? I presume it's possible, but I'm giving up for now.
Edit: Added test code as per request
package test;
import java.io.*;
import java.util.Scanner;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
/**
* Created by djb on 2015/06/03.
*/
public class ThreadTest {
public ThreadTest() {
}
static long rand = 10000;
public static void main(String args[])
{
ExecutorService executor = Executors.newFixedThreadPool(5);
File f = new File("C:\\code\\ThreadTest\\text.csv");
try {
Runnable keyPressThread = new ThreadTest.KeyPressThread();
Thread t = new Thread(keyPressThread);
t.start();
BufferedReader br = new BufferedReader(new FileReader(f));
String line;
while ((line = br.readLine()) != null)
{
try {
final String copy = line;
executor.execute(new Runnable() {
#Override
public void run() {
try {
System.out.println(rand);
Thread.sleep(rand);
System.out.println(copy);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
});
} catch (Exception e)
{
e.printStackTrace();
}
}
} catch (Exception e)
{
e.printStackTrace();
}
}
public static class KeyPressThread implements Runnable {
Scanner inputReader = new Scanner(System.in);
//Method that gets called when the object is instantiated
public KeyPressThread() {
}
public void run() {
String input = inputReader.next();
if (input.equals("["))
{
rand+=100;
System.out.println("Pressed [");
}
if (input.equals("]"))
{
rand-=100;
System.out.println("Pressed ]");
}
}
}
}
Your KeyPressThread is only testing once:
This will make it watch constantly.
public void run()
{
while(true)
{
if (inputReader.hasNext())
{
String input = inputReader.next();
if (input.equals("["))
{
rand+=100;
System.out.println("Pressed [");
}
if (input.equals("]"))
{
rand-=100;
System.out.println("Pressed ]");
}
if (input.equalsIgnoreCase("Q"))
{
break; // stop KeyPressThread
}
}
}
}
System.in is line buffered, by default. This means that no input is actually passed to the program until you press ENTER.
I have a program that performs lots of calculations and reports them to a file frequently. I know that frequent write operations can slow a program down a lot, so to avoid it I'd like to have a second thread dedicated to the writing operations.
Right now I'm doing it with this class I wrote (the impatient can skip to the end of the question):
public class ParallelWriter implements Runnable {
private File file;
private BlockingQueue<Item> q;
private int indentation;
public ParallelWriter( File f ){
file = f;
q = new LinkedBlockingQueue<Item>();
indentation = 0;
}
public ParallelWriter append( CharSequence str ){
try {
CharSeqItem item = new CharSeqItem();
item.content = str;
item.type = ItemType.CHARSEQ;
q.put(item);
return this;
} catch (InterruptedException ex) {
throw new RuntimeException( ex );
}
}
public ParallelWriter newLine(){
try {
Item item = new Item();
item.type = ItemType.NEWLINE;
q.put(item);
return this;
} catch (InterruptedException ex) {
throw new RuntimeException( ex );
}
}
public void setIndent(int indentation) {
try{
IndentCommand item = new IndentCommand();
item.type = ItemType.INDENT;
item.indent = indentation;
q.put(item);
} catch (InterruptedException ex) {
throw new RuntimeException( ex );
}
}
public void end(){
try {
Item item = new Item();
item.type = ItemType.POISON;
q.put(item);
} catch (InterruptedException ex) {
throw new RuntimeException( ex );
}
}
public void run() {
BufferedWriter out = null;
Item item = null;
try{
out = new BufferedWriter( new FileWriter( file ) );
while( (item = q.take()).type != ItemType.POISON ){
switch( item.type ){
case NEWLINE:
out.newLine();
for( int i = 0; i < indentation; i++ )
out.append(" ");
break;
case INDENT:
indentation = ((IndentCommand)item).indent;
break;
case CHARSEQ:
out.append( ((CharSeqItem)item).content );
}
}
} catch (InterruptedException ex){
throw new RuntimeException( ex );
} catch (IOException ex) {
throw new RuntimeException( ex );
} finally {
if( out != null ) try {
out.close();
} catch (IOException ex) {
throw new RuntimeException( ex );
}
}
}
private enum ItemType {
CHARSEQ, NEWLINE, INDENT, POISON;
}
private static class Item {
ItemType type;
}
private static class CharSeqItem extends Item {
CharSequence content;
}
private static class IndentCommand extends Item {
int indent;
}
}
And then I use it by doing:
ParallelWriter w = new ParallelWriter( myFile );
new Thread(w).start();
/// Lots of
w.append(" things ").newLine();
w.setIndent(2);
w.newLine().append(" more things ");
/// and finally
w.end();
While this works perfectly well, I'm wondering:
Is there a better way to accomplish this?
Your basic approach looks fine. I would structure the code as follows:
import java.io.BufferedWriter;
import java.io.File;
import java.io.IOException;
import java.io.Writer;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
public interface FileWriter {
FileWriter append(CharSequence seq);
FileWriter indent(int indent);
void close();
}
class AsyncFileWriter implements FileWriter, Runnable {
private final File file;
private final Writer out;
private final BlockingQueue<Item> queue = new LinkedBlockingQueue<Item>();
private volatile boolean started = false;
private volatile boolean stopped = false;
public AsyncFileWriter(File file) throws IOException {
this.file = file;
this.out = new BufferedWriter(new java.io.FileWriter(file));
}
public FileWriter append(CharSequence seq) {
if (!started) {
throw new IllegalStateException("open() call expected before append()");
}
try {
queue.put(new CharSeqItem(seq));
} catch (InterruptedException ignored) {
}
return this;
}
public FileWriter indent(int indent) {
if (!started) {
throw new IllegalStateException("open() call expected before append()");
}
try {
queue.put(new IndentItem(indent));
} catch (InterruptedException ignored) {
}
return this;
}
public void open() {
this.started = true;
new Thread(this).start();
}
public void run() {
while (!stopped) {
try {
Item item = queue.poll(100, TimeUnit.MICROSECONDS);
if (item != null) {
try {
item.write(out);
} catch (IOException logme) {
}
}
} catch (InterruptedException e) {
}
}
try {
out.close();
} catch (IOException ignore) {
}
}
public void close() {
this.stopped = true;
}
private static interface Item {
void write(Writer out) throws IOException;
}
private static class CharSeqItem implements Item {
private final CharSequence sequence;
public CharSeqItem(CharSequence sequence) {
this.sequence = sequence;
}
public void write(Writer out) throws IOException {
out.append(sequence);
}
}
private static class IndentItem implements Item {
private final int indent;
public IndentItem(int indent) {
this.indent = indent;
}
public void write(Writer out) throws IOException {
for (int i = 0; i < indent; i++) {
out.append(" ");
}
}
}
}
If you do not want to write in a separate thread (maybe in a test?), you can have an implementation of FileWriter which calls append on the Writer in the caller thread.
One good way to exchange data with a single consumer thread is to use an Exchanger.
You could use a StringBuilder or ByteBuffer as the buffer to exchange with the background thread. The latency incurred can be around 1 micro-second, doesn't involve creating any objects and which is lower using a BlockingQueue.
From the example which I think is worth repeating here.
class FillAndEmpty {
Exchanger<DataBuffer> exchanger = new Exchanger<DataBuffer>();
DataBuffer initialEmptyBuffer = ... a made-up type
DataBuffer initialFullBuffer = ...
class FillingLoop implements Runnable {
public void run() {
DataBuffer currentBuffer = initialEmptyBuffer;
try {
while (currentBuffer != null) {
addToBuffer(currentBuffer);
if (currentBuffer.isFull())
currentBuffer = exchanger.exchange(currentBuffer);
}
} catch (InterruptedException ex) { ... handle ... }
}
}
class EmptyingLoop implements Runnable {
public void run() {
DataBuffer currentBuffer = initialFullBuffer;
try {
while (currentBuffer != null) {
takeFromBuffer(currentBuffer);
if (currentBuffer.isEmpty())
currentBuffer = exchanger.exchange(currentBuffer);
}
} catch (InterruptedException ex) { ... handle ...}
}
}
void start() {
new Thread(new FillingLoop()).start();
new Thread(new EmptyingLoop()).start();
}
}
Using a LinkedBlockingQueue is a pretty good idea. Not sure I like some of the style of the code... but the principle seems sound.
I would maybe add a capacity to the LinkedBlockingQueue equal to a certain % of your total memory.. say 10,000 items.. this way if your writing is going too slow, your worker threads won't keep adding more work until the heap is blown.
I know that frequent write operations
can slow a program down a lot
Probably not as much as you think, provided you use buffering.