flush System.out with own logger - java

I created my own little logger to use instead of System.out.println
LogManager.getLogManager().reset();
logger = Logger.getLogger(this.toString());
logger.setLevel(loglevel);
Formatter formatter = new Formatter() {
public String format(LogRecord record) {
return new Date().toString().substring(11, 20) + record.getLevel()+" " + formatMessage(record) + System.getProperty("line.separator");
}
};
logger.addHandler(new StreamHandler(System.out, formatter));
LogManager.getLogManager().addLogger(logger);
At the moment, the messages don't get flushed, so they only appear once the application gets terminated. Is there a way to flush a message after printing it out without creating a new class or adding many lines of code? I want to keep this code as short as possible...

The problem is that the StreamHandler.setOutputStream wraps the given stream in a OutputStreamWriter which according to the javadocs:
The resulting bytes are accumulated in a buffer before being written to the underlying output stream. The size of this buffer may be specified, but by default it is large enough for most purposes.
So there is no way around calling StreamHandler.flush to force those bytes to the console.
Since you don't want to create a new class you can use the ConsoleHandler which will flush to console but by default will write to error stream. You can get around that if you haven't started any other threads by doing the following:
//Global resource so ensure single thread.
final PrintStream err = System.err;
System.setErr(System.out);
try {
ConsoleHandler ch = new ConsoleHandler();
ch.setFormatter(formatter);
logger.addHandler(ch);
} finally {
System.setErr(err);
}
A subclass really is your best bet because you can end up accidentally closing the System.out by calling Handler.close() through LogManager.getLogManager().reset() or by calling StreamHandler.setOutputStream which means you won't see any output at all.
public class OutConsoleHandler extends StreamHandler {
public OutConsoleHandler(OutputStream out, Formatter f) {
super(out, f);
}
#Override
public synchronized void publish(LogRecord record) {
super.publish(record);
flush();
}
#Override
public void close() throws SecurityException {
flush();
}
}

Related

Using ConsoleHandler with own PrintStream suppresses System.err

I want the Java ConsoleHandler to use System.out instead of err, so I implemented my own handler that calls the protected void setOutputStream(OutputStream) of the parent StreamHandler class:
public class ConsoleHandler extends java.util.logging.ConsoleHandler {
public ConsoleHandler() {
setOutputStream(System.out); // or System.err
setLevel(Level.ALL);
}
}
I remove the default console logger from and add my own to the root logger:
Logger l = Logger.getLogger("");
for (Handler h : l.getHandlers())
l.removeHandler(h);
l.addHandler(new ConsoleHandler());
System.out.println("OUT");
System.err.println("ERR");
Problem: "OUT" is always printed, but "ERR" never, independent of the output stream I set in my ConsoleHandler constructor.
The stacktrace (printed to System.err) is not printed any more, without my changes it is printed as usual
This is because setOutputStream closes the previously assigned System.err stream. This is a known issue filed under JDK-4827381: Invoking ConsoleHandler.setOutputStream(...) closes System.err. What should have happened with that bug report is that the StreamHandler.setOutputStream should call Handler.close instead of flushAndClose().
You need to wrap the existing System.err stream with a proxy that doesn't allow the stream to be closed. Or just extend StreamHandler and use the constructor that takes an OutputStream.
public class OutConsoleHandler extends StreamHandler {
public OutConsoleHandler() {
super(System.out, new SimpleFormatter());
//TODO: Read level,filter,encoding from LogManager.
}
#Override
public void publish(LogRecord record) {
super.publish(record);
super.flush();
}
#Override
public void close() {
super.flush();
}
}

How does java.util.logging.Handler lifecycle works?

I'm trying to implement a custom handler that logs parsed LogRecord objects into a file (basically what FileHandler or StreamHandler does). My currently implementation is shown below:
public final class ErrorHandler extends Handler {
private static final String OUTPUT_FILE = ".output";
private final Formatter formatter = new CustomFormatter();
private BufferedWriter writter;
#Override
public void publish(LogRecord record) {
if (record.getLevel() == SEVERE || record.getLevel() == WARNING) {
writeToOutput(record);
}
}
void writeToOutput(LogRecord log) {
try {
if (writter == null) {
writter = new BufferedWriter(new FileWriter(OUTPUT_FILE, true));
}
writter.write(formatter.format(log));
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void flush() {
}
#Override
public void close() {
try {
writter.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
P.S.: I known that we can achieve the same as the code above just by setting filter and formatter on a FileHandler or StreamHandler however I'll need the hookpoints later in the future.
My problem is, if I leave flush() with no implementation, although output file gets created, no log is written there. If I call writter.flush() inside flush(), the log is duplicated. Any though why this might be happening?
Ok, after two days fighting agains that I came to realize that the process was running on a daemon, therefore, handler's close() was only called when daemon was killed. I believe that this was leading to multiples calls to flush() almost at the same time. Running the process with no daemon solved the issue.
My problem is, if I leave flush() with no implementation, although output file gets created, no log is written there.
This is because the bytes are cached in the BufferedWriter. Flush sends those bytes to the wrapped FileWriter. If you collect enough bytes it will flush to the target file but you risk losing that information of you have some sort of process crash or disk issue.
If I call writter.flush() inside flush(), the log is duplicated. Any though why this might be happening?
Perhaps you have added two instances of this handler to the logger and both are appending to the same file. Logger.addHandler works like a List and not like a Set. Add code to Print the logger tree which will tell you how many handler instances are installed.
I'm sure I have no process crash nor disk issue, and I believe that close calls flush. Yet, I don't see why nothing is being logged - and it happens only file is not created yet.
Close is only implicitly called when the Java virtual machine shuts down and the handler is visible from the LogManager. If the shutdown is not clean as described in the documentation then the contents of the buffered writer is not flushed.

How to get the progress of reading a file

I'm trying to read a large object from a file in my application. Since this can take some time I'd like to somehow connect the reading of the file with a JProgressBar. Is there any easy way to find the progress of reading a file? (The loading itself is done in a swingworker thread so updating a progress bar should not be a problem.) I've been thinking about overriding the readByte() method in the FileInputStream to return a progress value of sorts but that seems such a devious way. Any suggestions on how to realize this are more than welcome.
Here is the code for reading the file:
public class MapLoader extends SwingWorker<Void, Integer> {
String path;
WorldMap map;
public void load(String mapName) {
this.path = Game.MAP_DIR + mapName + ".map";
this.execute();
}
public WorldMap getMap() {
return map;
}
#Override
protected Void doInBackground() throws Exception {
File f = new File(path);
if (! f.exists())
throw new IllegalArgumentException(path + " is not a valid map name.");
try {
FileInputStream fs = new FileInputStream(f);
ObjectInputStream os = new ObjectInputStream(fs);
map = (WorldMap) os.readObject();
os.close();
fs.close();
} catch (IOException | ClassCastException | ClassNotFoundException e) {
e.printStackTrace();
}
return null;
}
#Override
protected void done() {
firePropertyChange("map", null, map);
}
}
If it were me, I would not mess with overriding FileInputStream. I think the decorator might be a good fit here. The idea is you create a decorator input stream that you pass to your ObjectInputStream. The decorator takes care of updating the progress of your read, then delegates to the real input stream.
Perhaps the easiest solution is to use CountingInputStream from Apache commons-io. The basic steps would be:
Create subclass of CountingInputStream as a non-static inner class of your map loader
Override the afterRead method. Call super.afterRead, then publish your updated status
Pass an instance of your new decorator input stream to output stream, passing the file input stream to the constructor of your decorator
Using RandomAccessFile you may call getFilePointer() to known how many bytes was read.
Time consuming operation may be executed in a background thread, remember using SwingUtilities.invokeLater() to communicate between background task and GUI threads.
If you are considering to override read() in FileInputStream, what you could legitimately consider is making your own wrapper InputStream class that accepts a progress-monitor callback. However, you'll find out that it not as easy as implementing read() since it is very inefficient to spend a method invocation for each byte. Instead you'll need to deal with read(byte[], int, int), which is a bit more involved.

PrintWriter to JTextArea, nothing displays until called method closes

Context: I am reading data from a serial port at 115.2 Kbaud. The read data is printed using a PrintWriter that I then have appending to a JTextArea.
Everything works well, but the text in the JTextArea does not appear until the method sending the stream from the serial port to my PrintWriter finishes. I'd like it to display closer to real-time, as I will at times be receiving upwards of 20-30 MB of text at a time, and how the general flow of text changes as the program executes would be valuable.
I am using the PrintWriter to JTextArea method here. I think the solution probably has to do with Threads and PipedWriter/PipedReader, but every attempt I've made to implement that has failed miserably.
Thank you for your help.
//code calling method; VerifierReader does not inherit from Reader
//or any such class. it's wholly homegrown. I send it the PrintWriter
//as out, telling it to output there
verifierInstance=new VerifierReader("COM3", verifierOutputLocString.getText());
verifierInstance.setSysOutWriter(out);
verifierInstance.readVerifierStream();
// and the relevant code from VerifierReader
public void setSysOutWriter (PrintWriter outWriter) {
sysOutWriter=new PrintWriter(outWriter);
}
public void readVerifierStream() throws SerialPortException,
InterruptedException{
try{
sysOutWriter.println("Listening for verifier...");
//sysOutWriter.flush();
verifierPort.addEventListener(new verifierListener());
lastReadTimer=System.currentTimeMillis();
while(verifierPort.isOpened()) {
Thread.sleep(1000);
//System.out.println(timeOut);
if( ((long)(System.currentTimeMillis()-lastReadTimer))>timeOut){
sysOutWriter.println("Finished");
verifierPort.removeEventListener();
verifierPort.closePort();
}
}
}
finally {
if (verifierPort.isOpened()) {
verifierPort.closePort();
}
bfrFile.close();
}
}
private class verifierListener implements SerialPortEventListener{
String outBuffer;
public void serialEvent(SerialPortEvent event) {
if(event.isRXCHAR()){//If data is available
timeOut=200;
lastReadTimer=System.currentTimeMillis();
if(event.getEventValue() > 0){//Check bytes count in the input buffer
try {
byte[] buffer = verifierPort.readBytes(event.getEventValue());
outBuffer=new String(buffer);
bfrFile.print(outBuffer);
sysOutWriter.print(outBuffer);
//bfrFile.flush();
//sysOutWriter.flush();
}
catch (SerialPortException ex) {
sysOutWriter.println(ex);
}
}
}
}
}
Edit:
I've attempted what was recommended below, and have made the following changes:
private class VerifierTask extends SwingWorker<Void, String> {
public VerifierTask() throws IOException, SerialPortException, InterruptedException{
verifierInstance= new VerifierReader(streamReader);
verifierInstance.setReaderIO("COM3", verifierOutputLocString.getText());
verifierInstance.readVerifierStream();
}
#Override
protected Void doInBackground() throws IOException{
int charItem;
char[] charBuff = new char[10];
String passString;
while ((charItem = streamReader.read(charBuff, 0, 10)) !=-1) {
passString = new String(charBuff);
publish(passString);
}
return null;
}
#Override
protected void process(List<String> outList) {
for (String output : outList) {
outputArea.append(output);
}
}
}
was added, and I changed my button to immediately invoke a new instance of the VerifierTask class, in addition to making VerifierReader implement a PipedWriter for output (with all of that being Strings).
I'm not sure what I'm doing wrong here. When this code is executed the Java process just freezes indefinitely.
Am I assuming correctly that a VerifierReader created in any VerifierTask thread is tied to that thread, and thus my thread.sleep and while(true) statements no longer pose a problem?
Don't call Thread.sleep or do while (true) on the main Swing event thread, the EDT. Ever. Instead do this sort of thing in a background thread such as one provided via a SwingWorker. You would use the publish/process method pair to get intermediate results to your JTextArea.
For more on this, please check out the tutorial: Concurrency in Swing.

Always blocking input stream for testing?

I'm doing some unit tests where essentially I need the input stream to block forever. Right now I'm using this to construct the input stream
InputStream in = new ByteArrayInputStream("".getBytes());
While it works some of the time, other times the input stream is read before the output stream (what I'm testing) is finished, causing all sorts of havoc.
Essentially I need this input stream to block forever when read. The only solution I can think of is to setup the InputStream with a massive buffer so that the other threads finish, but thats a really hackish and brittle solution. I do have mockito but I'm very new at it and not sure if I can get away with only mocking read without mocking anything else.
Does anyone know of a better solution?
EDIT:
This is my new attempt. It works most of the time, but other times the input thread dies early which causes the Output Thread to die (that behavior is intentional). I can't seem to figure out though why this would sometimes fail.
This is the general test under TestNG simplified for clarity.
protected CountDownLatch inputLatch;
#BeforeMethod
public void botSetup() throws Exception {
//Setup streams for bot
PipedOutputStream out = new PipedOutputStream();
//Create an input stream that we'll kill later
inputLatch = new CountDownLatch(1);
in = new AutoCloseInputStream(new ByteArrayInputStream("".getBytes()) {
#Override
public synchronized int read() {
try {
//Block until were killed
inputLatch.await();
} catch (InterruptedException ex) {
//Wrap in an RuntimeException so whatever was using this fails
throw new RuntimeException("Interrupted while waiting for input", ex);
}
//No more input
return -1;
}
});
Socket socket = mock(Socket.class);
when(socket.getInputStream()).thenReturn(in);
when(socket.getOutputStream()).thenReturn(out);
//Setup ability to read from bots output
botOut = new BufferedReader(new InputStreamReader(new PipedInputStream(out)));
...
}
#AfterMethod
public void cleanUp() {
inputLatch.countDown();
bot.dispose();
}
For the test I use readLine() from botOut to get the appropriate number of lines. The issue though is that when the output thread dies, readLine() blocks forever which hangs up TestNG. I've tried a timeout with mixed results: most of the time it would work but others it would kill tests that just took a little longer than normal to test.
My only other option is to just not use streams for this kind of work. The output thread relies on an output queue, so I could just run off of that. The issue though is that I'm not actually testing writing to the stream, just what is going to be sent, which does bother me.
Mockito is great- I am personally a huge fan!
With Mockito, you can do something like the code below. You basically set up a stream mock, and you tell it to sleep for a very long time when the "read" method is invoked on it. You can then pass this mock into the code you want to test when the stream hangs.
import static org.mockito.Mockito.*;
//...
#Test
public void testMockitoSleepOnInputStreamRead() throws Exception{
InputStream is = mock(InputStream.class);
when(is.read()).thenAnswer(new Answer() {
#Override
public Object answer(InvocationOnMock invocation) {
try {
Thread.sleep(10000000000L);
return null;
} catch (InterruptedException ie) {
throw new RuntimeException(ie);
}
}
});
//then use this input stream for your testing.
}
I'd make an InputStream that, when read(), does a wait() on something that's held locked till you're done with the rest of the test. Subclass from FilterInputStream to get everything else for free.
There doesn't seem to be any reliable way to do this. My code in the question only works sometimes, #Moe's doesn't work at all, #Ed's suggesting is what I was origionally doing, and #SJuan's is sort of what I'm already doing.
There just seems to be too much stuff going on. The input stream I give to the class is wrapped in a InputStreamReader, then a Buffered reader. Suggestions for other streams inside of other streams just further complicate the issue.
To fix the problem I did what I should of done origionally: Create a factory method for the InputThread (the thread that actually does the reading), then override in my testing. Simple, effective, and 100% reliable.
I suggest anyone that runs into this problem to first try and override the part of your program that does the reading. If you can't then the code I posted is the only semi-reliable code that works in my situation.
Then you need another InputStream flavour. Read block when no more bytes are available, but with ByteArrayOutputStream they are always available until the end of stream is found.
I would extend BAOS by changing read() so it checks for a certain boolean value (if true read, if false wait a second and loop). Then change that variable from your unit code when it is the right time.
Hope that helps
I created a helper class that extends ByteArrayInputStream for my unit tests. It pipes a given byte[] through, but at the end of the stream instead of returning -1 it waits until close() is called. If ten seconds passes it gives up and throws an exception.
If you want it to close earlier you can call latch.countdown() yourself.
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
public class BlockingByteArrayInputStream extends ByteArrayInputStream {
private CountDownLatch latch;
public BlockingByteArrayInputStream(byte[] buf) {
super(buf);
latch = new CountDownLatch(1);
}
#Override
public synchronized int read() {
int read = super.read();
if (read == -1) {
waitForUnblock();
}
return read;
}
#Override
public int read(byte[] b) throws IOException {
int read = super.read(b);
if (read == -1) {
waitForUnblock();
}
return read;
}
#Override
public synchronized int read(byte[] b, int off, int len) {
int read = super.read(b, off, len);
if (read == -1) {
waitForUnblock();
}
return read;
}
private void waitForUnblock() {
try {
latch.await(10, TimeUnit.SECONDS);
} catch (InterruptedException e) {
throw new RuntimeException("safeAwait interrupted");
}
}
#Override
public void close() throws IOException {
super.close();
latch.countDown();
}
}

Categories