I'm trying to read a large object from a file in my application. Since this can take some time I'd like to somehow connect the reading of the file with a JProgressBar. Is there any easy way to find the progress of reading a file? (The loading itself is done in a swingworker thread so updating a progress bar should not be a problem.) I've been thinking about overriding the readByte() method in the FileInputStream to return a progress value of sorts but that seems such a devious way. Any suggestions on how to realize this are more than welcome.
Here is the code for reading the file:
public class MapLoader extends SwingWorker<Void, Integer> {
String path;
WorldMap map;
public void load(String mapName) {
this.path = Game.MAP_DIR + mapName + ".map";
this.execute();
}
public WorldMap getMap() {
return map;
}
#Override
protected Void doInBackground() throws Exception {
File f = new File(path);
if (! f.exists())
throw new IllegalArgumentException(path + " is not a valid map name.");
try {
FileInputStream fs = new FileInputStream(f);
ObjectInputStream os = new ObjectInputStream(fs);
map = (WorldMap) os.readObject();
os.close();
fs.close();
} catch (IOException | ClassCastException | ClassNotFoundException e) {
e.printStackTrace();
}
return null;
}
#Override
protected void done() {
firePropertyChange("map", null, map);
}
}
If it were me, I would not mess with overriding FileInputStream. I think the decorator might be a good fit here. The idea is you create a decorator input stream that you pass to your ObjectInputStream. The decorator takes care of updating the progress of your read, then delegates to the real input stream.
Perhaps the easiest solution is to use CountingInputStream from Apache commons-io. The basic steps would be:
Create subclass of CountingInputStream as a non-static inner class of your map loader
Override the afterRead method. Call super.afterRead, then publish your updated status
Pass an instance of your new decorator input stream to output stream, passing the file input stream to the constructor of your decorator
Using RandomAccessFile you may call getFilePointer() to known how many bytes was read.
Time consuming operation may be executed in a background thread, remember using SwingUtilities.invokeLater() to communicate between background task and GUI threads.
If you are considering to override read() in FileInputStream, what you could legitimately consider is making your own wrapper InputStream class that accepts a progress-monitor callback. However, you'll find out that it not as easy as implementing read() since it is very inefficient to spend a method invocation for each byte. Instead you'll need to deal with read(byte[], int, int), which is a bit more involved.
Related
Hope someone can shed some light into what I'm doing wrong.
I have a DataLoader class that creates a FileInputStream. Since FileInputStream implements Closeable, I create that instance as part of the try block.
I then pass the newly created stream to a DataManager class. This class opens a file channel and reads data into a singleton class, storing all data into memory blocks. Since FileChannel also implements Closeable, I also instanciate it in the try block
I then invoke this code from a single thread to check every now if there are any filechanges, and when this happens, a new instance of DataLoader is created to rebuild the memory blocks. But this constantly fails due to file locking. This code is part of a Java 1.8 standard application, running on windows 10. Am I assuming wrongly that both file channel and file inputstream close? I added code to invoke the close method in both classes, but with no success. Any help would be greatly appreciated. Thanks in advance.
public class DataManager {
public DataManager(FileInputStream in) throws IOException {
fromInputStream(in);
}
public final void fromInputStream(FileInputStream in) throws IOException {
try (FileChannel ch = in.getChannel()) {
MappedByteBuffer mb = ch.map(MapMode.READ_ONLY, ch.position(), ch.size());
readData(mb); //reads mapped buffer into a byte array, e.g.: mb.get(barray, 0, 1000);
}
}
}
public class DataLoader {
public DataLoader(File binFile) throws FileNotFoundException, IOException {
try (FileInputStream in = new FileInputStream(binFile)) {
DataManager d = new DataManager(in);
} catch (Exception e) {
LOG.error("Something went wrong while loading data.", e);
}
}
}
As suggested in the comments, the issue relies on windows being somewhat stringent regarding the use of FileChannel. I replaced all FileChannel related code with InputStream and the locking behavior disappeared.
I'm trying to implement a custom handler that logs parsed LogRecord objects into a file (basically what FileHandler or StreamHandler does). My currently implementation is shown below:
public final class ErrorHandler extends Handler {
private static final String OUTPUT_FILE = ".output";
private final Formatter formatter = new CustomFormatter();
private BufferedWriter writter;
#Override
public void publish(LogRecord record) {
if (record.getLevel() == SEVERE || record.getLevel() == WARNING) {
writeToOutput(record);
}
}
void writeToOutput(LogRecord log) {
try {
if (writter == null) {
writter = new BufferedWriter(new FileWriter(OUTPUT_FILE, true));
}
writter.write(formatter.format(log));
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void flush() {
}
#Override
public void close() {
try {
writter.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
P.S.: I known that we can achieve the same as the code above just by setting filter and formatter on a FileHandler or StreamHandler however I'll need the hookpoints later in the future.
My problem is, if I leave flush() with no implementation, although output file gets created, no log is written there. If I call writter.flush() inside flush(), the log is duplicated. Any though why this might be happening?
Ok, after two days fighting agains that I came to realize that the process was running on a daemon, therefore, handler's close() was only called when daemon was killed. I believe that this was leading to multiples calls to flush() almost at the same time. Running the process with no daemon solved the issue.
My problem is, if I leave flush() with no implementation, although output file gets created, no log is written there.
This is because the bytes are cached in the BufferedWriter. Flush sends those bytes to the wrapped FileWriter. If you collect enough bytes it will flush to the target file but you risk losing that information of you have some sort of process crash or disk issue.
If I call writter.flush() inside flush(), the log is duplicated. Any though why this might be happening?
Perhaps you have added two instances of this handler to the logger and both are appending to the same file. Logger.addHandler works like a List and not like a Set. Add code to Print the logger tree which will tell you how many handler instances are installed.
I'm sure I have no process crash nor disk issue, and I believe that close calls flush. Yet, I don't see why nothing is being logged - and it happens only file is not created yet.
Close is only implicitly called when the Java virtual machine shuts down and the handler is visible from the LogManager. If the shutdown is not clean as described in the documentation then the contents of the buffered writer is not flushed.
I am programming a multi-threaded server and I have a thread for every client-socket-connection.
I want to pass data from the main thread back and forth to the client-threads.
How many threads do I have to set up?
Does it work like this:
[Server Class:]
private PipedInputStream serverInputStream = new PipedInputStream();
private PipedOutputStream serverOutputStream = new PipedOutputStream();
public PipedInputStream clientInputStream = new PipedInputStream();
public PipedOutputStream clientOutputStream = new PipedOutputStream();
serverInputStream.connect(clientOutputStream);
clientInputStream.connect(serverOutputStream);
or do I have to set up a these four of Streams for each client?
I apologize if this question is maybe dumb, but I have seen that as a possebility and tried it.
If there is a much better way to handle communication between threads then please educate me!
I have written a small Class that waits for a Buffered Reader that blocks and add the messages to the queue:
public class DataListener implements Runnable {
private InputStream is;
private ConcurrentLinkedQueue<String> messages;
private boolean closed = false;
public DataListener(InputStream is, ConcurrentLinkedQueue<String> messages) {
this.is = is;
this.messages = messages;
}
#Override
public void run() {
BufferedReader br = new BufferedReader(new InputStreamReader(is));
while (!closed) {
try {
messages.add(br.readLine());
} catch (IOException e) {
e.printStackTrace();
}
}
}
public void close() {
closed = true;
}
}
Can you tell me if that is a good way of working around a blocking listener?
It is definitely not a good idea to use streams to pass data from one thread to another, because:
Stream IO are blocking operations, which would put the thread on hold.
Streams (in general) are not thread-safe by default, which could lead to all kinds of issues.
If you want to pass data between threads frequently try to avoid blocking at all cost, because otherwise the server thread will be waiting most of the time. Java offers a wide tool-set of non-blocking data exchange functionality that you can use instead, which you can find in the Concurrent Package.
A very simple yet effective and fast way to exchange data is to use a message-style system:
final ConcurrentLinkedQueue<MyMessageClass> messages = new ConcurrentLinkedQueue<MyMessageClass>();
public void addMessage(final MyMessageClass message) {
messaged.add(message);
}
protected void serverLoop() {
//...
final MyMessageClass message = messages.poll(); // does not block, returns null if none is available
if (message != null) {
handleMessage(message);
}
//...
}
You can use other classes like the LinkedBlockingQueue for additional features like temporary blocking.
However: when it comes to threading most problems can only be solved by using the correct architecture for your problem at hand. There is no "golden hammer" solution for all your threading tasks, if you want your code to run flawlessly, you need to find the appropriate solution.
You can only use any given piped steam pair between two threads. You will need a pair of pioed streams per thread pair. Don't do this. Use a queue.
It turns out that almost nobody closes resources in Java correctly. Programmers either do not use try-finally block at all, or just put resource.close() in finally which is also incorrect (because Throwable from close() can shadow Throwable from try block). Sometimes they put something like IOUtils.closeQuietly() with is only correct for InputStream, but not for OutputStream. try-with-resources solves all of these problems but there are still huge number of projects written in Java 6.
What is the best way to emulate try-with-resources in Java 6? Now I use Guava Closer, which is better than nothing but still much uglier than try-with-resources. Also, there is a pattern called a loan-pattern, but the absence of lambdas in Java makes this pattern very cumbersome. Is there a better way?
I've found a good replacement for try-with-resources. It uses Lombok library with annotation processing:
#Cleanup InputStream in = new FileInputStream(args[0]);
#Cleanup OutputStream out = new FileOutputStream(args[1]);
byte[] b = new byte[10000];
while (true) {
int r = in.read(b);
if (r == -1) break;
out.write(b, 0, r);
}
However, it doesn't handle exception correctly. This bug is more than 1 year old and still is not closed: https://code.google.com/p/projectlombok/issues/detail?id=384
Though anonymous class is quite verbose, it's still acceptable in java land
new TryWithResource<InputStream>(){
protected InputStream init() throws Exception {
return new FileInputStream("abc.txt");
}
protected void use(InputStream input) throws Exception{
input.read();
}
};
----
abstract class TryWithResource<R>
{
abstract protected R init() throws Exception;
abstract protected void use(R resource) throws Exception;
// caution: invoking virtual methods in constructor!
TryWithResource() throws Exception
{
// ... code before
R r = init();
use(r);
// ... code after
}
}
If your only problem with IOUtils.closeQuietly is that it ignores exceptions on OutputStreams, then you can either simply call close() on them, or create your own utility class which automatically treats the two differently, like this:
public static void close(Closeable resource)
{
try
{
resource.close();
}
catch(Exception e)
{
//swallow exception
}
}
public static void close(OutputStream o)
{
//throw any exceptions
o.close();
}
The correct overloaded method will be selected at compile time in all common situations, although if you're passing OutputStreams around as Closeables then you'll have to change this to do a dynamic instanceof check to make sure OutputStreams always throw exceptions.
I'm doing some unit tests where essentially I need the input stream to block forever. Right now I'm using this to construct the input stream
InputStream in = new ByteArrayInputStream("".getBytes());
While it works some of the time, other times the input stream is read before the output stream (what I'm testing) is finished, causing all sorts of havoc.
Essentially I need this input stream to block forever when read. The only solution I can think of is to setup the InputStream with a massive buffer so that the other threads finish, but thats a really hackish and brittle solution. I do have mockito but I'm very new at it and not sure if I can get away with only mocking read without mocking anything else.
Does anyone know of a better solution?
EDIT:
This is my new attempt. It works most of the time, but other times the input thread dies early which causes the Output Thread to die (that behavior is intentional). I can't seem to figure out though why this would sometimes fail.
This is the general test under TestNG simplified for clarity.
protected CountDownLatch inputLatch;
#BeforeMethod
public void botSetup() throws Exception {
//Setup streams for bot
PipedOutputStream out = new PipedOutputStream();
//Create an input stream that we'll kill later
inputLatch = new CountDownLatch(1);
in = new AutoCloseInputStream(new ByteArrayInputStream("".getBytes()) {
#Override
public synchronized int read() {
try {
//Block until were killed
inputLatch.await();
} catch (InterruptedException ex) {
//Wrap in an RuntimeException so whatever was using this fails
throw new RuntimeException("Interrupted while waiting for input", ex);
}
//No more input
return -1;
}
});
Socket socket = mock(Socket.class);
when(socket.getInputStream()).thenReturn(in);
when(socket.getOutputStream()).thenReturn(out);
//Setup ability to read from bots output
botOut = new BufferedReader(new InputStreamReader(new PipedInputStream(out)));
...
}
#AfterMethod
public void cleanUp() {
inputLatch.countDown();
bot.dispose();
}
For the test I use readLine() from botOut to get the appropriate number of lines. The issue though is that when the output thread dies, readLine() blocks forever which hangs up TestNG. I've tried a timeout with mixed results: most of the time it would work but others it would kill tests that just took a little longer than normal to test.
My only other option is to just not use streams for this kind of work. The output thread relies on an output queue, so I could just run off of that. The issue though is that I'm not actually testing writing to the stream, just what is going to be sent, which does bother me.
Mockito is great- I am personally a huge fan!
With Mockito, you can do something like the code below. You basically set up a stream mock, and you tell it to sleep for a very long time when the "read" method is invoked on it. You can then pass this mock into the code you want to test when the stream hangs.
import static org.mockito.Mockito.*;
//...
#Test
public void testMockitoSleepOnInputStreamRead() throws Exception{
InputStream is = mock(InputStream.class);
when(is.read()).thenAnswer(new Answer() {
#Override
public Object answer(InvocationOnMock invocation) {
try {
Thread.sleep(10000000000L);
return null;
} catch (InterruptedException ie) {
throw new RuntimeException(ie);
}
}
});
//then use this input stream for your testing.
}
I'd make an InputStream that, when read(), does a wait() on something that's held locked till you're done with the rest of the test. Subclass from FilterInputStream to get everything else for free.
There doesn't seem to be any reliable way to do this. My code in the question only works sometimes, #Moe's doesn't work at all, #Ed's suggesting is what I was origionally doing, and #SJuan's is sort of what I'm already doing.
There just seems to be too much stuff going on. The input stream I give to the class is wrapped in a InputStreamReader, then a Buffered reader. Suggestions for other streams inside of other streams just further complicate the issue.
To fix the problem I did what I should of done origionally: Create a factory method for the InputThread (the thread that actually does the reading), then override in my testing. Simple, effective, and 100% reliable.
I suggest anyone that runs into this problem to first try and override the part of your program that does the reading. If you can't then the code I posted is the only semi-reliable code that works in my situation.
Then you need another InputStream flavour. Read block when no more bytes are available, but with ByteArrayOutputStream they are always available until the end of stream is found.
I would extend BAOS by changing read() so it checks for a certain boolean value (if true read, if false wait a second and loop). Then change that variable from your unit code when it is the right time.
Hope that helps
I created a helper class that extends ByteArrayInputStream for my unit tests. It pipes a given byte[] through, but at the end of the stream instead of returning -1 it waits until close() is called. If ten seconds passes it gives up and throws an exception.
If you want it to close earlier you can call latch.countdown() yourself.
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
public class BlockingByteArrayInputStream extends ByteArrayInputStream {
private CountDownLatch latch;
public BlockingByteArrayInputStream(byte[] buf) {
super(buf);
latch = new CountDownLatch(1);
}
#Override
public synchronized int read() {
int read = super.read();
if (read == -1) {
waitForUnblock();
}
return read;
}
#Override
public int read(byte[] b) throws IOException {
int read = super.read(b);
if (read == -1) {
waitForUnblock();
}
return read;
}
#Override
public synchronized int read(byte[] b, int off, int len) {
int read = super.read(b, off, len);
if (read == -1) {
waitForUnblock();
}
return read;
}
private void waitForUnblock() {
try {
latch.await(10, TimeUnit.SECONDS);
} catch (InterruptedException e) {
throw new RuntimeException("safeAwait interrupted");
}
}
#Override
public void close() throws IOException {
super.close();
latch.countDown();
}
}