I need to check if a position in a random access file has not been written to. The problem with this is, when the position actually hasn't been written to, I get (as anticipated) an EOFException. I have been reading RandomAccessFile documentation to try to solve this problem, and tried researching online.
Things I've tried:
Using a try-catch block and catching every time there is a EOFException (Using try-catch as a conditional statement). It works, but it is horrible practice, and it is very inefficient, as for my case it is EOF the majority of the time.
Using a BufferReader to loop through and check the position. I ended up running into many problems and decided that there must be a better way.
I don't want to do any copying one file over to another or any other work around. I know there has to be a direct way of doing this, I just can't seem to find the correct solution.
Are you trying to write a "tailer"?
What you need to do is have one thread which reads only the data which is there using FileChannel.size() to check for more data. This data is passed to a piped input. This allows you to have a second thread which reads the piped stream continuously e.g. using BufferedReader and blocks where more data is needed.
Related
The current documentation for StandardOpenOption.Append says:
If the file is opened for WRITE access then bytes will be written to the end of the file rather than the beginning.````
However, I can't seem to find any further information on how this works internally.
My use case involves appending data to a huge file. I currently use BufferedWriter, but my understanding is that, if I have some way to maintain a pointer to the end of the file, I can easily append to it, without first traversing from start of file, till end of file.
So, my question is: Does StandardOpenOption.Append actually work in a similar method? Or does this also, internally, move to end of file and perform the append?
I have a program at the moment that links multiple SQL statements together in the form of a mapping. I then write this data to a file using FileWriter.
I was wondering if it is better to store the information from my SQL statements in a string buffer and then write directly to file, or, would it be best to write to a file line by line when necessary ?
I actually already write to the file using one string buffer but encountered a problem, the program would crash once the string buffer collected a value exceeding around 125000 characters.
In my opinion I always like to create the string I'm going to save in a StringBuffer first and then store it in a file so that I only have to deserialize it once. Which usually leads to performance gains for me.
Another benefit from that is there are less points that there can be a critical error and therefore a would be a fail point. Whereas with the StringBuffer there is usually only one point of failure.
I'm working with a library that I have to provide an InputStream and a PrintStream. It uses the InputStream to gather data for processing and the PrintStream to provide results. I'm stuck using this library and its API cannot be altered.
There are two issues with this that I think have related solutions.
First, the data that needs to be read via the InputStream is not available upfront. Instead, the data is dynamically created by a different part of the application and given to my code as a String via method call. My code's job is to somehow allow the library to read this data through the InputStream provided as I get it.
Second, I need to somehow get the result that is written to the PrintStream and send it to another part of the application as a String. This needs to happen as immediately after the data is put in to the PrintStream as possible.
What it looks like I need are two stream objects that behave more or less like buffers. I need an InputStream that I can shove data in to whenever I have it and a PrintStream that I can grab it's contents whenever it has some. This seems a little awkward to me, but I'm not sure how else to do it.
I'm wondering if anything already exists that allows this kind of behavior or if there is a different (better) solution that will work in the situation I've described. The only thing I can come up with is to try to implement streams with this behavior, but that can become complicated fast (especially since the InputStream needs to block until data is available).
Any ideas?
Edit: To be clear, I'm not writing the library. I'm writing code that is supposed to provide the library with an InputStream to read data from and a PrintStream to write data to.
Looks like both streams need to be constantly reading/writing so you'll need two threads independent of each other. The pattern resembles JMS a little bit, in which case you're feeding information to a "queue" or "topic", and wait for it to be processed then put on a "output" queue/topic. This may introduce additional moving parts, but you could write a simple client to place info onto a JMS queue, then have a listener to just grab messages, and feed it to the input stream constantly. Then another piece of code to read from output stream, and do what you need with it.
Hope this helps.
I want to write a program in Java with support for unix pipeline. The problem is that my input files are images and I need in some way to separate them from one another.
I thought that there is no problem because I can read InputStream using ImageIO.read() without reseting position. But it isn't that simple. ImageIO.read() closes the stream every time an image is read. So I can't read more than one file from stdin. Do you have some solution for this?
The API for read() mentions, "This method does not close the provided InputStream after the read operation has completed; it is the responsibility of the caller to close the stream, if desired." You might also check the result for null and verify that a suitable ImageReader is available.
I got following exception when I am trying to seek to some file.
>
Error while seeking to 38128 in myFile, File length: 85742
java.io.EOFException
at java.io.RandomAccessFile.readInt(RandomAccessFile.java:725)
at java.io.RandomAccessFile.readLong(RandomAccessFile.java:758)
>
But If you see I am trying to seek to '38128' where as file length is '85742'. It reported EOF exception. I wonder how it is possible? Another process appends contents to that file periodically and closes the file handler. It appends contents using DataOutputStream. My process is trying to seek to some locations and reading it. One more thing is I got this exception only once. I tried to reproduce it but it never happened again. The file is in local disk only. No filer.
Thanks
D. L. Kumar
I would be very careful when trying to do random access on a file that is concurrently being written to from another process. It might lead to all kinds of strange synchronisation problems, as you are experiencing right now.
Do you determine the length of the file from the same process as the one doing the seek()? Has the other modifying processing done a flush()?
The process writing the data may have been told to write the data, but the data could be buffered to write. Be sure to call flush() on the output stream prior to attempting to read the data.