write strings to the bottom line of an textfile - java

i want to write strings to a textfile, everytime to the bottom of the file. And then if im searching for a certain string in the textfile and finds it, i want to replace that line with another.
I'm thinking this: Count rows in textfile and add +1 and then write the string i want to write to that index. But is it even possible to write to a certain linenumber in a textfile?
And how about to update a certain row to another string ?
thanks!

You do not want to do that: it is a recipe for disaster. If, during the original file modification, you fail to write to it, the original file will be corrupted.
Use a double write protocol, write the modified file to another file, and only if the write suceeds, rename that file to the original.
Provided your file is not too big, for some definition of "big", I'd recommend creating a List<String> for the destination file: read the original file line by line, add to that list; once the list processing is complete (your question is unclear what should really happen), write each String to the other file, flush and close, and if that succeeds, rename to the original.

If you want to append strings, the FileOutputStream does have an alternate constructor which you can set to true so you can open for appending.
If you'd like, say, to replace strings into a file without copying it, your best bet would be to rely in RandomAccessFile instead. However, if the line length is varying, this is unreliable. For fixed-length records, this should work as such:
Move to the offset
Write
You can also 'truncate' (via setLength), so if there's a trailing block you need to get rid, you could discard as such.
A Third Solution would be to rely in mmap. This requires on a Memory-Mapped Bytebuffer for the whole file. I'm not considering the whole feasibility of the solution (it works in plain C), but that actually 'looks' the more correct, if you consider both the Java Platform + the Operating System

Related

Replace text in a file using Stream- Java 8

I'm trying to write an API to replace all the lines containing a certain substring with a different string in a text file.
I’m using Java 8 stream to filter the lines which contains the given pattern. I’m having problem with the file write part.
Files.lines(targetFile).filter(line -> line.contains(plainTextPattern)).parallel()
.map(line-> line.replaceAll(plainTextPattern, replaceWith)).parallel();
Above code reads the file line-wise, filters the lines that match the pattern and replaces with the give string and gives back a stream of strings which has only the replaced lines.
We need to write these lines back to file. Since we lose the stream once the pipeline ends, I appended the following to the pipeline:
.forEach(line -> {
try {
Files.write(targetFile, line.toString().getBytes());
} catch (IOException e) {
e.printStackTrace();
}
I was hoping it would write to the file only the modified (since it is in the pipeline) line and keep the other lines untouched.
But it seems to truncate the file for each line in the file and keep only the last processed line and deletes all the lines that were not matched in the pipeline.
Is there something I’m missing about handling files using streams?
Using filter eliminates anything that doesn't match the filter from the stream. (Additionally, for what it's worth, a) you only need to use parallel once, b) parallel isn't that effective on streams coming from I/O sources, c) it's almost never a good idea to use parallel until you've actually tried it non-parallel and found it too slow.)
That said: there's no need to filter out the lines that match the pattern if you're going to do a replaceAll. Your code should look like this:
try (Stream<String> lines = Files.lines(targetFile)) {
List<String> replaced = lines
.map(line-> line.replaceAll(plainTextPattern, replaceWith))
.collect(Collectors.toList());
Files.write(targetFile, replaced);
}
So sorry to tell you that this is not how files work. If you want to write to the middle of a file, you need to have RandomAccess; Get a FilePointer, seek, that pointer, and write from there.
This holds if the size of data you want write is equal to the size of data you want to overwrite. If this is not the case, you have to copy the tail of the file to a temp buffer and append it to the text you wish to write.
And btw, parallelStreams on IO bound tasks is often a bad idea.
You might want to implement a stream, like Jenkov has done it here:
http://tutorials.jenkov.com/java-howto/replace-strings-in-streams-arrays-files.html
This simple one is specifically replacing tokens in the form of ${tokenName}.
There are more general algorithms.

Skip empty lines while CSV parsing

I'm currently working with pulling a CSV file from a URL and modifying it's entries. I'm currently using a StreamReader to read each line of the CSV and split it into an array, where I can modify each entry based on its position.
The CSV is generated from an e-form provider where a particular form entry is a Multi-Line field, where a user can add multiple notes. However, when a user enters a new note, they are separating each note by a line return.
CSV Example:
"FName","LName","Email","Note 1: some text
Note 2: some text"
Since my code is splitting each CSV entry by line, once it reaches these notes, it believes it to be a new CSV entry. This is causing my code that modifies the entries to not work since the element positions become incorrect. (CSV entries with empty or single line note fields work fine)
Any ideas on the best approach to take for this? I've tried adding code to replace carriage returns or to skip empty lines but it doesn't seem to help.
You can check for first column value in a row is null or not. If it is null continue to read next line.
Assuming the CSV example you have provided is supposed to be just one entry in the CSV file (with the last field spanning over several different lines due to newline breaks), you could try something like this, using 2 loops.
Keep a variable for the current CSV record (of String[] type) currentRecord and a recordList (a List or an array) to keep all the CSV records.
Read a line of the CSV file
Split it into an array of strings using the comma as the delimiter. Keep this array in a temporary variable.
If the size of this array is 1, append this string to the last element (4th) in currentRecord (if currentRecord is not null).
Keep reading lines off the CSV file, and repeating step 4 until the array size is 4.
If the size is 4, then this indicates that the record is the next record in the CSV file and you can add the currentRecord to recordList
Keep repeating steps 2 to 6 until you reach the end of the CSV file
It would be better if you can remove the line breaks in the field and clean the CSV file before parsing it though. It'll make things much simpler.
Use a proper CSV library to handle the writing and parsing. There's a few edge cases to handle here, not only the new line. Users could also insert commas or quotes in their notes and it will become very messy to handle this by yourself.
Try uniVocity-parsers as it can handle all sorts of situations when parsing and writing CSV.
Disclosure: I am the author of this library. It's open-source and free (Apache V2.0 license).

Sorting an Array with .txt

Quick enquiry..
I have created an array and it will be populated by a scanner passing reading through information from a .txt file. The .txt file has a specific structure:
<job role> <years of experience> <name>
( this is an example ). This will be inputted by more than one person so there will be multiple of these in the text file. So, I now need to find a way to gather them into an array into an ordered structure. The order should be based on the first alphabetic letter of their job role. I was
thinking about implementing a comparator would this be possible/efficient to do?
So my idea would be use a comparator somehow on specifically and compare them will all other job role entries..
thanks and sorry if it's a brief or not very clear found it difficult describing the situation...
In order to accurately read in the text file, you would need a delimiter. A delimiter is a character that tells the computer that you are moving to the next data entry.
I recommend the following: for a new person, I recommend a new line \n as the delimiter. To tell the difference in the objects job, years, and name, I recommend using a comma, ",", as it does not appear that the the data set requires the use of that.
Once you have decided on a delimiter, I would recommend looking into the class File, FileInputStream and FileOutputStream. These classes are data streams that can read information from files saved on the computer.

Most elegant way to read a file and operating on lines as bytes

I have a database dump file I need to operate on rawly. I need to read the file in, operating on it line by line, but I can't have the whole file in memory (they can be 10gb + theoretically).
I want to be able to read it and operate on each line individually as I go, until the end of the file. It has to be weird character friendly (can have all sorts of bytes in them).
You could adapt the old nio example grep and remove the pattern match if you don't need it.
if the line break doesn't interest you can use BufferedReader#readLine() and convert the string back to a byte[]
the other way is to use a byte[] as buffer (has to be large enough for a line) and use InputStream#read(byte[]) to fill it with bytes. then you can search the buffer for linefeeds and work with part of the buffer. once you find no more linefeeds, move the data to the left via System#arraycopy() and fill the rest with new data through InputStream#read(byte[], int, int) and go on.
but be careful! depending on the encoding (e.g. unicode) one byte doesn't have to be one character

Java File Splitting

What will be the most eficient way to split a file in Java ?
Like to get it grid ready...
(Edit)
Modifying the question.
Basically after scouring the net I understand that there are generally two methods followed for file splitting....
Just split them by the number of bytes
I guess the advantage of this method is that it is fast, but say I have all the data in a line and suppose the file split puts half the data in one split and the other half the data in another split, then what do I do ??
Read them line by line
This will keep my data intact, fine, but I suppose this ain't as fast as the above method
Well, just read the file line by line and start saving it to a new file. Then when you decide it's time to split, start saving the lines to a new place.
Don't worry about efficiency too much unless it's a real problem later.
My first impression is that you have something like a comma separated value (csv) file. The usual way to read / parse those files is to
read them line by line
skip headers and empty lines
use String#split(String reg) to split a line into values (reg is chosen to match the delimiter)

Categories