I have a method to write data to a file.
public void writeCSFFileData(List<String> fileData){
try {
CsvListWriter csvWriter = new CsvListWriter(new FileWriter("/path/file.csv"), CsvPreference.STANDARD_PREFERENCE);
csvWriter.write(fileData);
csvWriter.close();
} catch (Exception e) {
SimpleLogger.getInstance().writeError(e);
}
The above method is called several times to write to a file.
But, each time the file is not appended instead it is overwritten.
Thanks in advance.
I found the solution myself, I just need to add true in FileWriter to append the data.
Ex: new FileWriter("/path/file.csv",true)
Related
I am given an assignment where we are not allowed to use a DB or libraries but only textfile for data storage.
But it has rather complex requirements, for e.g. many validations, because of that, we need to "access the db" (i.e. read the textfile) many times.
My question is: should I create a class like this:
class SomeRepository{
static ArrayList<Users> users = new ArrayList();
public SomeRepository(){
//instantiate this class on program load
//In constructor, we read the text file, instantiate and store everything inside the arraylist.
}
//public getOneUser(){ // for get methods, we don't read from text file at all }
/public save() { //text file saving code overhere }
}
Is this a good approach to solve the above problem? Currently, what we are doing is reading and writing to the text file every time we want to retrieve some data or write something new.
Wouldn't this be too expensive in terms of heap space memory? Or should I just read/write to the text file for every method?
public class IOManager {
public static void writeObjToTxtFile(String fileName, Object object) {
File file = new File(fileName + ".txt");//File will be created in the root directory where the program runs.
try (FileOutputStream fos = new FileOutputStream(file);
ObjectOutputStream oos = new ObjectOutputStream(fos);) {
oos.writeObject(object);
} catch (IOException e) {
e.printStackTrace();
}
}
public static Object readObjFromTxtFile(String fileName) {
Object obj = null;
File file = new File(fileName + ".txt");
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
ObjectInputStream ois = new ObjectInputStream(fis);
obj = ois.readObject();
} catch (ClassNotFoundException | IOException e) {
e.printStackTrace();
}
return obj;
}
}
Add this class to your project. Since it's general for all Objects, you can pass and receive Objects like these as well: ArrayList<Users>. Play around and Tinker with it to fit whatever your specific purpose is. Hint: You can write other custom methods that calls these methods. eg:
public static void writeUsersToFile(ArrayList<Users> usersArrayList){
writeObjToTxtFile("users",usersArrayList);
}
Ps. Make sure your Objects implement Serializable. Eg:
public class Users implements Serializable {
}
I would suggest reading the contents of your file to a dynamic list such as an arraylist at the start of your program. Make the required queries/changes to your arraylist and then write that arraylist to your file when the program is set to close. This will save significant time over repeated file reads/writes.
This isn't without it's drawbacks, though. You don't want to hogg up memory in case of very large files - but considering this is an assignment, that may not be the case. Additionally, should your program terminate prior to the write at the end, all changes made to your database during the current execution will be lost.
I have a list of objects that has some simple String properties. I want to be able to save those strings to binary so that when you open the file outside the program, you only see 1's and 0's.
I have managed to use FileOutputStreamand saved the strings, however, I can't manage to get it to write to binary. The file reads as clean readable text. I have tried getBytes().
What would be the best approach for this? Keep in mind that I want to be able to read the file later and construct back the objects. Would it be better to use Serializable and save a list of objects?
Here is my FileWriter:
NB: The toString() is custom and returns a String with linebreaks for every property.
public class FileWriter {
public void write(String fileName, Savable objectToSave ) throws IOException {
File fileToSave = new File(fileName);
String stringToSave = objectToSave.toString();
byte[] bytesToSave = stringToSave.getBytes(StandardCharsets.UTF_8) ;
try (
OutputStream outputStream = new FileOutputStream(fileToSave);
) {
outputStream.write(bytesToSave);
} catch (IOException e) {
throw new IOException("error");
}
}
}
If your goal is simply serializing, implementing Serializable and writing them would work, but your string is still going to be readable. You can encrypt the stream, but anyone decompiling your code can still devise a way to read the values.
Hi I am writing some data to a text file through java code but when i again run the code its again appending to the older data ,i want the new data to overwrite the older version.
can any one help..
BufferedWriter out1 = new BufferedWriter(new FileWriter("inValues.txt" , true));
for(String key: inout.keySet())
{
String val = inout.get(key);
out1.write(key+" , "+val+"\n");
}
out1.close();
code would help, but its likely you are telling it to append the data since the default is to overwrite. find something like:
file = new FileWriter("outfile.txt", true);
and change it to
file = new FileWriter("outfile.txt", false);
or just
file = new FileWriter("outfile.txt");
since the default is to overwrite, either should work.
based on your edit just change the true to false, or remove it, in the FileWriter. The 2nd parameter is not required and when true specifies that you want to append data to the file.
You mentioned a problem of incomplete writes... BufferedWriter() isn't required, if your file is smallish then you can use FileWriter() by itself and avoid any such issues. If you do use BufferedWriter() you need to .flush() it before you .close() it.
BufferedWriter out1 = new BufferedWriter(new FileWriter("inValues.txt"));
for(String key: inout.keySet())
{
String val = inout.get(key);
out1.write(key+" , "+val+"\n");
}
out1.flush();
out1.close();
Set append parameter to false
new FileWriter(yourFileLocation,false);
You can use simple File and FileWriter Class.
The Constructor of FileWrite Class provides 2 different varieties to make a file. One which only takes the Object of file. and another is with two parameters one with file object and second is boolean true/false which indicates whether file to be created is going to be append the contents or overwriting.
following code will do the overwriting of content.
public class WriteFile {
public static void main(String[] args) throws IOException {
File file= new File("new.txt");
FileWriter fw=new FileWriter(file,true);
try {
fw.write("This is first line");
fw.write("This is second line");
fw.write("This is third line");
fw.write("This is fourth line");
fw.write("This is fifth line");
fw.write("hello");
} catch (Exception e) {
} finally {
fw.flush();
fw.close();
}
}
}
It works same with PrintWriter class also, since it also provides 2 different varieties of Constructors same as FileWriter. But you can always refer to Java Doc API.
I've already tried exporting my database tables to CSV using the CSVWriter.
But my tables contain BLOB data. How can I include them in my export?
Then later on, im going to import that exported CSV using CSVReader. Can anyone share some concepts?
This is a part of my code for export
ResultSet res = st.executeQuery("select * from "+db+"."+obTableNames[23]);
int colunmCount = getColumnCount(res);
try {
File filename = new File(dir,""+obTableNames[23]+".csv");
fw = new FileWriter(filename);
CSVWriter writer = new CSVWriter(fw);
writer.writeAll(res, false);
int colType = res.getMetaData().getColumnType(colunmCount);
dispInt(colType);
fw.flush();
fw.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Did you take a look at encodeBase64String(byte[] data) method from the Base64 provided by Apache?
Encodes binary data using the base64 algorithm but does not chunk the output.
This should allow you to return encoded strings representing your Binary Large Object and incorporate it in your CSV.
People on the other side can then use the decodeBase64String(String data) to get the BLOB back again.
I need to write something into a text file's beginning. I have a text file with content and i want write something before this content. Say i have;
Good afternoon sir,how are you today?
I'm fine,how are you?
Thanks for asking,I'm great
After modifying,I want it to be like this:
Page 1-Scene 59
25.05.2011
Good afternoon sir,how are you today?
I'm fine,how are you?
Thanks for asking,I'm great
Just made up the content :) How can i modify a text file like this way?
You can't really modify it that way - file systems don't generally let you insert data in arbitrary locations - but you can:
Create a new file
Write the prefix to it
Copy the data from the old file to the new file
Move the old file to a backup location
Move the new file to the old file's location
Optionally delete the old backup file
Just in case it will be useful for someone here is full source code of method to prepend lines to a file using Apache Commons IO library. The code does not read whole file into memory, so will work on files of any size.
public static void prependPrefix(File input, String prefix) throws IOException {
LineIterator li = FileUtils.lineIterator(input);
File tempFile = File.createTempFile("prependPrefix", ".tmp");
BufferedWriter w = new BufferedWriter(new FileWriter(tempFile));
try {
w.write(prefix);
while (li.hasNext()) {
w.write(li.next());
w.write("\n");
}
} finally {
IOUtils.closeQuietly(w);
LineIterator.closeQuietly(li);
}
FileUtils.deleteQuietly(input);
FileUtils.moveFile(tempFile, input);
}
I think what you want is random access. Check out the related java tutorial. However, I don't believe you can just insert data at an arbitrary point in the file; If I recall correctly, you'd only overwrite the data. If you wanted to insert, you'd have to have your code
copy a block,
overwrite with your new stuff,
copy the next block,
overwrite with the previously copied block,
return to 3 until no more blocks
As #atk suggested, java.nio.channels.SeekableByteChannel is a good interface. But it is available from 1.7 only.
Update : If you have no issue using FileUtils then use
String fileString = FileUtils.readFileToString(file);
This isn't a direct answer to the question, but often files are accessed via InputStreams. If this is your use case, then you can chain input streams via SequenceInputStream to achieve the same result. E.g.
InputStream inputStream = new SequenceInputStream(new ByteArrayInputStream("my line\n".getBytes()), new FileInputStream(new File("myfile.txt")));
I will leave it here just in case anyone need
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
try (FileInputStream fileInputStream1 = new FileInputStream(fileName1);
FileInputStream fileInputStream2 = new FileInputStream(fileName2)) {
while (fileInputStream2.available() > 0) {
byteArrayOutputStream.write(fileInputStream2.read());
}
while (fileInputStream1.available() > 0) {
byteArrayOutputStream.write(fileInputStream1.read());
}
}
try (FileOutputStream fileOutputStream = new FileOutputStream(fileName1)) {
byteArrayOutputStream.writeTo(fileOutputStream);
}