I want to read a huge csv file. We are using superCSV to parse through the files in general. In this particular scenario, the file is huge and there is always this problem of running out of memory for obvious reasons.
The initial idea is to read the file as chunks, but I am not sure if this would work with superCSV because when I chunk the file, only the first chunk has the header values and will be loaded into the CSV bean, while the other chunks do not have header values and I feel that it might throw an exception. So
a) I was wondering if my thought process is right
b) Are there any other ways to approach this problem.
So my main question is
Does superCSV have the capability to handle large csv files and I see that superCSV reads the document through the BufferedReader. But I dont know what is the size of the buffer and can we change it as per our requirement ?
#Gilbert Le BlancI have tried splitting into smaller chunks as per your suggestion but it is taking a long time to break down the huge file into smaller chunks. Here is the code that I have written to do it.
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.io.LineNumberReader;
public class TestFileSplit {
public static void main(String[] args) {
LineNumberReader lnr = null;
try {
//RandomAccessFile input = new RandomAccessFile("", "r");
File file = new File("C:\\Blah\\largetextfile.txt");
lnr = new LineNumberReader(new FileReader(file), 1024);
String line = "";
String header = null;
int noOfLines = 100000;
int i = 1;
boolean chunkedFiles = new File("C:\\Blah\\chunks").mkdir();
if(chunkedFiles){
while((line = lnr.readLine()) != null) {
if(lnr.getLineNumber() == 1) {
header = line;
continue;
}
else {
// a new chunk file is created for every 100000 records
if((lnr.getLineNumber()%noOfLines)==0){
i = i+1;
}
File chunkedFile = new File("C:\\Blah\\chunks\\" + file.getName().substring(0,file.getName().indexOf(".")) + "_" + i + ".txt");
// if the file does not exist create it and add the header as the first row
if (!chunkedFile.exists()) {
file.createNewFile();
FileWriter fw = new FileWriter(chunkedFile.getAbsoluteFile(), true);
BufferedWriter bw = new BufferedWriter(fw);
bw.write(header);
bw.newLine();
bw.close();
fw.close();
}
FileWriter fw = new FileWriter(chunkedFile.getAbsoluteFile(), true);
BufferedWriter bw = new BufferedWriter(fw);
bw.write(line);
bw.newLine();
bw.close();
fw.close();
}
}
}
lnr.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
You can define header in the parser java class itself. That way you don't need a header row in CSV files.
// only map the first 3 columns - setting header elements to null means those columns are ignored
final String[] header = new String[] { "customerNo", "firstName", "lastName", null, null, null, null, null, null, null };
beanReader.read(CustomerBean.class, header)
or
You can also use dozer extension of SuperCSV api.
I'm not sure what the question is. Reading a line at a time as a bean takes roughly constant memory consumption. If you store all read objects at once then Yes you run out of memory. But how is this super csv's fault ?
Related
I'm looking for a small code snippet that will find a line in file and remove that line (not content but line) but could not find. So for example I have in a file following:
myFile.txt:
aaa
bbb
ccc
ddd
Need to have a function like this: public void removeLine(String lineContent), and if I pass
removeLine("bbb"), I get file like this:
myFile.txt:
aaa
ccc
ddd
This solution may not be optimal or pretty, but it works. It reads in an input file line by line, writing each line out to a temporary output file. Whenever it encounters a line that matches what you are looking for, it skips writing that one out. It then renames the output file. I have omitted error handling, closing of readers/writers, etc. from the example. I also assume there is no leading or trailing whitespace in the line you are looking for. Change the code around trim() as needed so you can find a match.
File inputFile = new File("myFile.txt");
File tempFile = new File("myTempFile.txt");
BufferedReader reader = new BufferedReader(new FileReader(inputFile));
BufferedWriter writer = new BufferedWriter(new FileWriter(tempFile));
String lineToRemove = "bbb";
String currentLine;
while((currentLine = reader.readLine()) != null) {
// trim newline when comparing with lineToRemove
String trimmedLine = currentLine.trim();
if(trimmedLine.equals(lineToRemove)) continue;
writer.write(currentLine + System.getProperty("line.separator"));
}
writer.close();
reader.close();
boolean successful = tempFile.renameTo(inputFile);
public void removeLineFromFile(String file, String lineToRemove) {
try {
File inFile = new File(file);
if (!inFile.isFile()) {
System.out.println("Parameter is not an existing file");
return;
}
//Construct the new file that will later be renamed to the original filename.
File tempFile = new File(inFile.getAbsolutePath() + ".tmp");
BufferedReader br = new BufferedReader(new FileReader(file));
PrintWriter pw = new PrintWriter(new FileWriter(tempFile));
String line = null;
//Read from the original file and write to the new
//unless content matches data to be removed.
while ((line = br.readLine()) != null) {
if (!line.trim().equals(lineToRemove)) {
pw.println(line);
pw.flush();
}
}
pw.close();
br.close();
//Delete the original file
if (!inFile.delete()) {
System.out.println("Could not delete file");
return;
}
//Rename the new file to the filename the original file had.
if (!tempFile.renameTo(inFile))
System.out.println("Could not rename file");
}
catch (FileNotFoundException ex) {
ex.printStackTrace();
}
catch (IOException ex) {
ex.printStackTrace();
}
}
This I have found on the internet.
You want to do something like the following:
Open the old file for reading
Open a new (temporary) file for writing
Iterate over the lines in the old file (probably using a BufferedReader)
For each line, check if it matches what you are supposed to remove
If it matches, do nothing
If it doesn't match, write it to the temporary file
When done, close both files
Delete the old file
Rename the temporary file to the name of the original file
(I won't write the actual code, since this looks like homework, but feel free to post other questions on specific bits that you have trouble with)
So, whenever I hear someone mention that they want to filter out text, I immediately think to go to Streams (mainly because there is a method called filter which filters exactly as you need it to). Another answer mentions using Streams with the Apache commons-io library, but I thought it would be worthwhile to show how this can be done in standard Java 8. Here is the simplest form:
public void removeLine(String lineContent) throws IOException
{
File file = new File("myFile.txt");
List<String> out = Files.lines(file.toPath())
.filter(line -> !line.contains(lineContent))
.collect(Collectors.toList());
Files.write(file.toPath(), out, StandardOpenOption.WRITE, StandardOpenOption.TRUNCATE_EXISTING);
}
I think there isn't too much to explain there, basically Files.lines gets a Stream<String> of the lines of the file, filter takes out the lines we don't want, then collect puts all of the lines of the new file into a List. We then write the list over top of the existing file with Files.write, using the additional option TRUNCATE so the old contents of the file are replaced.
Of course, this approach has the downside of loading every line into memory as they all get stored into a List before being written back out. If we wanted to simply modify without storing, we would need to use some form of OutputStream to write each new line to a file as it passes through the stream, like this:
public void removeLine(String lineContent) throws IOException
{
File file = new File("myFile.txt");
File temp = new File("_temp_");
PrintWriter out = new PrintWriter(new FileWriter(temp));
Files.lines(file.toPath())
.filter(line -> !line.contains(lineContent))
.forEach(out::println);
out.flush();
out.close();
temp.renameTo(file);
}
Not much has been changed in this example. Basically, instead of using collect to gather the file contents into memory, we use forEach so that each line that makes it through the filter gets sent to the PrintWriter to be written out to the file immediately and not stored. We have to save it to a temporary file, because we can't overwrite the existing file at the same time as we are still reading from it, so then at the end, we rename the temp file to replace the existing file.
Using apache commons-io and Java 8 you can use
List<String> lines = FileUtils.readLines(file);
List<String> updatedLines = lines.stream().filter(s -> !s.contains(searchString)).collect(Collectors.toList());
FileUtils.writeLines(file, updatedLines, false);
public static void deleteLine() throws IOException {
RandomAccessFile file = new RandomAccessFile("me.txt", "rw");
String delete;
String task="";
byte []tasking;
while ((delete = file.readLine()) != null) {
if (delete.startsWith("BAD")) {
continue;
}
task+=delete+"\n";
}
System.out.println(task);
BufferedWriter writer = new BufferedWriter(new FileWriter("me.txt"));
writer.write(task);
file.close();
writer.close();
}
Here you go. This solution uses a DataInputStream to scan for the position of the string you want replaced and uses a FileChannel to replace the text at that exact position. It only replaces the first occurrence of the string that it finds. This solution doesn't store a copy of the entire file somewhere, (either the RAM or a temp file), it just edits the portion of the file that it finds.
public static long scanForString(String text, File file) throws IOException {
if (text.isEmpty())
return file.exists() ? 0 : -1;
// First of all, get a byte array off of this string:
byte[] bytes = text.getBytes(/* StandardCharsets.your_charset */);
// Next, search the file for the byte array.
try (DataInputStream dis = new DataInputStream(new FileInputStream(file))) {
List<Integer> matches = new LinkedList<>();
for (long pos = 0; pos < file.length(); pos++) {
byte bite = dis.readByte();
for (int i = 0; i < matches.size(); i++) {
Integer m = matches.get(i);
if (bytes[m] != bite)
matches.remove(i--);
else if (++m == bytes.length)
return pos - m + 1;
else
matches.set(i, m);
}
if (bytes[0] == bite)
matches.add(1);
}
}
return -1;
}
public static void replaceText(String text, String replacement, File file) throws IOException {
// Open a FileChannel with writing ability. You don't really need the read
// ability for this specific case, but there it is in case you need it for
// something else.
try (FileChannel channel = FileChannel.open(file.toPath(), StandardOpenOption.WRITE, StandardOpenOption.READ)) {
long scanForString = scanForString(text, file);
if (scanForString == -1) {
System.out.println("String not found.");
return;
}
channel.position(scanForString);
channel.write(ByteBuffer.wrap(replacement.getBytes(/* StandardCharsets.your_charset */)));
}
}
Example
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Method Call:
replaceText("QRS", "000", new File("path/to/file");
Resulting File: ABCDEFGHIJKLMNOP000TUVWXYZ
Here is the complete Class. In the below file "somelocation" refers to the actual path of the file.
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
public class FileProcess
{
public static void main(String[] args) throws IOException
{
File inputFile = new File("C://somelocation//Demographics.txt");
File tempFile = new File("C://somelocation//Demographics_report.txt");
BufferedReader reader = new BufferedReader(new FileReader(inputFile));
BufferedWriter writer = new BufferedWriter(new FileWriter(tempFile));
String currentLine;
while((currentLine = reader.readLine()) != null) {
if(null!=currentLine && !currentLine.equalsIgnoreCase("BBB")){
writer.write(currentLine + System.getProperty("line.separator"));
}
}
writer.close();
reader.close();
boolean successful = tempFile.renameTo(inputFile);
System.out.println(successful);
}
}
This solution reads in an input file line by line, writing each line out to a StringBuilder variable. Whenever it encounters a line that matches what you are looking for, it skips writing that one out. Then it deletes file content and put the StringBuilder variable content.
public void removeLineFromFile(String lineToRemove, File f) throws FileNotFoundException, IOException{
//Reading File Content and storing it to a StringBuilder variable ( skips lineToRemove)
StringBuilder sb = new StringBuilder();
try (Scanner sc = new Scanner(f)) {
String currentLine;
while(sc.hasNext()){
currentLine = sc.nextLine();
if(currentLine.equals(lineToRemove)){
continue; //skips lineToRemove
}
sb.append(currentLine).append("\n");
}
}
//Delete File Content
PrintWriter pw = new PrintWriter(f);
pw.close();
BufferedWriter writer = new BufferedWriter(new FileWriter(f, true));
writer.append(sb.toString());
writer.close();
}
Super simple method using maven/gradle+groovy.
public void deleteConfig(String text) {
File config = new File("/the/path/config.txt")
def lines = config.readLines()
lines.remove(text);
config.write("")
lines.each {line -> {
config.append(line+"\n")
}}
}
public static void deleteLine(String line, String filePath) {
File file = new File(filePath);
File file2 = new File(file.getParent() + "\\temp" + file.getName());
PrintWriter pw = null;
Scanner read = null;
FileInputStream fis = null;
FileOutputStream fos = null;
FileChannel src = null;
FileChannel dest = null;
try {
pw = new PrintWriter(file2);
read = new Scanner(file);
while (read.hasNextLine()) {
String currline = read.nextLine();
if (line.equalsIgnoreCase(currline)) {
continue;
} else {
pw.println(currline);
}
}
pw.flush();
fis = new FileInputStream(file2);
src = fis.getChannel();
fos = new FileOutputStream(file);
dest = fos.getChannel();
dest.transferFrom(src, 0, src.size());
} catch (IOException e) {
e.printStackTrace();
} finally {
pw.close();
read.close();
try {
fis.close();
fos.close();
src.close();
dest.close();
} catch (IOException e) {
e.printStackTrace();
}
if (file2.delete()) {
System.out.println("File is deleted");
} else {
System.out.println("Error occured! File: " + file2.getName() + " is not deleted!");
}
}
}
package com.ncs.cache;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.File;
import java.io.FileWriter;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.PrintWriter;
public class FileUtil {
public void removeLineFromFile(String file, String lineToRemove) {
try {
File inFile = new File(file);
if (!inFile.isFile()) {
System.out.println("Parameter is not an existing file");
return;
}
// Construct the new file that will later be renamed to the original
// filename.
File tempFile = new File(inFile.getAbsolutePath() + ".tmp");
BufferedReader br = new BufferedReader(new FileReader(file));
PrintWriter pw = new PrintWriter(new FileWriter(tempFile));
String line = null;
// Read from the original file and write to the new
// unless content matches data to be removed.
while ((line = br.readLine()) != null) {
if (!line.trim().equals(lineToRemove)) {
pw.println(line);
pw.flush();
}
}
pw.close();
br.close();
// Delete the original file
if (!inFile.delete()) {
System.out.println("Could not delete file");
return;
}
// Rename the new file to the filename the original file had.
if (!tempFile.renameTo(inFile))
System.out.println("Could not rename file");
} catch (FileNotFoundException ex) {
ex.printStackTrace();
} catch (IOException ex) {
ex.printStackTrace();
}
}
public static void main(String[] args) {
FileUtil util = new FileUtil();
util.removeLineFromFile("test.txt", "bbbbb");
}
}
src : http://www.javadb.com/remove-a-line-from-a-text-file/
This solution requires the Apache Commons IO library to be added to the build path. It works by reading the entire file and writing each line back but only if the search term is not contained.
public static void removeLineFromFile(File targetFile, String searchTerm)
throws IOException
{
StringBuffer fileContents = new StringBuffer(
FileUtils.readFileToString(targetFile));
String[] fileContentLines = fileContents.toString().split(
System.lineSeparator());
emptyFile(targetFile);
fileContents = new StringBuffer();
for (int fileContentLinesIndex = 0; fileContentLinesIndex < fileContentLines.length; fileContentLinesIndex++)
{
if (fileContentLines[fileContentLinesIndex].contains(searchTerm))
{
continue;
}
fileContents.append(fileContentLines[fileContentLinesIndex] + System.lineSeparator());
}
FileUtils.writeStringToFile(targetFile, fileContents.toString().trim());
}
private static void emptyFile(File targetFile) throws FileNotFoundException,
IOException
{
RandomAccessFile randomAccessFile = new RandomAccessFile(targetFile, "rw");
randomAccessFile.setLength(0);
randomAccessFile.close();
}
I refactored the solution that Narek had to create (according to me) a slightly more efficient and easy to understand code. I used embedded Automatic Resource Management, a recent feature in Java and used a Scanner class which according to me is more easier to understand and use.
Here is the code with edited Comments:
public class RemoveLineInFile {
private static File file;
public static void main(String[] args) {
//create a new File
file = new File("hello.txt");
//takes in String that you want to get rid off
removeLineFromFile("Hello");
}
public static void removeLineFromFile(String lineToRemove) {
//if file does not exist, a file is created
if (!file.exists()) {
try {
file.createNewFile();
} catch (IOException e) {
System.out.println("File "+file.getName()+" not created successfully");
}
}
// Construct the new temporary file that will later be renamed to the original
// filename.
File tempFile = new File(file.getAbsolutePath() + ".tmp");
//Two Embedded Automatic Resource Managers used
// to effectivey handle IO Responses
try(Scanner scanner = new Scanner(file)) {
try (PrintWriter pw = new PrintWriter(new FileWriter(tempFile))) {
//a declaration of a String Line Which Will Be assigned Later
String line;
// Read from the original file and write to the new
// unless content matches data to be removed.
while (scanner.hasNextLine()) {
line = scanner.nextLine();
if (!line.trim().equals(lineToRemove)) {
pw.println(line);
pw.flush();
}
}
// Delete the original file
if (!file.delete()) {
System.out.println("Could not delete file");
return;
}
// Rename the new file to the filename the original file had.
if (!tempFile.renameTo(file))
System.out.println("Could not rename file");
}
}
catch (IOException e)
{
System.out.println("IO Exception Occurred");
}
}
}
Try this:
public static void main(String[] args) throws IOException {
File file = new File("file.csv");
CSVReader csvFileReader = new CSVReader(new FileReader(file));
List<String[]> list = csvFileReader.readAll();
for (int i = 0; i < list.size(); i++) {
String[] filter = list.get(i);
if (filter[0].equalsIgnoreCase("bbb")) {
list.remove(i);
}
}
csvFileReader.close();
CSVWriter csvOutput = new CSVWriter(new FileWriter(file));
csvOutput.writeAll(list);
csvOutput.flush();
csvOutput.close();
}
Old question, but an easy way is to:
Iterate through file, adding each line to an new array list
iterate through the array, find matching String, then call the remove method.
iterate through array again, printing each line to the file, boolean for append should be false, which basically replaces the file
This solution uses a RandomAccessFile to only cache the portion of the file subsequent to the string to remove. It scans until it finds the String you want to remove. Then it copies all of the data after the found string, then writes it over the found string, and everything after. Last, it truncates the file size to remove the excess data.
public static long scanForString(String text, File file) throws IOException {
if (text.isEmpty())
return file.exists() ? 0 : -1;
// First of all, get a byte array off of this string:
byte[] bytes = text.getBytes(/* StandardCharsets.your_charset */);
// Next, search the file for the byte array.
try (DataInputStream dis = new DataInputStream(new FileInputStream(file))) {
List<Integer> matches = new LinkedList<>();
for (long pos = 0; pos < file.length(); pos++) {
byte bite = dis.readByte();
for (int i = 0; i < matches.size(); i++) {
Integer m = matches.get(i);
if (bytes[m] != bite)
matches.remove(i--);
else if (++m == bytes.length)
return pos - m + 1;
else
matches.set(i, m);
}
if (bytes[0] == bite)
matches.add(1);
}
}
return -1;
}
public static void remove(String text, File file) throws IOException {
try (RandomAccessFile rafile = new RandomAccessFile(file, "rw");) {
long scanForString = scanForString(text, file);
if (scanForString == -1) {
System.out.println("String not found.");
return;
}
long remainderStartPos = scanForString + text.getBytes().length;
rafile.seek(remainderStartPos);
int remainderSize = (int) (rafile.length() - rafile.getFilePointer());
byte[] bytes = new byte[remainderSize];
rafile.read(bytes);
rafile.seek(scanForString);
rafile.write(bytes);
rafile.setLength(rafile.length() - (text.length()));
}
}
Usage:
File Contents: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Method Call: remove("ABC", new File("Drive:/Path/File.extension"));
Resulting Contents: DEFGHIJKLMNOPQRSTUVWXYZ
This solution could easily be modified to remove with a certain, specifiable cacheSize, if memory is a concern. This would just involve iterating over the rest of the file to continually replace portions of size, cacheSize. Regardless, this solution is generally much better than caching an entire file in memory, or copying it to a temporary directory, etc.
A method returns a String in comma separated format. For example, the returned String can be like the one given below.
Tarantino,50,M,USA\n Carey Mulligan,27,F,UK\n Gong Li,45,F,China
I will need to get this String and write it into a CSV file. I'll have to insert a header and a footer for this file as well.
For example, when I open the file, the contents for the above data will be
Name,Age,Gender,Country
Tarantino,50,M,USA
Carey Mulligan,27,F,UK
Gong Li,45,F,China
How do we do that ? Are there any open source libraries to do this task ?
CSV format is not very well defined. You don't have to write headers for the file. Instead it is pretty SIMPLE format. Data values are separated using commas or semicolon or space etc.
You just have to write your own simple method that writes your string to a file on local computer using FileOutputStream or Writer in java.io package.
You can use this as a learning example.
I used BufferedReader because he will take care about line separators, but you can also use #split method, and write the resulting tokens.
import java.io.*;
public class Tests {
public static void main(String[] args) {
File file = new File("out.csv");
BufferedWriter out = null;
try {
out = new BufferedWriter(new FileWriter(file));
String string = "Tarantino,50,M,USA\n Carey Mulligan,27,F,UK\n Gong Li,45,F,China";
BufferedReader reader = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(string.getBytes())));
String line;
while ((line = reader.readLine()) != null) {
out.write(line.trim());
out.newLine();
}
}
catch (IOException e) {
// log something
e.printStackTrace();
}
finally {
if (out != null) {
try {
out.close();
} catch (IOException e) {
// ignored
}
}
}
}
}
This is pretty simple
String str = "Tarantino,50,M,USA\n Carey Mulligan,27,F,UK\n Gong Li,45,F,China";
PrintWriter pr = new PrintWriter(new FileWriter(new File("test.csv"), true));
String arr[] = str.split("\\n");
// splited the string by new line provided with the string
pr.println("Name,Age,Gender,Country");
// header written first and rest of data appended
for(String s : arr){
pr.println(s);
}
pr.close();
don't forget to close the stream in finally block and handle the exception
This question already has answers here:
How to append text to an existing file in Java?
(31 answers)
Closed 9 years ago.
I am trying to generate random numbers as ids, and save them in a file to easily access them. I am currently using BufferedWriter in order to write these to the file, but the problem is that I am not too sure about how to go about finding where I should start writing into the file. I am currently trying to use BufferedReader to figure out where the next line is to write, but I am not sure how I am supposed to save this offset or anything, or how a new line is represented.
void createIds(){
File writeId = new File("peopleIDs.txt");
try {
FileReader fr = new FileReader(writeId);
BufferedReader in = new BufferedReader(fr);
FileWriter fw = new FileWriter(writeId);
BufferedWriter out = new BufferedWriter(fw);
String line;
while((line = in.readLine()) != null){
//How do I save where the last line of null is?
continue;
}
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
}
If you simply want to add IDs to the end of the file, use the following FileWriter constructor:
FileWriter fw = new FileWriter(writeId, true);
This opens the FileWriter in append mode, allowing you to write output to a pre-existing file.
If you would like to write the IDs to a particular location within an existing file rather than just to the end, I am not sure if this is possible without first parsing the file's contents.
For more information, see the JavaDoc for FileWriter.
We need more information about the file itself: what are you searching for with BufferedReader?
If the file is empty/newly created then you don't need BufferedReader at all. Just create the PrintWriter and save your numbers.
I'm just guessing here, but I think the real problem is that you're not sure how to generate random numbers (since this doesn't appear in your example code).
Here's some example code that'll write random numbers into a text file:
import java.io.PrintWriter;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.util.Random;
public class Example
{
public static void main(String[] args)
{
Random r;
PrintWriter writer;
r = new Random();
try
{
writer = new PrintWriter(new BufferedWriter(new FileWriter("Examplefile.txt")));
for (int i = 0; i < 10; i++)
writer.println(Integer.toString(r.nextInt(10)));
writer.close();
}
catch (IOException e)
{
}
}
}
You can do
try {
PrintWriter writer = new PrintWriter(new BufferedWriter(new FileWriter(new File("abc.txt"),true)));
writer.append("test");
} catch (IOException e) {
e.printStackTrace();
}
I wrote the below part of the code but I couldn't bind the arraylist with search and replace
so my csv file is as like below
1/1/1;7/6/1
1/1/2;7/7/1
I want to search the file 1.cfg for 1/1/1 and change it to 7/6/1 and 1/1/2 change to 7/7/1 and it goes so on.
Thank you all in advance
It's now only printing in a new file only the last line of the old File
import java.io.*;
import java.util.ArrayList;
import java.util.List;
public class ChangeConfiguration {
/**
* #param args
* #throws IOException
*/
public static void main(String[] args)
{
try{
// Open the file that is the first
// command line parameter
FileInputStream degistirilecek = new FileInputStream("c:/Config_Changer.csv");
FileInputStream config = new FileInputStream("c:/1.cfg");
// Get the object of DataInputStream
DataInputStream in = new DataInputStream(config);
DataInputStream degistir = new DataInputStream(degistirilecek);
BufferedReader br = new BufferedReader(new InputStreamReader(in));
BufferedReader brdegis = new BufferedReader(new InputStreamReader(degistir));
List<Object> arrayLines = new ArrayList<Object>();
Object contents;
while ((contents = brdegis.readLine()) != null)
{
arrayLines.add(contents);
}
System.out.println(arrayLines + "\n");
String strLine;
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
//Couldn't modify this part error is here :(
BufferedWriter out = new BufferedWriter(new FileWriter("c:/1_new.cfg"));
out.write(strLine);
out.close();
}
in.close();
degistir.close();
}catch (Exception e){//Catch exception if any
System.err.println("Error: " + e.getMessage());
}
}
}
You are opening the file for reading when you declare:
BufferedReader br = new BufferedReader(new InputStreamReader(in));
If you know the entire file will fit in memory, I recommend doing the following :
Open the file and read it's contents in memory into a giant string, then close the file.
Apply your replace in one shot to the giant string.
Open the file and write (e.g use a BufferedWriter) out the contents of the giant string, then close the file.
As a side note, your code as posted will not compile. The quality of the responses you receive are correlated with the quality of the question asked. Always include an SCCE with your question to increase the chance of getting a precise answer to your question.
can you elaborate the purpose of the program?
if it is a simple content replacement in a file.
then just read a line and store it in a string. then use string replace method for replacing a text in a string.
eg:
newStrog=oldString.replace(oldVlue,newValue);
I want to read a text file containing space separated values. Values are integers.
How can I read it and put it in an array list?
Here is an example of contents of the text file:
1 62 4 55 5 6 77
I want to have it in an arraylist as [1, 62, 4, 55, 5, 6, 77]. How can I do it in Java?
You can use Files#readAllLines() to get all lines of a text file into a List<String>.
for (String line : Files.readAllLines(Paths.get("/path/to/file.txt"))) {
// ...
}
Tutorial: Basic I/O > File I/O > Reading, Writing and Creating text files
You can use String#split() to split a String in parts based on a regular expression.
for (String part : line.split("\\s+")) {
// ...
}
Tutorial: Numbers and Strings > Strings > Manipulating Characters in a String
You can use Integer#valueOf() to convert a String into an Integer.
Integer i = Integer.valueOf(part);
Tutorial: Numbers and Strings > Strings > Converting between Numbers and Strings
You can use List#add() to add an element to a List.
numbers.add(i);
Tutorial: Interfaces > The List Interface
So, in a nutshell (assuming that the file doesn't have empty lines nor trailing/leading whitespace).
List<Integer> numbers = new ArrayList<>();
for (String line : Files.readAllLines(Paths.get("/path/to/file.txt"))) {
for (String part : line.split("\\s+")) {
Integer i = Integer.valueOf(part);
numbers.add(i);
}
}
If you happen to be at Java 8 already, then you can even use Stream API for this, starting with Files#lines().
List<Integer> numbers = Files.lines(Paths.get("/path/to/test.txt"))
.map(line -> line.split("\\s+")).flatMap(Arrays::stream)
.map(Integer::valueOf)
.collect(Collectors.toList());
Tutorial: Processing data with Java 8 streams
Java 1.5 introduced the Scanner class for handling input from file and streams.
It is used for getting integers from a file and would look something like this:
List<Integer> integers = new ArrayList<Integer>();
Scanner fileScanner = new Scanner(new File("c:\\file.txt"));
while (fileScanner.hasNextInt()){
integers.add(fileScanner.nextInt());
}
Check the API though. There are many more options for dealing with different types of input sources, differing delimiters, and differing data types.
This example code shows you how to read file in Java.
import java.io.*;
/**
* This example code shows you how to read file in Java
*
* IN MY CASE RAILWAY IS MY TEXT FILE WHICH I WANT TO DISPLAY YOU CHANGE WITH YOUR OWN
*/
public class ReadFileExample
{
public static void main(String[] args)
{
System.out.println("Reading File from Java code");
//Name of the file
String fileName="RAILWAY.txt";
try{
//Create object of FileReader
FileReader inputFile = new FileReader(fileName);
//Instantiate the BufferedReader Class
BufferedReader bufferReader = new BufferedReader(inputFile);
//Variable to hold the one line data
String line;
// Read file line by line and print on the console
while ((line = bufferReader.readLine()) != null) {
System.out.println(line);
}
//Close the buffer reader
bufferReader.close();
}catch(Exception e){
System.out.println("Error while reading file line by line:" + e.getMessage());
}
}
}
Look at this example, and try to do your own:
import java.io.*;
public class ReadFile {
public static void main(String[] args){
String string = "";
String file = "textFile.txt";
// Reading
try{
InputStream ips = new FileInputStream(file);
InputStreamReader ipsr = new InputStreamReader(ips);
BufferedReader br = new BufferedReader(ipsr);
String line;
while ((line = br.readLine()) != null){
System.out.println(line);
string += line + "\n";
}
br.close();
}
catch (Exception e){
System.out.println(e.toString());
}
// Writing
try {
FileWriter fw = new FileWriter (file);
BufferedWriter bw = new BufferedWriter (fw);
PrintWriter fileOut = new PrintWriter (bw);
fileOut.println (string+"\n test of read and write !!");
fileOut.close();
System.out.println("the file " + file + " is created!");
}
catch (Exception e){
System.out.println(e.toString());
}
}
}
Just for fun, here's what I'd probably do in a real project, where I'm already using all my favourite libraries (in this case Guava, formerly known as Google Collections).
String text = Files.toString(new File("textfile.txt"), Charsets.UTF_8);
List<Integer> list = Lists.newArrayList();
for (String s : text.split("\\s")) {
list.add(Integer.valueOf(s));
}
Benefit: Not much own code to maintain (contrast with e.g. this). Edit: Although it is worth noting that in this case tschaible's Scanner solution doesn't have any more code!
Drawback: you obviously may not want to add new library dependencies just for this. (Then again, you'd be silly not to make use of Guava in your projects. ;-)
Use Apache Commons (IO and Lang) for simple/common things like this.
Imports:
import org.apache.commons.io.FileUtils;
import org.apache.commons.lang3.ArrayUtils;
Code:
String contents = FileUtils.readFileToString(new File("path/to/your/file.txt"));
String[] array = ArrayUtils.toArray(contents.split(" "));
Done.
Using Java 7 to read files with NIO.2
Import these packages:
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
This is the process to read a file:
Path file = Paths.get("C:\\Java\\file.txt");
if(Files.exists(file) && Files.isReadable(file)) {
try {
// File reader
BufferedReader reader = Files.newBufferedReader(file, Charset.defaultCharset());
String line;
// read each line
while((line = reader.readLine()) != null) {
System.out.println(line);
// tokenize each number
StringTokenizer tokenizer = new StringTokenizer(line, " ");
while (tokenizer.hasMoreElements()) {
// parse each integer in file
int element = Integer.parseInt(tokenizer.nextToken());
}
}
reader.close();
} catch (Exception e) {
e.printStackTrace();
}
}
To read all lines of a file at once:
Path file = Paths.get("C:\\Java\\file.txt");
List<String> lines = Files.readAllLines(file, StandardCharsets.UTF_8);
All the answers so far given involve reading the file line by line, taking the line in as a String, and then processing the String.
There is no question that this is the easiest approach to understand, and if the file is fairly short (say, tens of thousands of lines), it'll also be acceptable in terms of efficiency. But if the file is long, it's a very inefficient way to do it, for two reasons:
Every character gets processed twice, once in constructing the String, and once in processing it.
The garbage collector will not be your friend if there are lots of lines in the file. You're constructing a new String for each line, and then throwing it away when you move to the next line. The garbage collector will eventually have to dispose of all these String objects that you don't want any more. Someone's got to clean up after you.
If you care about speed, you are much better off reading a block of data and then processing it byte by byte rather than line by line. Every time you come to the end of a number, you add it to the List you're building.
It will come out something like this:
private List<Integer> readIntegers(File file) throws IOException {
List<Integer> result = new ArrayList<>();
RandomAccessFile raf = new RandomAccessFile(file, "r");
byte buf[] = new byte[16 * 1024];
final FileChannel ch = raf.getChannel();
int fileLength = (int) ch.size();
final MappedByteBuffer mb = ch.map(FileChannel.MapMode.READ_ONLY, 0,
fileLength);
int acc = 0;
while (mb.hasRemaining()) {
int len = Math.min(mb.remaining(), buf.length);
mb.get(buf, 0, len);
for (int i = 0; i < len; i++)
if ((buf[i] >= 48) && (buf[i] <= 57))
acc = acc * 10 + buf[i] - 48;
else {
result.add(acc);
acc = 0;
}
}
ch.close();
raf.close();
return result;
}
The code above assumes that this is ASCII (though it could be easily tweaked for other encodings), and that anything that isn't a digit (in particular, a space or a newline) represents a boundary between digits. It also assumes that the file ends with a non-digit (in practice, that the last line ends with a newline), though, again, it could be tweaked to deal with the case where it doesn't.
It's much, much faster than any of the String-based approaches also given as answers to this question. There is a detailed investigation of a very similar issue in this question. You'll see there that there's the possibility of improving it still further if you want to go down the multi-threaded line.
read the file and then do whatever you want
java8
Files.lines(Paths.get("c://lines.txt")).collect(Collectors.toList());