Buffered Reader, Line count, Split, Parse in Java - java

Actually, I am assigned a task where I have a xyz.txt/CSV file which will basically have numeric values and I am supposed to pass it through BUFFERED READER then split those values and finally parse them.
So I have a Java code can body help me with it.
package javaapplication12;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.io.LineNumberReader;
public class JavaApplication12 {
public static void main(String[] args) {
String count= "F:\\Gephi\\number.txt";
BufferedReader br = null;
FileReader fr = null;
try {
fr = new FileReader(count);
br = new BufferedReader
// AT THIS POINT THERE SHOULD BE SOME THING THAT COUNTS NUMBER OF LINES USING COUNT++ OR SOMETHING LIKE THIS//
String sCurrentLine;
while ((sCurrentLine = br.readLine()) != null) {
System.out.println(sCurrentLine);
}
}
catch (IOException e) {
e.printStackTrace();
}
finally {
try {
if (br != null)
br.close();
if (fr != null)
fr.close();
}
catch (IOException ex) {
ex.printStackTrace();
}
}
}
}
// COMING TO THIS POINT THE ABOVE VALUES OF .TXT FILE SHOULD BE SPLIT USING SPLIT PARAMETER//
// AFTER SPLITTING THE SPLIT VALUE SHOULD BE KEPT IN AN ARRAY AND THEN EVENTUALLY PARSED//
Or IF Anybody can rewrite the code in another way of the above-stated problem, it shall also be appreciated.

Here is my solution with Java 8:
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.util.List;
import java.util.stream.Collectors;
public class BR {
public static void main(String[] args) {
String fileName = "br.txt";
//for the csv format
String regex = ", ";
try (BufferedReader br = new BufferedReader(new FileReader(fileName))) {
List<String[]> lines = br.lines()
.map(line -> line.split(regex))
.collect(Collectors.toList());
parse(lines);
} catch (IOException e) {
e.printStackTrace();
}
}
private static void parse(List<String[]> lines) {
//Do your stuff here
}
}
The initialization of the BufferedReader is in the try block (this approach is called try with resources), since BufferedReader implements AutoCloseable (an interface), so in case an exception is thrown the reader will close.
The br.lines() method returns all lines from the file.
In the map function you are passing a line that is rad in a lambda. The line is split using the split variable (for CSV format it is ', ') and is returned and collected.
The result is a List of arrays of String which can be changed in the body of the map function.
For more clarification I suggest you check some Java 8 tutorials and you will fully understand what is going on.
This solution might not be appropriate for your knowledge level (I guess), but hope it inspires you to check some fancier and modern approaches.
Have a nice day.

Related

How to read TextFiles in java? [duplicate]

This question already has answers here:
Java unreported exception [duplicate]
(2 answers)
Closed 2 years ago.
so I was tasked with writing and reading TextFiles in java, and I managed to successfully write a TextFile and display the contents (First Names) in the new file called "FirstNames". I am supposed to use the try-catch block to accomplish this task, however, I am unable to successfully read the file back, as it produces some errors that I am unable to fix. Any help would be greatly appreciated!
My Code:
// Import file
Import java.io.*;
// Import file reader
import java.io.FileReader;
// Import IOException to handle any errors
import java.io.IOException;
// Create class and method
class Main {
public static void main(String[] args) {
// Start a try-catch block
try {
// Initialize the new objects
FileWriter fw = new FileWriter("FirstNames");
BufferedWriter bw = new BufferedWriter(fw);
// Create a String array to store the first names
String names[] = new String[] { "Hussain", "Ronald", "John", "James", "Robert", "Michael", "William", "David",
"Joseph", "Daniel" };
// Output the first names in the textfile
for (int x = 0; x < 10; x++){
bw.write(names[x]);
bw.newLine();
}
bw.close();
fw.close();
// Catch any errors
} catch (Exception e) {
System.out.println("An error occured!");
}
// Experiencing issues starting from here:
// Create another try-catch block to read the file
try {
// Initialize the new objects
FileReader fr = new FileReader("FirstNames.txt");
BufferedReader br = new BufferedReader(fr);
String line = br.readLine();
// Start a while loop to output the line
while (line != null) {
System.out.println(line);
line = br.readLine();
}
br.close();
fr.close();
} catch (NullPointerException e1) { // I have put it to NullPointerException only to see the errors I'm getting for now
// System.out.println("An Error Occured!");
}
}
}
My Output:
Your problem was you were writing FirstNames and then trying to read FirstNames.txt
I've made some enhancements below to use Try-with-resources which offers the benefit of not having to close the resource at the end of use.
I've also replaced the file name with a single variable that holds the string name. ("extract to variable" under your refactor menu in IntelliJ)
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
class Main {
public static void main(String[] args) {
// By deduplicating the filename you will remove the chance for errors.
String fileName = "FirstNames";
try (FileWriter fw = new FileWriter(fileName)) {
// Initialize the new objects
BufferedWriter bw = new BufferedWriter(fw);
// should probably be a public static final String[] class field.
String names[] = new String[]{"Hussain",
"Ronald",
"John",
"James",
"Robert",
"Michael",
"William",
"David",
"Joseph",
"Daniel"};
// Output the first names in the textfile
for (String name : names) {
bw.write(name);
bw.newLine();
}
bw.close();
} catch (IOException ex) {
ex.printStackTrace(); // read the stack trace to understand the errors
}
// Now TRY reading the file
try (FileReader fr = new FileReader(fileName)){
// Initialize the new objects
BufferedReader br = new BufferedReader(fr);
String line;
// Start a while loop to output the line
while ((line = br.readLine()) != null) {
System.out.println(line);
}
br.close();
} catch (IOException e) {
System.out.println("Catches errors related to br.readLine(), br.close() and new FileReader");
e.printStackTrace();
}
}
}
Hopefully not too far removed from your original code that it still makes sense.
As others have mentioned, in a real-world situation, you may want to throw all/some errors higher up the call stack so they can be handled by a centralised error handler (so that all error handling logic is in one place rather than scattered all over the place).
At the moment I've just put ex.printStackTrace() and allowed execution to continue. This could result in multiple stack traces to be printed out which can be confusing.
Essentially start with the first error in the output and if you fix that, you may fix all the problems, otherwise re-run and look at the next first error in the console... and so on. Eventually when you've fixed all errors the code will run :)

Best way to use CSV files in Java

Im having trouble with the best approach to reading a CSV file in order to extract and compare certain things in it. The file is made up of strings, and I need to keep track if there are duplicated items. Here is what I have so far.
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class CSVReader {
public static void main(String[] args) {
String csvFile = "Cchallenge.csv";
String line = "";
String cvsSplitBy = ",";
try (BufferedReader br = new BufferedReader(new FileReader(csvFile))) {
while ((line = br.readLine()) != null) {
// use comma as separator
String[] country = line.split(cvsSplitBy);
} catch (IOException e) {
e.printStackTrace();
}
}
}
So I made an array called country with all the data. But when I go to print out the arrays length, it gives my a lot of different arrays with varying sizes. I am having a hard time traversing the arrays and extracting the duplicates. Any ideas will help, thanks.
If you simply wish to get a list of the items without any duplicates, then you could collect the items into a set, as sets do not allow duplicate items:
Set<String> items = new HashSet<>();
try (BufferedReader br = new BufferedReader(new FileReader(csvFile))) {
while ((line = br.readLine()) != null) {
items.addAll(Arrays.asList(line.split(cvsSplitBy)));
}
} catch (IOException e) {
e.printStackTrace();
}
If you also want to keep track of the duplicates, you could use another set and add items into it if they already exist in the first set. This would be an easy feat to accomplish, as the add method of Set returns a boolean in regards to if the set already contained the specified element or not:
Set<String> items = new HashSet<>();
Set<String> duplicates = new HashSet<>();
try (BufferedReader br = new BufferedReader(new FileReader(csvFile))) {
while ((line = br.readLine()) != null) {
for (String item : line.split(cvsSplitBy)) {
if (items.add(item)) {
continue;
}
duplicates.add(item);
}
}
} catch (IOException e) {
e.printStackTrace();
}

(Java) how to read a text file block by block

suppose I have the following text file, how do I read each block of lines separated by 2 empty lines in Java?
Thanks!
Reference Type: Journal Article
Record Number: 153
Author: Yang, W. and Kang, J.
Year: 2005
Title: Acoustic comfort evaluation in urban open public spaces
Journal: Applied Acoustics
Volume: 66
Issue: 2
Pages: 211-229
Short Title: Acoustic comfort evaluation in urban open public spaces
ISSN: 0003682X
DOI: 10.1016/j.apacoust.2004.07.011
'File' Attachments: internal-pdf://0633242026/Acoustic comfort evaluation in urban open public spaces.pdf
Reference Type: Thesis
Record Number: 3318
Author: Wienold, Jan
Year: 2009
Title: Daylight glare in offices
University: Fraunhofer Institute for Solar Energy Systems ISE
Thesis Type: PhD Dissertation
Short Title: Daylight glare in offices
URL: http://publica.fraunhofer.de/eprints/urn:nbn:de:0011-n-1414579.pdf
'File' Attachments: internal-pdf://2172014641/Daylight glare in offices.pdf
It seems that answering questions in this forum is quite picky ... I think its really not necessary. Nevertheless, here's my try via Processing, a programming environment built on top of Java:
import java.util.*;
String fileName = "";
String line;
BufferedReader br;
void setup(){
fileName = "My_EndNote_Library_2014-07-04.txt";
br = createReader(fileName);
}
void draw(){
try {
line = br.readLine();
println(line);
println();
} catch (IOException e) {
e.printStackTrace();
line = null;
}
if (line == null) {
// Stop reading because of an error or file is empty
noLoop();
}
}
Since data (rows) of each block is not the same you can do something like this. Using \n\n as delimiter for each block and \n for each line
import java.io.*;
public class Main {
public static void main(String[] args) throws IOException {
BufferedReader br = new BufferedReader(new FileReader("file.txt"));
StringBuffer sb = new StringBuffer();
while (true) {
String line = br.readLine();
if (line == null) break;
sb.append(line).append("\n");
}
String[] blocks = sb.toString().split("\n\n");
for (String block : blocks) {
block = block.trim();
// block - individual block from file
String[] data = block.split("\n");
for (String d : data) {
// d - individual line of block
}
}
}
}
There are two flaws in accepted answer though essence of logic is correct that you don't need any complex regex etc ,
1.Code is not OS neutral since \n is hard coded
2.Second, since a \n is added after each line so there would be three \n chars between two blocks instead of two \n chars ( two from two empty lines and one extra from previous block ) . Splitting on two chars will also work but block-1 on wards would contain an extra new line at the beginning so you might require trim .
Code is assuming that file is on class path & not on disk.
import java.io.BufferedReader;
import java.io.IOException;
import java.net.URISyntaxException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
public class ReferenceType {
public static void main(String[] args) {
ReferenceType app = new ReferenceType();
String allLines = null;
String[] blocks = null;
String lineSeparator = System.getProperty("line.separator");
try {
allLines = app.getFileAsString(lineSeparator);
blocks = allLines.split(lineSeparator+lineSeparator+lineSeparator);
} catch (URISyntaxException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private String getFileAsString(String lineSeparator) throws URISyntaxException, IOException {
Path path = Paths.get(this.getClass().getResource("ReferenceType.txt").toURI());
String textLine = null;
StringBuilder builder = new StringBuilder();
try (BufferedReader br = Files.newBufferedReader(path)) {
while ((textLine = br.readLine()) != null) {
builder.append(textLine);
builder.append(lineSeparator);
}
}
return builder.toString();
}
}

Returning values from a text file as Strings

Okay, so I've seen a few questions about this, but the answers were a bit overwhelming and quite varied. Obviously I'm looking for the simplest solution, but a want to be able to read in lines from a text file as strings and store them in an array list. I have an addItem(String item) method for each thing that would be imported to add it to the array list, but I don't know how to import the file correctly and have each line as an individual string.
You're looking for something like BufferedReader, which has functions for reading input from a file line by line.
import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.InputStreamReader;
public static void main(String[] args) {
try {
FileInputStream in = new FileInputStream("inputFile.txt");
BufferedReader br = new BufferedReader(new InputStreamReader(in));
String strLine;
while((strLine = br.readLine())!= null) {
addItem(strLine);
}
} catch(Exception e) {
System.out.println(e);
}
}
I'm not at my computer so I didn't compile this answer but I believe you are looking for somehing like this:
// file is your txt file.
BufferedReader br = new BufferedReader(new FileReader(file));
String line;
while ((line = br.readLine()) != null) {
// process the line, i.e, add to your list
}
br.close();
Hope it helps

Modifying existing file content in Java

I want to replace the second line file content, can somebody help please based on the below file format and listener method.
1324254875443
1313131
Paid
0.0
2nd line is long and want to replace to currentTimeMillis().
/************** Pay Button Listener **************/
public class payListener implements ActionListener {
public void actionPerformed(ActionEvent e) {
ArrayList<String> lines = new ArrayList<String>();
String line = null;
try {
FileReader fr = new FileReader("Ticket/" + ticketIDNumber + ".dat");
BufferedReader br = new BufferedReader(fr);
FileWriter fw = new FileWriter("Ticket/" + ticketIDNumber + ".dat");
BufferedWriter bw = new BufferedWriter(fw);
while ((line = br.readLine()) != null) {
if (line.contains("1313131"))
line.replace(System.currentTimeMillis();
lines.add(line);
bw.write(line);
} //end if
} //end try
catch (Exception e) {
} //end catch
} //end while
}//end method
Although this question is very old I'd like to add that this can be achieved much easier since Java 1.7 with java.nio.file.Files:
List<String> newLines = new ArrayList<>();
for (String line : Files.readAllLines(Paths.get(fileName), StandardCharsets.UTF_8)) {
if (line.contains("1313131")) {
newLines.add(line.replace("1313131", ""+System.currentTimeMillis()));
} else {
newLines.add(line);
}
}
Files.write(Paths.get(fileName), newLines, StandardCharsets.UTF_8);
As proposed in the accepted answer to a similar question:
open a temporary file in writing mode at the same time, and for each line, read it, modify if necessary, then write into the temporary file. At the end, delete the original and rename the temporary file.
Based on your implementation, something similar to:
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
public class ReplaceFileContents {
public static void main(String[] args) {
new ReplaceFileContents().replace();
}
public void replace() {
String oldFileName = "try.dat";
String tmpFileName = "tmp_try.dat";
BufferedReader br = null;
BufferedWriter bw = null;
try {
br = new BufferedReader(new FileReader(oldFileName));
bw = new BufferedWriter(new FileWriter(tmpFileName));
String line;
while ((line = br.readLine()) != null) {
if (line.contains("1313131"))
line = line.replace("1313131", ""+System.currentTimeMillis());
bw.write(line+"\n");
}
} catch (Exception e) {
return;
} finally {
try {
if(br != null)
br.close();
} catch (IOException e) {
//
}
try {
if(bw != null)
bw.close();
} catch (IOException e) {
//
}
}
// Once everything is complete, delete old file..
File oldFile = new File(oldFileName);
oldFile.delete();
// And rename tmp file's name to old file name
File newFile = new File(tmpFileName);
newFile.renameTo(oldFile);
}
}
I could suggest to use Apache Commons IO library. There you'll find the class org.apache.commons.io.FileUtils. You can use it:
File file = new File("... your file...");
List<String> lines = FileUtils.readLines(file);
lines.set(1, ""+System.currentTimeMillis());
FileUtils.writeLines(file, lines);
This code reads entire file contents into a List of Strings and changes the second line's content, then writes the list back to the file.
I'm not sure reading and writing the same file simultaneously is a good idea. I think it would be better to read the file line by line into a String array, replace the second line and then write the String array back into the file.

Categories