Reading, comparing and merging multiple files in Java - java

Given there are some files Customer-1.txt, Customer-2.txt and Customer-3.txt and these files have the following content:
Customer-1.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
4|2|BARBARA|JONES
Customer-2.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
Customer-3.txt
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
5|2|ALEXANDER|ANDERSON
These files have a lot of duplicate data, but it is possible that each file contains some data that is unique.
And given that the actual files are sorted, big (a few GB each file) and there are many files...
Then what is the:
a) memory cheapest
b) cpu cheapest
c) fastest
way in Java to create one file out of these three files that will contain all the unique data of each file sorted and concatenated like such:
Customer-final.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
5|2|ALEXANDER|ANDERSON
I looked into the following solution https://github.com/upcrob/spring-batch-sort-merge , but I would like to know if its possible to perhaps do it with the FileInputStream and/or a non spring batch solution.
A solution to use an in memory or real database to join them is not viable for my use case due to the size of the files and the absence of an actual database.

Since the input files are already sorted, a simple parallel iteration of the files, merging their content, is the memory cheapest, cpu cheapest, and fastest way to do it.
This is a multi-way merge join, i.e. a sort-merge join without the "sort", with elimination of duplicates, similar to a SQL DISTINCT.
Here is a version that can do unlimited number of input files (well, as many as you can have open files anyway). It uses a helper class to stage the next line from each input file, so the leading ID value only has to be parsed once per line.
private static void merge(StringWriter out, BufferedReader ... in) throws IOException {
CustomerReader[] customerReader = new CustomerReader[in.length];
for (int i = 0; i < in.length; i++)
customerReader[i] = new CustomerReader(in[i]);
merge(out, customerReader);
}
private static void merge(StringWriter out, CustomerReader ... in) throws IOException {
List<CustomerReader> min = new ArrayList<>(in.length);
for (;;) {
min.clear();
for (CustomerReader reader : in)
if (reader.hasData()) {
int cmp = (min.isEmpty() ? 0 : reader.compareTo(min.get(0)));
if (cmp < 0)
min.clear();
if (cmp <= 0)
min.add(reader);
}
if (min.isEmpty())
break; // all done
// optional: Verify that lines that compared equal by ID are entirely equal
out.write(min.get(0).getCustomerLine());
out.write(System.lineSeparator());
for (CustomerReader reader : min)
reader.readNext();
}
}
private static final class CustomerReader implements Comparable<CustomerReader> {
private BufferedReader in;
private String customerLine;
private int customerId;
CustomerReader(BufferedReader in) throws IOException {
this.in = in;
readNext();
}
void readNext() throws IOException {
if ((this.customerLine = this.in.readLine()) == null)
this.customerId = Integer.MAX_VALUE;
else
this.customerId = Integer.parseInt(this.customerLine.substring(0, this.customerLine.indexOf('|')));
}
boolean hasData() {
return (this.customerLine != null);
}
String getCustomerLine() {
return this.customerLine;
}
#Override
public int compareTo(CustomerReader that) {
// Order by customerId only. Inconsistent with equals()
return Integer.compare(this.customerId, that.customerId);
}
}
TEST
String file1data = "1|1|MARY|SMITH\n" +
"2|1|PATRICIA|JOHNSON\n" +
"4|2|BARBARA|JONES\n";
String file2data = "1|1|MARY|SMITH\n" +
"2|1|PATRICIA|JOHNSON\n" +
"3|1|LINDA|WILLIAMS\n" +
"4|2|BARBARA|JONES\n";
String file3data = "2|1|PATRICIA|JOHNSON\n" +
"3|1|LINDA|WILLIAMS\n" +
"5|2|ALEXANDER|ANDERSON\n";
try (
BufferedReader in1 = new BufferedReader(new StringReader(file1data));
BufferedReader in2 = new BufferedReader(new StringReader(file2data));
BufferedReader in3 = new BufferedReader(new StringReader(file3data));
StringWriter out = new StringWriter();
) {
merge(out, in1, in2, in3);
System.out.print(out);
}
OUTPUT
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
5|2|ALEXANDER|ANDERSON
The code merges purely by ID value, and doesn't verify that rest of line is actually equal. Insert code at the optional comment to check for that, if needed.

This might help:
public static void main(String[] args) {
String files[] = {"Customer-1.txt", "Customer-2.txt", "Customer-3.txt"};
HashMap<Integer, String> customers = new HashMap<Integer, String>();
try {
String line;
for(int i = 0; i < files.length; i++) {
BufferedReader reader = new BufferedReader(new FileReader("data/" + files[i]));
while((line = reader.readLine()) != null) {
Integer uuid = Integer.valueOf(line.split("|")[0]);
customers.put(uuid, line);
}
reader.close();
}
BufferedWriter writer = new BufferedWriter(new FileWriter("data/Customer-final.txt"));
Iterator<String> it = customers.values().iterator();
while(it.hasNext()) writer.write(it.next() + "\n");
writer.close();
} catch (Exception e) {
e.printStackTrace();
}
}
If you have any cquestions ask me.

Related

How to replace an string after a specific line in a file using java

I have a situation similar where I need to change a line in a batch file if similar string not found.
Suppose I have a code like below in batch(I know it is not correct code as it a dummy only)
public static void main(String[] args) {
if (user == '1234') {
ENV DEV
set DB myDBDEV
set Excel myExecelDEV
set API MyAPIURLDEV
} elseif (user == '5678') {
ENV UAT
set DB myDBUAT
set Excel myExecelUAT
set API MyAPIURLUAT
}
}
}
Now I want java to read above file, find ENV as DEV and change the value like myDBDEV, myExecelDEV, MyAPIURLDEV etc.
I am able to find the line number by using below code
FileInputStream fis = new FileInputStream("C:\\Users\\owner\\Desktop\\batch\\MYbatch-env.csh");
InputStreamReader input = new InputStreamReader(fis);
BufferedReader br = new BufferedReader(input);
String data;
String result = new String();
int i=0;
while ((data = br.readLine()) != null) {
i++;
if(data.contains("ENV DEV")) {
System.out.println("line number -> "+i);
}
result = result.concat(data + "\n");
}
I have tried below code but that was not return line number so I use above approach
Finding line number of a word in a text file using java
I also tried below approach but it seems not working
How to replace an string after a specific line in a file using java
Now problem statement is replaceAll function will remove all key but I want to remove the next string of key means value. and it is a text as string not a hashmap kind thing,
In if block if DB string is myDBDEV2 then I want to change the values to myDBDEV
Example:
If below string found
ENV DEV
Then below value should check value of key DB and replace if not found required value
set DB myDBDEV
set Excel myExecelDEV
set API MyAPIURLDEV
And main thing is code should make change in if block only, else if variables should be affected as an file example I have shown in above URL.
Below solution work for me
public static void main(String[] args) throws Exception {
String filepath= "C:\\Users\\Demo\\Desktop\\batch\\Demo.sh";
FileInputStream fis = new FileInputStream(filepath);
InputStreamReader input = new InputStreamReader(fis);
BufferedReader br = new BufferedReader(input);
String data;
String result = new String();
int lineNumber=0;
int i=0;
while ((data = br.readLine()) != null) {
i++;
if(data.contains("My String data")) {
System.out.println("line number -> "+i);
lineNumber=i;
break;
}
result = result.concat(data + "\n");
}
br.close();
lineNumber=lineNumber+1;
System.out.println(lineNumber);
String Mystring =" Mystring";
String Mystringline = Files.readAllLines(Paths.get(filepath)).get(lineNumber-1); // get method count from 0 so -1
System.out.println("Line data ->> "+Mystringline);
if(!Mystringline.equalsIgnoreCase(Mystring)) {
setVariable(lineNumber, Mystring, filepath);
}else {
System.out.println("Mystring is already pointing to correct DB");
}
System.out.println("Succesfully Change");
}
public static void setVariable(int lineNumber, String data, String filepath) throws IOException {
Path path = Paths.get(filepath);
List<String> lines = Files.readAllLines(path, StandardCharsets.UTF_8);
lines.set(lineNumber - 1, data);
Files.write(path, lines, StandardCharsets.UTF_8);
}
}

read txt file and store data in a hashtable in java

I am reading a txt file and store the data in a hashtable, but I couldn't get the correct output. the txt file like this (part) attached image
this is part of my data
And I want to store the column 1 and column 2 as the key(String type) in hashtable, and column 3 and column 4 as the value (ArrayList type) in hashtable.
My code below:
private Hashtable<String, ArrayList<String[]>> readData() throws Exception {
BufferedReader br = new BufferedReader (new FileReader("MyGridWorld.txt"));
br.readLine();
ArrayList<String[]> value = new ArrayList<String[]>();
String[] probDes = new String[2];
String key = "";
//read file line by line
String line = null;
while ((line = br.readLine()) != null && !line.equals(";;")) {
//System.out.println("line ="+line);
String source;
String action;
//split by tab
String [] splited = line.split("\\t");
source = splited[0];
action = splited[1];
key = source+","+action;
probDes[0] = splited[2];
probDes[1] = splited[3];
value.add(probDes);
hashTableForWorld.put(key, value);
System.out.println("hash table is like this:" +hashTableForWorld);
}
br.close();
return hashTableForWorld;
}
The output looks like this:
it's a very long long line
I think maybe the hashtable is broken, but I don't know why. Thank you for reading my problem.
The first thing we need to establish is that you have a really obvious XY-Problem, in that "what you need to do" and "how you're trying to solve it" are completely at odds with each other.
So let's go back to the original problem and try to work out what we need first.
As best as I can determine, source and action are connected, in that they represent queryable "keys" to your data structure, and probability, destination, and reward are queryable "outcomes" in your data structure. So we'll start by creating objects to represent those two concepts:
public class SourceAction implements Comparable<SourceAction>{
public final String source;
public final String action;
public SourceAction() {
this("", "");
}
public SourceAction(String source, String action) {
this.source = source;
this.action = action;
}
public int compareTo(SourceAction sa) {
int comp = source.compareTo(sa.source);
if(comp != 0) return comp;
return action.compareto(sa.action);
}
public boolean equals(SourceAction sa) {
return source.equals(sa.source) && action.equals(sa.action);
}
public String toString() {
return source + ',' + action;
}
}
public class Outcome {
public String probability; //You can use double if you've written code to parse the probability
public String destination;
public String reward; //you can use double if you're written code to parse the reward
public Outcome() {
this("", "", "");
}
public Outcome(String probability, String destination, String reward) {
this.probability = probability;
this.destination = destination;
this.reward = reward;
}
public boolean equals(Outcome o) {
return probability.equals(o.probability) && destination.equals(o.destination) && reward.equals(o.reward);
public String toString() {
return probability + ',' + destination + ',' + reward;
}
}
So then, given these objects, what sort of Data Structure can properly encapsulate the relationship between these objects, given that a SourceAction seems to have a One-To-Many relationship to Outcome objects? My suggestion is that a Map<SourceAction, List<Outcome>> represents this relationship.
private Map<SourceAction, List<Outcome>> readData() throws Exception {
It is possible to use a Hash Table (in this case, HashMap) to contain these objects, but I'm trying to keep the code as simple as possible, so we're going to stick to the more generic interface.
Then, we can reuse the logic you used in your original code to insert values into this data structure, with a few tweaks.
private Map<SourceAction, List<Outcome>> readData() {
//We're using a try-with-resources block to eliminate the later call to close the reader
try (BufferedReader br = new BufferedReader (new FileReader("MyGridWorld.txt"))) {
br.readLine();//Skip the first line because it's just a header
//I'm using a TreeMap because that makes the implementation simpler. If you absolutely
//need to use a HashMap, then make sure you implement a hash() function for SourceAction
Map<SourceAction, List<Outcome>> dataStructure = new TreeMap<>();
//read file line by line
String line = null;
while ((line = br.readLine()) != null && !line.equals(";;")) {
//split by tab
String [] splited = line.split("\\t");
SourceAction sourceAction = new SourceAction(splited[0], splited[1]);
Outcome outcome = new Outcome(splited[2], splited[3], splited[4]);
if(dataStructure.contains(sourceAction)) {
//Entry already found; we're just going to add this outcome to the already
//existing list.
dataStructure.get(sourceAction).add(outcome);
} else {
List<Outcome> outcomes = new ArrayList<>();
outcomes.add(outcome);
dataStructure.put(sourceAction, outcomes);
}
}
} catch (IOException e) {//Do whatever, or rethrow the exception}
return dataStructure;
}
Then, if you want to query for all the outcomes associated with a given source + action, you need only construct a SourceAction object and query the Map for it.
Map<SourceAction, List<Outcome>> actionMap = readData();
List<Outcome> outcomes = actionMap.get(new SourceAction("(1,1)", "Up"));
assert(outcomes != null);
assert(outcomes.size() == 3);
assert(outcomes.get(0).equals(new Outcome("0.8", "(1,2)", "-0.04")));
assert(outcomes.get(1).equals(new Outcome("0.1", "(2,1)", "-0.04")));
assert(outcomes.get(2).equals(new Outcome("0.1", "(1,1)", "-0.04")));
This should yield the functionality you need for your problem.
You should change your logic for adding to your hashtable to check for the key you create. If the key exists, then grab your array list of arrays that it maps to and add your array to it. Currently you will overwrite the data.
Try this
if(hashTableForWorld.containsKey(key))
{
value = hashTableForWorld.get(key);
value.add(probDes);
hashTableForWorld.put(key, value);
}
else
{
value = new ArrayList<String[]>();
value.add(probDes);
hashTableForWorld.put(key, value);
}
Then to print the contents try something like this
for (Map.Entry<String, ArrayList<String[]>> entry : hashTableForWorld.entrySet()) {
String key = entry.getKey();
ArrayList<String[]> value = entry.getValue();
System.out.println ("Key: " + key + " Value: ");
for(int i = 0; i < value.size(); i++)
{
System.out.print("Array " + i + ": ");
for(String val : value.get(i))
System.out.print(val + " :: ")
System.out.println();
}
}
Hashtable and ArrayList (and other collections) do not make a copy of key and value, and thus all values you are storing are the same probDes array you are allocating at the beginning (note that it is normal that the String[] appears in a cryptic form, you would have to make it pretty yourself, but you can still see that it is the very same cryptic thing all the time).
What is sure is that you should allocate a new probDes for each element inside the loop.
Based on your data you could work with an array as value in my opinion, there is no real use for the ArrayList
And the same applies to value, it has to be allocated separately upon encountering a new key:
private Hashtable<String, ArrayList<String[]>> readData() throws Exception {
try(BufferedReader br=new BufferedReader(new FileReader("MyGridWorld.txt"))) {
br.readLine();
Hashtable<String, ArrayList<String[]>> hashTableForWorld=new Hashtable<>();
//read file line by line
String line = null;
while ((line = br.readLine()) != null && !line.equals(";;")) {
//System.out.println("line ="+line);
String source;
String action;
//split by tab
String[] split = line.split("\\t");
source = split[0];
action = split[1];
String key = source+","+action;
String[] probDesRew = new String[3];
probDesRew[0] = split[2];
probDesRew[1] = split[3];
probDesRew[2] = split[4];
ArrayList<String[]> value = hashTableForWorld.get(key);
if(value == null){
value = new ArrayList<>();
hashTableForWorld.put(key, value);
}
value.add(probDesRew);
}
return hashTableForWorld;
}
}
Besides relocating the variables to their place of actual usage, the return value is also created locally, and the reader is wrapped into a try-with-resource construct which ensures that it is getting closed even if an exception occurs (see official tutorial here).

save to and load from not working in java [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
for the life of me I can't figure out what is wrong with these codes .. the save to keep overwrite itself and the load from doesn't load the already existing data .. I have searched for this code but it seems like people use different codes .. please help me end my headache
// Write to file
static void writeToFile(Customer c[], int number_of_customers) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File("Customers.dat");
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
int i = 0;
do {
pw.println(c[i].getName());
pw.println(c[i].getNumber());
i++;
} while (i < number_of_customers);
pw.println(0);
pw.println(0);
pw.close();
}
// Read from file
public static int readFromFile(Customer c[]) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File("Customers.dat");
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
String cus;
int l = -1;
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
int all_customers = 0;
do {
l++;
c[l] = new Customer();
c[l].cus_name = br.readLine();
cus = br.readLine();
c[l].cus_no = Integer.parseInt(cus);
all_customers++;
} while (c[l].cus_no != 0); // end while
br.close(); // end ReadFile class
return all_customers - 1;
}
An alternative way to fix your write method would be to use a FileOutputStream constructor that lets you request that data be appended to the end of the file.
FileOutputStream fos = new FileOutputStream(outputFile, true);
This does assume that you always write a complete final record with an end of line after it, even under error conditions. You'll still have to deal with this type of situation with the other solution (read and merge), but with that one the subsequent run can detect and deal with it if necessary. So the append solution I describe is not as robust.
You have a number of issues with your code.
Looking first at your readFromFile method:
You're passing in an array that your method is filling up with all the records it finds. What happens if there are more customers in the file than there's room for in the array? (hint: ArrayIndexOutOfBoundsException is a thing)
You're parsing an integer read as a string from the file. What happens if the file is corrupt and the line read is not an integer?
The name of the file to read from is hard-coded. This should be a constant or configuration option. For the purpose of writing methods, it is best to make it a parameter.
You're opening the file and reading from it in the method. For purposes of unit testing, you should split this into separate methods.
In general, you should be using a Collections class instead of an array to hold a list of objects.
You're accessing the Customer attributes directly in the readFromFile method. You should be using an accessor method.
Collections-based approach
Here's my proposed rewrite based on using Collections APIs:
public static List<Customer> readFromFile(String filename) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File(filename);
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
List<Customer> customers = readFromStream(br);
br.close(); // end ReadFile class
return customers;
}
This uses this method to actually read the contents:
public static List<Customer> readFromStream(BufferedReader br) throws IOException {
List<Customer> customerList = new LinkedList<>();
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
boolean moreCustomers = true;
while (moreCustomers) {
try {
Customer customer = new Customer();
customer.setName(br.readLine());
String sCustNo = br.readLine();
customer.setNumber(Integer.parseInt(sCustNo));
if (customer.getNumber() == 0) {
moreCustomers = false;
}
else {
customerList.add(customer);
}
}
catch (NumberFormatException x) {
// happens if the line is not a number.
// handle this somehow, e.g. by ignoring, logging, or stopping execution
// for now, we just stop reading
moreCustomers = false;
}
}
return customerList;
}
Using a similar approach for writeToFile, we get:
static void writeToFile(Collection<Customer> customers, String filename) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File(filename);
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
writeToStream(customers, pw);
pw.flush();
pw.close();
}
static void writeToStream(Collection<Customer> customers, PrintWriter pw) throws IOException {
for (Customer customer: customers) {
pw.println(customer.getName());
pw.println(customer.getNumber());
}
pw.println(0);
pw.println(0);
}
However, we still haven't addressed your main concern. It seems you want to merge the file content with the customers in memory when you call writeToFile. I suggest that you instead introduce a new method for this purpose. This keeps the existing methods simpler:
static void syncToFile(Collection<Customer> customers, String filename) throws IOException {
// get a list of existing customers
List<Customer> customersInFile = readFromFile(filename);
// use a set to merge
Set<Customer> customersToWrite = new HashSet<>();
// first add current in-memory cutomers
customersToWrite.addAll(customers);
// then add the ones from the file. Duplicates will be ignored
customersToWrite.addAll(customersInFile);
// then save the merged set
writeToFile(customersToWrite, filename);
}
Oh... I almost forgot: The magic of using a Set to merge the file and in-memory list relies on you to implement the equals() method in the Customer class. If you overwrite equals(), you should also overwrite hashCode(). For example:
public class Customer {
#Override
public boolean equals(Object obj) {
return (obj != null) && (obj instanceof Customer) && (getNumber() == ((Customer)obj).getNumber());
}
#Override
public int hashCode() {
return getNumber()+31;
}
};
CustomerList-based approach
If you cannot use Collections APIs, the second-best would be to write your own collection type that supports the same operations, but is backed by an array (or linked list, if you have learned that). In your case, it would be a list of customers. I'll call the type CustomerList:
Analyzing our existing code, we'll need a class that implements an add method and a way to traverse the list. Ignoring Iterators, we'll accomplish the latter with a getLength and a getCustomer (by index). For the synchronization, we also need a way to check if a customer is in the list, so we'll add a contains method:
public class CustomerList {
private static final int INITIAL_SIZE = 100;
private static final int SIZE_INCREMENT = 100;
// list of customers. We're keeping it packed, so there
// should be no holes!
private Customer[] customers = new Customer[INITIAL_SIZE];
private int numberOfCustomers = 0;
/**
* Adds a new customer at end. Allows duplicates.
*
* #param newCustomer the new customer to add
* #return the updated number of customers in the list
*/
public int add(Customer newCustomer) {
if (numberOfCustomers == customers.length) {
// the current array is full, make a new one with more headroom
Customer[] newCustomerList = new Customer[customers.length+SIZE_INCREMENT];
for (int i = 0; i < customers.length; i++) {
newCustomerList[i] = customers[i];
}
// we will add the new customer at end!
newCustomerList[numberOfCustomers] = newCustomer;
// replace the customer list with the new one
customers = newCustomerList;
}
else {
customers[numberOfCustomers] = newCustomer;
}
// we've added a new customer!
numberOfCustomers++;
return numberOfCustomers;
}
/**
* #return the number of customers in this list
*/
public int getLength() {
return numberOfCustomers;
}
/**
* #param i the index of the customer to retrieve
* #return Customer at index <code>i</code> of this list (zero-based).
*/
public Customer getCustomer(int i) {
//TODO: Add boundary check of i (0 <= i < numberOfCustomers)
return customers[i];
}
/**
* Check if a customer with the same number as the one given exists in this list
* #param customer the customer to check for (will use customer.getNumber() to check against list)
* #return <code>true</code> if the customer is found. <code>false</code> otherwise.
*/
public boolean contains(Customer customer) {
for (int i = 0; i < numberOfCustomers; i++) {
if (customers[i].getNumber() == customer.getNumber()) {
return true;
}
}
// if we got here, it means we didn't find the customer
return false;
}
}
With this implemented, the rewrite of the writeToFile method is exactly the same, except we use CustomerList instead of List<Customer>:
static void writeToFile(CustomerList customers, String filename) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File(filename);
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
writeToStream(customers, pw);
pw.flush();
pw.close();
}
The writeToStream is also very similar, except since we're not using an Iterator, we have to traverse the list manually:
static void writeToStream(CustomerList customers, PrintWriter pw) throws IOException {
for (int i = 0; i < customers.getLength(); i++) {
pw.println(customers.getCustomer(i).getName());
pw.println(customers.getCustomer(i).getNumber());
}
pw.println(0);
pw.println(0);
}
Similar for readFromFile -- pretty much the same except for the list type:
public static CustomerList readFromFile(String filename) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File(filename);
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
CustomerList customers = readFromStream(br);
br.close(); // end ReadFile class
return customers;
}
The readFromStream is also pretty much the same, except for the type (the methods used on CustomerList has the same signature as the ones used on List<Customer>:
public static CustomerList readFromStream(BufferedReader br) throws IOException {
CustomerList customerList = new CustomerList();
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
boolean moreCustomers = true;
while (moreCustomers) {
try {
Customer customer = new Customer();
customer.setName(br.readLine());
String sCustNo = br.readLine();
customer.setNumber(Integer.parseInt(sCustNo));
if (customer.getNumber() == 0) {
moreCustomers = false;
}
else {
customerList.add(customer);
}
}
catch (NumberFormatException x) {
// happens if the line is not a number.
// handle this somehow, e.g. by ignoring, logging, or stopping execution
// for now, we just stop reading
moreCustomers = false;
}
}
return customerList;
}
The most different method is the syncToFile, as we don't have the Set type that guarantees no duplicates, we have to manually check each time we try to insert a customer from the file:
static void syncToFile(CustomerList customers, String filename) throws IOException {
// get a list of existing customers
CustomerList customersInFile = readFromFile(filename);
// use a set to merge
CustomerList customersToWrite = new CustomerList();
// first add current in-memory customers
for (int i = 0; i < customers.getLength(); i++) {
customersToWrite.add(customers.getCustomer(i));
}
// then add the ones from the file. But skip duplicates
for (int i = 0; i < customersInFile.getLength(); i++) {
if (!customersToWrite.contains(customersInFile.getCustomer(i))) {
customersToWrite.add(customersInFile.getCustomer(i));
}
}
// then save the merged set
writeToFile(customersToWrite, filename);
}
Something to note here is that we could have optimized the add operations by having an extra constructor for CustomerList that took the new capacity, but I'll leave at least something for you to figure out ;)

Reading from csv files

This is a project i'm working on at college, everything seems good except in the game class which initializes the game. Here is a snippet
public class Game{
private Player player;
private World world;
private ArrayList<NonPlayableFighter> weakFoes;
private ArrayList<NonPlayableFighter> strongFoes;
private ArrayList<Attack> attacks;
private ArrayList<Dragon> dragons;
public Game() throws IOException{
player = new Player("");
world = new World();
weakFoes = new ArrayList<NonPlayableFighter>();
strongFoes = new ArrayList<NonPlayableFighter>();
attacks = new ArrayList<Attack>();
dragons = new ArrayList<Dragon>();
loadAttacks ("Database-Attacks_20309.csv");
loadFoes ("Database-Foes_20311.csv");
loadDragons ("Database-Dragons_20310.csv");
}
after that follows some getters and the 4 method i am supposed to implement.
These methods are loadCSV(String filePath),loadAttacks(String filePath),loadFoes(String filePath),loadDragons(String filePath)
I have created loadCSV(String filePath) such that it returns an ArrayList of String[] here:
private ArrayList<String[]> loadCSV(String filePath) throws IOException{
String currentLine = "";
ArrayList<String[]> result = new ArrayList<String[]>();
FileReader fileReader = new FileReader(filePath);
BufferedReader br = new BufferedReader(fileReader);
currentLine = br.readLine();
while (currentLine != null){
String[] split = currentLine.split(",");
result.add(split);
}
br.close();
return result;
}
Then i would like to load some attacks, foes, and dragons and inserting them in the appropriate ArrayList.
I applied loadAttacks(String filePath) here:
private void loadAttacks(String filePath) throws IOException{
ArrayList<String[]> allAttacks = loadCSV(filePath);
for(int i = 0; i < allAttacks.size(); i++){
String[] current = allAttacks.get(i);
Attack temp = null;
switch(current[0]){
case "SA": temp = new SuperAttack(current[1],
Integer.parseInt(current[2]));
break;
case "UA": temp = new UltimateAttack(current[1],
Integer.parseInt(current[2]));
break;
case "MC": temp = new MaximumCharge();
break;
case "SS": temp = new SuperSaiyan();
break;
}
attacks.add(temp);
}
}
I wrote it such that it takes the ArrayList returned from loadCSV(String filePath) and searches in each String[] within the ArrayList on the first String using a switch thus creating the appropriate attack and adding it to attacks.
Then i would like to read another CSV for the Foes and the CSV file is structured such that in the first line there are some attributes the second line some attacks of type SuperAttack and the third line holds some attacks of type Ultimate attack. Also within each foe there is a boolean attribute that determines if it is a Strong or Weak Foe thus putting it in the right Arraylist. Here is the code for loadFoes(String filePath):
private void loadFoes(String filePath) throws IOException{
ArrayList<String[]> allFoes = loadCSV(filePath);
for(int i = 0; i < allFoes.size(); i += 3){
String[] current = allFoes.get(i);
String[] supers = allFoes.get(i+1);
String[] ultimates = allFoes.get(i+2);
ArrayList<SuperAttack> superAttacks = new ArrayList<SuperAttack>();
ArrayList<UltimateAttack> ultimateAttacks = new ArrayList<UltimateAttack>();
NonPlayableFighter temp = null;
for(int j = 0; i < supers.length; j++){
int index = attacks.indexOf(supers[j]);
if(index != -1){
superAttacks.add((SuperAttack)attacks.get(index));
}
else break;
}
for(int j = 0; i < ultimates.length; j++){
int index = attacks.indexOf(ultimates[j]);
if(index != -1){
ultimateAttacks.add((UltimateAttack)attacks.get(index));
}
else break;
}
if(current[7].equalsIgnoreCase("True")){
temp = new NonPlayableFighter(current[0], Integer.parseInt(current[1]),
Integer.parseInt(current[2]), Integer.parseInt(current[3]),
Integer.parseInt(current[4]), Integer.parseInt(current[5]),
Integer.parseInt(current[6]), true, superAttacks, ultimateAttacks);
strongFoes.add(temp);
}
else{
temp = new NonPlayableFighter(current[0], Integer.parseInt(current[1]),
Integer.parseInt(current[2]), Integer.parseInt(current[3]),
Integer.parseInt(current[4]), Integer.parseInt(current[5]),
Integer.parseInt(current[6]), false, superAttacks, ultimateAttacks);
weakFoes.add(temp);
}
}
}
First i get the first three String[] in the ArrayList returned from loadCSV(String filePath and made 2 loops to check if the attacks are within the previously loaded attacks CSV then i check for the attribute that determines if it is a strong or weak and accordingly creating a new NonPlayableFighter and adding it to the appropriate list.
Running the jUnit4 tests for this assignment it gives me a Compilation Error: Unhandled exception type IOException. And generally speaking does the code have any notable problems ?
It's better to reuse already exist CSV file readers for Java (e.g. CVSReader) if isn't a part of you task.
That makes a lot of code. I'll answer to your Compilation Error.
While reading a file you have to pu your code in a try catch in order to avoid this kind of error. In your loadCSV method you have to set up a try catch block.
Please refer to this site for complete tutorial.
try (BufferedReader br = new BufferedReader(new FileReader("C:\\testing.txt")))
{
String sCurrentLine;
while ((sCurrentLine = br.readLine()) != null) {
String[] split = currentLine.split(",");
result.add(split);
}
} catch (IOException e) {
e.printStackTrace();
}
To make it short, code that access to files have to be in a try catch to avoid IO Exception, or be in a method that throws the exception (but then it has to be catched elsewhere).
In that code you have a good example of a try-with-resource, very good way to manage your ressource and memory.
loadCSV(String filePath) is a infinite loop isn't it? And as for the IOException it as #RPresle suggested a try/catch would do the trick around the BufferedReader.

How do I read from a File to an array

I am trying to read from a file to an array. I tried two different styles and both aren't working. Below are the two styles.
Style 1
public class FileRead {
int i;
String a[] = new String[2];
public void read() throws FileNotFoundException {
//Z means: "The end of the input but for the final terminator, if any"
a[i] = new Scanner(new File("C:\\Users\\nnanna\\Documents\\login.txt")).useDelimiter("\\n").next();
for(i=0; i<=a.length; i++){
System.out.println("" + a[i]);
}
}
public static void main(String args[]) throws FileNotFoundException{
new FileRead().read();
}
}
Style 2
public class FileReadExample {
private int j = 0;
String path = null;
public void fileRead(File file){
StringBuilder attachPhoneNumber = new StringBuilder();
try{
FileReader read = new FileReader(file);
BufferedReader bufferedReader = new BufferedReader(read);
while((path = bufferedReader.readLine()) != null){
String a[] = new String[3];
a[j] = path;
j++;
System.out.println(path);
System.out.println(a[j]);
}
bufferedReader.close();
}catch(IOException exception){
exception.printStackTrace();
}
}
I need it to read each line of string and store each line in an array. But neither works. How do I go about it?
Do yourself a favor and use a library that provides this functionality for you, e.g.
Guava:
// one String per File
String data = Files.toString(file, Charsets.UTF_8);
// or one String per Line
List<String> data = Files.readLines(file, Charsets.UTF_8);
Commons / IO:
// one String per File
String data = FileUtils.readFileToString(file, "UTF-8");
// or one String per Line
List<String> data = FileUtils.readLines(file, "UTF-8");
It's not really clear exactly what you're trying to do (partly with quite a lot of code commented out, leaving other code which won't even compile), but I'd recommend you look at using Guava:
List<String> lines = Files.readLines(file, Charsets.UTF_8);
That way you don't need to mess around with the file handling yourself at all.

Categories