Write and read txt file - java

I'm trying to do project which contains patient information and print them if user want. But it is currently adding only one person. I want to add infinite information and I don't how can I fix it.
public static void saveChanges(ArrayList<Human> humans) throws IOException {
File veritabani = new File("patients.txt");
System.gc();
RandomAccessFile raf = new RandomAccessFile(veritabani, "rw");
raf.close();
veritabani.delete();
int ctrWhile = 0;
for (int yazdir = 0; yazdir < humans.size(); yazdir++) {
File f = new File("patients.txt");
PrintWriter pw = new PrintWriter(new FileOutputStream(f, true));
String tName=humans.get(yazdir).getNameAndSurname();
int tID=humans.get(yazdir).getTC();
int tAge=humans.get(yazdir).getAge();
boolean tInsuance=humans.get(yazdir).isInsurance();
String tComplain=humans.get(yazdir).getComplain();
if (ctrWhile== 0) {
pw.append(tName+"-"+tID+"-"+tAge+"-"+"-"+tInsuance+"-"+tComplain+"-");
ctrWhile++;
} else {
pw.append("\n"+tName+"-"+tID+"-"+tAge+"-"+"-"+tInsuance+"-"+tComplain+"-");
}
pw.close();
}
}

Each time through your loop, you appear to create a new file, write to it, and close it. Therefore each time through your loop you will overwrite the file created before.
Create the file before entering the loop, and close it after you've completed the loop; only write to it within the loop.

Related

How come this method does not open up a file and write to it?

I made this method and my goal is to populate a txt file of name filename with the elements that are contained in arrayToWrite, but it does not seem to be working. Does the file get deleted once the method ends? because that is my main issue it would seem my other method can not print the content that is in the file made by this method.
public static void writeFile(String[] arrayToWrite, String filename) throws IOException{
FileOutputStream fileStream = new FileOutputStream(filename);
PrintWriter outFS = new PrintWriter(fileStream);
for (int i = 0; i < arrayToWrite.length; i++) {
outFS.println(arrayToWrite[i]);
}
}
Add an outFS.close() to the end of your function.
You may declare one or more resources in a try-with-resources statement and both FileOutputStream class and PrintWriter class implement the AutoCloseable interface, so to solve your problem you can write :
public static void writeFile(String[] arrayToWrite, String filename) throws IOException {
try (
FileOutputStream fileStream = new FileOutputStream(filename);
PrintWriter outFS = new PrintWriter(fileStream)
) {
for (int i = 0; i < arrayToWrite.length; i++) {
outFS.println(arrayToWrite[i]);
}
}
}

save to and load from not working in java [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
for the life of me I can't figure out what is wrong with these codes .. the save to keep overwrite itself and the load from doesn't load the already existing data .. I have searched for this code but it seems like people use different codes .. please help me end my headache
// Write to file
static void writeToFile(Customer c[], int number_of_customers) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File("Customers.dat");
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
int i = 0;
do {
pw.println(c[i].getName());
pw.println(c[i].getNumber());
i++;
} while (i < number_of_customers);
pw.println(0);
pw.println(0);
pw.close();
}
// Read from file
public static int readFromFile(Customer c[]) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File("Customers.dat");
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
String cus;
int l = -1;
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
int all_customers = 0;
do {
l++;
c[l] = new Customer();
c[l].cus_name = br.readLine();
cus = br.readLine();
c[l].cus_no = Integer.parseInt(cus);
all_customers++;
} while (c[l].cus_no != 0); // end while
br.close(); // end ReadFile class
return all_customers - 1;
}
An alternative way to fix your write method would be to use a FileOutputStream constructor that lets you request that data be appended to the end of the file.
FileOutputStream fos = new FileOutputStream(outputFile, true);
This does assume that you always write a complete final record with an end of line after it, even under error conditions. You'll still have to deal with this type of situation with the other solution (read and merge), but with that one the subsequent run can detect and deal with it if necessary. So the append solution I describe is not as robust.
You have a number of issues with your code.
Looking first at your readFromFile method:
You're passing in an array that your method is filling up with all the records it finds. What happens if there are more customers in the file than there's room for in the array? (hint: ArrayIndexOutOfBoundsException is a thing)
You're parsing an integer read as a string from the file. What happens if the file is corrupt and the line read is not an integer?
The name of the file to read from is hard-coded. This should be a constant or configuration option. For the purpose of writing methods, it is best to make it a parameter.
You're opening the file and reading from it in the method. For purposes of unit testing, you should split this into separate methods.
In general, you should be using a Collections class instead of an array to hold a list of objects.
You're accessing the Customer attributes directly in the readFromFile method. You should be using an accessor method.
Collections-based approach
Here's my proposed rewrite based on using Collections APIs:
public static List<Customer> readFromFile(String filename) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File(filename);
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
List<Customer> customers = readFromStream(br);
br.close(); // end ReadFile class
return customers;
}
This uses this method to actually read the contents:
public static List<Customer> readFromStream(BufferedReader br) throws IOException {
List<Customer> customerList = new LinkedList<>();
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
boolean moreCustomers = true;
while (moreCustomers) {
try {
Customer customer = new Customer();
customer.setName(br.readLine());
String sCustNo = br.readLine();
customer.setNumber(Integer.parseInt(sCustNo));
if (customer.getNumber() == 0) {
moreCustomers = false;
}
else {
customerList.add(customer);
}
}
catch (NumberFormatException x) {
// happens if the line is not a number.
// handle this somehow, e.g. by ignoring, logging, or stopping execution
// for now, we just stop reading
moreCustomers = false;
}
}
return customerList;
}
Using a similar approach for writeToFile, we get:
static void writeToFile(Collection<Customer> customers, String filename) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File(filename);
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
writeToStream(customers, pw);
pw.flush();
pw.close();
}
static void writeToStream(Collection<Customer> customers, PrintWriter pw) throws IOException {
for (Customer customer: customers) {
pw.println(customer.getName());
pw.println(customer.getNumber());
}
pw.println(0);
pw.println(0);
}
However, we still haven't addressed your main concern. It seems you want to merge the file content with the customers in memory when you call writeToFile. I suggest that you instead introduce a new method for this purpose. This keeps the existing methods simpler:
static void syncToFile(Collection<Customer> customers, String filename) throws IOException {
// get a list of existing customers
List<Customer> customersInFile = readFromFile(filename);
// use a set to merge
Set<Customer> customersToWrite = new HashSet<>();
// first add current in-memory cutomers
customersToWrite.addAll(customers);
// then add the ones from the file. Duplicates will be ignored
customersToWrite.addAll(customersInFile);
// then save the merged set
writeToFile(customersToWrite, filename);
}
Oh... I almost forgot: The magic of using a Set to merge the file and in-memory list relies on you to implement the equals() method in the Customer class. If you overwrite equals(), you should also overwrite hashCode(). For example:
public class Customer {
#Override
public boolean equals(Object obj) {
return (obj != null) && (obj instanceof Customer) && (getNumber() == ((Customer)obj).getNumber());
}
#Override
public int hashCode() {
return getNumber()+31;
}
};
CustomerList-based approach
If you cannot use Collections APIs, the second-best would be to write your own collection type that supports the same operations, but is backed by an array (or linked list, if you have learned that). In your case, it would be a list of customers. I'll call the type CustomerList:
Analyzing our existing code, we'll need a class that implements an add method and a way to traverse the list. Ignoring Iterators, we'll accomplish the latter with a getLength and a getCustomer (by index). For the synchronization, we also need a way to check if a customer is in the list, so we'll add a contains method:
public class CustomerList {
private static final int INITIAL_SIZE = 100;
private static final int SIZE_INCREMENT = 100;
// list of customers. We're keeping it packed, so there
// should be no holes!
private Customer[] customers = new Customer[INITIAL_SIZE];
private int numberOfCustomers = 0;
/**
* Adds a new customer at end. Allows duplicates.
*
* #param newCustomer the new customer to add
* #return the updated number of customers in the list
*/
public int add(Customer newCustomer) {
if (numberOfCustomers == customers.length) {
// the current array is full, make a new one with more headroom
Customer[] newCustomerList = new Customer[customers.length+SIZE_INCREMENT];
for (int i = 0; i < customers.length; i++) {
newCustomerList[i] = customers[i];
}
// we will add the new customer at end!
newCustomerList[numberOfCustomers] = newCustomer;
// replace the customer list with the new one
customers = newCustomerList;
}
else {
customers[numberOfCustomers] = newCustomer;
}
// we've added a new customer!
numberOfCustomers++;
return numberOfCustomers;
}
/**
* #return the number of customers in this list
*/
public int getLength() {
return numberOfCustomers;
}
/**
* #param i the index of the customer to retrieve
* #return Customer at index <code>i</code> of this list (zero-based).
*/
public Customer getCustomer(int i) {
//TODO: Add boundary check of i (0 <= i < numberOfCustomers)
return customers[i];
}
/**
* Check if a customer with the same number as the one given exists in this list
* #param customer the customer to check for (will use customer.getNumber() to check against list)
* #return <code>true</code> if the customer is found. <code>false</code> otherwise.
*/
public boolean contains(Customer customer) {
for (int i = 0; i < numberOfCustomers; i++) {
if (customers[i].getNumber() == customer.getNumber()) {
return true;
}
}
// if we got here, it means we didn't find the customer
return false;
}
}
With this implemented, the rewrite of the writeToFile method is exactly the same, except we use CustomerList instead of List<Customer>:
static void writeToFile(CustomerList customers, String filename) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File(filename);
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
writeToStream(customers, pw);
pw.flush();
pw.close();
}
The writeToStream is also very similar, except since we're not using an Iterator, we have to traverse the list manually:
static void writeToStream(CustomerList customers, PrintWriter pw) throws IOException {
for (int i = 0; i < customers.getLength(); i++) {
pw.println(customers.getCustomer(i).getName());
pw.println(customers.getCustomer(i).getNumber());
}
pw.println(0);
pw.println(0);
}
Similar for readFromFile -- pretty much the same except for the list type:
public static CustomerList readFromFile(String filename) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File(filename);
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
CustomerList customers = readFromStream(br);
br.close(); // end ReadFile class
return customers;
}
The readFromStream is also pretty much the same, except for the type (the methods used on CustomerList has the same signature as the ones used on List<Customer>:
public static CustomerList readFromStream(BufferedReader br) throws IOException {
CustomerList customerList = new CustomerList();
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
boolean moreCustomers = true;
while (moreCustomers) {
try {
Customer customer = new Customer();
customer.setName(br.readLine());
String sCustNo = br.readLine();
customer.setNumber(Integer.parseInt(sCustNo));
if (customer.getNumber() == 0) {
moreCustomers = false;
}
else {
customerList.add(customer);
}
}
catch (NumberFormatException x) {
// happens if the line is not a number.
// handle this somehow, e.g. by ignoring, logging, or stopping execution
// for now, we just stop reading
moreCustomers = false;
}
}
return customerList;
}
The most different method is the syncToFile, as we don't have the Set type that guarantees no duplicates, we have to manually check each time we try to insert a customer from the file:
static void syncToFile(CustomerList customers, String filename) throws IOException {
// get a list of existing customers
CustomerList customersInFile = readFromFile(filename);
// use a set to merge
CustomerList customersToWrite = new CustomerList();
// first add current in-memory customers
for (int i = 0; i < customers.getLength(); i++) {
customersToWrite.add(customers.getCustomer(i));
}
// then add the ones from the file. But skip duplicates
for (int i = 0; i < customersInFile.getLength(); i++) {
if (!customersToWrite.contains(customersInFile.getCustomer(i))) {
customersToWrite.add(customersInFile.getCustomer(i));
}
}
// then save the merged set
writeToFile(customersToWrite, filename);
}
Something to note here is that we could have optimized the add operations by having an extra constructor for CustomerList that took the new capacity, but I'll leave at least something for you to figure out ;)

Java Scanner throws NoSuchElementException: No Line Found

I don't know why it won't work. I've double and triple checked that the file in FileWriter has text (inputted by another PrintWriter earlier in a separate program) but the while statement doesn't seem to run. The commented out lines were various tests I was running to try to figure out what was going on. What I'm trying to work out is have it iterate through the array and add a group ID to all Persons. If anyone knows what's up, it would be greatly appreciated. I'll apologize in advance for any formatting errors, any comments on how to be more easily helped would also be greatly appreciated.
public static void updateWinners(Person[] Players, int n)
throws FileNotFoundException {
// n is 2 or 4 depending on round
File fileS = new File(
"C:\\Users\\Patrick\\Desktop\\New folder\\FileWriter\\Win");
File fileP = new File(
"C:\\Users\\Patrick\\Desktop\\New folder\\Bracket\\Win");
Scanner fs = new Scanner(fileS);
PrintWriter writer = new PrintWriter(fileP);
//int q=0;
while (fs.hasNextLine()) {
//System.out.println(Players[q].toString());
for (int i = 0; i < Players.length; i++) {
if (fs.nextLine().equals(Players[i].toString())) {
Players[i].addGroup(alpha[i / n]);
System.out.println(Players[i].toString());
writer.println(Players[i].toString());
}
}
//q++;
}
writer.close();
fs.close();
}
Do it like this,
while (fs.hasNextLine()) {
String s = fs.nextLine();
//System.out.println(Players[q].toString());
for (int i = 0; i < Players.length; i++) {
if (s.equals(Players[i].toString())) {
Players[i].addGroup(alpha[i / n]);
System.out.println(Players[i].toString());
writer.println(Players[i].toString());
}
}
//q++;
}
Explanation:
When you call fs.nextLine() each time in the players loop. It reads a new line from the file. So All lines of the file are read before you complete all the players.
Scanner throws exception when you try to read and there are no more data in the file.
Source
Throws:
NoSuchElementException - if no line was found

How to remove first line of a text file in java [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Replace first line of a text file in Java
Java - Find a line in a file and remove
I am trying to find a way to remove the first line of text in a text file using java. Would like to use a scanner to do it...is there a good way to do it without the need of a tmp file?
Thanks.
If your file is huge, you can use the following method that is performing the remove, in place, without using a temp file or loading all the content into memory.
public static void removeFirstLine(String fileName) throws IOException {
RandomAccessFile raf = new RandomAccessFile(fileName, "rw");
//Initial write position
long writePosition = raf.getFilePointer();
raf.readLine();
// Shift the next lines upwards.
long readPosition = raf.getFilePointer();
byte[] buff = new byte[1024];
int n;
while (-1 != (n = raf.read(buff))) {
raf.seek(writePosition);
raf.write(buff, 0, n);
readPosition += n;
writePosition += n;
raf.seek(readPosition);
}
raf.setLength(writePosition);
raf.close();
}
Note that if your program is terminated while in the middle of the above loop you can end up with duplicated lines or corrupted file.
Scanner fileScanner = new Scanner(myFile);
fileScanner.nextLine();
This will return the first line of text from the file and discard it because you don't store it anywhere.
To overwrite your existing file:
FileWriter fileStream = new FileWriter("my/path/for/file.txt");
BufferedWriter out = new BufferedWriter(fileStream);
while(fileScanner.hasNextLine()) {
String next = fileScanner.nextLine();
if(next.equals("\n"))
out.newLine();
else
out.write(next);
out.newLine();
}
out.close();
Note that you will have to be catching and handling some IOExceptions this way. Also, the if()... else()... statement is necessary in the while() loop to keep any line breaks present in your text file.
Without temp file you must keep everything in main memory. The rest is straight forward: loop over the lines (ignoring the first) and store them in a collection. Then write the lines back to disk:
File path = new File("/path/to/file.txt");
Scanner scanner = new Scanner(path);
ArrayList<String> coll = new ArrayList<String>();
scanner.nextLine();
while (scanner.hasNextLine()) {
String line = scanner.nextLine();
coll.add(line);
}
scanner.close();
FileWriter writer = new FileWriter(path);
for (String line : coll) {
writer.write(line);
}
writer.close();
If file is not too big, you can read is into a byte array, find first new line symbol and write the rest of array into the file starting from position zero. Or you may use memory mapped file to do so.

java split string[] array to multiple files

I'm having a problem figuring out how to split a string to multiple files. At the moment I should get two files both with JSON data. The code below writes to the first file but leaves the second empty. Any ideas why?
public void splitFile(List<String> results) throws IOException {
int name = 0;
for (int i=0; i<results.size(); i ++) {
write = new FileWriter("/home/tom/files/"+ name +".json");
out = new BufferedWriter(write);
out.write(results.get(i));
if (results.get(i).startsWith("}")) {
name++;
}
}
}
Edit: it splits at line starting with { because that denotes the end of a JSON document.
Enhance the cut-control
Get togher this:
write = new FileWriter("/home/tom/files/"+ name +".json");
out = new BufferedWriter(write);
and this:
name++;
Check for starting, not for end
Check for line starting with {, and execute those three lines to open the file.
Remember to close and flush
If it's not the first line (i > 0) then close the last writer (write.close();).
Close the last opened writer
if (!results.isEmpty())
out.close();
Result
It should look something like this:
public void splitFile(List<String> results) throws IOException {
int name = 0;
BufferedWriter out = null;
for (int i=0; i<results.size(); i ++) {
String line = results.get(i);
if (line.startsWith("{")) {
if (out != null) // it's not the first
out.close(); // tell buffered it's going to close, it makes it flush
FileWriter writer = new FileWriter("/home/tom/files/"+ name +".json");
out = new BufferedWriter(writer);
name++;
}
if (out == null)
throw new IllegalArgumentException("first line doesn't start with {");
out.write(line);
}
if (out != null) // there was at least one file
out.close();
}
I would close your buffered writer after each completed write sequence. i.e. after each iteration through the loop before you assign write to a new FileWriter().
Closing the BufferedWriter will close the underlying FileWriter, and consequently force a flush on the data written to the disk.
Note: If you're using a distinct FileWriter per loop then I'd scope that variable to that inner loop e.g.
FileWriter write = new FileWriter("/home/tom/files/"+ name +".json");
The same goes for the BufferedWriter. In fact you can write:
BufferedWriter outer = new BufferedWriter(new FileWriter(...
and just deal with outer.
Try the following code..
public void splitFile(List<String> results) throws IOException {
int name = 0;
for (int i = 0; i < results.size(); i++) {
write = new FileWriter("/home/tom/files/" + name + ".json");
out = new BufferedWriter(write);
out.write(results.get(i));
out.flush();
out.close(); // you have to close your stream every time in your case.
if (results.get(i).startsWith("}")) {
name++;
}
}
}

Categories