How to write a constructor that holds the sorted array, Then write it to a file with a method like getDatabase that returns an object that has been passed the sorted array.
Database class:
public Person[] entry; // this needs to be an array that will hold the person obj each new entry to the array is added to the next avail pos in list
public Database(int capacity) {
entry = new Person[capacity];
size = 0;
}
public Person[] getDatabase() {
return entry;
}
Storage Class:
public dataBase writeCommaSeparated(Database data) throws IOException {
Database db = new Database();
PrintStream writer = new PrintStream(file);
if(file.exists()) {
for(int i = 0; i < data.size; i++) {
writer.println(data.get(i).toFile());
}
}
writer.close();
return db;
}
public dataBase read() throws IOException {
Database db = new Database();
Scanner scan = new Scanner(file);
Person person;
//check if file has data print selected data
while(scan.hasNextLine()) {
person = parsePerson(scan.nextLine());
db.add(person);
}
scan.close();
return db;
}
These are just snippets of the code that I have. I am trying to write a sorted array into a file, and I know that it is sorting the file by age correctly but I am not sure how to write it out to a file.
in main I have:
String fileLocation = File.separator + "Users"
+ File.separator + "USERNAME"
+ File.separator + "Desktop"
+ File.separator + "DataFile.txt";
FileStorage fileStore = new FileStorage(fileLocation);
FileData data = fileStore.read(); // this invokes a method called read that reads the file
data.sort(); // sorts the file by age and prints out to the console the sorted age
fileSort.writeCommaSeparated(data); // writes to the file in a commaseparated way
Focusing on just the sorting of a csv file based on age and given your description, this was about the simplest solution that came to mind.
public class PersonDatabase {
private ArrayList<String[]> people = new ArrayList();
// Reads the given input file and loads it into an ArrayList of string arrays.
public PersonDatabase(String inputFile) throws IOException {
BufferedReader in = new BufferedReader(new FileReader(inputFile));
for (String line = null; null != (line=in.readLine()); ) {
people.add(line.split(",")); // convert csv string to an array of strings.
}
in.close();
}
private static final int AGE_COLUMN_INDEX=2; // Identifies the 'age' column
// performs a numeric comparison on the 'age' column values.
int compareAge(String[] a1, String[]a2) {
return Integer.compare(
Integer.parseInt(a1[AGE_COLUMN_INDEX]),
Integer.parseInt(a2[AGE_COLUMN_INDEX]));
}
// Sorts the list of people by age and writes to the given output file.
public void writeSorted(String outputFile) throws IOException {
PrintWriter out = new PrintWriter(new FileWriter(outputFile));
people.stream()
.sorted(this::compareAge) // sort by age
.forEach(a->{
Arrays.stream(a).forEach(s->out.print(s+",")); // print as csv
out.println();
});
out.close();
}
public static void main(String[] args) throws IOException {
PersonDatabase pdb = new PersonDatabase("persondb.in");
pdb.writeSorted("persondb.out");
}
}
Given the following input:
fred,flintstone,43,
barney,rubble,42,
wilma,flintstone,39,
betty,rubble,39,
This program produces the following output:
wilma,flintstone,39,
betty,rubble,39,
barney,rubble,42,
fred,flintstone,43,
It seemed like marshalling these arrays into Person objects just for the sake of sorting was overkill. However, if you wanted to do that, it would be pretty easy to turn an array of field values into a Person object. I'll leave that to you.
Related
(Homework:) I want to use array instead of arraylist in this situation. I have the arraylist name Employee and i have to insert data of it into the tree. I load data line by line from file. But i want to use array for the Employee not arraylist. How can i do that ? There're any ways to use array instead of arraylist in this situation. The following code is my example code for arraylist Employee. I want to change List to Employee[] how can i write the following function in style of Array.
public static void main(String[] args) {
List<Employee> employees = read("employees.txt");
BST bst = new BST();
for(Employee e : employees){
bst.insert(e);
}
}
public static List<Employee> read(String file) {
try {
List<Employee> employees = new ArrayList<>();
BufferedReader reader = new BufferedReader(new FileReader(file));
String line;
while((line = reader.readLine()) != null ){
String[] arr = line.split("-");
Employee emp = new Employee();
emp.ccode = Integer.parseInt(arr[0]);
emp.cus_name = arr[1];
emp.phone = arr[2];
employees.add(emp);
}
return employees;
} catch (IOException ex) {
Logger.getLogger(TestMusic.class.getName()).log(Level.SEVERE, null, ex);
}
return null;
}
This approach is not the best one, but might solve your problem. to be used for java versions < 8.
The approach is to parse the file to get no. of lines, to create the employee array, and parse again to get data of all the individual employees
public static void main(String[] args) {
int empSize = getNumberOfEmployees("employees.txt");
employees = new Employee[empSize];
employees = read("employees.txt");
BST bst = new BST();
for(Employee e : employees){
bst.insert(e);
}
}
public static int getNumberOfEmployees (String file) {
int totalEmp = 0;
try {
BufferedReader reader = new BufferedReader(new FileReader(file));
String line;
while((line = reader.readLine()) != null ) {
totalEmp ++;
}
}catch (IOException e) {
e.printStackTrace();
}
return totalEmp;
}
public static Employee[] read(String file) {
try {
BufferedReader reader = new BufferedReader(new FileReader(file));
String line;
int i=0;
while((line = reader.readLine()) != null ){
String[] arr = line.split("-");
Employee emp = new Employee();
emp.ccode = Integer.parseInt(arr[0]);
emp.cus_name = arr[1];
emp.phone = arr[2];
employees[i] = emp;
i++;
}
return employees;
} catch (IOException ex) {
Logger.getLogger(TestMusic.class.getName()).log(Level.SEVERE, null, ex);
}
return null;
}
Without giving you any code (do it by yourself ;-)):
Parse the file twice:
get the number of lines, create an Array based on the number of lines
parse the file again, fill the Array
And some Research (keywords BufferedReader and Array) would help you too.
It is unclear from your requirements what you want to do in the following situations:
one line fails to parse;
cannot open the file for reading.
Here is a solution which (eww) will just ignore the unparseable entries and return an empty array if the file cannot be parsed:
public final class TestMusic
{
private static final Employee[] NO_EMPLOYEES = new Employee[0];
public static void main(final String... args)
{
final BST bst = new BST();
for (final Employee emp: getArray())
bst.insert(emp);
}
private static Employee toEmployee(final String input)
{
final String[] arr = input.split["-"];
final Employee emp = new Employee();
try {
emp.ccode = Integer.parseInt(arr[0]);
emp.cus_name = arr[1];
emp.phone = arr[2];
return emp;
} catch (NumberFormatException | IndexOutOfBoundsException e) {
return null;
}
}
private static Employee[] getArray()
{
final Path path = Paths.get("employees.txt");
try (
Stream<String> lines = Files.lines(path);
) {
return lines.map(TestMusic::toEmployee)
.filter(Objects::nonNull)
.toArray(Employee[]::new);
} catch (IOException ignored) {
return NO_EMPLOYEES;
}
}
}
Note how this solution does not use an intermediate list at all; instead, it makes use of the Java 8 Stream API.
What is left to do here is to handle errors... That is for you to decide :)
if you want to convert ArrayList to array use the following code:
Employee [] arrayOfEmpolyees = new Employee[employees.size()]
employees.toArray(arrayOfEmpolyees);
That is like doing a step backwards. Java collections (for example the List interface and the ArrayList implementation) have various advantages compared to "plain old" arrays.
The only real advantage of arrays is their reduced overhead - but that is only important when dealing with millions or billions of things to store in a container.
So the real answer is: don't do that. Just keep using List/ArrayList.
But in case you insist, you can of course use arrays - but then you have to add that part that makes ArrayList more convenient: you have to provide code that dynamically "grows" your array once you hit its size limit. That works like this:
you start with an initial array of size 100 for example
while populating that array, you keep track of the number of slots "in use"
when your code wants to add the 101st element, you "grow" the array
Growing works by:
creating a new array, that has like currentArray.length + 100 capacity
using System.arraycopy() to move all entries from the old to the new array
Guess the size of the array, for example by taking the size of the file and dividing by 20 (approximately the size of the line in the example you gave). Then read into the array, counting the lines. If the array is full before you have reached the end of the file, allocate a new array double the size, copy everything from the old array to the new array, replace the old array with the new array and continue the same way until done. You can look at the source of ArrayList to see an example of how it is done - basically this is what ArrayList does internally.
Given there are some files Customer-1.txt, Customer-2.txt and Customer-3.txt and these files have the following content:
Customer-1.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
4|2|BARBARA|JONES
Customer-2.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
Customer-3.txt
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
5|2|ALEXANDER|ANDERSON
These files have a lot of duplicate data, but it is possible that each file contains some data that is unique.
And given that the actual files are sorted, big (a few GB each file) and there are many files...
Then what is the:
a) memory cheapest
b) cpu cheapest
c) fastest
way in Java to create one file out of these three files that will contain all the unique data of each file sorted and concatenated like such:
Customer-final.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
5|2|ALEXANDER|ANDERSON
I looked into the following solution https://github.com/upcrob/spring-batch-sort-merge , but I would like to know if its possible to perhaps do it with the FileInputStream and/or a non spring batch solution.
A solution to use an in memory or real database to join them is not viable for my use case due to the size of the files and the absence of an actual database.
Since the input files are already sorted, a simple parallel iteration of the files, merging their content, is the memory cheapest, cpu cheapest, and fastest way to do it.
This is a multi-way merge join, i.e. a sort-merge join without the "sort", with elimination of duplicates, similar to a SQL DISTINCT.
Here is a version that can do unlimited number of input files (well, as many as you can have open files anyway). It uses a helper class to stage the next line from each input file, so the leading ID value only has to be parsed once per line.
private static void merge(StringWriter out, BufferedReader ... in) throws IOException {
CustomerReader[] customerReader = new CustomerReader[in.length];
for (int i = 0; i < in.length; i++)
customerReader[i] = new CustomerReader(in[i]);
merge(out, customerReader);
}
private static void merge(StringWriter out, CustomerReader ... in) throws IOException {
List<CustomerReader> min = new ArrayList<>(in.length);
for (;;) {
min.clear();
for (CustomerReader reader : in)
if (reader.hasData()) {
int cmp = (min.isEmpty() ? 0 : reader.compareTo(min.get(0)));
if (cmp < 0)
min.clear();
if (cmp <= 0)
min.add(reader);
}
if (min.isEmpty())
break; // all done
// optional: Verify that lines that compared equal by ID are entirely equal
out.write(min.get(0).getCustomerLine());
out.write(System.lineSeparator());
for (CustomerReader reader : min)
reader.readNext();
}
}
private static final class CustomerReader implements Comparable<CustomerReader> {
private BufferedReader in;
private String customerLine;
private int customerId;
CustomerReader(BufferedReader in) throws IOException {
this.in = in;
readNext();
}
void readNext() throws IOException {
if ((this.customerLine = this.in.readLine()) == null)
this.customerId = Integer.MAX_VALUE;
else
this.customerId = Integer.parseInt(this.customerLine.substring(0, this.customerLine.indexOf('|')));
}
boolean hasData() {
return (this.customerLine != null);
}
String getCustomerLine() {
return this.customerLine;
}
#Override
public int compareTo(CustomerReader that) {
// Order by customerId only. Inconsistent with equals()
return Integer.compare(this.customerId, that.customerId);
}
}
TEST
String file1data = "1|1|MARY|SMITH\n" +
"2|1|PATRICIA|JOHNSON\n" +
"4|2|BARBARA|JONES\n";
String file2data = "1|1|MARY|SMITH\n" +
"2|1|PATRICIA|JOHNSON\n" +
"3|1|LINDA|WILLIAMS\n" +
"4|2|BARBARA|JONES\n";
String file3data = "2|1|PATRICIA|JOHNSON\n" +
"3|1|LINDA|WILLIAMS\n" +
"5|2|ALEXANDER|ANDERSON\n";
try (
BufferedReader in1 = new BufferedReader(new StringReader(file1data));
BufferedReader in2 = new BufferedReader(new StringReader(file2data));
BufferedReader in3 = new BufferedReader(new StringReader(file3data));
StringWriter out = new StringWriter();
) {
merge(out, in1, in2, in3);
System.out.print(out);
}
OUTPUT
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
5|2|ALEXANDER|ANDERSON
The code merges purely by ID value, and doesn't verify that rest of line is actually equal. Insert code at the optional comment to check for that, if needed.
This might help:
public static void main(String[] args) {
String files[] = {"Customer-1.txt", "Customer-2.txt", "Customer-3.txt"};
HashMap<Integer, String> customers = new HashMap<Integer, String>();
try {
String line;
for(int i = 0; i < files.length; i++) {
BufferedReader reader = new BufferedReader(new FileReader("data/" + files[i]));
while((line = reader.readLine()) != null) {
Integer uuid = Integer.valueOf(line.split("|")[0]);
customers.put(uuid, line);
}
reader.close();
}
BufferedWriter writer = new BufferedWriter(new FileWriter("data/Customer-final.txt"));
Iterator<String> it = customers.values().iterator();
while(it.hasNext()) writer.write(it.next() + "\n");
writer.close();
} catch (Exception e) {
e.printStackTrace();
}
}
If you have any cquestions ask me.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
for the life of me I can't figure out what is wrong with these codes .. the save to keep overwrite itself and the load from doesn't load the already existing data .. I have searched for this code but it seems like people use different codes .. please help me end my headache
// Write to file
static void writeToFile(Customer c[], int number_of_customers) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File("Customers.dat");
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
int i = 0;
do {
pw.println(c[i].getName());
pw.println(c[i].getNumber());
i++;
} while (i < number_of_customers);
pw.println(0);
pw.println(0);
pw.close();
}
// Read from file
public static int readFromFile(Customer c[]) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File("Customers.dat");
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
String cus;
int l = -1;
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
int all_customers = 0;
do {
l++;
c[l] = new Customer();
c[l].cus_name = br.readLine();
cus = br.readLine();
c[l].cus_no = Integer.parseInt(cus);
all_customers++;
} while (c[l].cus_no != 0); // end while
br.close(); // end ReadFile class
return all_customers - 1;
}
An alternative way to fix your write method would be to use a FileOutputStream constructor that lets you request that data be appended to the end of the file.
FileOutputStream fos = new FileOutputStream(outputFile, true);
This does assume that you always write a complete final record with an end of line after it, even under error conditions. You'll still have to deal with this type of situation with the other solution (read and merge), but with that one the subsequent run can detect and deal with it if necessary. So the append solution I describe is not as robust.
You have a number of issues with your code.
Looking first at your readFromFile method:
You're passing in an array that your method is filling up with all the records it finds. What happens if there are more customers in the file than there's room for in the array? (hint: ArrayIndexOutOfBoundsException is a thing)
You're parsing an integer read as a string from the file. What happens if the file is corrupt and the line read is not an integer?
The name of the file to read from is hard-coded. This should be a constant or configuration option. For the purpose of writing methods, it is best to make it a parameter.
You're opening the file and reading from it in the method. For purposes of unit testing, you should split this into separate methods.
In general, you should be using a Collections class instead of an array to hold a list of objects.
You're accessing the Customer attributes directly in the readFromFile method. You should be using an accessor method.
Collections-based approach
Here's my proposed rewrite based on using Collections APIs:
public static List<Customer> readFromFile(String filename) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File(filename);
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
List<Customer> customers = readFromStream(br);
br.close(); // end ReadFile class
return customers;
}
This uses this method to actually read the contents:
public static List<Customer> readFromStream(BufferedReader br) throws IOException {
List<Customer> customerList = new LinkedList<>();
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
boolean moreCustomers = true;
while (moreCustomers) {
try {
Customer customer = new Customer();
customer.setName(br.readLine());
String sCustNo = br.readLine();
customer.setNumber(Integer.parseInt(sCustNo));
if (customer.getNumber() == 0) {
moreCustomers = false;
}
else {
customerList.add(customer);
}
}
catch (NumberFormatException x) {
// happens if the line is not a number.
// handle this somehow, e.g. by ignoring, logging, or stopping execution
// for now, we just stop reading
moreCustomers = false;
}
}
return customerList;
}
Using a similar approach for writeToFile, we get:
static void writeToFile(Collection<Customer> customers, String filename) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File(filename);
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
writeToStream(customers, pw);
pw.flush();
pw.close();
}
static void writeToStream(Collection<Customer> customers, PrintWriter pw) throws IOException {
for (Customer customer: customers) {
pw.println(customer.getName());
pw.println(customer.getNumber());
}
pw.println(0);
pw.println(0);
}
However, we still haven't addressed your main concern. It seems you want to merge the file content with the customers in memory when you call writeToFile. I suggest that you instead introduce a new method for this purpose. This keeps the existing methods simpler:
static void syncToFile(Collection<Customer> customers, String filename) throws IOException {
// get a list of existing customers
List<Customer> customersInFile = readFromFile(filename);
// use a set to merge
Set<Customer> customersToWrite = new HashSet<>();
// first add current in-memory cutomers
customersToWrite.addAll(customers);
// then add the ones from the file. Duplicates will be ignored
customersToWrite.addAll(customersInFile);
// then save the merged set
writeToFile(customersToWrite, filename);
}
Oh... I almost forgot: The magic of using a Set to merge the file and in-memory list relies on you to implement the equals() method in the Customer class. If you overwrite equals(), you should also overwrite hashCode(). For example:
public class Customer {
#Override
public boolean equals(Object obj) {
return (obj != null) && (obj instanceof Customer) && (getNumber() == ((Customer)obj).getNumber());
}
#Override
public int hashCode() {
return getNumber()+31;
}
};
CustomerList-based approach
If you cannot use Collections APIs, the second-best would be to write your own collection type that supports the same operations, but is backed by an array (or linked list, if you have learned that). In your case, it would be a list of customers. I'll call the type CustomerList:
Analyzing our existing code, we'll need a class that implements an add method and a way to traverse the list. Ignoring Iterators, we'll accomplish the latter with a getLength and a getCustomer (by index). For the synchronization, we also need a way to check if a customer is in the list, so we'll add a contains method:
public class CustomerList {
private static final int INITIAL_SIZE = 100;
private static final int SIZE_INCREMENT = 100;
// list of customers. We're keeping it packed, so there
// should be no holes!
private Customer[] customers = new Customer[INITIAL_SIZE];
private int numberOfCustomers = 0;
/**
* Adds a new customer at end. Allows duplicates.
*
* #param newCustomer the new customer to add
* #return the updated number of customers in the list
*/
public int add(Customer newCustomer) {
if (numberOfCustomers == customers.length) {
// the current array is full, make a new one with more headroom
Customer[] newCustomerList = new Customer[customers.length+SIZE_INCREMENT];
for (int i = 0; i < customers.length; i++) {
newCustomerList[i] = customers[i];
}
// we will add the new customer at end!
newCustomerList[numberOfCustomers] = newCustomer;
// replace the customer list with the new one
customers = newCustomerList;
}
else {
customers[numberOfCustomers] = newCustomer;
}
// we've added a new customer!
numberOfCustomers++;
return numberOfCustomers;
}
/**
* #return the number of customers in this list
*/
public int getLength() {
return numberOfCustomers;
}
/**
* #param i the index of the customer to retrieve
* #return Customer at index <code>i</code> of this list (zero-based).
*/
public Customer getCustomer(int i) {
//TODO: Add boundary check of i (0 <= i < numberOfCustomers)
return customers[i];
}
/**
* Check if a customer with the same number as the one given exists in this list
* #param customer the customer to check for (will use customer.getNumber() to check against list)
* #return <code>true</code> if the customer is found. <code>false</code> otherwise.
*/
public boolean contains(Customer customer) {
for (int i = 0; i < numberOfCustomers; i++) {
if (customers[i].getNumber() == customer.getNumber()) {
return true;
}
}
// if we got here, it means we didn't find the customer
return false;
}
}
With this implemented, the rewrite of the writeToFile method is exactly the same, except we use CustomerList instead of List<Customer>:
static void writeToFile(CustomerList customers, String filename) throws IOException {
// set up file for output
// pw used to write to file
File outputFile = new File(filename);
FileOutputStream fos = new FileOutputStream(outputFile);
PrintWriter pw = new PrintWriter(new OutputStreamWriter(fos));
writeToStream(customers, pw);
pw.flush();
pw.close();
}
The writeToStream is also very similar, except since we're not using an Iterator, we have to traverse the list manually:
static void writeToStream(CustomerList customers, PrintWriter pw) throws IOException {
for (int i = 0; i < customers.getLength(); i++) {
pw.println(customers.getCustomer(i).getName());
pw.println(customers.getCustomer(i).getNumber());
}
pw.println(0);
pw.println(0);
}
Similar for readFromFile -- pretty much the same except for the list type:
public static CustomerList readFromFile(String filename) throws IOException {
// set up file for reading
// br used to read from file
File inputFile = new File(filename);
FileInputStream fis = new FileInputStream(inputFile);
BufferedReader br = new BufferedReader(new InputStreamReader(fis));
CustomerList customers = readFromStream(br);
br.close(); // end ReadFile class
return customers;
}
The readFromStream is also pretty much the same, except for the type (the methods used on CustomerList has the same signature as the ones used on List<Customer>:
public static CustomerList readFromStream(BufferedReader br) throws IOException {
CustomerList customerList = new CustomerList();
// Subtract AND assignment operator, It subtracts right operand from the
// left operand and assign the result to left operand
boolean moreCustomers = true;
while (moreCustomers) {
try {
Customer customer = new Customer();
customer.setName(br.readLine());
String sCustNo = br.readLine();
customer.setNumber(Integer.parseInt(sCustNo));
if (customer.getNumber() == 0) {
moreCustomers = false;
}
else {
customerList.add(customer);
}
}
catch (NumberFormatException x) {
// happens if the line is not a number.
// handle this somehow, e.g. by ignoring, logging, or stopping execution
// for now, we just stop reading
moreCustomers = false;
}
}
return customerList;
}
The most different method is the syncToFile, as we don't have the Set type that guarantees no duplicates, we have to manually check each time we try to insert a customer from the file:
static void syncToFile(CustomerList customers, String filename) throws IOException {
// get a list of existing customers
CustomerList customersInFile = readFromFile(filename);
// use a set to merge
CustomerList customersToWrite = new CustomerList();
// first add current in-memory customers
for (int i = 0; i < customers.getLength(); i++) {
customersToWrite.add(customers.getCustomer(i));
}
// then add the ones from the file. But skip duplicates
for (int i = 0; i < customersInFile.getLength(); i++) {
if (!customersToWrite.contains(customersInFile.getCustomer(i))) {
customersToWrite.add(customersInFile.getCustomer(i));
}
}
// then save the merged set
writeToFile(customersToWrite, filename);
}
Something to note here is that we could have optimized the add operations by having an extra constructor for CustomerList that took the new capacity, but I'll leave at least something for you to figure out ;)
I have an ArrayList of Objects and i want to store them into the file and also i want to read them from the file to ArrayList. I can successfully write them into the file using writeObject method but when reading from the file to ArrayList, i can only read first object. Here is my code for reading from serialized file
public void loadFromFile() throws IOException, ClassNotFoundException {
FileInputStream fis = new FileInputStream(file);
ObjectInputStream ois = new ObjectInputStream(fis);
myStudentList = (ArrayList<Student>) ois.readObject();
}
EDIT:
This is the code for writing list into the file.
public void saveToFile(ArrayList<Student> list) throws IOException {
ObjectOutputStream out = null;
if (!file.exists ()) out = new ObjectOutputStream (new FileOutputStream (file));
else out = new AppendableObjectOutputStream (new FileOutputStream (file, true));
out.writeObject(list);
}
Rest of my class is
public class Student implements Serializable {
String name;
String surname;
int ID;
public ArrayList<Student> myStudentList = new ArrayList<Student>();
File file = new File("src/files/students.txt");
public Student(String namex, String surnamex, int IDx) {
this.name = namex;
this.surname = surnamex;
this.ID = IDx;
}
public Student(){}
//Getters and Setters
public void add() {
Scanner input = new Scanner(System.in);
System.out.println("name");
String name = input.nextLine();
System.out.println("surname");
String surname = input.nextLine();
System.out.println("ID");
int ID = input.nextInt();
Ogrenci studenttemp = new Ogrenci(name, surname, ID);
myOgrenciList.add(studenttemp);
try {
saveToFile(myOgrenciList, true);
}
catch (IOException e){
e.printStackTrace();
}
}
Ok so you are storing whole list of students every time when new student comes in, so basicly what your file is keeping is:
List with one student
List with two students including the first one
List of 3 studens
and so on and so on.
I know you are probably thought it will write only new students in incremental fashion, but you were wrong here
.
You should rather add all students you want to store, into the list first. And then store complete list into the file , just like you are doing it.
Now, when you will be reading from the filre, first readObject will return you the list no.1 - that is why you are getting list with only one student. Second read would give you list no.2 and so on.
So you save your data you either have to:
Create complete list containig N students and store it once ito the file
Do not use list, but store students directly to the file
To read it back:
readObject once, so you will get List<Students>
Read students one by one from the file by multiple calls to readObject
This is Because I think ObjectOutputStream will return the first object from a file.
If you want all of the objects you can use for loop and use like this -:
FileInputStream fis = new FileInputStream("OutObject.txt");
for(int i=0;i<3;i++) {
ObjectInputStream ois = new ObjectInputStream(fis);
Employee emp2 = (Employee) ois.readObject();
System.out.println("Name: " + emp2.getName());
System.out.println("D.O.B.: " + emp2.getSirName());
System.out.println("Department: " + emp2.getId());
}
I have to write code that will reverse the order of the string and write it in a new file. For example :
Hi my name is Bob.
I am ten years old.
The reversed will be :
I am ten years old.
Hi my name is Bob.
This is what I have so far. Not sure what to write for the outWriter print statement. Any help will be appreciated. Thanks!
import java.io.*;
import java.util.ArrayList;
import java.util.Scanner;
public class FileRewinder {
public static void main(String[] args) {
File inputFile = new File("ascii.txt");
ArrayList<String> list1 = new ArrayList<String>();
Scanner inputScanner;
try {
inputScanner = new Scanner(inputFile);
} catch (FileNotFoundException f) {
System.out.println("File not found :" + f);
return;
}
while (inputScanner.hasNextLine()) {
String curLine = inputScanner .nextLine();
System.out.println(curLine );
}
inputScanner.close();
File outputFile = new File("hi.txt");
PrintWriter outWriter = null;
try {
outWriter = new PrintWriter(outputFile);
} catch (FileNotFoundException e) {
System.out.println("File not found :" + e);
return;
}
outWriter.println(???);
outWriter.close();
}
}
My suggestion is read entire file first and store sentences(you can split by .) in a LinkedList<String>(this will keep insertion order)
Then use Iterator and get sentences in reverse order. and write them into a file. make sure to put . just after each sentence.
After System.out.println(curLine ); add list1.add(curline); that will place your lines of text into your list.
At the end create a loop over list1 backwards:
for(int i = list1.size() - 1 , i > 0, --i) {
outWriter.println(list1[i]);
}
If the file contains an amount of lines which can be loaded into the memory. You can read all lines into a list, reverse the order of the list and write the list back to the disk.
public class Reverse {
static final Charset FILE_ENCODING = StandardCharsets.UTF_8;
public static void main(String[] args) throws IOException {
List<String> inLines = Files.readAllLines(Paths.get("ascii.txt"), FILE_ENCODING);
Collections.reverse(inLines);
Files.write(Paths.get("hi.txt"), inLines, FILE_ENCODING);
}
}