Creating files in a separate thread - java

I have a method that starts creating JSON files in each of the folders in my tree.
public static void fill(List<String> subFoldersPaths) {
for (int i = 0; i < subFoldersPaths.size(); i++) {
String fullFileName = subFoldersPaths.get(i) + FILE_NAME;
String formatFullFileName = String.format(fullFileName, i)+"%d";
Runnable runnable = new JsonCreator(formatFullFileName);
new Thread(runnable).start();
}
}
List<String> subFoldersPaths is a list that contains paths to each folder in order.
Here is my folder structure:
I want each folder to be filled with files in a separate thread every 0.08 seconds. But my class will not fill every folder.
Here is a class that implements Runnable, which should perform the filling:
import com.epam.lab.model.Author;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import net.andreinc.mockneat.MockNeat;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import java.io.FileWriter;
import java.io.IOException;
public class JsonCreator implements Runnable {
private static Logger logger = LogManager.getLogger();
private static String fileName;
private static final int FILES_COUNT = 100;
public JsonCreator(String s){
this.fileName = s;
}
#Override
public void run() {
for (int i = 0; i < FILES_COUNT; i++) {
try {
String formatFullFileName = String.format(fileName, i)+".json";
FileWriter fileWriter = new FileWriter(formatFullFileName);
fileWriter.write(createJsonString());
fileWriter.close();
Thread.sleep(80);
} catch (IOException | InterruptedException e) {
logger.error("File was not created", e);
}
}
}
private static String createJsonString() {
MockNeat mockNeat = MockNeat.threadLocal();
Gson gson = new GsonBuilder()
.setPrettyPrinting()
.create();
String json = mockNeat
.reflect(Author.class)
.field("authorName", mockNeat.names().first())
.field("authorSurname", mockNeat.names().last())
.map(gson::toJson)
.val();
return json;
}
}
But this class fills not every folder with files. (maybe there is a problem with the file names) I can not figure it out.
And I want each folder below "foo" to be filled in a separate thread of JSON files in the amount of FILES_COUNT = 10
some examples of algorithm execution:
The folder structure is created with the participation of the random, so it is almost always different. but this does not affect the fact that files are not created in all folders

Your code is buggy; you cannot ever use that FileWriter constructor. Use new FileWriter(formatFullFileName, StandardCharsets.UTF_8), which is only in jdk11. If you're not on JDK11, you can't use FileWriter at all (it uses platform default encoding, and that is not acceptable; JSON must be in UTF-8 as per the JSON spec, and you have no guarantee that UTF-8 is your platform default).
you aren't guarding your FileWriter with an ARM block - you should add that.
In the initial block, formatFullFileName is a variable that is a format string. In the run() method, it's the opposite (it's the result of running a String.format op on one). Makes your code very hard to read.
Most likely your filenames are incorrect. You should be using List<Path> which would have removed any doubt. If your List<String> subFoldersPaths contains, for example, /home/misnomer/project/foo/1stLayerSubFolder0 in it, and the constant FILE_NAME (which you did not put in your pastes) is, say, example, then the path for the very first file to be created becomes: /home/misnomer/project/foo/1stLayerSubFolder0example0.json which is not what you wanted - you're missing a slash.
NB: If using the newer path API, writing a string to a file becomes vastly simpler: Files.write(path, string) is all you need (and note that the Files API defaults to UTF-8, unlike most other parts of the java libraries that involve turning strings to bytes or vice versa).
The paste needs more info, or you should debug this on your own: Print when you write a file, preferably including the thread ID (you can get it with Thread.currentThread().getName()). That's how programming works: You don't just stare at it, go --heck, I dunno, better ask stack overflow!-- and then give up. You debug it. Use a debugger, or if you can't/don't want to, use the poor man's debugger: Add a whole bunch of System.out.println statements. Go through your code and imagine (write it down if you have to) which each step is doing. Then, add a println statement that confirms this. The very place where what the program says it is doing does not match with what you thought it would do? That's where a bug is. Fix it, and keep going until all bugs are eliminated.

Related

How to read a text file into an array list of objects in Java

I'm currently working on a project and I'm running into a couple of issues. This project involves working with 2 classes, Subject and TestSubject. Basically, I need my program (in TestSubject class) to read details (subject code and subject name) from a text file and create subject objects using this information, then add those to an array list. The text file looks like this:
ITC105: Communication and Information Management
ITC106: Programming Principles
ITC114: Introduction to Database Systems
ITC161: Computer Systems
ITC204: Human Computer Interaction
ITC205: Professional Programming Practice
the first part is the subject code i.e. ITC105 and the second part is the name (Communication and Information Management)
I have created the subject object with the code and name as strings with getters and setters to allow access (in the subject class):
private static String subjectCode;
private static String subjectName;
public Subject(String newSubjectCode, String newSubjectName) {
newSubjectCode = subjectCode;
newSubjectName = subjectName;
}
public String getSubjectCode() {
return subjectCode;
}
public String getSubjectName() {
return subjectName;
}
public void setSubjectCode(String newSubjectCode) {
subjectCode= newSubjectCode;
}
public void setSubjectName(String newSubjectName) {
subjectName = newSubjectName;
}
The code I have so far for reading the file and creating the array list is:
public class TestSubject {
#SuppressWarnings({ "null", "resource" })
public static void main(String[] args) throws IOException {
File subjectFile = new File ("A:\\Assessment 3 Task 1\\src\\subjects.txt");
Scanner scanFile = new Scanner(subjectFile);
System.out.println("The current subjects are as follows: ");
System.out.println(" ");
while (scanFile.hasNextLine()) {
System.out.println(scanFile.nextLine());
}
//This array will store the list of subject objects.
ArrayList <Object> subjectList = new ArrayList <>();
//Subjects split into code and name and added to a new subject object.
String [] token = new String[3];
while (scanFile.hasNextLine()) {
token = scanFile.nextLine().split(": ");
String code = token [0] + ": ";
String name = token [1];
Subject addSubjects = new Subject (code, name);
//Each subject is then added to the subject list array list.
subjectList.add(addSubjects);
}
//Check if the array list is being filled by printing it to the console.
System.out.println(subjectList.toString());
This code isn't working, the array list is just printing as blank. I have tried doing this several ways including a buffered reader but I can't get it to work so far. The next section of code allows a user to enter a subject code and name, which is then added to the array list as well. That section of code works perfectly, I'm just stuck on the above part. Any advice on how to fix it to make it work would be amazing.
Another small thing:
File subjectFile = new File ("A:\\Assessment 3 Task 1\\src\\subjects.txt"); //this file path
Scanner scanFile = new Scanner(subjectFile);
I'd like to know how I can change the file path so that it will still work if the folder is moved or the files are opened on another computer. The .txt file is in the source folder with the java files. I have tried:
File subjectFile = new File ("subjects.txt");
But that doesn't work and just throws errors.
That is because you have already read through the file
while (scanFile.hasNextLine()) {
System.out.println(scanFile.nextLine());
}
The contents are exhausted. So when you do
while (scanFile.hasNextLine()) {
token = scanFile.nextLine().split(": ");
there is no data left.
Remove the first loop or re-open the file.
Or as #UsagiMiyamoto mentions
Or read the line to a String variable, print it, then split it... All in one loop.
I assume you are just beginning with learning Java and hence the below code is probably way too advanced, but it may help others who are trying to do something similar to you and also give you a glimpse of what you will probably learn in future.
The below code uses the following (in no particular order):
Streams
Accessing resources
Records
try-with-resources
Multi-catch
Method references
NIO.2
More notes after the code.
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
import java.net.URL;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.stream.Collectors;
import java.util.stream.Stream;
public record Subject(String subjectCode, String subjectName) {
private static final String DELIMITER = ": ";
private static Path getPath(String filename) throws URISyntaxException {
URL url = Subject.class.getResource(filename);
URI uri = url.toURI(); // throws java.net.URISyntaxException
return Paths.get(uri);
}
private static Subject makeSubject(String line) {
String[] parts = line.split(DELIMITER);
return new Subject(parts[0].trim(), parts[1].trim());
}
/**
* Reads contents of a text file and converts its contents to a list of
* instances of this record and displays that list.
*
* #param args - not used.
*/
public static void main(String[] args) {
try {
Path path = getPath("subjects.txt");
try (Stream<String> lines = Files.lines(path)) { // throws java.io.IOException
lines.map(Subject::makeSubject)
.collect(Collectors.toList())
.forEach(System.out::println);
}
}
catch (IOException | URISyntaxException x) {
x.printStackTrace();
}
}
}
A Java record is applicable for an immutable object and it simply saves you from writing code for methods including getters as well as equals, hashCode and toString. (There are no setters since a record is immutable.) It's a bit like Project Lombok. I would say that a Subject is immutable since I don't think the code or name would need to be changed and that's why I thought making Subject a record was applicable.
Running the above code produces the following output:
Subject[subjectCode=ITC105, subjectName=Communication and Information Management]
Subject[subjectCode=ITC106, subjectName=Programming Principles]
Subject[subjectCode=ITC114, subjectName=Introduction to Database Systems]
Subject[subjectCode=ITC161, subjectName=Computer Systems]
Subject[subjectCode=ITC204, subjectName=Human Computer Interaction]
Subject[subjectCode=ITC205, subjectName=Professional Programming Practice]
Regarding
I'd like to know how I can change the file path so that it will still work if the folder is moved
I placed file subjects.txt in the same folder as file Subject.class, which allowed me to use method getResource. Refer to the Accessing resources link, above. Note that this can't be used if
the files are opened on another computer
Alternatively, there are several directories whose paths are stored in System properties including
java.home
java.io.tmpdir
user.home
user.dir
what did your debug console said about the exception?
your code works very well in my editor.
code result
and you should code like below if you want to read file through relative path
before ->
new File ("A:\Assessment 3 Task 1\src\subjects.txt");
after ->
new File (".\\subjects.txt");

Trying to add substrings from newLines in a large file to a list

I downloaded my extended listening history from Spotify and I am trying to make a program to turn the data into a list of artists without doubles I can easily make sense of. The file is rather huge because it has data on every stream I have done since 2016 (307790 lines of text in total). This is what 2 lines of the file looks like:
{"ts":"2016-10-30T18:12:51Z","username":"edgymemes69endmylifepls","platform":"Android OS 6.0.1 API 23 (HTC, 2PQ93)","ms_played":0,"conn_country":"US","ip_addr_decrypted":"68.199.250.233","user_agent_decrypted":"unknown","master_metadata_track_name":"Devil's Daughter (Holy War)","master_metadata_album_artist_name":"Ozzy Osbourne","master_metadata_album_album_name":"No Rest for the Wicked (Expanded Edition)","spotify_track_uri":"spotify:track:0pieqCWDpThDCd7gSkzx9w","episode_name":null,"episode_show_name":null,"spotify_episode_uri":null,"reason_start":"fwdbtn","reason_end":"fwdbtn","shuffle":true,"skipped":null,"offline":false,"offline_timestamp":0,"incognito_mode":false},
{"ts":"2021-03-26T18:15:15Z","username":"edgymemes69endmylifepls","platform":"Android OS 11 API 30 (samsung, SM-F700U1)","ms_played":254120,"conn_country":"US","ip_addr_decrypted":"67.82.66.3","user_agent_decrypted":"unknown","master_metadata_track_name":"Opportunist","master_metadata_album_artist_name":"Sworn In","master_metadata_album_album_name":"Start/End","spotify_track_uri":"spotify:track:3tA4jL0JFwFZRK9Q1WcfSZ","episode_name":null,"episode_show_name":null,"spotify_episode_uri":null,"reason_start":"fwdbtn","reason_end":"trackdone","shuffle":true,"skipped":null,"offline":false,"offline_timestamp":1616782259928,"incognito_mode":false},
It is formatted in the actual text file so that each stream is on its own line. NetBeans is telling me the exception is happening at line 19 and it only fails when I am looking for a substring bounded by the indexOf function. My code is below. I have no idea why this isn't working, any ideas?
import java.util.*;
public class MainClass {
public static void main(String args[]){
File dat = new File("SpotifyListeningData.txt");
List<String> list = new ArrayList<String>();
Scanner swag = null;
try {
swag = new Scanner(dat);
}
catch(Exception e) {
System.out.println("pranked");
}
while (swag.hasNextLine())
if (swag.nextLine().length() > 1)
if (list.contains(swag.nextLine().substring(swag.nextLine().indexOf("artist_name"), swag.nextLine().indexOf("master_metadata_album_album"))))
System.out.print("");
else
try {list.add(swag.nextLine().substring(swag.nextLine().indexOf("artist_name"), swag.nextLine().indexOf("master_metadata_album_album")));}
catch(Exception e) {}
System.out.println(list);
}
}
Find a JSON parser you like.
Create a class that with the fields you care about marked up to the parsers specs.
Read the file into a collection of objects. Most parsers will stream the contents so you're not string a massive string.
You can then load the data into objects and store that as you see fit. For your purposes, a TreeSet is probably what you want.
Your code will throw a lot of exceptions only because you don't use braces. Please do use braces in each blocks, whether it is if, else, loops, whatever. It's a good practice and prevent unnecessary bugs.
However, everytime scanner.nextLine() is called, it reads the next line from the file, so you need to avoid using that in this way.
The best way to deal with this is to write a class containing the fields same as the json in each line of the file. And map the json to the class and get desired field value from that.
Your way is too much risky and dependent on structure of the data, even on whitespaces. However, I fixed some lines in your code and this will work for your purpose, although I actually don't prefer operating string in this way.
while (swag.hasNextLine()) {
String swagNextLine = swag.nextLine();
if (swagNextLine.length() > 1) {
String toBeAdded = swagNextLine.substring(swagNextLine.indexOf("artist_name") + "artist_name".length() + 2
, swagNextLine.indexOf("master_metadata_album_album") - 2);
if (list.contains(toBeAdded)) {
System.out.print("Match");
} else {
try {
list.add(toBeAdded);
} catch (Exception e) {
System.out.println("Add to list failed");
}
}
System.out.println(list);
}
}

Java8:Handling a checked exception in java8 lambda's like Stream.forEach() Method [duplicate]

I'd like to read in a file and replace some text with new text. It would be simple using asm and int 21h but I want to use the new java 8 streams.
Files.write(outf.toPath(),
(Iterable<String>)Files.lines(inf)::iterator,
CREATE, WRITE, TRUNCATE_EXISTING);
Somewhere in there I'd like a lines.replace("/*replace me*/","new Code()\n");. The new lines are because I want to test inserting a block of code somewhere.
Here's a play example, that doesn't work how I want it to, but compiles. I just need a way to intercept the lines from the iterator, and replace certain phrases with code blocks.
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import static java.nio.file.StandardOpenOption.*;
import java.util.Arrays;
import java.util.stream.Stream;
public class FileStreamTest {
public static void main(String[] args) {
String[] ss = new String[]{"hi","pls","help","me"};
Stream<String> stream = Arrays.stream(ss);
try {
Files.write(Paths.get("tmp.txt"),
(Iterable<String>)stream::iterator,
CREATE, WRITE, TRUNCATE_EXISTING);
} catch (IOException ex) {}
//// I'd like to hook this next part into Files.write part./////
//reset stream
stream = Arrays.stream(ss);
Iterable<String> it = stream::iterator;
//I'd like to replace some text before writing to the file
for (String s : it){
System.out.println(s.replace("me", "my\nreal\nname"));
}
}
}
edit: I've gotten this far and it works. I was trying with filter and maybe it isn't really necessary.
Files.write(Paths.get("tmp.txt"),
(Iterable<String>)(stream.map((s) -> {
return s.replace("me", "my\nreal\nname");
}))::iterator,
CREATE, WRITE, TRUNCATE_EXISTING);
The Files.write(..., Iterable, ...) method seems tempting here, but converting the Stream to an Iterable makes this cumbersome. It also "pulls" from the Iterable, which is a bit odd. It would make more sense if the file-writing method could be used as the stream's terminal operation, within something like forEach.
Unfortunately, most things that write throw IOException, which isn't permitted by the Consumer functional interface that forEach expects. But PrintWriter is different. At least, its writing methods don't throw checked exceptions, although opening one can still throw IOException. Here's how it could be used.
Stream<String> stream = ... ;
try (PrintWriter pw = new PrintWriter("output.txt", "UTF-8")) {
stream.map(s -> s.replaceAll("foo", "bar"))
.forEachOrdered(pw::println);
}
Note the use of forEachOrdered, which prints the output lines in the same order in which they were read, which is presumably what you want!
If you're reading lines from an input file, modifying them, and then writing them to an output file, it would be reasonable to put both files within the same try-with-resources statement:
try (Stream<String> input = Files.lines(Paths.get("input.txt"));
PrintWriter output = new PrintWriter("output.txt", "UTF-8"))
{
input.map(s -> s.replaceAll("foo", "bar"))
.forEachOrdered(output::println);
}

How to create a dynamic Interface with properties file at compile time?

The problem here is that the property file we use has insanely huge name as the key and most of us run into incorrect key naming issues . so it got me thinking if there's a way to generate the following interface based on the property file. Every change we make to the property file will auto-adjust the Properties interface. Or is there other solution?
Property File
A=Apple
B=Bannana
C=Cherry
Should Generate The following Interface
interface Properties{
public static final String A = "A" // keys
public static final String B = "B";
public static final String C = "C";
}
So in my application code
String a_value = PROP.getString(Properties.A);
There is an old rule about programming and not only about it, if something looks beautiful, then most probably it is the right way to do.
This approach does not look good, from my point of view.
The first thing:
Do not declare constants in interfaces. It violates the incapsulation approach. Check this article please: http://en.wikipedia.org/wiki/Constant_interface
The second thing:
Use a prefix for name part of your properties which are somehow special, let say: key_
And when you load your properties file, iterate over keys and extract keys with name that starts with key_ and use values of these keys as you planned to use those constants in your question.
UPDATE
Assume, we generate a huge properties file upon compilation process, using our Apache Ant script.
For example, let's properties file (myapp.properties) looks like that:
key_A = Apple
key_B = Banana
key_C = Cherry
anotherPropertyKey1 = blablabla1
anotherPropertyKey2 = blablabla2
our special properties which we want to handle have key names start with key_ prefix.
So, we write the following code (please note, it is not optimized, it is just proof of concept):
package propertiestest;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.util.Arrays;
import java.util.Enumeration;
import java.util.HashSet;
import java.util.Properties;
import java.util.Set;
public class PropertiesTest {
public static void main(String[] args) throws IOException {
final String PROPERTIES_FILENAME = "myapp.properties";
SpecialPropertyKeysStore spkStore =
new SpecialPropertyKeysStore(PROPERTIES_FILENAME);
System.out.println(Arrays.toString(spkStore.getKeysArray()));
}
}
class SpecialPropertyKeysStore {
private final Set<String> keys;
public SpecialPropertyKeysStore(String propertiesFileName)
throws FileNotFoundException, IOException {
// prefix of name of a special property key
final String KEY_PREFIX = "key_";
Properties propertiesHandler = new Properties();
keys = new HashSet<>();
try (InputStream input = new FileInputStream(propertiesFileName)) {
propertiesHandler.load(input);
Enumeration<?> enumeration = propertiesHandler.propertyNames();
while (enumeration.hasMoreElements()) {
String key = (String) enumeration.nextElement();
if (key.startsWith(KEY_PREFIX)) {
keys.add(key);
}
}
}
}
public boolean isKeyPresent(String keyName) {
return keys.contains(keyName);
}
public String[] getKeysArray() {
String[] strTypeParam = new String[0];
return keys.toArray(strTypeParam);
}
}
Class SpecialPropertyKeysStore filters and collects all special keys into its instance.
And you can get an array of these keys, or check whether is key present or not.
If you run this code, you will get:
[key_C, key_B, key_A]
It is a string representation of returned array with special key names.
Change this code as you want to meet your requirements.
I would not generate a class or interface from properties because you would lose the abilities to :
document those properties, as they would be represented by a java element + javadocs
references those properties in your code, as they would be play old java constant, and the compiler would have full knowledge of them. Refactoring them would also be possible while it would not be possible with automatic names.
You can also use enums, or create some special Property class, with a name as only and final field. Then, you only need a get method that would take a Properties, a Map or whatever.
As for your request, you can execute code with the maven-exec-plugin.
You should simply create a main that would read your properties file, and for each keys:
convert the key to a valid java identifier (you can use isJavaIdentifierStart and isJavaIdentifierPart to replace invalid char by a _)
write your class/interface/whatever you like using plain old Java (and don't forget to escape for eventual doublequote or backslashes !)
Since it would be a part of your build, say before building other classes that would depends on those constants, I would recommend you to create a specific maven project to isolate those build.
Still, I would really don't do that and use a POJO loaded with whatever need (CDI, Spring, Static initialization, etc).

File I/O bottleneck found via VisualVM

I've found a bottleneck in my app that keeps growing as data in my files grow (see attached screenshot of VisualVM below).
Below is the getFileContentsAsList code. How can this be made better performance-wise? I've read several posts on efficient File I/O and some have suggested Scanner as a way to efficiently read from a file. I've also tried Apache Commons readFileToString but that's not running fast as well.
The data file that's causing the app to run slower is 8 KB...that doesn't seem too big to me.
I could convert to an embedded database like Apache Derby if that seems like a better route. Ultimately looking for what will help the application run faster (It's a Java 1.7 Swing app BTW).
Here's the code for getFileContentsAsList:
public static List<String> getFileContentsAsList(String filePath) throws IOException {
if (ReceiptPrinterStringUtils.isNullOrEmpty(filePath)) throw new IllegalArgumentException("File path must not be null or empty");
Scanner s = null;
List<String> records = new ArrayList<String>();
try {
s = new Scanner(new BufferedReader(new FileReader(filePath)));
s.useDelimiter(FileDelimiters.RECORD);
while (s.hasNext()) {
records.add(s.next());
}
} finally {
if (s != null) {
s.close();
}
}
return records;
}
The size of an ArrayList is multiplied by 1.5 when necessary. This is O(log(N)). (Doubling was used in Vector.) I would certainly use an O(1) LinkedList here, and BufferedReader.readLine() rather than a Scanner if I was trying to speed it up. It's hard to believe that the time to read one 8k file is seriously a concern. You can read millions of lines in a second.
So, file.io gets to be REAL expensive if you do it a lot...as seen in my screen shot and original code, getFileContentsAsList, which contains file.io calls, gets invoked quite a bit (18.425 times). VisualVM is a real gem of a tool to point out bottlenecks like these!
After contemplating over various ways to improve performance, it dawned on me that possibly the best way is to do file.io calls as little as possible. So, I decided to use private static variables to hold the file contents and to only do file.io in the static initializer and when a file is written to. As my application is (fortunately) not doing excessive writing (but excessive reading), this makes for a much better performing application.
Here's the source for the entire class that contains the getFileContentsAsList method. I took a snapshot of that method and it now runs in 57.2 ms (down from 3116 ms). Also, it was my longest running method and is now my 4th longest running method. The top 5 longest running methods run for a total of 498.8 ms now as opposed to the ones in the original screenshot that ran for a total of 3812.9 ms. That's a percentage decrease of about 85%
[100 * (498.8 - 3812.9) / 3812.9].
package com.mbc.receiptprinter.util;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.logging.Level;
import org.apache.commons.io.FileUtils;
import com.mbc.receiptprinter.constant.FileDelimiters;
import com.mbc.receiptprinter.constant.FilePaths;
/*
* Various File utility functions. This class uses the Apache Commons FileUtils class.
*/
public class ReceiptPrinterFileUtils {
private static Map<String, String> fileContents = new HashMap<String, String>();
private static Map<String, Boolean> fileHasBeenUpdated = new HashMap<String, Boolean>();
static {
for (FilePaths fp : FilePaths.values()) {
File f = new File(fp.getPath());
try {
FileUtils.touch(f);
fileHasBeenUpdated.put(fp.getPath(), false);
fileContents.put(fp.getPath(), FileUtils.readFileToString(f));
} catch (IOException e) {
ReceiptPrinterLogger.logMessage(ReceiptPrinterFileUtils.class,
Level.SEVERE,
"IOException while performing FileUtils.touch in static block of ReceiptPrinterFileUtils", e);
}
}
}
public static String getFileContents(String filePath) throws IOException {
if (ReceiptPrinterStringUtils.isNullOrEmpty(filePath)) throw new IllegalArgumentException("File path must not be null or empty");
File f = new File(filePath);
if (fileHasBeenUpdated.get(filePath)) {
fileContents.put(filePath, FileUtils.readFileToString(f));
fileHasBeenUpdated.put(filePath, false);
}
return fileContents.get(filePath);
}
public static List<String> convertFileContentsToList(String fileContents) {
List<String> records = new ArrayList<String>();
if (fileContents.contains(FileDelimiters.RECORD)) {
records = Arrays.asList(fileContents.split(FileDelimiters.RECORD));
}
return records;
}
public static void writeStringToFile(String filePath, String data) throws IOException {
fileHasBeenUpdated.put(filePath, true);
FileUtils.writeStringToFile(new File(filePath), data);
}
public static void writeStringToFile(String filePath, String data, boolean append) throws IOException {
fileHasBeenUpdated.put(filePath, true);
FileUtils.writeStringToFile(new File(filePath), data, append);
}
}
ArrayLists have a good performance at reading and also on writing IF the lenth does not change very often. In your application the length changes very often (size is doubled, when it is full and an element is added) and your application needs to copy your array into an new, longer array.
You could use a LinkedList, where new elements are appended and no copy actions are needed.
List<String> records = new LinkedList<String>();
Or you could initialize the ArrayList with the approximated finished Number of Words. This will reduce the number of copy actions.
List<String> records = new ArrayList<String>(2000);

Categories