I have to write a code in JAVA like following structure:
Read String From File
// Perform some string processing
Write output string in file
Now, for reading/writing string to/from file, I am using,
BufferedReader br = new BufferedReader(new FileReader("Text.txt"), 32768);
BufferedWriter out = new BufferedWriter(new FileWriter("AnotherText.txt"), 32768);
while((line = br.readLine()) != null) {
//perform some string processing
out.write(output string) ;
out.newLine();
}
However, it seems reading and writing is quite slow. Is there any other fastest method to read/write strings to/from a file in JAVA ?
Additional Info:
1) Read File is 144 MB.
2) I can allocate large memory (50 MB) for reading or writing.
3)I have to write it as a string, not as Byte.
It sounds slower than it should be.
You can try increasing the buffer size.
Maybe also try FileOutputStream instead of FileWriter.
You mentioned 50MB. Are you modifying the memory parameters of the program at all when you run it using a -X switch?
Ignoring the fact that you have not posted what your performance requirements are:
Try reading/writing the file as bytes and internally convert the byte to characters/string.
This question might be helpful: Number of lines in a file in Java
Related
I see some posts on StackOverflow that contradict each other, and I would like to get a definite answer.
I started with the assumption that using a Java InputStream would allow me to stream bytes out of a file, and thus save on memory, as I would not have to consume the whole file at once. And that is exactly what I read here:
Loading all bytes to memory is not a good practice. Consider returning the file and opening an input stream to read it, so your application won't crash when handling large files. – andrucz
Download file to stream instead of File
But then I used an InputStream to read a very large Microsoft Excel file (using the Apache POI library) and I ran into this error:
java.lang.outofmemory exception while reading excel file (xlsx) using POI
I got an OutOfMemory error.
And this crucial bit of advice saved me:
One thing that'll make a small difference is when opening the file to start with. If you have a file, then pass that in! Using an InputStream requires buffering of everything into memory, which eats up space. Since you don't need to do that buffering, don't!
I got rid of the InputStream and just used a bare java.io.File, and then the OutOfMemory error went away.
So using java.io.File is better than an InputSteam, when it comes to memory use? That doesn't make any sense.
What is the real answer?
So you are saying that an InputStream would typically help?
It entirely depends on how the application (or library) >>uses<< the InputStream
With what kind of follow up code? Could you offer an example of memory efficient Java?
For example:
// Efficient use of memory
try (InputStream is = new FileInputStream(largeFileName);
BufferedReader br = new BufferedReader(new InputStreamReader(is))) {
String line;
while ((line = br.readLine()) != null) {
// process one line
}
}
// Inefficient use of memory
try (InputStream is = new FileInputStream(largeFileName);
BufferedReader br = new BufferedReader(new InputStreamReader(is))) {
StringBuilder sb = new StringBuilder();
while ((line = br.readLine()) != null) {
sb.append(line).append("\n");
}
String everything = sb.toString();
// process the entire string
}
// Very inefficient use of memory
try (InputStream is = new FileInputStream(largeFileName);
BufferedReader br = new BufferedReader(new InputStreamReader(is))) {
String everything = "";
while ((line = br.readLine()) != null) {
everything += line + "\n";
}
// process the entire string
}
(Note that there are more efficient ways of reading a file into memory. The above examples are purely to illustrate the principles.)
The general principles here are:
avoid holding the entire file in memory, all at the same time
if you have to hold the entire file in memory, then be careful about you "accumulate" the characters.
The posts that you linked to above:
The first one is not really about memory efficiency. Rather it is talking about a limitation of the AWS client-side library. Apparently, the API doesn't provide an easy way to stream an object while reading it. You have to save it the object to a file, then open the file as a stream. Whether that is memory efficient or not depends on what the application does with the stream; see above.
The second one specific to the POI APIs. Apparently, the POI library itself is reading the stream contents into memory if you use a stream. That would be an implementation limitation of that particular library. (But there could be a good reason; e.g. maybe because POI needs to be able to "seek" or "rewind" the stream.)
It seems that there are many, many ways to read text files in Java (BufferedReader, DataInputStream etc.) My personal favorite is Scanner with a File in the constructor (it's just simpler, works with mathy data processing better, and has familiar syntax).
Boris the Spider also mentioned Channel and RandomAccessFile.
Can someone explain the pros and cons of each of these methods? To be specific, when would I want to use each?
(edit) I think I should be specific and add that I have a strong preference for the Scanner method. So the real question is, when wouldn't I want to use it?
Lets start at the beginning. The question is what do you want to do?
It's important to understand what a file actually is. A file is a collection of bytes on a disc, these bytes are your data. There are various levels of abstraction above that that Java provides:
File(Input|Output)Stream - read these bytes as a stream of byte.
File(Reader|Writer) - read from a stream of bytes as a stream of char.
Scanner - read from a stream of char and tokenise it.
RandomAccessFile - read these bytes as a searchable byte[].
FileChannel - read these bytes in a safe multithreaded way.
On top of each of those there are the Decorators, for example you can add buffering with BufferedXXX. You could add linebreak awareness to a FileWriter with PrintWriter. You could turn an InputStream into a Reader with an InputStreamReader (currently the only way to specify character encoding for a Reader).
So - when wouldn't I want to use it [a Scanner]?.
You would not use a Scanner if you wanted to, (these are some examples):
Read in data as bytes
Read in a serialized Java object
Copy bytes from one file to another, maybe with some filtering.
It is also worth nothing that the Scanner(File file) constructor takes the File and opens a FileInputStream with the platform default encoding - this is almost always a bad idea. It is generally recognised that you should specify the encoding explicitly to avoid nasty encoding based bugs. Further the stream isn't buffered.
So you may be better off with
try (final Scanner scanner = new Scanner(new BufferedInputStream(new FileInputStream())), "UTF-8") {
//do stuff
}
Ugly, I know.
It's worth noting that Java 7 Provides a further layer of abstraction to remove the need to loop over files - these are in the Files class:
byte[] Files.readAllBytes(Path path)
List<String> Files.readAllLines(Path path, Charset cs)
Both these methods read the entire file into memory, which might not be appropriate. In Java 8 this is further improved by adding support for the new Stream API:
Stream<String> Files.lines(Path path, Charset cs)
Stream<Path> Files.list(Path dir)
For example to get a Stream of words from a Path you can do:
final Stream<String> words = Files.lines(Paths.get("myFile.txt")).
flatMap((in) -> Arrays.stream(in.split("\\b")));
SCANNER:
can parse primitive types and strings using regular expressions.
A Scanner breaks its input into tokens using a delimiter pattern, which by default matches whitespace. The resulting tokens may then be converted into values of different types.more can be read at http://docs.oracle.com/javase/7/docs/api/java/util/Scanner.html
DATA INPUT STREAM:
Lets an application read primitive Java data types from an underlying input stream in a machine-independent way. An application uses a data output stream to write data that can later be read by a data input stream.DataInputStream is not necessarily safe for multithreaded access. Thread safety is optional and is the responsibility of users of methods in this class. More can be read at http://docs.oracle.com/javase/7/docs/api/java/io/DataInputStream.html
BufferedReader:
Reads text from a character-input stream, buffering characters so as to provide for the efficient reading of characters, arrays, and lines.The buffer size may be specified, or the default size may be used. The default is large enough for most purposes.In general, each read request made of a Reader causes a corresponding read request to be made of the underlying character or byte stream. It is therefore advisable to wrap a BufferedReader around any Reader whose read() operations may be costly, such as FileReaders and InputStreamReaders. For example,
BufferedReader in = new BufferedReader(new FileReader("foo.in"));
will buffer the input from the specified file. Without buffering, each invocation of read() or readLine() could cause bytes to be read from the file, converted into characters, and then returned, which can be very inefficient.Programs that use DataInputStreams for textual input can be localized by replacing each DataInputStream with an appropriate BufferedReader.More detail are at http://docs.oracle.com/javase/7/docs/api/java/io/BufferedReader.html
NOTE: This approach is outdated. As Boris points out in his comment. I will leave it here for history, but you should use methods available in JDK.
It depends on what kind of operation you are doing and the size of the file you are reading.
In most of the cases, I recommend using commons-io for small files.
byte[] data = FileUtils.readFileToByteArray(new File("myfile"));
You can read it as string or character array...
Now, you are handing big files, or changing parts of a file directly on the filesystem, then the best it to use a RandomAccessFile and potentially even a FileChannel to do "nio" style.
Using BufferedReader
BufferedReader reader;
char[] buffer = new char[10];
reader = new BufferedReader(new FileReader("FILE_PATH"));
//or
reader = Files.newBufferedReader(Path.get("FILE_PATH"));
while (reader.read(buffer) != -1) {
System.out.print(new String(buffer));
buffer = new char[10];
}
//or
while (buffReader.ready()) {
System.out.println(
buffReader.readLine());
}
reader.close();
Using FileInputStream-Read Binary Files to Bytes
FileInputStream fis;
byte[] buffer = new byte[10];
fis = new FileInputStream("FILE_PATH");
//or
fis=Files.newInoutSream(Paths.get("FILE_PATH"))
while (fis.read(buffer) != -1) {
System.out.print(new String(buffer));
buffer = new byte[10];
}
fis.close();
Using Files– Read Small File to List of Strings
List<String> allLines = Files.readAllLines(Paths.get("FILE_PATH"));
for (String line : allLines) {
System.out.println(line);
}
Using Scanner – Read Text File as Iterator
Scanner scanner = new Scanner(new File("FILE_PATH"));
while (scanner.hasNextLine()) {
System.out.println(scanner.nextLine());
}
scanner.close();
Using RandomAccessFile-Reading Files in Read-Only Mode
RandomAccessFile file = new RandomAccessFile("FILE_PATH", "r");
String str;
while ((str = file.readLine()) != null) {
System.out.println(str);
}
file.close();
Using Files.lines-Reading lines as stream
Stream<String> lines = Files.lines(Paths.get("FILE_PATH") .forEach(s -> System.out.println(s));
Using FileChannel-for increasing performance by using off-heap memory furthermore using MappedByteBuffer
FileInputStream i = new FileInputStream(("FILE_PATH");
ReadableByteChannel r = i.getChannel();
ByteBuffer buffer = ByteBuffer.allocateDirect(16 * 1024);
while (r.read(buffer) != -1) {
buffer.flip();
while (buffer.hasRemaining()) {
System.out.print((char) buffer.get());
}
buffer.clear();
}
My code reads through an xml file encoded with UTF-8 until a specfied string has been found. It finds the specified string fine, but I wish to write at this point in the file.
I would much prefer to do this through a stream as only small tasks need to be done.
I cannot find a way to do this. Any alternative methods are welcome.
Code so far:
final String RESOURCE = "/path/to/file.xml";
BufferedReader in = new BufferedReader(new InputStreamReader(ClassLoader.class.getResourceAsStream(RESOURCE), "UTF-8"));
BufferedWriter out = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(ClassLoader.class.getResource(RESOURCE).getPath()),"UTF-8"));
String fileLine = in.readLine();
while (!fileLine.contains("some string")) {
fileLine = in.readLine();
}
// File writing code here
You can't really write into the middle of the file, except for overwriting existing bytes (using something like RandomAccessFile). that would only work, however, if what you needed to write was exactly the same byte length as what you were replacing, which i highly doubt.
instead, you need to re-write the file to a new file, copying the input to the output, replacing the parts you need to replace in the process. there are a variety of ways you could do this. i would recommend using a StAX event reader and writer as the StAX api is fairly user friendly (compared to SAX) as well as fast and memory efficient.
I have program in which I have to load a PNG as a String and then save it again, but after I save it it becomes unreadable. If I open both the loaded PNG and the saved String in the editor, I can see that java created linebreaks all over the file. If this is is the problem, how can I avoid this?
public static void main(String[] args)
{
try
{
File file1 = new File("C://andim//testFile.png");
StringBuffer content = new StringBuffer();
BufferedReader reader = null;
reader = new BufferedReader(new FileReader(file1));
String s = null;
while ((s = reader.readLine()) != null)
{
content.append(s).append(System.getProperty("line.separator"));
}
reader.close();
String loaded=content.toString();
File file2=new File("C://andim//testString.png");
FileWriter filewriter = new FileWriter(file2);
filewriter.write(loaded);
filewriter.flush();
filewriter.close();
}
catch(Exception exception)
{
exception.printStackTrace();
}
}
I have program in which I have to load a PNG as a String and then save it again, but after I save it it becomes unreadable.
Yes, I'm not surprised. You're treating arbitrary binary data as if it's text data (in whatever your platform default encoding is, to boot). It's not. Don't do that. It's possible that in some encodings you'll get away with it - until you start trying to pass the string elsewhere in a way that strips unprintable characters etc.
If you must convert arbitrary binary data to text, use base64 or hex. If possible, avoid the conversion to text in the first place though. If you just want to copy a file, use InputStream and OutputStream - not Reader and Writer.
This is a big general point: keep data in its "native" representation as long as you possibly can. Only convert data to a different representation when you absolutely have to, and be very careful about it.
Don't use text-based APIs to read binary files. In this case, you don't want a BufferedReader, and you certainly don't want readLine, which may well treat more than just one thing as a line separator. Use an InputStream (for instance, FileInputStream) and an OutputStream (for instance, FileOutputStream), not readers and writers.
Don't do that
PNGs are not textual data.
If you try to read arbitrary bytes into a string, Java will mangle the bytes into actual text, corrupting the data you read.
You need to use byte[]sm not strings.
I'm reading a file line by line. The file is encoded by CipherOutputStream and then later compressed by DeflaterOutputStream. The file can consist of UTF-8 characters, like Russian letters, etc.
I want to obtain the offset in actually read file, or the number of bytes read by br.ReadLine() command. The problem is that the file is both encrypted, and deflated, so length of read String is larger than number of read bytes in file.
InputStream fis=tempURL.openStream(); //in tempURL I've got an URL to download
CipherInputStream cis=new CipherInputStream(fis,pbeCipher); //CipherStream
InflaterInputStream iis=new InflaterInputStream(cis); //InflaterInputStream
BufferedReader br = new BufferedReader(
new InputStreamReader(iis, "UTF8")); //BufferedReader
br.readLine();
int fSize=tempURL.openConnection().getContentLength(); //Catch FileSize
Use a CountingInputStream from the Apache Commons IO project:
InputStream fis=tempURL.openStream();
CountingInputStream countStream = new CountingInputStream(fis);
CipherInputStream cis=new CipherInputStream(countStream,pbeCipher);
...
Later you can obtain the file position with countStream.getByteCount().
For compressed files, you can find that a String doesn't use a whole number of bytes so the question cannot be answered. e.g. a byte can take less than a byte when compressed (otherwise there would be no point trying to compress it)
BTW: Is usually best to compress the data before encrypting it as it will usually be much more compact. Compressing the data after it has been encrypted will only help if its output is base 64 or something similar. Compression works best when you can the contents become predictable (e.g. repeating sequences, common characters) whereas the porpose of encryption is to make the data appear unpredictable.