I know this is a simple question. However I can google more to find the answer But your answer can give different ideas to work on. I am trying to understand the new feature introduced in Java 8. As part of that, I have written some code to read the files in a directory and put them in List of InputStream. How can I simplify the following code with Java 8?
File[] gred_files = gred_directory.listFiles();
List<InputStream> gredInputStreamList = new ArrayList<InputStream>();
for(File gred_file: gred_files) {
if(gred_file.isFile()) {
InputStream gredStream = new FileInputStream(gred_file);
if (gredStream != null)
{
gredInputStreamList.add(gredStream);
}
}
}
Help would be appreciated.
List<InputStream> gredInputStreamList = Arrays.stream(gred_directory.listFiles())
.filter(File::isFile)
.map(FileInputStream::new)
.collect(Collectors.toList())
I am not sure about the part of .map(FileInputStream::new), however, as there is a checked exception. Maybe we'd need a helper method here which does
InputStream openStream(File file) {
try {
return new FileInputStream(gred_file);
} catch (IOException e) {
return null;
}
}
and you'd do .map(WhateverClassThisIsIn::openStream) here.
Or maybe even better
Optional<InputStream> openStream(File file) {
try {
return Optional.of(new FileInputStream(gred_file));
} catch (IOException e) {
return Optional.empty();
}
}
and
List<InputStream> gredInputStreamList = Arrays.stream(gred_directory.listFiles())
.filter(File::isFile)
.map(WhateverClassThisIsIn::openStream)
.filter(Optional::isPresent)
.map(Optional::get)
.collect(Collectors.toList())
to avoid unnecessary null values. (Although, in such a tight loop, it won't matter.)
You can use
List<InputStream> list = Files.list(gred_directory.toPath())
.filter(Files::isRegularFile)
.map(path -> {
try { return Files.newInputStream(path); }
catch(IOException ex) { throw new UncheckedIOException(ex); }
})
.collect(Collectors.toList());
though, it might be worth using the NIO API in the first place instead of starting with a File.
But even more questionable is the desire the get a List<InputStream>. A List is no end in itself, so if you plan to read the files, you should do that directly in the Stream operation. Depending on the underlying system, there might be a limitation on how many files can be open simultaneously. If you process and close the Stream in subsequent operations of a single Stream pipeline, each file will be processed completely before opening the next one.
Depending on the actual operation, you might even skip dealing with InputStreams:
List<byte[]> list = Files.list(gred_directory.toPath())
.filter(Files::isRegularFile)
.map(path -> {
try { return Files.readAllBytes(path); }
catch(IOException ex) { throw new UncheckedIOException(ex); }
})
.collect(Collectors.toList());
You can do it like this.
Arrays.stream(gred_files)
.filter(File::isFile).map(file -> {
try {
return new FileInputStream(file);
} catch (FileNotFoundException e) {
throw new RuntimeException(e);
}
return null;
})
.collect(Collectors.toList());
Related
For some reason my String is written partially by PrintWriter. As a result I am getting partial text in my file. Here's the method:
public void new_file_with_text(String text, String fname) {
File f = null;
try {
f = new File(fname);
f.createNewFile();
System.out.println(text);
PrintWriter out = new PrintWriter(f, "UTF-8");
out.print(text);
} catch (IOException e) {
e.printStackTrace();
}
}
Where I print text to a console, I can see that the data is all there, nothing is lost, but apparently part of text is lost when PrintWriter does its job... I am clueless..
You should always Writer#close your streams before you discard your opened streams. This will free some rather expensive system resources that your JVM must quire when opening a file on the file system. If you do not want to close your stream, you can use Writer#flush. This will make your changes visible on the file system without closing the stream. When closing the stream, all data is flushed implicitly.
Streams always buffer data in order to only write to the file system when there is enough data to be written. The stream flushes its data automatically every now and then when it in some way considers the data worth writing. Writing to the file system is an expensive operation (it costs time and system resources) and should therefore only be done if it really is necessary. Therefore, you need to flush your stream's cache manually, if you desire an immediate write.
In general, make sure that you always close streams since they use quite some system resources. Java has some mechanisms for closing streams on garbage collection but these mechanisms should only be seen as a last resort since streams can live for quite some time before they are actually garbage collected. Therefore, always use try {} finally {} to assure that streams get closed, even on exceptions after the opening of a stream. If you do not pay attention to this, you will end up with an IOException signaling that you have opened too many files.
You want to change your code like this:
public void new_file_with_text(String text, String fname) {
File f = null;
try {
f = new File(fname);
f.createNewFile();
System.out.println(text);
PrintWriter out = new PrintWriter(f, "UTF-8");
try {
out.print(text);
} finally {
out.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
Try to use out.flush(); right after the line out.print(text);
Here is a proper way to write in a file :
public void new_file_with_text(String text, String fname) {
try (FileWriter f = new FileWriter(fname)) {
f.write(text);
f.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
I tested you code. You forgot to close the PrintWriter object i.e out.close
try {
f = new File(fname);
f.createNewFile();
System.out.println(text);
PrintWriter out = new PrintWriter(f, "UTF-8");
out.print(text);
out.close(); // <--------------
} catch (IOException e) {
System.out.println(e);
}
You must always close your streams (which will also flush them), in a finally block, or using the Java 7 try-with-resources facility:
PrintWriter out = null;
try {
...
}
finally {
if (out != null) {
out.close();
}
}
or
try (PrintWriter out = new PrintWriter(...)) {
...
}
If you don't close your streams, not only won't everything be flushed to the file, but at some time, your OS will be out of available file descriptors.
You should close your file:
PrintWriter out = new PrintWriter(f, "UTF-8");
try
{
out.print(text);
}
finally
{
try
{
out.close();
}
catch(Throwable t)
{
t.printStackTrace();
}
}
When we concatenate streams using Stream.concat is it possible to run a function whenever a stream gets over?
e.g.
I'm creating streams out of multiple files using Files.lines. Now whenever a file is read completely, I need to delete it.
The close handlers of a stream composed via Stream.concat are executed when the resulting stream is closed. Note that close handlers in general require that the code using the stream closes the stream, e.g.
try(Stream<String> s=Stream.concat(Files.lines(path1), Files.lines(path2))) {
s.forEach(System.out::println);
}
for proper closing and
try(Stream<String> s=Stream.concat(
Files.lines(path1).onClose(()->{
try { Files.delete(path1); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
}),
Files.lines(path2).onClose(()->{
try { Files.delete(path2); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
}))
) {
s.forEach(System.out::println);
}
for deleting the files afterwards. But in this case, the resulting stream’s close handler will invoke the source stream’s close handlers, so this doesn’t delete the files immediately after use, but after the entire operation, so it’s not much different to
try(Closeable c1=() -> Files.deleteIfExists(path1);
Closeable c2=() -> Files.deleteIfExists(path2);
Stream<String> s=Stream.concat(Files.lines(path1), Files.lines(path2)); ) {
s.forEach(System.out::println);
}
If you want a timely deletion of the files, you have to use flatMap. The sub-streams will be closed immediately after use, regardless of whether the “outer” stream will be closed:
Stream.of(path1, path2)
.flatMap(path -> {
try { return Files.lines(path).onClose(()->{
try { Files.delete(path); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
}); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
})
.forEach(System.out::println);
To demonstrate the difference,
try(Stream<String> s=Stream.concat(
Stream.of("foo").onClose(()->System.out.println("foo closed")),
Stream.of("bar").onClose(()->System.out.println("bar closed")) )) {
s.forEach(System.out::println);
}
will print
foo
bar
foo closed
bar closed
whereas
Stream.of("foo", "bar")
.flatMap(x -> Stream.of(x).onClose(()->System.out.println(x+" closed")) )
.forEach(System.out::println);
will print
foo
foo closed
bar
bar closed
Did you tried using Stream.onClose(). Like Stream.concat().onClose()
This question already has answers here:
Java 8 Lambda function that throws exception?
(27 answers)
Closed 7 years ago.
Not able to wrap stream object in a try/catch block.
I tried like this:
reponseNodes.stream().parallel().collect(Collectors.toMap(responseNode -> responseNode.getLabel(), responseNode -> processImage(responseNode)));
Eclipse started complaining underlining processImage(responseNode) and suggested that it needs to Surround with try/catch.
Then I updated to:
return reponseNodes.stream().parallel().collect(Collectors.toMap(responseNode -> responseNode.getLabel(), responseNode -> try { processImage(responseNode) } catch (Exception e) { throw new UncheckedException(e); }));
Updated code also did not work.
Because the lambda is no longer a single statement, each statement (including processImage(responseNode) must be followed by a ;. For the same reason, the lambda also requires an explicit return statement (return processImage(responseNode)), and must be wrapped in {}.
Thus:
return reponseNodes.stream().parallel()
.collect(Collectors.toMap(responseNode -> responseNode.getLabel(), responseNode -> {
try {
return processImage(responseNode);
} catch (Exception e) {
throw new UncheckedException(e);
}
}));
There is no direct way to handle checked exceptions in lambadas, only option I came up with is just move the logic to another method which can handle it with try-catch.
For example,
List<FileReader> fr = Arrays.asList("a.txt", "b.txt", "c.txt").stream()
.map(a -> createFileReader(a)).collect(Collectors.toList());
//....
private static FileReader createFileReader(String file) {
FileReader fr = null;
try {
fr = new FileReader(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
return fr;
}
I have two file-to-string processes in my app (one actually deals with an asset file).
If I repeat either of these processes a few times on the same file, I get OutOfMemoryErrors.
I suspect it might be because I'm not closing the streams properly and therefore maybe causing multiple streams to be created, and this is perhaps causing my app to run out of memory.
Here is the code of the two processes:
My asset-file-to-string process.
As you can see, I have have something in place to close the stream but I don't know if it's formatted properly.
try
{
myVeryLargeString = IOUtils.toString(getAssets().open(myAssetsFilePath), "UTF-8");
IOUtils.closeQuietly(getAssets().open(myAssetsFilePath));
}
catch (IOException e)
{
e.printStackTrace();
}
catch(OutOfMemoryError e)
{
Log.e(TAG, "Ran out of memory 01");
}
My file-to-string process.
I have no idea how to close this stream (if there is even a stream to close at all).
myFile01 = new File(myFilePath);
try
{
myVeryLargeString = FileUtils.readFileToString(myFile01, "UTF-8");
}
catch (IOException e)
{
e.printStackTrace();
}
catch(OutOfMemoryError e)
{
Log.e(TAG, "Ran out of memory 02");
}
It's difficult to say what may cause OOME but closing should be like this
InputStream is = getAssets().open(myAssetsFilePath);
try {
myVeryLargeString = IOUtils.toString(is, "UTF-8");
} finally {
IOUtils.closeQuietly(is);
}
I am writing to a file using this code.
protected void writeFile(String text) {
DataOutputStream os = null;
FileConnection fconn = null;
try {
fconn = (FileConnection) Connector.open("file:///store/home/user/documents/file.txt", Connector.READ_WRITE);
if (!fconn.exists())
fconn.create();
os = fconn.openDataOutputStream();
os.write(text.getBytes());
} catch (IOException e) {
System.out.println(e.getMessage());
} finally {
try {
if (null != os)
os.close();
if (null != fconn)
fconn.close();
} catch (IOException e) {
System.out.println(e.getMessage());
}
}}
the code is working fine.
My problem is if I write first time "Banglore" and when I read it, I get "Banglore".
But, second time when I write "India" and when I read it, I get, "Indialore".
so, basically its content is not changing according the text , I am giving.
Please tell me how to fix this.
writing in a file doesn't remove the content but it just replaces the content, so writing 'india' over 'Bangalore' will replace the 'Banga' with 'India' and the rest would remain the same. If you want to completely remove old content with newer one, you need to truncate()
the file from where the newer data ends. truncate(text.getBytes().length)