This question already has answers here:
Java 8 Lambda function that throws exception?
(27 answers)
Closed 7 years ago.
Not able to wrap stream object in a try/catch block.
I tried like this:
reponseNodes.stream().parallel().collect(Collectors.toMap(responseNode -> responseNode.getLabel(), responseNode -> processImage(responseNode)));
Eclipse started complaining underlining processImage(responseNode) and suggested that it needs to Surround with try/catch.
Then I updated to:
return reponseNodes.stream().parallel().collect(Collectors.toMap(responseNode -> responseNode.getLabel(), responseNode -> try { processImage(responseNode) } catch (Exception e) { throw new UncheckedException(e); }));
Updated code also did not work.
Because the lambda is no longer a single statement, each statement (including processImage(responseNode) must be followed by a ;. For the same reason, the lambda also requires an explicit return statement (return processImage(responseNode)), and must be wrapped in {}.
Thus:
return reponseNodes.stream().parallel()
.collect(Collectors.toMap(responseNode -> responseNode.getLabel(), responseNode -> {
try {
return processImage(responseNode);
} catch (Exception e) {
throw new UncheckedException(e);
}
}));
There is no direct way to handle checked exceptions in lambadas, only option I came up with is just move the logic to another method which can handle it with try-catch.
For example,
List<FileReader> fr = Arrays.asList("a.txt", "b.txt", "c.txt").stream()
.map(a -> createFileReader(a)).collect(Collectors.toList());
//....
private static FileReader createFileReader(String file) {
FileReader fr = null;
try {
fr = new FileReader(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
return fr;
}
Related
This question already has answers here:
What's the purpose of try-with-resources statements?
(7 answers)
Closed 8 months ago.
I noticed that if I don't call myBufferedWriter.close(), my content will not appear in the target file. What if the program ends accidentally before reaching myBufferedWriter.close()? How to avoid losing data that are already in the buffer but not written to the file yet?
Edit:
I have found the simple use case of try-with-resources, but my code is like the following
public class myClass{
Map<String, BufferedWriter> writerMap = new HashMap<>();
public void write(···){
//call this.create() here
···
//normally, the writer will close here
}
public void create(···){
//BufferedWriter is created here, and saved into writerMap
···
}
}
Where is the best place to use the try-with-resources statement?
You can handle it with a try - catch - finally sentence.
Something like:
try {
// Do things
} catch (Exception e) {
e.printStackTrace();
} finally {
if (myBufferedWriter != null) {
try {
myBufferedWriter.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Your best bet is The try-with-resources Statement.
The try-with-resources statement ensures that each resource is closed at the end of the statement.
[...] the try-with-resources statement contains [...] declarations that are separated by a semicolon[...]. When the block of code that directly follows it terminates, either normally or because of an exception, the close methods of the [...] objects are automatically called in this order. Note that the close methods of resources are called in the opposite order of their creation.
public static void writeToFileZipFileContents(String zipFileName,
String outputFileName)
throws java.io.IOException {
java.nio.charset.Charset charset =
java.nio.charset.StandardCharsets.US_ASCII;
java.nio.file.Path outputFilePath =
java.nio.file.Paths.get(outputFileName);
// Open zip file and create output file with
// try-with-resources statement
try (
java.util.zip.ZipFile zf =
new java.util.zip.ZipFile(zipFileName);
java.io.BufferedWriter writer =
java.nio.file.Files.newBufferedWriter(outputFilePath, charset)
) {
// Enumerate each entry
for (java.util.Enumeration entries =
zf.entries(); entries.hasMoreElements();) {
// Get the entry name and write it to the output file
String newLine = System.getProperty("line.separator");
String zipEntryName =
((java.util.zip.ZipEntry)entries.nextElement()).getName() +
newLine;
writer.write(zipEntryName, 0, zipEntryName.length());
}
}
}
I know this is a simple question. However I can google more to find the answer But your answer can give different ideas to work on. I am trying to understand the new feature introduced in Java 8. As part of that, I have written some code to read the files in a directory and put them in List of InputStream. How can I simplify the following code with Java 8?
File[] gred_files = gred_directory.listFiles();
List<InputStream> gredInputStreamList = new ArrayList<InputStream>();
for(File gred_file: gred_files) {
if(gred_file.isFile()) {
InputStream gredStream = new FileInputStream(gred_file);
if (gredStream != null)
{
gredInputStreamList.add(gredStream);
}
}
}
Help would be appreciated.
List<InputStream> gredInputStreamList = Arrays.stream(gred_directory.listFiles())
.filter(File::isFile)
.map(FileInputStream::new)
.collect(Collectors.toList())
I am not sure about the part of .map(FileInputStream::new), however, as there is a checked exception. Maybe we'd need a helper method here which does
InputStream openStream(File file) {
try {
return new FileInputStream(gred_file);
} catch (IOException e) {
return null;
}
}
and you'd do .map(WhateverClassThisIsIn::openStream) here.
Or maybe even better
Optional<InputStream> openStream(File file) {
try {
return Optional.of(new FileInputStream(gred_file));
} catch (IOException e) {
return Optional.empty();
}
}
and
List<InputStream> gredInputStreamList = Arrays.stream(gred_directory.listFiles())
.filter(File::isFile)
.map(WhateverClassThisIsIn::openStream)
.filter(Optional::isPresent)
.map(Optional::get)
.collect(Collectors.toList())
to avoid unnecessary null values. (Although, in such a tight loop, it won't matter.)
You can use
List<InputStream> list = Files.list(gred_directory.toPath())
.filter(Files::isRegularFile)
.map(path -> {
try { return Files.newInputStream(path); }
catch(IOException ex) { throw new UncheckedIOException(ex); }
})
.collect(Collectors.toList());
though, it might be worth using the NIO API in the first place instead of starting with a File.
But even more questionable is the desire the get a List<InputStream>. A List is no end in itself, so if you plan to read the files, you should do that directly in the Stream operation. Depending on the underlying system, there might be a limitation on how many files can be open simultaneously. If you process and close the Stream in subsequent operations of a single Stream pipeline, each file will be processed completely before opening the next one.
Depending on the actual operation, you might even skip dealing with InputStreams:
List<byte[]> list = Files.list(gred_directory.toPath())
.filter(Files::isRegularFile)
.map(path -> {
try { return Files.readAllBytes(path); }
catch(IOException ex) { throw new UncheckedIOException(ex); }
})
.collect(Collectors.toList());
You can do it like this.
Arrays.stream(gred_files)
.filter(File::isFile).map(file -> {
try {
return new FileInputStream(file);
} catch (FileNotFoundException e) {
throw new RuntimeException(e);
}
return null;
})
.collect(Collectors.toList());
When we concatenate streams using Stream.concat is it possible to run a function whenever a stream gets over?
e.g.
I'm creating streams out of multiple files using Files.lines. Now whenever a file is read completely, I need to delete it.
The close handlers of a stream composed via Stream.concat are executed when the resulting stream is closed. Note that close handlers in general require that the code using the stream closes the stream, e.g.
try(Stream<String> s=Stream.concat(Files.lines(path1), Files.lines(path2))) {
s.forEach(System.out::println);
}
for proper closing and
try(Stream<String> s=Stream.concat(
Files.lines(path1).onClose(()->{
try { Files.delete(path1); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
}),
Files.lines(path2).onClose(()->{
try { Files.delete(path2); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
}))
) {
s.forEach(System.out::println);
}
for deleting the files afterwards. But in this case, the resulting stream’s close handler will invoke the source stream’s close handlers, so this doesn’t delete the files immediately after use, but after the entire operation, so it’s not much different to
try(Closeable c1=() -> Files.deleteIfExists(path1);
Closeable c2=() -> Files.deleteIfExists(path2);
Stream<String> s=Stream.concat(Files.lines(path1), Files.lines(path2)); ) {
s.forEach(System.out::println);
}
If you want a timely deletion of the files, you have to use flatMap. The sub-streams will be closed immediately after use, regardless of whether the “outer” stream will be closed:
Stream.of(path1, path2)
.flatMap(path -> {
try { return Files.lines(path).onClose(()->{
try { Files.delete(path); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
}); }
catch (IOException ex) { throw new UncheckedIOException(ex); }
})
.forEach(System.out::println);
To demonstrate the difference,
try(Stream<String> s=Stream.concat(
Stream.of("foo").onClose(()->System.out.println("foo closed")),
Stream.of("bar").onClose(()->System.out.println("bar closed")) )) {
s.forEach(System.out::println);
}
will print
foo
bar
foo closed
bar closed
whereas
Stream.of("foo", "bar")
.flatMap(x -> Stream.of(x).onClose(()->System.out.println(x+" closed")) )
.forEach(System.out::println);
will print
foo
foo closed
bar
bar closed
Did you tried using Stream.onClose(). Like Stream.concat().onClose()
As per as my knowledge we use try catch as follows:
try {
//Some code that may generate exception
}
catch(Exception ex) {
}
//handle exception
finally {
//close any open resources etc.
}
But in a code I found following
try(
ByteArrayOutputStream byteArrayStreamResponse = new ByteArrayOutputStream();
HSLFSlideShow pptSlideShow = new HSLFSlideShow(
new HSLFSlideShowImpl(
Thread.currentThread().getContextClassLoader()
.getResourceAsStream(Constants.PPT_TEMPLATE_FILE_NAME)
));
){
}
catch (Exception ex) {
//handel exception
}
finally {
//close any open resource
}
I am not able to understand why this parentheses () just after try.
What is the usage of it? Is it new in Java 1.7? What kind of syntax I can write there?
Please also refer me some API documents.
It is try with Resources syntax which is new in java 1.7. It is used to declare all resources which can be closed. Here is the link to official documentation.
https://docs.oracle.com/javase/tutorial/essential/exceptions/tryResourceClose.html
static String readFirstLineFromFile(String path) throws IOException {
try (BufferedReader br =
new BufferedReader(new FileReader(path))) {
return br.readLine();
}
}
In this example, the resource declared in the try-with-resources statement is a BufferedReader. The declaration statement appears within parentheses immediately after the try keyword. The class BufferedReader, in Java SE 7 and later, implements the interface java.lang.AutoCloseable. Because the BufferedReader instance is declared in a try-with-resource statement, it will be closed regardless of whether the try statement completes normally or abruptly (as a result of the method BufferedReader.readLine throwing an IOException).
This question already has answers here:
Why is declaration required in Java's try-with-resource
(3 answers)
Closed 9 years ago.
try(PrintWriter f = new PrintWriter(new BufferedWriter(new FileWriter("abc.txt")));)
{}
catch(IOException ex)
{
ex.printStackTrace();
}
Above works fine. But when I do
PrintWriter f;
try(f = new PrintWriter(new BufferedWriter(new FileWriter("abc.txt")));)
{}
catch(IOException ex)
{
ex.printStackTrace();
}
It throws errors. Why is it so? I was testing this new feature and I was of the opinion I would take the 2nd method and after the try-catch statement would print the resource PrintWriter f - which should be null if try-with-resource statement works as expected. Why is the 2nd way not allowed?
Also how could I test it by method 1?
Because try-with-resources actually adds the finally block for you in order to close the resources after usage, so they should not be usable anyway (after you leave your try block).
So this code
try(PrintWriter f = new PrintWriter(new BufferedWriter(new FileWriter("abc.txt")));) {
} catch(IOException ex) {
ex.printStackTrace();
}
actually translates into
PrintWriter f = null;
try {
f = new PrintWriter(new BufferedWriter(new FileWriter("abc.txt")));)
// now do something
} catch(IOException ex) {
ex.printStackTrace();
}
finally {
try {
f.close();
catch(IOException ex) {}
}
}
So this was the original purpose, save you from the bloated code and allow you to take care just about try block and leave the rest on JVM. Also see what Oracle docs has to say about this.
The code below, I believe, answers your question, with an unexpected result.
PrintWriter t = null;
try( PrintWriter f = new PrintWriter( new BufferedWriter(
new FileWriter( "abc.txt" ) ) ) ) {
f.println( "bar" );
t = f;
} catch( IOException ex ) {
ex.printStackTrace();
}
System.out.println( t );
t.println( "foo" );
t.close();
Output:
java.io.PrintWriter#1fc4bec
But, nothing is added to the file, as the writer was closed by the try.
Edit: If you want to play with TWR, write a class that implements AutoClosable, for example:
public class Door implements AutoCloseable {
public Door() {
System.out.println( "I'm opening" );
}
public void close() {
System.out.println( "I'm closing" );
}
public static void main( String[] args ) {
try( Door door = new Door() ) { }
}
}
Output:
I'm opening
I'm closing
Not perfectly sure, but doing some sophisticated guesses:
The value of f after the catch block is potentially undefined. Therefore you would have to add all kinds of checks to verify whether the Object has been created, used, and/or is closed. But if you need all those checks, it would be simpler to not use that idiom in the first place.
The JIT can happily optimize code with a block-local variable.
The AutoClosure variable must not be set to a different variable during the try block, but can be afterwards. Maybe that's just too complicated for the JIT to check.