Apparently I am always calling FileReader#close() and FileWriter#close(), but some of my files remained locked by my own code.
How to
1) close file fully?
2) check, where in the code it was opened and not closed?
The question is vague and is missing context, so it makes it difficult to answer and encourages assumptions, never a good place to start from...
However, if you are doing something similar to...
try {
FileReader fr = new FileReader(new File("..."));
// Read file...
fr.close();
} catch (IOException exp) {
exp.printStackTrace();
}
Then if an exception occurs for some reason (or the code returns before it reaches the close statement), then close will never be called...
Prior to Java 7, one would typically do something like...
FileReader fr = nulll;
try {
fr = new FileReader(new File("..."));
// Read file...
} catch (IOException exp) {
exp.printStackTrace();
} finally {
try {
// Avoid NullPointerException's
if (fr != null) {
fr.close();
}
} catch (Exception exp) {
}
}
This ensures that regardless of what happens between the try-catch, finally will always be called and you can take steps to ensure that the resource is closed.
With Java 7, you can now take advantage of the "try-with-resources" feature...
try (FileReader fr = new FileReader(new File("..."))) {
fr = ;
// Read file...
} catch (IOException exp) {
exp.printStackTrace();
}
Which is basically a short-cutted version of the try-catch-finally example block.
If you are using the FileLock functionality, then you also need to ensure that you releasing the FileLock when you are done with, in a similar fashion to the try-catch-finally example above, but file locking will only ensure that different processes can't read/write the simultaneously, it doesn't protect you against multiple threaded access...
firstly I know I should be using a try-catch with resources, however I don't currently have the most up to date JDK on my system.
I have the following code below, and am trying to ensure the resource reader is closed using the finally block, however the code below doesn't compile for two reasons. Firstly is that reader may have not been initialized and secondly that close() should be caught within its own try-catch. Dont both of these reasons defeat the object of the initial try-catch block?
I can solve the issue with the finally block close() statement by putting it in its own try-catch. However this still leaves the compile error about reader not being initialized?
I'm presuming I have gone wrong somewhere? Help appreciated!
Cheers,
public Path [] getPaths()
{
// Create and initialise ArrayList for paths to be stored in when read
// from file.
ArrayList<Path> pathList = new ArrayList();
BufferedReader reader;
try
{
// Create new buffered read to read lines from file
reader = Files.newBufferedReader(importPathFile);
String line = null;
int i = 0;
// for each line from the file, add to the array list
while((line = reader.readLine()) != null)
{
pathList.add(0, Paths.get(line));
i++;
}
}
catch(IOException e)
{
System.out.println("exception: " + e.getMessage());
}
finally
{
reader.close();
}
// Move contents from ArrayList into Path [] and return function.
Path pathArray [] = new Path[(pathList.size())];
for(int i = 0; i < pathList.size(); i++)
{
pathArray[i] = Paths.get(pathList.get(i).toString());
}
return pathArray;
}
There is no other way then initialize your buffer and catch the exception. The compiler is always right.
BufferedReader reader = null;
try {
// do stuff
} catch(IOException e) {
// handle
} finally {
if(reader != null) {
try {
reader.close();
} catch(IOException e1) {
// handle or forget about it
}
}
}
The method close will always need a try-catch-block since it declares that it could throw an IOException. It doesn't matter if the call is in a finally block or somewhere else. It just needs to be handled. It is a checked exception.
Read must also be initialized just by null. IMHO this is super useless, but that's Java. That is how it works.
Instead check if reader is null or not and then close it accordingly like below (you should call close() on reader only if it's not null or if it's been already instantiated else you will end up getting null reference exception).
finally
{
if(reader != null)
{
reader.close();
}
}
I'm pretty new to java and i still have alot to learn. I'm trying to output the data within a variable to a text file, and I'm not sure why this will not work. Could anyone help me out?
if ("Y".equals(output_to_file)) {
System.out.print("You selected Yes");
PrintStream out = null;
try {
out = new PrintStream(new FileOutputStream("filename.txt"));
out.print(first_input);
}
finally {
if (out != null) out.close();
}
}
else System.out.print("You selected No");
"(new FileOutputStream("filename.txt"))" is underlined red, and it says: Unhandled exception: java.io.FileNotFoundException
Thanks for your help!
Anytime you're doing file operations, there is the possiblity that a FileNotFoundException will be thrown. Therefore, Java wants you to tell it what to do in the event that one is thrown. Thus, you need to add a catch clause for the possible FileNotFoundException. You already have a try block, so you simply need to add a catch clause before your finally clause:
try {
out = new PrintStream(new FileOutputStream("filename.txt"));
out.print(first_input);
}
catch(FileNotFoundException e) {
//do something in the event that a FNFE is thrown
}
finally {
if (out != null) out.close();
}
}
I'm reading a URL with the following code:
URL myURL = new URL("htpp://path_to_my_file");
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(myURL.openStream()));
while (reader.ready()) {
String line = reader.readLine();
...
}
} catch (IOException e) {
throw new RuntimeException("Parsing of file failed: " + myURL, e);
}
Could it happen, that the file is not read completely? (because of network problems or something else?). If yes, is there a way to test it or even to avoid?
The background: I'm working on an application (not written by me up to this point) and users report me that parts of files are missing sometimes. It happens sporadically so my only guess was that something sometimes fails when the file is read in but I have too few java-background to be sure...
Yes, you'll know it's happened when you get an IOException as per the Reader.readLine docs.
So you'll want to catch the Exception, something like this:
try {
while (reader.ready()) {
String line = reader.readLine();
}
}
catch(IOException e) {
// Bah! Humbug!
// Should really log this too. So if you're using Log4j:
log.error("Error reading from URL " + myURL.toString(), e);
} finally {
try { if (reader != null) reader.close(); }catch(Exception e){}
}
Somewhere here, I found the following comment:
ready() != has more
ready() does not indicate that there is more data to be read. It only shows if a read will could block the thread. It is likely that it will return false before you read all data.
To find out if there is no more data check if readLine() returns null
It sounds that the implementation with reader.ready() causes my problem. Am I wrong with this assumption?
I'm trying to delete a file, after writing something in it, with FileOutputStream. This is the code I use for writing:
private void writeContent(File file, String fileContent) {
FileOutputStream to;
try {
to = new FileOutputStream(file);
to.write(fileContent.getBytes());
to.flush();
to.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
As it is seen, I flush and close the stream, but when I try to delete, file.delete() returns false.
I checked before deletion to see if the file exists, and: file.exists(), file.canRead(), file.canWrite(), file.canExecute() all return true. Just after calling these methods I try file.delete() and returns false.
Is there anything I've done wrong?
Another bug in Java. I seldom find them, only my second in my 10 year career. This is my solution, as others have mentioned. I have nether used System.gc(). But here, in my case, it is absolutely crucial. Weird? YES!
finally
{
try
{
in.close();
in = null;
out.flush();
out.close();
out = null;
System.gc();
}
catch (IOException e)
{
logger.error(e.getMessage());
e.printStackTrace();
}
}
It was pretty odd the trick that worked. The thing is when I have previously read the content of the file, I used BufferedReader. After reading, I closed the buffer.
Meanwhile I switched and now I'm reading the content using FileInputStream. Also after finishing reading I close the stream. And now it's working.
The problem is I don't have the explanation for this.
I don't know BufferedReader and FileOutputStream to be incompatible.
I tried this simple thing and it seems to be working.
file.setWritable(true);
file.delete();
It works for me.
If this does not work try to run your Java application with sudo if on linux and as administrator when on windows. Just to make sure Java has rights to change the file properties.
Before trying to delete/rename any file, you must ensure that all the readers or writers (for ex: BufferedReader/InputStreamReader/BufferedWriter) are properly closed.
When you try to read/write your data from/to a file, the file is held by the process and not released until the program execution completes. If you want to perform the delete/rename operations before the program ends, then you must use the close() method that comes with the java.io.* classes.
As Jon Skeet commented, you should close your file in the finally {...} block, to ensure that it's always closed. And, instead of swallowing the exceptions with the e.printStackTrace, simply don't catch and add the exception to the method signature. If you can't for any reason, at least do this:
catch(IOException ex) {
throw new RuntimeException("Error processing file XYZ", ex);
}
Now, question number #2:
What if you do this:
...
to.close();
System.out.println("Please delete the file and press <enter> afterwards!");
System.in.read();
...
Would you be able to delete the file?
Also, files are flushed when they're closed. I use IOUtils.closeQuietly(...), so I use the flush method to ensure that the contents of the file are there before I try to close it (IOUtils.closeQuietly doesn't throw exceptions). Something like this:
...
try {
...
to.flush();
} catch(IOException ex) {
throw new CannotProcessFileException("whatever", ex);
} finally {
IOUtils.closeQuietly(to);
}
So I know that the contents of the file are in there. As it usually matters to me that the contents of the file are written and not if the file could be closed or not, it really doesn't matter if the file was closed or not. In your case, as it matters, I would recommend closing the file yourself and treating any exceptions according.
There is no reason you should not be able to delete this file. I would look to see who has a hold on this file. In unix/linux, you can use the lsof utility to check which process has a lock on the file. In windows, you can use process explorer.
for lsof, it's as simple as saying:
lsof /path/and/name/of/the/file
for process explorer you can use the find menu and enter the file name to show you the handle which will point you to the process locking the file.
here is some code that does what I think you need to do:
FileOutputStream to;
try {
String file = "/tmp/will_delete.txt";
to = new FileOutputStream(file );
to.write(new String("blah blah").getBytes());
to.flush();
to.close();
File f = new File(file);
System.out.print(f.delete());
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
It works fine on OS X. I haven't tested it on windows but I suspect it should work on Windows too. I will also admit seeing some unexpected behavior on Windows w.r.t. file handling.
If you are working in Eclipse IDE, that could mean that you haven't close the file in the previous launch of the application. When I had the same error message at trying to delete a file, that was the reason. It seems, Eclipse IDE doesn't close all files after termination of an application.
Hopefully this will help. I came across similar problem where i couldn't delete my file after my java code made a copy of the content to the other folder. After extensive googling, i explicitly declared every single file operation related variables and called the close() method of each file operation object, and set them to NULL. Then, there is a function called System.gc(), which will clear up the file i/o mapping (i'm not sure, i just tell what is given on the web sites).
Here is my example code:
public void start() {
File f = new File(this.archivePath + "\\" + this.currentFile.getName());
this.Copy(this.currentFile, f);
if(!this.currentFile.canWrite()){
System.out.println("Write protected file " +
this.currentFile.getAbsolutePath());
return;
}
boolean ok = this.currentFile.delete();
if(ok == false){
System.out.println("Failed to remove " + this.currentFile.getAbsolutePath());
return;
}
}
private void Copy(File source, File dest) throws IOException {
FileInputStream fin;
FileOutputStream fout;
FileChannel cin = null, cout = null;
try {
fin = new FileInputStream(source);
cin = fin.getChannel();
fout = new FileOutputStream(dest);
cout = fout.getChannel();
long size = cin.size();
MappedByteBuffer buf = cin.map(FileChannel.MapMode.READ_ONLY, 0, size);
cout.write(buf);
buf.clear();
buf = null;
cin.close();
cin = null;
fin.close();
fin = null;
cout.close();
cout = null;
fout.close();
fout = null;
System.gc();
} catch (Exception e){
this.message = e.getMessage();
e.printStackTrace();
}
}
the answer is when you load the file, you need apply the "close" method, in any line of code, works to me
There was a problem once in ruby where files in windows needed an "fsync" to actually be able to turn around and re-read the file after writing it and closing it. Maybe this is a similar manifestation (and if so, I think a windows bug, really).
None of the solutions listed here worked in my situation. My solution was to use a while loop, attempting to delete the file, with a 5 second (configurable) limit for safety.
File f = new File("/path/to/file");
int limit = 20; //Only try for 5 seconds, for safety
while(!f.delete() && limit > 0){
synchronized(this){
try {
this.wait(250); //Wait for 250 milliseconds
} catch (InterruptedException e) {
e.printStackTrace();
}
}
limit--;
}
Using the above loop worked without having to do any manual garbage collecting or setting the stream to null, etc.
The problem could be that the file is still seen as opened and locked by a program; or maybe it is a component from your program that it had been opened in, so you have to ensure you use the dispose() method to solve that problem.
i.e. JFrame frame;
....
frame.dispose();
You have to close all of the streams or use try-with-resource block
static public String head(File file) throws FileNotFoundException, UnsupportedEncodingException, IOException
{
final String readLine;
try (FileInputStream fis = new FileInputStream(file);
InputStreamReader isr = new InputStreamReader(fis, "UTF-8");
LineNumberReader lnr = new LineNumberReader(isr))
{
readLine = lnr.readLine();
}
return readLine;
}
if file.delete() is sending false then in most of the cases your Bufferedreader handle will not be closed. Just close and it seems to work for me normally.
I had the same problem on Windows. I used to read the file in scala line by line with
Source.fromFile(path).getLines()
Now I read it as a whole with
import org.apache.commons.io.FileUtils._
// encoding is null for platform default
val content=readFileToString(new File(path),null.asInstanceOf[String])
which closes the file properly after reading and now
new File(path).delete
works.
FOR Eclipse/NetBeans
Restart your IDE and run your code again this is only trick work for me after one hour long struggle.
Here is my code:
File file = new File("file-path");
if(file.exists()){
if(file.delete()){
System.out.println("Delete");
}
else{
System.out.println("not delete");
}
}
Output:
Delete
Another corner case that this could happen: if you read/write a JAR file through a URL and later try to delete the same file within the same JVM session.
File f = new File("/tmp/foo.jar");
URL j = f.toURI().toURL();
URL u = new URL("jar:" + j + "!/META-INF/MANIFEST.MF");
URLConnection c = u.openConnection();
// open a Jar entry in auto-closing manner
try (InputStream i = c.getInputStream()) {
// just read some stuff; for demonstration purposes only
byte[] first16 = new byte[16];
i.read(first16);
System.out.println(new String(first16));
}
// ...
// i is now closed, so we should be good to delete the jar; but...
System.out.println(f.delete()); // says false!
Reason is that the internal JAR file handling logic of Java, tends to cache JarFile entries:
// inner class of `JarURLConnection` that wraps the actual stream returned by `getInputStream()`
class JarURLInputStream extends FilterInputStream {
JarURLInputStream(InputStream var2) {
super(var2);
}
public void close() throws IOException {
try {
super.close();
} finally {
// if `getUseCaches()` is set, `jarFile` won't get closed!
if (!JarURLConnection.this.getUseCaches()) {
JarURLConnection.this.jarFile.close();
}
}
}
}
And each JarFile (rather, the underlying ZipFile structure) would hold a handle to the file, right from the time of construction up until close() is invoked:
public ZipFile(File file, int mode, Charset charset) throws IOException {
// ...
jzfile = open(name, mode, file.lastModified(), usemmap);
// ...
}
// ...
private static native long open(String name, int mode, long lastModified,
boolean usemmap) throws IOException;
There's a good explanation on this NetBeans issue.
Apparently there are two ways to "fix" this:
You can disable the JAR file caching - for the current URLConnection, or for all future URLConnections (globally) in the current JVM session:
URL u = new URL("jar:" + j + "!/META-INF/MANIFEST.MF");
URLConnection c = u.openConnection();
// for only c
c.setUseCaches(false);
// globally; for some reason this method is not static,
// so we still need to access it through a URLConnection instance :(
c.setDefaultUseCaches(false);
[HACK WARNING!] You can manually purge the JarFile from the cache when you are done with it. The cache manager sun.net.www.protocol.jar.JarFileFactory is package-private, but some reflection magic can get the job done for you:
class JarBridge {
static void closeJar(URL url) throws Exception {
// JarFileFactory jarFactory = JarFileFactory.getInstance();
Class<?> jarFactoryClazz = Class.forName("sun.net.www.protocol.jar.JarFileFactory");
Method getInstance = jarFactoryClazz.getMethod("getInstance");
getInstance.setAccessible(true);
Object jarFactory = getInstance.invoke(jarFactoryClazz);
// JarFile jarFile = jarFactory.get(url);
Method get = jarFactoryClazz.getMethod("get", URL.class);
get.setAccessible(true);
Object jarFile = get.invoke(jarFactory, url);
// jarFactory.close(jarFile);
Method close = jarFactoryClazz.getMethod("close", JarFile.class);
close.setAccessible(true);
//noinspection JavaReflectionInvocation
close.invoke(jarFactory, jarFile);
// jarFile.close();
((JarFile) jarFile).close();
}
}
// and in your code:
// i is now closed, so we should be good to delete the jar
JarBridge.closeJar(j);
System.out.println(f.delete()); // says true, phew.
Please note: All this is based on Java 8 codebase (1.8.0_144); they may not work with other / later versions.