Read/Write File in blackberry - java

I am writing to a file using this code.
protected void writeFile(String text) {
DataOutputStream os = null;
FileConnection fconn = null;
try {
fconn = (FileConnection) Connector.open("file:///store/home/user/documents/file.txt", Connector.READ_WRITE);
if (!fconn.exists())
fconn.create();
os = fconn.openDataOutputStream();
os.write(text.getBytes());
} catch (IOException e) {
System.out.println(e.getMessage());
} finally {
try {
if (null != os)
os.close();
if (null != fconn)
fconn.close();
} catch (IOException e) {
System.out.println(e.getMessage());
}
}}
the code is working fine.
My problem is if I write first time "Banglore" and when I read it, I get "Banglore".
But, second time when I write "India" and when I read it, I get, "Indialore".
so, basically its content is not changing according the text , I am giving.
Please tell me how to fix this.

writing in a file doesn't remove the content but it just replaces the content, so writing 'india' over 'Bangalore' will replace the 'Banga' with 'India' and the rest would remain the same. If you want to completely remove old content with newer one, you need to truncate()
the file from where the newer data ends. truncate(text.getBytes().length)

Related

Writing from Stream<String> to PrintWriter drops lines [duplicate]

For some reason my String is written partially by PrintWriter. As a result I am getting partial text in my file. Here's the method:
public void new_file_with_text(String text, String fname) {
File f = null;
try {
f = new File(fname);
f.createNewFile();
System.out.println(text);
PrintWriter out = new PrintWriter(f, "UTF-8");
out.print(text);
} catch (IOException e) {
e.printStackTrace();
}
}
Where I print text to a console, I can see that the data is all there, nothing is lost, but apparently part of text is lost when PrintWriter does its job... I am clueless..
You should always Writer#close your streams before you discard your opened streams. This will free some rather expensive system resources that your JVM must quire when opening a file on the file system. If you do not want to close your stream, you can use Writer#flush. This will make your changes visible on the file system without closing the stream. When closing the stream, all data is flushed implicitly.
Streams always buffer data in order to only write to the file system when there is enough data to be written. The stream flushes its data automatically every now and then when it in some way considers the data worth writing. Writing to the file system is an expensive operation (it costs time and system resources) and should therefore only be done if it really is necessary. Therefore, you need to flush your stream's cache manually, if you desire an immediate write.
In general, make sure that you always close streams since they use quite some system resources. Java has some mechanisms for closing streams on garbage collection but these mechanisms should only be seen as a last resort since streams can live for quite some time before they are actually garbage collected. Therefore, always use try {} finally {} to assure that streams get closed, even on exceptions after the opening of a stream. If you do not pay attention to this, you will end up with an IOException signaling that you have opened too many files.
You want to change your code like this:
public void new_file_with_text(String text, String fname) {
File f = null;
try {
f = new File(fname);
f.createNewFile();
System.out.println(text);
PrintWriter out = new PrintWriter(f, "UTF-8");
try {
out.print(text);
} finally {
out.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
Try to use out.flush(); right after the line out.print(text);
Here is a proper way to write in a file :
public void new_file_with_text(String text, String fname) {
try (FileWriter f = new FileWriter(fname)) {
f.write(text);
f.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
I tested you code. You forgot to close the PrintWriter object i.e out.close
try {
f = new File(fname);
f.createNewFile();
System.out.println(text);
PrintWriter out = new PrintWriter(f, "UTF-8");
out.print(text);
out.close(); // <--------------
} catch (IOException e) {
System.out.println(e);
}
You must always close your streams (which will also flush them), in a finally block, or using the Java 7 try-with-resources facility:
PrintWriter out = null;
try {
...
}
finally {
if (out != null) {
out.close();
}
}
or
try (PrintWriter out = new PrintWriter(...)) {
...
}
If you don't close your streams, not only won't everything be flushed to the file, but at some time, your OS will be out of available file descriptors.
You should close your file:
PrintWriter out = new PrintWriter(f, "UTF-8");
try
{
out.print(text);
}
finally
{
try
{
out.close();
}
catch(Throwable t)
{
t.printStackTrace();
}
}

File isn't saved correctly

I am making a save/load feature for the settings in my application. Upon launching the program, it tries to find the file. If it fails, it tries to create a file with default settings (code below)
try (FileWriter fileWriter = new FileWriter(absolutePath))
{
fileWriter.write("theme=light\n");
fileWriter.write("resolution=1280x720\n");
fileWriter.write("printfps=false\n");
System.out.println("Reset settings");
load();
}
catch (FileNotFoundException e)
{
System.out.println("Settings File not found.");
}
catch (IOException e)
{
e.printStackTrace();
}
After it has written this, it goes on to load the file. (calling load() method)
In the load method, the application reads the contents of the file (code below).
try (BufferedReader bufferedReader = new BufferedReader(new FileReader(absolutePath)))
{
String line = bufferedReader.readLine();
System.out.println(line);
while(line != null)
{
if (line.contains("="))
{
String key = line;
String value = line;
while (key.contains("="))
{
key = key.substring(0, key.length() - 1);
}
while (value.contains("="))
{
value = value.substring(1);
}
settings.put(key, value);
}
System.out.println(line);
line = bufferedReader.readLine();
}
System.out.println(settings);
}
However, it returns that the file is empty. After messing with breakpoints, I can confirm that the file is indeed not updated at that point. The rather weird thing is that if I pause the application at a later time, the file seems to contain the text that was written to it, even though the file is not touched later in the program.
This makes me believe that it takes some time for the file to update, thus not updating in time for the load() method. Is this correct, or am I missing something? And is there a workaround?
All help is appreciated :)
You're calling load() before you actually saved the file.
To save the file, call fileWriter.close() or just move the load() call out of the try-with-resource block with the FileWriter:
try (FileWriter fileWriter = new FileWriter(absolutePath))
{
fileWriter.write("theme=light\n");
fileWriter.write("resolution=1280x720\n");
fileWriter.write("printfps=false\n");
}
catch (FileNotFoundException e)
{
System.out.println("Settings File not found.");
}
catch (IOException e)
{
e.printStackTrace();
}
// FileWriter closed now and the file contents saved
System.out.println("Reset settings");
load();

Attempting to overwrite files results in blank files

In the app I am working on right now, part of the functionality is to write data saved on the device to a flash drive connected via a USB-OTG adapter. Specifically, the device is a rooted Motorola Xoom running 4.2.2. I can successfully write files to the drive and read them on my computer. That part works fine. However, when I try to replace existing files with new information, the resulting files come out empty. I even delete the existing files before writing new data. What's weird is that after copying the contents of my internal file to the flash drive, I log the length of the resulting file. It always matches the input file and is always a non-0 number, yet the file still shows up as blank on my computer. Can anyone help with this problem? Relevant code from the AsyncTask that I have doing this work is below.
#Override
protected Void doInBackground(Void... params) {
File[] files = context.getFilesDir().listFiles();
for (File file : files) {
if (file.isFile()) {
List<String> nameSegments = Arrays.asList(file.getName().split(
"_"));
Log.d("source file", "size: " + file.length());
String destinationPath = "/storage/usbdisk0/"
+ nameSegments.get(0) + "/" + nameSegments.get(1) + "/";
File destinationPathFile = new File(destinationPath);
if (!destinationPathFile.mkdirs()) {
destinationPathFile.mkdirs();
}
File destinationFile = new File(destinationPathFile,
nameSegments.get(2));
FileReader fr = null;
FileWriter fw = null;
try {
fr = new FileReader(file);
fw = new FileWriter(destinationFile, false);
int c = fr.read();
while (c != -1) {
fw.write(c);
c = fr.read();
}
fw.flush();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
fr.close();
fw.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Log.d("destination file", "size: " + new File(destinationFile.getPath()).length());
}
}
return null;
}
EDIT:
Per #Simon's suggestion, I added output.flush() to my code. This does not change the result.
EDIT #2:
I did some further testing with this and found something interesting. If I go to Settings->Storage->Unmount USB Storage after writing to the flash drive but before removing it from the OTG adapter, everything works perfectly. However, failing to eject the drive after writing results in the data not being written. What's strange is that the folder structure and file itself are created on the drive, but the file is always empty. One more thing: if I go to a file manager application and open up the file prior to removing the drive, the files all exist as they should. However, even removing the device, plugging it straight back in to the tablet and opening any of the files results in the file looking empty. I can't make heads or tails of this, and this is incredibly frustrating. Can anyone help with this?
EDIT #3:
I also changed to using FileReaders and FileWriters just to wee what would happen. I don't care about efficiency at this point, I simply want file writing to work reliably. This change did not affect the issue. Updated code is posted above.
Try using FileReader.ready() method before your FileReader.read() call,
and ensure if your FileReader really has some bytes in it.
Try this , Used buffered reader for writing
try
{
fw = new FileWriter(destinationFile);
BufferedWriter writer=new BufferedWriter(fw);
writer.append(yourText); // Append can be changed to write or something if you want to overwrite
writer.close();
}
catch (Exception e) {
throw new RuntimeException(e);
}
finally {
if (fw != null) {
try {
fw.flush();
fw.close();
}
catch (IOException e) {
}
I found the solution to my problem. It appears that the Android system buffers some files off of the SD card/flash drive, and then writes them to the flash drive upon eject. The following code after my file operations synchronizes the buffer with the filesystem and allows the flash drive to be immediately removed from the device without data loss. It's worth noting that this DOES require root access; it will not work on a non-rooted device.
try {
Process p = Runtime.getRuntime().exec("su");
DataOutputStream os = new DataOutputStream(p.getOutputStream());
os.writeBytes("sync; sync\n");
os.writeBytes("exit\n");
os.flush();
} catch (Exception e) {
e.printStackTrace();
}
Source of my solution: Android 2.1 programatically unmount SDCard
It sounds like the filesystem is caching your changes, but not actually writing them to the flash drive until you eject it. I don't think there's a way to flush the filesystem cache, so the best solution seems to be just to unmount and then remount the flash drive.

How do I update a File created by openFileOutput

I'm currently building an application where the user will generate data over time and, should he/she has an internet connection, transmit it to the web. However, if he doesn't have web access, I need to store this data in the phone until the user recovers his access, when I'll need to recover this data to be transmitted. However, I'm facing lots of troubles to do this, as per below.
Note: before anything, I'm using a local java-created file because I know no other way to save/restore this data on the device. If you happen to know any other way to store/access this data from within the device please feel free to comment here.
Just for reference,
phantoms is an ArrayList containing objects with the data I need to
store,
Arquivador is the class that I'm using to make my data persistent and to recover it,
Funcionario is the class with the data generated by the program (just a few strings and numbers)
I am able to write a file to the file system through the code below, on my Activity:
try {
arq = new Arquivador();
arq.addFirstObjectInFile(
openFileOutput("dados.jlog", MODE_WORLD_WRITEABLE),
phantoms.get(0));
phantoms.remove(phantoms.get(0));
for (Funcionario func : phantoms) {
arq.addObjectInFile(openFileOutput("dados.jlog", MODE_APPEND),
func);
}
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
}
Here is the code inside Arquivador that adds the data to a file:
public void addObjectInFile(FileOutputStream arquivo,
Object objetoAAdicionar) {
try {
ObjectOutputStream aoos = new ObjectOutputStream(arquivo);
aoos.writeObject(objetoAAdicionar);
aoos.close();
} catch (IOException ioe) {
Log.d(TAG_NAME, "Erro no Appendable OOS.");
}
}
public void addFirstObjectInFile(FileOutputStream arquivo,
Object objetoAAdicionar) {
try {
AppendableObjectOutputStream aoos = new AppendableObjectOutputStream(
arquivo);
aoos.writeObject(objetoAAdicionar);
aoos.close();
} catch (IOException ioe) {
Log.d(TAG_NAME, "Erro no Appendable OOS.");
}
}
You will notice that I'm adding data to persistence in 2 steps, the first Object and the rest of them. This was an idea I saw on this post, here in StackOverflow, to allow appending data to a Java generated file. I have no problem with this code, it works perfectly.
Later on, back on my Activity, the internet connection is detected and I try to recover the file saved on the disk:
phantoms = new ArrayList<Funcionario>();
Object obj = arq.readObjectFromFile(openFileInput("dados.jlog"));
Funcionario func = null;
if (obj instanceof Funcionario) {
func = (Funcionario) obj;
}
while (func != null) {
phantoms.add(func);
arq.removeObjectFromFile(openFileInput("dados.jlog"), func,
getApplicationContext());
func = (Funcionario) arq
.readObjectFromFile(openFileInput("dados.jlog"));
}
The original idea was to read 1 object at a time, then attempt to transmit it and, if successful, erase the object from the file (so it didn't get retransmitted). However, I was having too many error messages with this. Instead, I decided to load all the objects at once, one by one, to see where my problem was more clearly.
Back to the Arquivador class:
public Object readObjectFromFile(FileInputStream arquivo) {
Object retorno = null;
if (arquivo.equals(null)) {
Log.e(TAG_NAME, "FIS is null!");
}
ObjectInputStream ois = null;
try {
ois = new ObjectInputStream(arquivo);
retorno = ois.readObject();
} catch (IOException ioex) {
} catch (ClassNotFoundException e) {
} finally {
try {
if (ois != null) ois.close();
} catch (IOException e) {
}
}
return retorno;
}
public void removeObjectFromFile(FileInputStream arqPrincipal,
Object objetoARemover, Context contexto) {
try {
// Construct the new file that will later be renamed to the original
// filename.
ObjectOutputStream oos = new ObjectOutputStream(
contexto.openFileOutput("dados.jlog.temp",
contexto.MODE_APPEND));
ObjectInputStream ois = new ObjectInputStream(arqPrincipal);
Object obj = null;
// Read from the original file and write to the new
// unless content matches data to be removed.
try {
while ((obj = ois.readObject()) != null) {
if (!(objetoARemover.equals(obj))) {
oos.writeObject(obj);
oos.flush();
}
}
} catch (EOFException eof) {
} finally {
oos.close();
ois.close();
// Delete the original file
File aDeletar = contexto.getFileStreamPath("dados.jlog");
File aRenomear = contexto.getFileStreamPath("dados.jlog.tmp");
if (!aDeletar.delete()) {
return;
} else {
// Rename the new file to the filename the original file
// had.
if (!aRenomear.renameTo(aDeletar)) Log.d(TAG_NAME,
"Error renaming file");
else Log.d(TAG_NAME, "Renaming successful");
}
}
} catch (FileNotFoundException ex) {
ex.printStackTrace();
Log.d(TAG_NAME, "Arquivo não encontrado");
} catch (IOException ex) {
ex.printStackTrace();
Log.d(TAG_NAME, "Erro de entrada/saída");
} catch (ClassNotFoundException e) {
Log.d(TAG_NAME, "Classe Não Encontrada.");
}
}
The method readObjectFromFile() seems to work just fine. I can even convert the read Object to Funcionario class and read its data.
My problems appear when I use removeObjectFromFile(). The idea is to create a temporary file to store objects from "dados.jlog" file other than the one that has been already loaded in the main program, then once this temp file is created the file "dados.jlog" should be deleted and the temporary file should be renamed to replace it.
The first thing I found out to be strange here is that the ois.readobject() keeps throwing an EOFException. While this makes sense, the tutorial I read on the internet doesn't mention this error. In fact, their code indicates that when the readObject() method reaches the EOF, it would return a reference to null, but instead this class throws this EOFException. I handled this exception in the code - though I'm not sure if this would be the right way to do it.
Another thing I find strange is the fact that this code fails to recognize the object that it should NOT copy. When I compare the object read from the file to the one received as argument, no matter what I try ( == , equals(), etc) they seem different objects to the compiler. Funcionario class is serializable has a serialversionUID, so the object read from the file should be identical to the one I stored. Worse than this, these 2 Objects being compared are read from the same file. They should be identical, right?
After creating the temporary file, I try to delete the original file and rename the temporary file. Though this seems to be working, once the removeObjectFromFile() ends the first time, the program is unable to read the data from the file "dados.jlog" again. I can't read the remaining data from the file and the program enters on an endless loop - since the 1st object is never removed from the list in the file.
Please enlighten me with this matter.
Personally I'd use an SQLLite database. Store each object in a row in the database. Once you've successfully transmitted you can remove the row from the database.
You can even reuse most of your code that you've already done. The easiest way to get there from where you are is to use a separate file for each object and store only the filename of the object in the database. You can then iterate over the rows in the database. Each time you transmit an object to your server simply delete that row from the database (and remove the file from the filesystem!). No rows in the database means no objects remain to be transmitted.

file.delete() returns false even though file.exists(), file.canRead(), file.canWrite(), file.canExecute() all return true

I'm trying to delete a file, after writing something in it, with FileOutputStream. This is the code I use for writing:
private void writeContent(File file, String fileContent) {
FileOutputStream to;
try {
to = new FileOutputStream(file);
to.write(fileContent.getBytes());
to.flush();
to.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
As it is seen, I flush and close the stream, but when I try to delete, file.delete() returns false.
I checked before deletion to see if the file exists, and: file.exists(), file.canRead(), file.canWrite(), file.canExecute() all return true. Just after calling these methods I try file.delete() and returns false.
Is there anything I've done wrong?
Another bug in Java. I seldom find them, only my second in my 10 year career. This is my solution, as others have mentioned. I have nether used System.gc(). But here, in my case, it is absolutely crucial. Weird? YES!
finally
{
try
{
in.close();
in = null;
out.flush();
out.close();
out = null;
System.gc();
}
catch (IOException e)
{
logger.error(e.getMessage());
e.printStackTrace();
}
}
It was pretty odd the trick that worked. The thing is when I have previously read the content of the file, I used BufferedReader. After reading, I closed the buffer.
Meanwhile I switched and now I'm reading the content using FileInputStream. Also after finishing reading I close the stream. And now it's working.
The problem is I don't have the explanation for this.
I don't know BufferedReader and FileOutputStream to be incompatible.
I tried this simple thing and it seems to be working.
file.setWritable(true);
file.delete();
It works for me.
If this does not work try to run your Java application with sudo if on linux and as administrator when on windows. Just to make sure Java has rights to change the file properties.
Before trying to delete/rename any file, you must ensure that all the readers or writers (for ex: BufferedReader/InputStreamReader/BufferedWriter) are properly closed.
When you try to read/write your data from/to a file, the file is held by the process and not released until the program execution completes. If you want to perform the delete/rename operations before the program ends, then you must use the close() method that comes with the java.io.* classes.
As Jon Skeet commented, you should close your file in the finally {...} block, to ensure that it's always closed. And, instead of swallowing the exceptions with the e.printStackTrace, simply don't catch and add the exception to the method signature. If you can't for any reason, at least do this:
catch(IOException ex) {
throw new RuntimeException("Error processing file XYZ", ex);
}
Now, question number #2:
What if you do this:
...
to.close();
System.out.println("Please delete the file and press <enter> afterwards!");
System.in.read();
...
Would you be able to delete the file?
Also, files are flushed when they're closed. I use IOUtils.closeQuietly(...), so I use the flush method to ensure that the contents of the file are there before I try to close it (IOUtils.closeQuietly doesn't throw exceptions). Something like this:
...
try {
...
to.flush();
} catch(IOException ex) {
throw new CannotProcessFileException("whatever", ex);
} finally {
IOUtils.closeQuietly(to);
}
So I know that the contents of the file are in there. As it usually matters to me that the contents of the file are written and not if the file could be closed or not, it really doesn't matter if the file was closed or not. In your case, as it matters, I would recommend closing the file yourself and treating any exceptions according.
There is no reason you should not be able to delete this file. I would look to see who has a hold on this file. In unix/linux, you can use the lsof utility to check which process has a lock on the file. In windows, you can use process explorer.
for lsof, it's as simple as saying:
lsof /path/and/name/of/the/file
for process explorer you can use the find menu and enter the file name to show you the handle which will point you to the process locking the file.
here is some code that does what I think you need to do:
FileOutputStream to;
try {
String file = "/tmp/will_delete.txt";
to = new FileOutputStream(file );
to.write(new String("blah blah").getBytes());
to.flush();
to.close();
File f = new File(file);
System.out.print(f.delete());
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
It works fine on OS X. I haven't tested it on windows but I suspect it should work on Windows too. I will also admit seeing some unexpected behavior on Windows w.r.t. file handling.
If you are working in Eclipse IDE, that could mean that you haven't close the file in the previous launch of the application. When I had the same error message at trying to delete a file, that was the reason. It seems, Eclipse IDE doesn't close all files after termination of an application.
Hopefully this will help. I came across similar problem where i couldn't delete my file after my java code made a copy of the content to the other folder. After extensive googling, i explicitly declared every single file operation related variables and called the close() method of each file operation object, and set them to NULL. Then, there is a function called System.gc(), which will clear up the file i/o mapping (i'm not sure, i just tell what is given on the web sites).
Here is my example code:
public void start() {
File f = new File(this.archivePath + "\\" + this.currentFile.getName());
this.Copy(this.currentFile, f);
if(!this.currentFile.canWrite()){
System.out.println("Write protected file " +
this.currentFile.getAbsolutePath());
return;
}
boolean ok = this.currentFile.delete();
if(ok == false){
System.out.println("Failed to remove " + this.currentFile.getAbsolutePath());
return;
}
}
private void Copy(File source, File dest) throws IOException {
FileInputStream fin;
FileOutputStream fout;
FileChannel cin = null, cout = null;
try {
fin = new FileInputStream(source);
cin = fin.getChannel();
fout = new FileOutputStream(dest);
cout = fout.getChannel();
long size = cin.size();
MappedByteBuffer buf = cin.map(FileChannel.MapMode.READ_ONLY, 0, size);
cout.write(buf);
buf.clear();
buf = null;
cin.close();
cin = null;
fin.close();
fin = null;
cout.close();
cout = null;
fout.close();
fout = null;
System.gc();
} catch (Exception e){
this.message = e.getMessage();
e.printStackTrace();
}
}
the answer is when you load the file, you need apply the "close" method, in any line of code, works to me
There was a problem once in ruby where files in windows needed an "fsync" to actually be able to turn around and re-read the file after writing it and closing it. Maybe this is a similar manifestation (and if so, I think a windows bug, really).
None of the solutions listed here worked in my situation. My solution was to use a while loop, attempting to delete the file, with a 5 second (configurable) limit for safety.
File f = new File("/path/to/file");
int limit = 20; //Only try for 5 seconds, for safety
while(!f.delete() && limit > 0){
synchronized(this){
try {
this.wait(250); //Wait for 250 milliseconds
} catch (InterruptedException e) {
e.printStackTrace();
}
}
limit--;
}
Using the above loop worked without having to do any manual garbage collecting or setting the stream to null, etc.
The problem could be that the file is still seen as opened and locked by a program; or maybe it is a component from your program that it had been opened in, so you have to ensure you use the dispose() method to solve that problem.
i.e. JFrame frame;
....
frame.dispose();
You have to close all of the streams or use try-with-resource block
static public String head(File file) throws FileNotFoundException, UnsupportedEncodingException, IOException
{
final String readLine;
try (FileInputStream fis = new FileInputStream(file);
InputStreamReader isr = new InputStreamReader(fis, "UTF-8");
LineNumberReader lnr = new LineNumberReader(isr))
{
readLine = lnr.readLine();
}
return readLine;
}
if file.delete() is sending false then in most of the cases your Bufferedreader handle will not be closed. Just close and it seems to work for me normally.
I had the same problem on Windows. I used to read the file in scala line by line with
Source.fromFile(path).getLines()
Now I read it as a whole with
import org.apache.commons.io.FileUtils._
// encoding is null for platform default
val content=readFileToString(new File(path),null.asInstanceOf[String])
which closes the file properly after reading and now
new File(path).delete
works.
FOR Eclipse/NetBeans
Restart your IDE and run your code again this is only trick work for me after one hour long struggle.
Here is my code:
File file = new File("file-path");
if(file.exists()){
if(file.delete()){
System.out.println("Delete");
}
else{
System.out.println("not delete");
}
}
Output:
Delete
Another corner case that this could happen: if you read/write a JAR file through a URL and later try to delete the same file within the same JVM session.
File f = new File("/tmp/foo.jar");
URL j = f.toURI().toURL();
URL u = new URL("jar:" + j + "!/META-INF/MANIFEST.MF");
URLConnection c = u.openConnection();
// open a Jar entry in auto-closing manner
try (InputStream i = c.getInputStream()) {
// just read some stuff; for demonstration purposes only
byte[] first16 = new byte[16];
i.read(first16);
System.out.println(new String(first16));
}
// ...
// i is now closed, so we should be good to delete the jar; but...
System.out.println(f.delete()); // says false!
Reason is that the internal JAR file handling logic of Java, tends to cache JarFile entries:
// inner class of `JarURLConnection` that wraps the actual stream returned by `getInputStream()`
class JarURLInputStream extends FilterInputStream {
JarURLInputStream(InputStream var2) {
super(var2);
}
public void close() throws IOException {
try {
super.close();
} finally {
// if `getUseCaches()` is set, `jarFile` won't get closed!
if (!JarURLConnection.this.getUseCaches()) {
JarURLConnection.this.jarFile.close();
}
}
}
}
And each JarFile (rather, the underlying ZipFile structure) would hold a handle to the file, right from the time of construction up until close() is invoked:
public ZipFile(File file, int mode, Charset charset) throws IOException {
// ...
jzfile = open(name, mode, file.lastModified(), usemmap);
// ...
}
// ...
private static native long open(String name, int mode, long lastModified,
boolean usemmap) throws IOException;
There's a good explanation on this NetBeans issue.
Apparently there are two ways to "fix" this:
You can disable the JAR file caching - for the current URLConnection, or for all future URLConnections (globally) in the current JVM session:
URL u = new URL("jar:" + j + "!/META-INF/MANIFEST.MF");
URLConnection c = u.openConnection();
// for only c
c.setUseCaches(false);
// globally; for some reason this method is not static,
// so we still need to access it through a URLConnection instance :(
c.setDefaultUseCaches(false);
[HACK WARNING!] You can manually purge the JarFile from the cache when you are done with it. The cache manager sun.net.www.protocol.jar.JarFileFactory is package-private, but some reflection magic can get the job done for you:
class JarBridge {
static void closeJar(URL url) throws Exception {
// JarFileFactory jarFactory = JarFileFactory.getInstance();
Class<?> jarFactoryClazz = Class.forName("sun.net.www.protocol.jar.JarFileFactory");
Method getInstance = jarFactoryClazz.getMethod("getInstance");
getInstance.setAccessible(true);
Object jarFactory = getInstance.invoke(jarFactoryClazz);
// JarFile jarFile = jarFactory.get(url);
Method get = jarFactoryClazz.getMethod("get", URL.class);
get.setAccessible(true);
Object jarFile = get.invoke(jarFactory, url);
// jarFactory.close(jarFile);
Method close = jarFactoryClazz.getMethod("close", JarFile.class);
close.setAccessible(true);
//noinspection JavaReflectionInvocation
close.invoke(jarFactory, jarFile);
// jarFile.close();
((JarFile) jarFile).close();
}
}
// and in your code:
// i is now closed, so we should be good to delete the jar
JarBridge.closeJar(j);
System.out.println(f.delete()); // says true, phew.
Please note: All this is based on Java 8 codebase (1.8.0_144); they may not work with other / later versions.

Categories