I am trying to zip directories from one area (sdCard/someFolder) into a second directory (sdCard/Download), until the .zip file size becomes 5mb. Then, I want to create a new .zip file, fill that new one up to 5mb, etc.
Currently, my code compresses files into .zip directories successfully, but one of the .zip directories always becomes corrupt. I see this when my for loop exits the first Files[] of 22 objects and starts the next directory with Files[] of 4 objects. I believe I am losing some cleanup of old OutputStreams. The out.putNextEntry() becomes null after the second attempt of the for loop. Any help will suffice.
private static void addDirToArchive(ZipOutputStream out, FileOutputStream destinationDir, File sdCardMNDLogs)
{
File[] listOfFiles = sdCardMNDLogs.listFiles();
BufferedInputStream origin = null;
Log.i(TAG3, "Reading directory: " + sdCardMNDLogs.getName());
try{
byte[] buffer = new byte[BUFFER];
for(int i = 0; i < listOfFiles.length; i++)
{
if(listOfFiles[i].isDirectory())
{
addDirToArchive(out, destinationDir, listOfFiles[i]);
continue;
}
try
{
FileInputStream fis = new FileInputStream(listOfFiles[i]);
origin = new BufferedInputStream(fis,BUFFER);
ZipEntry ze = new ZipEntry(listOfFiles[i].getName());
if(currentZipFileSize >= EMAIL_SIZE)
{
out.close();
Log.d(emailTAG, "Creating new zipfile: /Download/MND/nwdLogs_" + i);
out = new ZipOutputStream(new FileOutputStream(new File(sdCard.getAbsolutePath() + "/Download/MND/nwdLogs_ " + i + ".zip")));
currentZipFileSize = 0;
}
out.putNextEntry(ze);
int length;
Log.i(TAG3, "Adding file: " + listOfFiles[i].getName());
while((length = origin.read(buffer, 0, BUFFER)) != -1)
{
out.write(buffer, 0, length);
}
out.closeEntry();
origin.close();
currentZipFileSize = currentZipFileSize + ze.getCompressedSize();
}
catch(IOException ioe)
{
Log.e(TAG3, "IOException: " + ioe);
}
}
}
finally
{
try {
out.close();
} catch (IOException e)
{
e.printStackTrace();
}
}
}
FileOutputStream destinationDir = new FileOutputStream(sdCard.getAbsolutePath() + "/Download/Dir/nwdLogs.zip");
ZipOutputStream out = new ZipOutputStream(destinationDir);
currentZipFileSize = 0;
addDirToArchive(out, destinationDir, dirName);
out.close();
destinationDir.close();
I suspect that the problem is that you aren't calling out.close() before opening the next ZIP file. My understanding is that ZIP's index is only written when the ZIP is closed, so if you neglect to close the index will be missing: hence corruption.
Also, note that you don't need to close both fis and origin. Just close origin ... and it closes fis.
UPDATE - While you have fixed the original close bug, there are more:
You have added a finally block to close out. That is wrong. You don't want addDirToArchive to close out. That's the likely cause of your exceptions.
There are a couple of problems that happen after you have done this:
if (currentZipFileSize >= EMAIL_SIZE)
{
out.close();
out = new ZipOutputStream(new FileOutputStream(...));
currentZipFileSize = 0;
}
Since out is a local parameter, the caller does not see the change
you make. Therefore:
when you call out.close() in the caller, you could be closing
the original ZIP (already closed) ... not the current one
if you called addDirToArchive(out, destinationDir, dirName)
multiple times, in subsequent calls you could be passing a closed ZIP file.
Your exception handling is misguided (IMO). If there is an I/O error writing a file to the ZIP, you do NOT want log a message and keep going. You want to bail out. Either crash the app entirely, or stop doing what you are doing. In this case, you "stream is closed" is clearly a bug in your code, and your exception handling is effectively telling the app to ignore it.
Some advice:
If you are splitting the responsibility for opening and closing resources across multiple methods you need to be VERY careful about what code has responsibility for closing what. You need to understand what you are doing.
Blindly applying (so called) "solutions" (like the finally stuff) ... 'cos someone says "XXX is best practice" or "always do XXX"... is going to get you into trouble. You need to 1) understand what the "solution" does, and 2) think about whether the solution actually does what you need.
Related
Sample code is below. It will copy the target files and directory from one location to another. What's considered best practice for handling IO Exceptions while coping files across a network?
I used printStackTrace() but feel like this is just a place holder for a better solution. Is logging the answer and should there be another step beyond logging to actually "handle" an error?
Thank you for you feedback.
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
/**
This is a test program to copy a directory(s) & file(s) from one location to another.
*/
public class CopyTest{
public static void main(String[] args) {
//Declarations
String sourcePath = "I:\\MB\\PO";
String destPath = "C:\\testPO\\";
System.out.println("Source path: " + sourcePath);
System.out.println("Destination path: " + destPath);
File source = new File(sourcePath);
File dest = new File(destPath);
//Process
//Call to method copyUsingStream
long start = System.nanoTime(); //start recording how much time the copy takes.
copyUsingStream(source, dest); //method to copy the directory/files.
System.out.println("Time taken to copy the file: "+(System.nanoTime() -start) + " nanoseconds");
} //end main method
/**
The copyUsingStream method is a recursive method to copy folders and files from one location to another.
*/
private static void copyUsingStream(File source, File dest) {
if (!source.isDirectory()){
// If source is a file -> copy it to the new folder
InputStream inStream = null;
OutputStream outStream = null;
try {
inStream = new FileInputStream(source);
outStream = new FileOutputStream(dest);
byte[] buffer = new byte[1024];
int length;
while ((length = inStream.read(buffer)) > 0) {
outStream.write(buffer, 0, length);
}
} catch(IOException ioe) {
ioe.printStackTrace();
} finally {
try{
inStream.close();
outStream.close();
System.out.println("File copied from " + source + " to " + dest + "successfully");
} catch(IOException ioe2) {
ioe2.printStackTrace();
}
}
} else {
//If a directory -> create the directory inside the new destination
//List all contents
if (!dest.exists()) {
dest.mkdir();
System.out.println("Directory copied from " + source + " to " + dest + "successfully");
}
String folder_contents[] = source.list();
for (String file : folder_contents) {
File srcFile = new File(source, file);
File destFile = new File(dest, file);
copyUsingStream(srcFile, destFile);
}
}
} //end method copyUsingStream
} //end class CopyTest
Method without the catches:
private static void copyUsingStream(File source, File dest) throws IOException {
if (!source.isDirectory()){
// If source is a file -> copy it to the new folder
InputStream inStream = null;
OutputStream outStream = null;
try {
inStream = new FileInputStream(source);
outStream = new FileOutputStream(dest);
byte[] buffer = new byte[1024];
int length;
while ((length = inStream.read(buffer)) > 0) {
outStream.write(buffer, 0, length);
}
} finally {
inStream.close();
outStream.close();
System.out.println("File copied from " + source + " to " + dest + "successfully");
}
} else {
//If a directory -> create the directory inside the new destination
//List all contents
if (!dest.exists()) {
dest.mkdir();
System.out.println("Directory copied from " + source + " to " + dest + "successfully");
}
String folder_contents[] = source.list();
for (String file : folder_contents) {
File srcFile = new File(source, file);
File destFile = new File(dest, file);
copyUsingStream(srcFile, destFile);
}
}
} //end method copyUsingStream
That depends highly on your application.
Applications that continue running anyway (e.g. web servers, daemons and batch processors) do usually log such errors in a file together with timestamp, thread ID and possibly other helpful information.
I had very good experience with a combination of two log files.
myapp.log receives only important messages, usually warnings and errors. This file is for the regular user and system operator.
debug.log is for the developer. It provides debug messages from the time before an error occurred, but no messages as long everything works fine. To enable this, a memory buffer is required.
If you are interested in that buffer, you may take a look at http://stefanfrings.de/bfUtilities/index.html . The website is german but the library and it's documentation is english.
On a desktop GUI application, when an error aborts the requested operation, it might be nice to show a short error messages in a popup window, and hide the details (stack trace) in an expandable box. Don't forget to tell the user clearly what operation failed. The exception itself may be clear enough to you developer but regular users expect a less technical text. For example: "Loading weather info from service weather.com failed: Connection failed", followed by the stack trace.
For console applications that stop immediately, I prefer so see the stack trace directly on screen as written by printStackTrace().
As Stefan said, it depends on the application.
A good rule of thumb is: Don’t catch an exception unless you are prepared to take a specific action, beyond printing or logging it, or there is no caller to whom you can propagate it.
If you have a general method for copying a file, that method should not make assumptions about why it’s being called. Its job is to copy a file. It should return only if it succeeds in that task. If it does not succeed, it should throw an exception rather than returning.
So for a general copying method, you would want to add throws IOException to the method signature, and have zero try/catch blocks in the method itself. This lets the callers decide how to handle a failure. A GUI application might display an error dialog. A service might just log the exception and try again later.
You yourself should only catch and log an exception at the highest possible level. A GUI application would log it right before displaying the error dialog. (You might also want to include the text of the stack trace in the dialog, in an expandable “Show details” section.) A service might have a main loop or main execution method, where there is no higher caller to whom the exception can be propagated, so there is nothing to do but log it.
I have been created a application which shall extract single files from tar-archive. The application reads the *.tar properly, but when i try to extract files, the application just create new files with correct filename... The files is empty (0kb). So... I probably just create new files instead of extract...
I'm a totally beginner at this point...
for(TarArchiveEntry tae : tarEntries){
System.out.println(tarEntries.size());
try {
fOutput = new FileOutputStream(new File(tae.getFile(), tae.getName()));
byte[] buf = new byte[(int) tae.getSize()];
int len;
while ((len = tarFile.read(buf)) > 0) {
fOutput.write(buf, 0, len);
}
fOutput.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Assuming tarFile is a TarArchiveInputStream you can only read an entry's content right after calling tarFile.getNextTarEntry().
The stream is processed sequentially, so when you invoke getNextTarEntry you skip over the content of the current entry right to the next entry. It looks as if you had read the whole archive in order to fill tarEntries in which case you've already read past the last entry and the stream is exhausted.
I have a directory where I programmatically (in Java) do recursive unzipping (which seems to work), but in the end I'm left with a directory that has a lot of subdirectories and files. Every time I run this method I want to start with a clean slate, so I always delete the folder and its left-over files and subdirectories present in the temp directory.
root = new File(System.getProperty("java.io.tmpdir")+ File.separator + "ProductionTXOnlineCompletionDataPreProcessorRoot");
if(root.exists()){
try {
FileUtils.deleteDirectory(root);
} catch (IOException e) {
e.printStackTrace();
}
}
if(root.mkdir()){
rawFile = createRawDataFile();
}
I'm getting a really strange error from FileUtils.deleteDirectory though.
14:55:27,214 ERROR [stderr] (Thread-3 (HornetQ-client-global-threads-2098205981)) java.io.IOException: Unable to delete directory C:\Users\Admin\AppData\Local\Temp\ProductionTXOnlineCompletionDataPreProcessorRoot\ProductionTXOnlineCompletionDataPreProcessor8718674704286818303.
It seems to think that I have a period at the end of my directory (It doesn't, so it's no surprise that it can't delete it). Sometimes, this error appears on folders in the subdirectory. Has anyone seen this before?
I'm using the Commons IO 2.4 jar.
EDIT I've confirmed that the directories do not have periods, so unless they're invisible, I don't know why the method would think that there are periods. And the File's path that I give the method is set right before feeding it as an argument, and as anyone can see - it doesn't have a period at the end.
I'm running the program on Windows 7.
EDIT This is the code I used for recursively unzipping:
private void extractFolder(String zipFile) throws IOException
{
int BUFFER = 2048;
File file = new File(zipFile);
ZipFile zip = null;
String newPath = zipFile.substring(0, zipFile.length() - 4);
BufferedOutputStream dest = null;
BufferedInputStream is = null;
try{
zip = new ZipFile(zipFile);
Enumeration<? extends ZipEntry> zipFileEntries = zip.entries();
while (zipFileEntries.hasMoreElements())
{
ZipEntry entry = (ZipEntry) zipFileEntries.nextElement();
String currentEntry = entry.getName();
File destFile = new File(newPath, currentEntry);
File destinationParent = destFile.getParentFile();
destinationParent.mkdirs();
if (!entry.isDirectory())
{
is = new BufferedInputStream(zip
.getInputStream(entry));
int currentByte;
byte data[] = new byte[BUFFER];
FileOutputStream fos = new FileOutputStream(destFile);
dest = new BufferedOutputStream(fos, BUFFER);
// read and write until last byte is encountered
while ((currentByte = is.read(data, 0, BUFFER)) != -1) {
dest.write(data, 0, currentByte);
}
dest.flush();
}
if (currentEntry.endsWith(".zip")){
// found a zip file, try to open
extractFolder(destFile.getAbsolutePath());
}
}
}catch(Exception e){
e.printStackTrace();
}finally{
if(dest!=null) {dest.close();}
if(is!=null) {is.close();}
zip.close();
}
}
I put the original zip into the root directory, and recursively unzip from there.
This is the relevant code showing that:
downloadInputStreamToFileInRootDir(in, rawFile);
try {
extractFolder(rawFile.getCanonicalPath());
} catch (IOException e) {
e.printStackTrace();
}catch (Exception e){
}
I just noticed that I use rawFile.getCanonicalPath() (rawFile is set in the first code excerpt) as the argument for extractFolder initially and then switch to destFile.getAbsolutePath() ... Maybe that has something to do with it. The problem with testing this is that the issue isn't deterministic. It sometimes happens and sometimes not.
The period is part of the error message. It's not trying to delete a file path with a period at the end. See the FileUtils source:
if (!directory.delete()) {
final String message =
"Unable to delete directory " + directory + ".";
throw new IOException(message);
}
I get some very odd errors when using org.apache.commons.compress to read embedded archive files and I suspect it's my inexperience that is haunting me.
When running my code I get a variety of truncated zip file errors (along with other truncated file errors). I suspect it's my use of ArchiveInputStream
private final void handleArchive(String fileName, ArchiveInputStream ais) {
ArchiveEntry archiveEntry = null;
try {
while((archiveEntry = ais.getNextEntry()) != null) {
byte[] buffer = new byte[1024];
while(ais.read(buffer) != -1) {
handleFile(fileName + "/" + archiveEntry.getName(), archiveEntry.getSize(), new ByteArrayInputStream(buffer));
} catch(IOException ioe) {
ioe.printStackTrace();
}
}
When I do this archiveEntry = ais.getNextEntry() does this effectively close my ais, and is there any way to read the bytes of embedded archive files using commons compress?
You re doing some wierd stuff it seems? For each archieve entry while your reading your archieve you re recursively calling your read archieve method which results in opening the next archieve while your parent code is still handling your previous archieve.
You should loop entirely through your archieve entry before handling any new archieve entry in your compressed file. Something like
ArArchiveEntry entry = (ArArchiveEntry) arInput.getNextEntry();
byte[] content = new byte[entry.getSize()];
LOOP UNTIL entry.getSize() HAS BEEN READ {
arInput.read(content, offset, content.length - offset);
}
as stated in the examples on the apache site
I have the following situation, within a servlet a create a file and then have to delete it.
When executing the file, I figured out that the file is still in the server, so I tried to remove it manually, I can't, I get the following message :
this file is opened by another program : javaw.exe
Here is my code :
public class GenerateFile extends Action {
public ActionForward execute(ActionMapping mapping, ActionForm form,
HttpServletRequest request, HttpServletResponse response) throws IOException {
System.out.println("ok");
String fileName = request.getParameter("fileName");
Integer nbrParam = Integer.parseInt(request.getParameter("nbrParam"));
String[] valueParam = new String[nbrParam+1];
for(int i =1;i<=nbrParam;i++)
{ System.out.println(request.getParameter("param"+i));
valueParam[i]=request.getParameter("param"+i);
}
FileInputStream in = new FileInputStream("C:\\Users\\free\\Desktop\\myworkspace\\gestionRH\\WebRoot\\fiches\\"+fileName+".doc");
POIFSFileSystem fs = new POIFSFileSystem(in);
HWPFDocument doc = new HWPFDocument(fs);
Range r = doc.getRange();
for(int i=1;i<=nbrParam;i++)
{ System.out.println("<param"+i+">");
System.out.println(valueParam[i]);
r.replaceText("<param"+i+">", valueParam[i]);
}
File file = new File("C:\\Users\\free\\Desktop\\myworkspace\\gestionRH\\WebRoot\\fiches\\temp");
File temp = File.createTempFile("monfile",".doc",file);
String tempName =temp.getName();
doc.write( new FileOutputStream(temp));
OutputStream out = response.getOutputStream();
response.setContentType("application/rtf");
response.setHeader("Content-Disposition","attachment; filename=Decision");
FileInputStream in1 = new FileInputStream(temp);
byte[] buffer = new byte[4096];
int length;
while ((length = in1.read(buffer)) > 0){
out.write(buffer, 0, length);
}
in1.close();
out.flush();
System.out.println("C:\\Users\\free\\Desktop\\myworkspace\\gestionRH\\WebRoot\\fiches\\temp\\"+tempName);
File f = new File("C:\\Users\\free\\Desktop\\myworkspace\\gestionRH\\WebRoot\\fiches\\temp\\"+tempName);
f.delete();
return null;
}
}
You should close all the file-reading object instances. Besides, if you can delete the file manually, you should close java and then delete it, javaw is the process that launches java outside the console.
The problem is you are creating a new FileOutputStream(tempName) to write on that file, but never closing that outputstream (or another outputstream linked to it).
Do this:
FileOutputStream fos = newFileOutputStream(tempName);
// use it
fos.close(); // CLOSE IT!!
// then you can delete the file
Simplify
Maybe you could do the work another way, without temp files...
by example: doc.write(new FileOutputStream(tempName)) could be replaced by:
doc.write(response.getOutputStream());
This way doc sends its bytes directly to where you need them, not to a temp file eliminating the need for it.
The idea behind input/output streams is composing them. Input/OutputStream are the abstract base classes. And there are a lot of implementations:
based on memory: ByteArrayInput/OutputStream
based on files: FileInputOutputStream
compressing/decompressing to another outputstream: GZipInputOutputStream
and so on
The beauty of it is applying decorator pattern to add functionality. By example:
new GZipOutputStream(new ByteArrayOutputStream());
// creates an outputstreams that compress data received and send it to the other stream
// the BAOS then writes the received bytes to memory
new GZipOutputStream(new FileOutputStream());
// it's the same but sending compressed bytes to a file.
Seems like, you are not closing the file(out), thus it remains with the thread of this action, which is restricting it to get deleted.
Hope it helps.
maybe you should try ProcMon to find out what process exactly holds the file opened
For IO features, I would to suggest to use some kind of jar already provided by community.
For example, common-io.x-x.jar, spring-core.jar
Eg, org.apache.commons.io.FileUtils;
FileUtils.copyDirectory(from, to);
FileUtils.deleteDirectory(childDir);
FileUtils.forceDelete(springConfigDir);
FileUtils.writeByteArrayToFile(file, data);
org.springframework.util.FileSystemUtils;
FileSystemUtils.copyRecursively(from, to);
FileSystemUtils.deleteRecursively(dir);
good luck!
Whenever you open a file handler, you should close it. In a Java application that you want to run for a long period of time, you are strongly recommended to close all unused file handlers soon after you finish working with them.
Examples of common file handlers are FileOutputStream and FileInputstream. Here is a good example of how you open and close the FileOutputStream
FileOutputStream fos = null;
try {
fos = new FileOutputStream(tempName);
// do something
} catch (IOException ex) {
// deal with exceptions
} finally {
// close if fos is not null
if (fos != null) {
fos.close();
}
}
You should never do this:
doc.write( new FileOutputStream(temp));
because you can never close the file handler if it has no refernce to it.