Get size of folder or file - java

How can I retrieve size of folder or file in Java?

java.io.File file = new java.io.File("myfile.txt");
file.length();
This returns the length of the file in bytes or 0 if the file does not exist. There is no built-in way to get the size of a folder, you are going to have to walk the directory tree recursively (using the listFiles() method of a file object that represents a directory) and accumulate the directory size for yourself:
public static long folderSize(File directory) {
long length = 0;
for (File file : directory.listFiles()) {
if (file.isFile())
length += file.length();
else
length += folderSize(file);
}
return length;
}
WARNING: This method is not sufficiently robust for production use. directory.listFiles() may return null and cause a NullPointerException. Also, it doesn't consider symlinks and possibly has other failure modes. Use this method.

Using java-7 nio api, calculating the folder size can be done a lot quicker.
Here is a ready to run example that is robust and won't throw an exception. It will log directories it can't enter or had trouble traversing. Symlinks are ignored, and concurrent modification of the directory won't cause more trouble than necessary.
/**
* Attempts to calculate the size of a file or directory.
*
* <p>
* Since the operation is non-atomic, the returned value may be inaccurate.
* However, this method is quick and does its best.
*/
public static long size(Path path) {
final AtomicLong size = new AtomicLong(0);
try {
Files.walkFileTree(path, new SimpleFileVisitor<Path>() {
#Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) {
size.addAndGet(attrs.size());
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult visitFileFailed(Path file, IOException exc) {
System.out.println("skipped: " + file + " (" + exc + ")");
// Skip folders that can't be traversed
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult postVisitDirectory(Path dir, IOException exc) {
if (exc != null)
System.out.println("had trouble traversing: " + dir + " (" + exc + ")");
// Ignore errors traversing a folder
return FileVisitResult.CONTINUE;
}
});
} catch (IOException e) {
throw new AssertionError("walkFileTree will not throw IOException if the FileVisitor does not");
}
return size.get();
}

You need FileUtils#sizeOfDirectory(File) from commons-io.
Note that you will need to manually check whether the file is a directory as the method throws an exception if a non-directory is passed to it.
WARNING: This method (as of commons-io 2.4) has a bug and may throw IllegalArgumentException if the directory is concurrently modified.

In Java 8:
long size = Files.walk(path).mapToLong( p -> p.toFile().length() ).sum();
It would be nicer to use Files::size in the map step but it throws a checked exception.
UPDATE:
You should also be aware that this can throw an exception if some of the files/folders are not accessible. See this question and another solution using Guava.

public static long getFolderSize(File dir) {
long size = 0;
for (File file : dir.listFiles()) {
if (file.isFile()) {
System.out.println(file.getName() + " " + file.length());
size += file.length();
}
else
size += getFolderSize(file);
}
return size;
}

For Java 8 this is one right way to do it:
Files.walk(new File("D:/temp").toPath())
.map(f -> f.toFile())
.filter(f -> f.isFile())
.mapToLong(f -> f.length()).sum()
It is important to filter out all directories, because the length method isn't guaranteed to be 0 for directories.
At least this code delivers the same size information like Windows Explorer itself does.

Here's the best way to get a general File's size (works for directory and non-directory):
public static long getSize(File file) {
long size;
if (file.isDirectory()) {
size = 0;
for (File child : file.listFiles()) {
size += getSize(child);
}
} else {
size = file.length();
}
return size;
}
Edit: Note that this is probably going to be a time-consuming operation. Don't run it on the UI thread.
Also, here (taken from https://stackoverflow.com/a/5599842/1696171) is a nice way to get a user-readable String from the long returned:
public static String getReadableSize(long size) {
if(size <= 0) return "0";
final String[] units = new String[] { "B", "KB", "MB", "GB", "TB" };
int digitGroups = (int) (Math.log10(size)/Math.log10(1024));
return new DecimalFormat("#,##0.#").format(size/Math.pow(1024, digitGroups))
+ " " + units[digitGroups];
}

File.length() (Javadoc).
Note that this doesn't work for directories, or is not guaranteed to work.
For a directory, what do you want? If it's the total size of all files underneath it, you can recursively walk children using File.list() and File.isDirectory() and sum their sizes.

The File object has a length method:
f = new File("your/file/name");
f.length();

If you want to use Java 8 NIO API, the following program will print the size, in bytes, of the directory it is located in.
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
public class PathSize {
public static void main(String[] args) {
Path path = Paths.get(".");
long size = calculateSize(path);
System.out.println(size);
}
/**
* Returns the size, in bytes, of the specified <tt>path</tt>. If the given
* path is a regular file, trivially its size is returned. Else the path is
* a directory and its contents are recursively explored, returning the
* total sum of all files within the directory.
* <p>
* If an I/O exception occurs, it is suppressed within this method and
* <tt>0</tt> is returned as the size of the specified <tt>path</tt>.
*
* #param path path whose size is to be returned
* #return size of the specified path
*/
public static long calculateSize(Path path) {
try {
if (Files.isRegularFile(path)) {
return Files.size(path);
}
return Files.list(path).mapToLong(PathSize::calculateSize).sum();
} catch (IOException e) {
return 0L;
}
}
}
The calculateSize method is universal for Path objects, so it also works for files.
Note that if a file or directory is inaccessible, in this case the returned size of the path object will be 0.

Works for Android and Java
Works for both folders and files
Checks for null pointer everywhere where needed
Ignores symbolic link aka shortcuts
Production ready!
Source code:
public long fileSize(File root) {
if(root == null){
return 0;
}
if(root.isFile()){
return root.length();
}
try {
if(isSymlink(root)){
return 0;
}
} catch (IOException e) {
e.printStackTrace();
return 0;
}
long length = 0;
File[] files = root.listFiles();
if(files == null){
return 0;
}
for (File file : files) {
length += fileSize(file);
}
return length;
}
private static boolean isSymlink(File file) throws IOException {
File canon;
if (file.getParent() == null) {
canon = file;
} else {
File canonDir = file.getParentFile().getCanonicalFile();
canon = new File(canonDir, file.getName());
}
return !canon.getCanonicalFile().equals(canon.getAbsoluteFile());
}

I've tested du -c <folderpath> and is 2x faster than nio.Files or recursion
private static long getFolderSize(File folder){
if (folder != null && folder.exists() && folder.canRead()){
try {
Process p = new ProcessBuilder("du","-c",folder.getAbsolutePath()).start();
BufferedReader r = new BufferedReader(new InputStreamReader(p.getInputStream()));
String total = "";
for (String line; null != (line = r.readLine());)
total = line;
r.close();
p.waitFor();
if (total.length() > 0 && total.endsWith("total"))
return Long.parseLong(total.split("\\s+")[0]) * 1024;
} catch (Exception ex) {
ex.printStackTrace();
}
}
return -1;
}

for windows, using java.io this reccursive function is useful.
public static long folderSize(File directory) {
long length = 0;
if (directory.isFile())
length += directory.length();
else{
for (File file : directory.listFiles()) {
if (file.isFile())
length += file.length();
else
length += folderSize(file);
}
}
return length;
}
This is tested and working properly on my end.

private static long getFolderSize(Path folder) {
try {
return Files.walk(folder)
.filter(p -> p.toFile().isFile())
.mapToLong(p -> p.toFile().length())
.sum();
} catch (IOException e) {
e.printStackTrace();
return 0L;
}

public long folderSize (String directory)
{
File curDir = new File(directory);
long length = 0;
for(File f : curDir.listFiles())
{
if(f.isDirectory())
{
for ( File child : f.listFiles())
{
length = length + child.length();
}
System.out.println("Directory: " + f.getName() + " " + length + "kb");
}
else
{
length = f.length();
System.out.println("File: " + f.getName() + " " + length + "kb");
}
length = 0;
}
return length;
}

After lot of researching and looking into different solutions proposed here at StackOverflow. I finally decided to write my own solution. My purpose is to have no-throw mechanism because I don't want to crash if the API is unable to fetch the folder size. This method is not suitable for mult-threaded scenario.
First of all I want to check for valid directories while traversing down the file system tree.
private static boolean isValidDir(File dir){
if (dir != null && dir.exists() && dir.isDirectory()){
return true;
}else{
return false;
}
}
Second I do not want my recursive call to go into symlinks (softlinks) and include the size in total aggregate.
public static boolean isSymlink(File file) throws IOException {
File canon;
if (file.getParent() == null) {
canon = file;
} else {
canon = new File(file.getParentFile().getCanonicalFile(),
file.getName());
}
return !canon.getCanonicalFile().equals(canon.getAbsoluteFile());
}
Finally my recursion based implementation to fetch the size of the specified directory. Notice the null check for dir.listFiles(). According to javadoc there is a possibility that this method can return null.
public static long getDirSize(File dir){
if (!isValidDir(dir))
return 0L;
File[] files = dir.listFiles();
//Guard for null pointer exception on files
if (files == null){
return 0L;
}else{
long size = 0L;
for(File file : files){
if (file.isFile()){
size += file.length();
}else{
try{
if (!isSymlink(file)) size += getDirSize(file);
}catch (IOException ioe){
//digest exception
}
}
}
return size;
}
}
Some cream on the cake, the API to get the size of the list Files (might be all of files and folder under root).
public static long getDirSize(List<File> files){
long size = 0L;
for(File file : files){
if (file.isDirectory()){
size += getDirSize(file);
} else {
size += file.length();
}
}
return size;
}

in linux if you want to sort directories then du -hs * | sort -h

You can use Apache Commons IO to find the folder size easily.
If you are on maven, please add the following dependency in your pom.xml file.
<!-- https://mvnrepository.com/artifact/commons-io/commons-io -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.6</version>
</dependency>
If not a fan of Maven, download the following jar and add it to the class path.
https://repo1.maven.org/maven2/commons-io/commons-io/2.6/commons-io-2.6.jar
public long getFolderSize() {
File folder = new File("src/test/resources");
long size = FileUtils.sizeOfDirectory(folder);
return size; // in bytes
}
To get file size via Commons IO,
File file = new File("ADD YOUR PATH TO FILE");
long fileSize = FileUtils.sizeOf(file);
System.out.println(fileSize); // bytes
It is also achievable via Google Guava
For Maven, add the following:
<!-- https://mvnrepository.com/artifact/com.google.guava/guava -->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>28.1-jre</version>
</dependency>
If not using Maven, add the following to class path
https://repo1.maven.org/maven2/com/google/guava/guava/28.1-jre/guava-28.1-jre.jar
public long getFolderSizeViaGuava() {
File folder = new File("src/test/resources");
Iterable<File> files = Files.fileTreeTraverser()
.breadthFirstTraversal(folder);
long size = StreamSupport.stream(files.spliterator(), false)
.filter(f -> f.isFile())
.mapToLong(File::length).sum();
return size;
}
To get file size,
File file = new File("PATH TO YOUR FILE");
long s = file.length();
System.out.println(s);

fun getSize(context: Context, uri: Uri?): Float? {
var fileSize: String? = null
val cursor: Cursor? = context.contentResolver
.query(uri!!, null, null, null, null, null)
try {
if (cursor != null && cursor.moveToFirst()) {
// get file size
val sizeIndex: Int = cursor.getColumnIndex(OpenableColumns.SIZE)
if (!cursor.isNull(sizeIndex)) {
fileSize = cursor.getString(sizeIndex)
}
}
} finally {
cursor?.close()
}
return fileSize!!.toFloat() / (1024 * 1024)
}

Related

How to find specific directory and its files according to the keyword passed in java and loading in memory approach

I have a project structure like below:
Now, my problem statement is I have to iterate resources folder, and given a key, I have to find that specific folder and its files.
For that, I have written a below code with the recursive approach but I am not getting the output as intended:
public class ConfigFileReader {
public static void main(String[] args) throws Exception {
System.out.println("Print L");
String path = "C:\\...\\ConfigFileReader\\src\\resources\\";
//FileReader reader = new FileReader(path + "\\Encounter\\Encounter.properties");
//Properties p = new Properties();
//p.load(reader);
File[] files = new File(path).listFiles();
String resourceType = "Encounter";
System.out.println(navigateDirectoriesAndFindTheFile(resourceType, files));
}
public static String navigateDirectoriesAndFindTheFile(String inputResourceString, File[] files) {
String entirePathOfTheIntendedFile = "";
for (File file : files) {
if (file.isDirectory()) {
navigateDirectoriesAndFindTheFile(inputResourceString, file.listFiles());
System.out.println("Directory: " + file.getName());
if (file.getName().startsWith(inputResourceString)) {
entirePathOfTheIntendedFile = file.getPath();
}
} else {
System.out.print("Inside...");
entirePathOfTheIntendedFile = file.getPath();
}
}
return entirePathOfTheIntendedFile;
}
}
Output:
The output should return C:\....\Encounter\Encounter.properties as the path.
First of all, if it finds the string while traversing it should return the file inside that folder and without navigating the further part as well as what is the best way to iterate over suppose 1k files because every time I can't follow this method because it doesn't seem an effective way of doing it. So, how can I use an in-memory approach for this problem? Please guide me through it.
You will need to check the output of recursive call and pass that back when a match is found.
Always use File or Path to handle filenames.
Assuming that I've understood the logic of the search, try this which scans for files of form XXX\XXXyyyy
public class ConfigReader
{
public static void main(String[] args) throws Exception {
System.out.println("Print L");
File path = new File(args[0]).getAbsoluteFile();
String resourceType = "Encounter";
System.out.println(navigateDirectoriesAndFindTheFile(resourceType, path));
}
public static File navigateDirectoriesAndFindTheFile(String inputResourceString, File path) {
File[] files = path.listFiles();
File found = null;
for (int i = 0; found == null && files != null && i < files.length; i++) {
File file = files[i];
if (file.isDirectory()) {
found = navigateDirectoriesAndFindTheFile(inputResourceString, file);
} else if (file.getName().startsWith(inputResourceString) && file.getParentFile().getName().equals(inputResourceString)) {
found = file;
}
}
return found;
}
}
If this is slow especially for 1K of files re-write with Files.walkFileTree which would be much faster than File.list() in recursion.

Apache commons compress 7z file size way bigger than p7zip compression

When I zip 500mb of html files, p7zip does it in a couple of seconds and the filesize is 7mb (Without any custom settings, just 7z a filename.7z /folder).
Thus I expected apache commons compress to also compress, using 7z, to a comparable size. This is however, not the case. Even though I enabled the max presets for apache commons compress 7z. The resulting file size is also huge, close to 100mb.
Do I do something wrong or do I need to tune my presets? I have read the apache commons compress wiki but have not found my answers.
Relevant code for the java implementation :
public static Path compress(String name, List<Path> files) throws IOException {
try (SevenZOutputFile out = new SevenZOutputFile(new File(name))) {
List<SevenZMethodConfiguration> methods = new ArrayList<>();
LZMA2Options lzma2Options = new LZMA2Options();
lzma2Options.setPreset(LZMA2Options.PRESET_MAX);
SevenZMethodConfiguration lzmaConfig =
new SevenZMethodConfiguration(SevenZMethod.LZMA, lzma2Options);
methods.add(lzmaConfig);
out.setContentMethods(methods);
for (Path file : files) {
addToArchiveCompression(out, file, ".");
}
}
return Paths.get(name);
}
private static void addToArchiveCompression(SevenZOutputFile out, Path file,
String dir) throws IOException {
String name = dir + File.separator + file.getFileName();
if (Files.isRegularFile(file)) {
SevenZArchiveEntry entry = out.createArchiveEntry(file.toFile(), name);
out.putArchiveEntry(entry);
FileInputStream in = new FileInputStream(file.toFile());
byte[] b = new byte[1024];
int count = 0;
while ((count = in.read(b)) > 0) {
out.write(b, 0, count);
}
out.closeArchiveEntry();
} else if (Files.isDirectory(file)) {
File[] children = file.toFile().listFiles();
if (children != null) {
for (File child : children) {
addToArchiveCompression(out, Paths.get(child.toURI()), name);
}
}
} else {
System.out.println(file.getFileName() + " is not supported");
}
}
Could you please try to remove these lines:
List<SevenZMethodConfiguration> methods = new ArrayList<>();
LZMA2Options lzma2Options = new LZMA2Options();
lzma2Options.setPreset(LZMA2Options.PRESET_MAX);
SevenZMethodConfiguration lzmaConfig =
new SevenZMethodConfiguration(SevenZMethod.LZMA, lzma2Options);
methods.add(lzmaConfig);
out.setContentMethods(methods);

Delete files older than x days

How can I find out when a file was created using java, as I wish to delete files older than a certain time period, currently I am deleting all files in a directory, but this is not ideal:
public void DeleteFiles() {
File file = new File("D:/Documents/NetBeansProjects/printing~subversion/fileupload/web/resources/pdf/");
System.out.println("Called deleteFiles");
DeleteFiles(file);
File file2 = new File("D:/Documents/NetBeansProjects/printing~subversion/fileupload/Uploaded/");
DeleteFilesNonPdf(file2);
}
public void DeleteFiles(File file) {
System.out.println("Now will search folders and delete files,");
if (file.isDirectory()) {
for (File f : file.listFiles()) {
DeleteFiles(f);
}
} else {
file.delete();
}
}
Above is my current code, I am trying now to add an if statement in that will only delete files older than say a week.
EDIT:
#ViewScoped
#ManagedBean
public class Delete {
public void DeleteFiles() {
File file = new File("D:/Documents/NetBeansProjects/printing~subversion/fileupload/web/resources/pdf/");
System.out.println("Called deleteFiles");
DeleteFiles(file);
File file2 = new File("D:/Documents/NetBeansProjects/printing~subversion/fileupload/Uploaded/");
DeleteFilesNonPdf(file2);
}
public void DeleteFiles(File file) {
System.out.println("Now will search folders and delete files,");
if (file.isDirectory()) {
System.out.println("Date Modified : " + file.lastModified());
for (File f : file.listFiles()) {
DeleteFiles(f);
}
} else {
file.delete();
}
}
Adding a loop now.
EDIT
I have noticed while testing the code above I get the last modified in :
INFO: Date Modified : 1361635382096
How should I code the if loop to say if it is older than 7 days delete it when it is in the above format?
You can use File.lastModified() to get the last modified time of a file/directory.
Can be used like this:
long diff = new Date().getTime() - file.lastModified();
if (diff > x * 24 * 60 * 60 * 1000) {
file.delete();
}
Which deletes files older than x (an int) days.
Commons IO has built-in support for filtering files by age with its AgeFileFilter. Your DeleteFiles could just look like this:
import java.io.File;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.filefilter.AgeFileFilter;
import static org.apache.commons.io.filefilter.TrueFileFilter.TRUE;
// a Date defined somewhere for the cutoff date
Date thresholdDate = <the oldest age you want to keep>;
public void DeleteFiles(File file) {
Iterator<File> filesToDelete =
FileUtils.iterateFiles(file, new AgeFileFilter(thresholdDate), TRUE);
for (File aFile : filesToDelete) {
aFile.delete();
}
}
Update: To use the value as given in your edit, define the thresholdDate as:
Date tresholdDate = new Date(1361635382096L);
Using Apache utils is probably the easiest. Here is the simplest solution I could come up with.
public void deleteOldFiles() {
Date oldestAllowedFileDate = DateUtils.addDays(new Date(), -3); //minus days from current date
File targetDir = new File("C:\\TEMP\\archive\\");
Iterator<File> filesToDelete = FileUtils.iterateFiles(targetDir, new AgeFileFilter(oldestAllowedFileDate), null);
//if deleting subdirs, replace null above with TrueFileFilter.INSTANCE
while (filesToDelete.hasNext()) {
FileUtils.deleteQuietly(filesToDelete.next());
} //I don't want an exception if a file is not deleted. Otherwise use filesToDelete.next().delete() in a try/catch
}
Example using Java 8's Time API
LocalDate today = LocalDate.now();
LocalDate eailer = today.minusDays(30);
Date threshold = Date.from(eailer.atStartOfDay(ZoneId.systemDefault()).toInstant());
AgeFileFilter filter = new AgeFileFilter(threshold);
File path = new File("...");
File[] oldFolders = FileFilterUtils.filter(filter, path);
for (File folder : oldFolders) {
System.out.println(folder);
}
Using lambdas (Java 8+)
Non recursive option to delete all files in current folder that are older than N days (ignores sub folders):
public static void deleteFilesOlderThanNDays(int days, String dirPath) throws IOException {
long cutOff = System.currentTimeMillis() - (days * 24 * 60 * 60 * 1000);
Files.list(Paths.get(dirPath))
.filter(path -> {
try {
return Files.isRegularFile(path) && Files.getLastModifiedTime(path).to(TimeUnit.MILLISECONDS) < cutOff;
} catch (IOException ex) {
// log here and move on
return false;
}
})
.forEach(path -> {
try {
Files.delete(path);
} catch (IOException ex) {
// log here and move on
}
});
}
Recursive option, that traverses sub-folders and deletes all files that are older than N days:
public static void recursiveDeleteFilesOlderThanNDays(int days, String dirPath) throws IOException {
long cutOff = System.currentTimeMillis() - (days * 24 * 60 * 60 * 1000);
Files.list(Paths.get(dirPath))
.forEach(path -> {
if (Files.isDirectory(path)) {
try {
recursiveDeleteFilesOlderThanNDays(days, path.toString());
} catch (IOException e) {
// log here and move on
}
} else {
try {
if (Files.getLastModifiedTime(path).to(TimeUnit.MILLISECONDS) < cutOff) {
Files.delete(path);
}
} catch (IOException ex) {
// log here and move on
}
}
});
}
Here's Java 8 version using Time API. It's been tested and used in our project:
public static int deleteFiles(final Path destination,
final Integer daysToKeep) throws IOException {
final Instant retentionFilePeriod = ZonedDateTime.now()
.minusDays(daysToKeep).toInstant();
final AtomicInteger countDeletedFiles = new AtomicInteger();
Files.find(destination, 1,
(path, basicFileAttrs) -> basicFileAttrs.lastModifiedTime()
.toInstant().isBefore(retentionFilePeriod))
.forEach(fileToDelete -> {
try {
if (!Files.isDirectory(fileToDelete)) {
Files.delete(fileToDelete);
countDeletedFiles.incrementAndGet();
}
} catch (IOException e) {
throw new UncheckedIOException(e);
}
});
return countDeletedFiles.get();
}
For a JDK 8 solution using both NIO file streams and JSR-310
long cut = LocalDateTime.now().minusWeeks(1).toEpochSecond(ZoneOffset.UTC);
Path path = Paths.get("/path/to/delete");
Files.list(path)
.filter(n -> {
try {
return Files.getLastModifiedTime(n)
.to(TimeUnit.SECONDS) < cut;
} catch (IOException ex) {
//handle exception
return false;
}
})
.forEach(n -> {
try {
Files.delete(n);
} catch (IOException ex) {
//handle exception
}
});
The sucky thing here is the need for handling exceptions within each lambda. It would have been great for the API to have UncheckedIOException overloads for each IO method. With helpers to do this one could write:
public static void main(String[] args) throws IOException {
long cut = LocalDateTime.now().minusWeeks(1).toEpochSecond(ZoneOffset.UTC);
Path path = Paths.get("/path/to/delete");
Files.list(path)
.filter(n -> Files2.getLastModifiedTimeUnchecked(n)
.to(TimeUnit.SECONDS) < cut)
.forEach(n -> {
System.out.println(n);
Files2.delete(n, (t, u)
-> System.err.format("Couldn't delete %s%n",
t, u.getMessage())
);
});
}
private static final class Files2 {
public static FileTime getLastModifiedTimeUnchecked(Path path,
LinkOption... options)
throws UncheckedIOException {
try {
return Files.getLastModifiedTime(path, options);
} catch (IOException ex) {
throw new UncheckedIOException(ex);
}
}
public static void delete(Path path, BiConsumer<Path, Exception> e) {
try {
Files.delete(path);
} catch (IOException ex) {
e.accept(path, ex);
}
}
}
JavaSE Canonical Solution.
Delete files older than expirationPeriod days.
private void cleanUpOldFiles(String folderPath, int expirationPeriod) {
File targetDir = new File(folderPath);
if (!targetDir.exists()) {
throw new RuntimeException(String.format("Log files directory '%s' " +
"does not exist in the environment", folderPath));
}
File[] files = targetDir.listFiles();
for (File file : files) {
long diff = new Date().getTime() - file.lastModified();
// Granularity = DAYS;
long desiredLifespan = TimeUnit.DAYS.toMillis(expirationPeriod);
if (diff > desiredLifespan) {
file.delete();
}
}
}
e.g. - to removed all files older than 30 days in folder "/sftp/logs" call:
cleanUpOldFiles("/sftp/logs", 30);
You can get the creation date of the file using NIO, following is the way:
BasicFileAttributes attrs = Files.readAttributes(file, BasicFileAttributes.class);
System.out.println("creationTime: " + attrs.creationTime());
More about it can be found here : http://docs.oracle.com/javase/tutorial/essential/io/fileAttr.html
Another approach with Apache commons-io and joda:
private void deleteOldFiles(String dir, int daysToRemainFiles) {
Collection<File> filesToDelete = FileUtils.listFiles(new File(dir),
new AgeFileFilter(DateTime.now().withTimeAtStartOfDay().minusDays(daysToRemainFiles).toDate()),
TrueFileFilter.TRUE); // include sub dirs
for (File file : filesToDelete) {
boolean success = FileUtils.deleteQuietly(file);
if (!success) {
// log...
}
}
}
Using Java NIO Files with lambdas & Commons IO
final long time = System.currentTimeMillis();
// Only show files & directories older than 2 days
final long maxdiff = TimeUnit.DAYS.toMillis(2);
List all found files and directories:
Files.newDirectoryStream(Paths.get("."), p -> (time - p.toFile().lastModified()) < maxdiff)
.forEach(System.out::println);
Or delete found files with FileUtils:
Files.newDirectoryStream(Paths.get("."), p -> (time - p.toFile().lastModified()) < maxdiff)
.forEach(p -> FileUtils.deleteQuietly(p.toFile()));
Here is the code to delete files which are not modified since six months & also create the log file.
package deleteFiles;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
import java.util.logging.FileHandler;
import java.util.logging.Logger;
import java.util.logging.SimpleFormatter;
public class Delete {
public static void deleteFiles()
{
int numOfMonths = -6;
String path="G:\\Files";
File file = new File(path);
FileHandler fh;
Calendar sixMonthAgo = Calendar.getInstance();
Calendar currentDate = Calendar.getInstance();
Logger logger = Logger.getLogger("MyLog");
sixMonthAgo.add(Calendar.MONTH, numOfMonths);
File[] files = file.listFiles();
ArrayList<String> arrlist = new ArrayList<String>();
try {
fh = new FileHandler("G:\\Files\\logFile\\MyLogForDeletedFile.log");
logger.addHandler(fh);
SimpleFormatter formatter = new SimpleFormatter();
fh.setFormatter(formatter);
for (File f:files)
{
if (f.isFile() && f.exists())
{
Date lastModDate = new Date(f.lastModified());
if(lastModDate.before(sixMonthAgo.getTime()))
{
arrlist.add(f.getName());
f.delete();
}
}
}
for(int i=0;i<arrlist.size();i++)
logger.info("deleted files are ===>"+arrlist.get(i));
}
catch ( Exception e ){
e.printStackTrace();
logger.info("error is-->"+e);
}
}
public static void main(String[] args)
{
deleteFiles();
}
}
Need to point out a bug on the first solution listed, x * 24 * 60 * 60 * 1000 will max out int value if x is big. So need to cast it to long value
long diff = new Date().getTime() - file.lastModified();
if (diff > (long) x * 24 * 60 * 60 * 1000) {
file.delete();
}
Perhaps this Java 11 & Spring solution will be useful to someone:
private void removeOldBackupFolders(Path folder, String name) throws IOException {
var current = System.currentTimeMillis();
var difference = TimeUnit.DAYS.toMillis(7);
BiPredicate<Path, BasicFileAttributes> predicate =
(path, attributes) ->
path.getFileName().toString().contains(name)
&& (current - attributes.lastModifiedTime().toMillis()) > difference;
try (var stream = Files.find(folder, 1, predicate)) {
stream.forEach(
path -> {
try {
FileSystemUtils.deleteRecursively(path);
log.warn("Deleted old backup {}", path.getFileName());
} catch (IOException lambdaEx) {
log.error("", lambdaEx);
}
});
}
}
The BiPredicate is used to filter files (i.e. files & folder in Java) by name and age.
FileSystemUtils.deleteRecursively() is a Spring method that recursively removes files & folders. You can change that to something like NIO.2 Files.files.walkFileTree() if you don't want to use Spring dependencies.
I've set the maxDepth of Files.find() to 1 based on my use case. You can set to it unlimited Integer.MAX_VALUE and risk irreversibly deleting your dev FS if you are not careful.
Example logs based on var difference = TimeUnit.MINUTES.toMillis(3):
2022-05-20 00:54:15.505 WARN 24680 --- [ single-1] u.t.s.service.impl.BackupServiceImpl : Deleted old backup backup_20052022_1652989557462
2022-05-20 00:54:15.506 WARN 24680 --- [ single-1] u.t.s.service.impl.BackupServiceImpl : Deleted old backup backup_20052022_1652989558474
2022-05-20 00:54:15.507 WARN 24680 --- [ single-1] u.t.s.service.impl.BackupServiceImpl : Deleted old backup backup_20052022_1652989589723
2022-05-20 00:54:15.508 WARN 24680 --- [ single-1] u.t.s.service.impl.BackupServiceImpl : Deleted old backup backup_20052022_1652989674083
Notes:
The stream of Files.find() must be wrapped inside of a try-with-resource (utilizing AutoCloseable) or handled the old-school way inside of a try-finally to close the stream.
A good example of Files.walkFileTree() for copying (can be adapted for deletion): https://stackoverflow.com/a/60621544/3242022
Using Apache commons-io and joda:
if ( FileUtils.isFileOlder(f, DateTime.now().minusDays(30).toDate()) ) {
f.delete();
}

in java, how to delete one folder/dir by non recursive algorithm?

How can I delete one folder/directory by a non-recursive algorithm in Java? I want to use a non-recursive algorithm in order to avoid StackOverflowErrors when a folder has a very deep path.
Could someone please offer some advise in this area.
In crappy pseudo-code, as I don't have a Java compiler handy to test this:
queue = [ rootDir ]
stack = []
while ( !queue.isEmpty() ) {
currentDir = queue.take()
stack.push( currentDir )
files = currentDir.list()
for ( f : files ) {
if ( f.isDirectory() ) {
queue.add( f )
} else {
f.delete()
}
}
}
while ( !stack.isEmpty() ) {
f = stack.pop()
f.delete()
}
Basically this code should scan a directory, deleting files or queueing subdirectories for further scanning. It places scanned directories in a stack, so that the second while loop deletes them in the correct order (deepest first).
Here's a general way to delete a file/folder :
/**deletes a file/folder recursively, and returns true iff succeeded */
public static boolean deleteQuietly(File file) {
if (file == null || !file.exists())
return true;
if (!file.isDirectory())
return file.delete();
LinkedList<File> dirs = new LinkedList<>();
dirs.add(0, file);
boolean succeededDeletion = true;
while (!dirs.isEmpty()) {
file = dirs.remove(0);
File[] children = file.listFiles();
if (children == null || children.length == 0)
succeededDeletion &= file.delete();
else {
dirs.add(0, file);
for (File child : children)
if (child.isDirectory())
dirs.add(0, child);
else
succeededDeletion &= child.delete();
}
}
return succeededDeletion;
}
this is just a starting point for you to improve on.
The critical part is to find out what's the directories to delete.
This piece of psuedo code should help you to find out all directories under certain directory:
Set<File> allDirectories = new Set<File>();
allDirectories.add(yourStartingDirectory);
while (hasMoreToRead) {
hasMoreToRead = false;
for (File f : allDirectories) {
if (f.isDirectory() && !allDirectories.contains(f)) {
allDirectories.add(f);
hasMoreToRead = true;
}
}
}
This is just a starting point, but you should be able to finish the rest: Avoid revisiting directories in allDirectories that has been processed in previous iterations; Performing delete base on allDirectories; Make the delete more efficient by deleting in "correct" order; etc
// Deletes all files and subdirectories under dir.
// Returns true if all deletions were successful.
// If a deletion fails, the method stops attempting to delete and returns false.
public static boolean deleteDir(File dir) {
if (dir.isDirectory()) {
String[] children = dir.list();
for (int i=0; i<children.length; i++) {
boolean success = deleteDir(new File(dir, children[i]));
if (!success) {
return false;
}
}
}
// The directory is now empty so delete it
return dir.delete();
}
To remove recursion, you replace the call stack with an explicit stack to hold the items you still need to process. In your case, you keep track of all the parent folders you need to delete after you are done with the current one. Here's an example using a LinkedList as a stack:
public static void rmdir(File dir) {
LinkedList<File> dirs = new LinkedList<File>();
dirs.push(dir);
while (dirs.peek() != null) {
dir = dirs.pop();
File[] contents = dir.listFiles();
if (contents.length == 0) {
dir.delete();
} else {
dirs.push(dir);
for(File content : contents) {
if (content.isDirectory()) {
dirs.push(content);
} else {
content.delete();
}
}
}
}
}
My interpretation of your question is that you want to delete a directory without recursing into the directories within it. In this case, you can implement the deletion using a pretty simple loop...
File directory = new File("c:\\directory_path")
if (!directory.exists()){
return;
}
File[] files = directory.listFiles();
for (int i=0;i<files.length;i++){
if (files[i].isFile()){
boolean deleted = files[i].delete();
if (!deleted){
System.out.println("Problem deleting file " + files[i].getAbsolutePath());
}
}
}
This will list all the Files of the directory in an array, and then loop through them. If the file is a normal file, it will be deleted. Non-normal files, such as directories, will be skipped.
Of course, there are other similar alternatives, such as adding a FileFilter to the listFiles() method so that the array is only populated by normal files, but its effectively pretty similar.
If you want to delete the directory tree, you will have to use some kind of recursion. You could approach it differently though, which might not cause you so many problems, such as building an ArrayList of directories, and then iterating through the ArrayList deleting them one at a time. This would help to reduce the recursion.
public static final void delete(File file) throws IOException
{
if (!file.exists())
throw new IllegalArgumentException("File does not exist: " + file);
if (file.isFile())
{
simpleDelete(file);
return;
}
Deque<File> dirsQueue = new ArrayDeque<File>();
dirsQueue.push(file);
for (File dir; (dir = dirsQueue.peekLast()) != null;)
{
File[] children = dir.listFiles();
if (children == null)
throw new IOException("Unable to read directory: " + dir);
if (children.length == 0)
{
simpleDelete(dir);
dirsQueue.removeLast();
continue;
}
for (File child : children)
{
if (child.isDirectory())
dirsQueue.addLast(child);
else
simpleDelete(child);
}
}
}
private static final void simpleDelete(File file) throws IOException
{
if (!file.delete())
throw new IOException("Unable to delete " + (file.isDirectory() ? "directory" : "file") + ": " + file);
}

Sending only 1mb of files from folder through web service

My question is that I want to send pdf files through web service with condition that only 1mb of files are taken from that folder containing many files.
Please help me to resolve this question.I am new to web service.
Ask me again if it not clear.
Thanks In Advance.
The following method will return a list of all the files whose total size is <= 1Mb
public List<File> getFilesList(){
File dirLoc = new File("C:\\Temp");
List<File> validFilesList = new ArrayList<File>();
File[] fileList;
final int fileSizeLimit = 1024000; // Bytes
try {
// select all the files whose size <= 1Mb
fileList = dirLoc.listFiles(new FilenameFilter() {
public boolean accept(final File dirLoc, final String fileName) {
return (new File(dirLoc + "\\" + fileName).length() <= fileSizeLimit);
}
});
long sizeCtr = fileSizeLimit;
for(File file : fileList){
if(file.length() <= sizeCtr){
validFilesList.add(file);
sizeCtr = sizeCtr - file.length();
if(sizeCtr <= 0){
break;
}
}
}
} catch (Exception e) {
e.printStackTrace();
validFilesList = new ArrayList<File>();
} finally {
fileList = null;
}
return validFilesList;
}
Well, I dont know if I have understood your requirements correctly and if this would help your problem but you can try this java solution for filtering the files from a directory.
You will get a list of files and then you can use the web-service specific code to send these files
File dirLoc = new File("C:\\California");
File[] fileList;
final int fileSize = 1024000;
try {
fileList = dirLoc.listFiles(new FilenameFilter() {
public boolean accept(final File dirLoc, final String fileName) {
return (new File(dirLoc+"\\"+fileName).length() > fileSize);
}
});
} catch (Exception e) {
e.printStackTrace();
} finally {
fileList = null;
}
This should work.
If you just require filenames, replace the File[] with String[] and .listFiles() with list()
I cannot say much about the performance though. For a small list of files it should work pretty fast.
I am not sure if this is what you want but you can pick the files and check their size by :
java.io.File file = new java.io.File("myfile.txt");
file.length();
File.length()Javadoc
Send files whose size is 1 Mb.

Categories