Is it possible to use a regular expression to get filenames for files matching a given pattern in a directory without having to manually loop through all the files.
You could use File.listFiles(FileFilter):
public static File[] listFilesMatching(File root, String regex) {
if(!root.isDirectory()) {
throw new IllegalArgumentException(root+" is no directory.");
}
final Pattern p = Pattern.compile(regex); // careful: could also throw an exception!
return root.listFiles(new FileFilter(){
#Override
public boolean accept(File file) {
return p.matcher(file.getName()).matches();
}
});
}
EDIT
So, to match files that look like: TXT-20100505-XXXX.trx where XXXX can be any four successive digits, do something like this:
listFilesMatching(new File("/some/path"), "XT-20100505-\\d{4}\\.trx")
EDIT
Starting with Java8 the complete 'return'-part can be written with a lamda-statement:
return root.listFiles((File file) -> p.matcher(file.getName()).matches());
implement FileFilter (just requires that you override the method
public boolean accept(File f)
then, every time that you'll request the list of files, jvm will compare each file against your method. Regex cannot and shouldn't be used as java is a cross platform language and that would cause implications on different systems.
package regularexpression;
import java.io.File;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
public class RegularFile {
public static void main(String[] args) {
new RegularFile();
}
public RegularFile() {
String fileName = null;
boolean bName = false;
int iCount = 0;
File dir = new File("C:/regularfolder");
File[] files = dir.listFiles();
System.out.println("List Of Files ::");
for (File f : files) {
fileName = f.getName();
System.out.println(fileName);
Pattern uName = Pattern.compile(".*l.zip.*");
Matcher mUname = uName.matcher(fileName);
bName = mUname.matches();
if (bName) {
iCount++;
}
}
System.out.println("File Count In Folder ::" + iCount);
}
}
Related
I would like to search for files recursively. According to other solutions, I have already done a big portion of the code:
public static File[] getFiles(String path) {
File file = new File(path);
// Get the subdirectories.
String[] directories = file.list(new FilenameFilter() {
#Override
public boolean accept(File current, String name) {
return new File(current, name).isDirectory();
}
});
for (String dir : directories) {
// Doing recursion
}
// Get the files inside the directory.
FileFilter fileFilter = new FileFilter();
File[] files = file.listFiles(fileFilter);
return files;
}
FileFilter is just a custom filter of mine. My problem is that I don't know how to do the recursion in this case. Of course I could call getFiles() again for each subdirectory with the subdirectory path as argument but somehow the returning File array must be merged.
Does somebody have a solution?
Use the find() method.
/* Your filter can be initialized however you need... */
YourCustomFilter filter = new YourCustomFilter(extension, maxSize);
try (Stream<Path> s = Files.find(dir, Integer.MAX_VALUE, filter::test)) {
return s.map(Path::toFile).toArray(File[]::new);
}
This assumes your custom filter has a method called test() that accepts the file and its attributes; you'll need to rework your current file filter a bit to accommodate this.
boolean test(Path path, BasicFileAttributes attrs) {
...
}
Working example: http://screencast.com/t/buiyV9UiEa
You can try something like this:
//add this imports
import java.io.File;
import java.io.FilenameFilter;
import java.util.List;
import java.util.ArrayList;
import java.util.Arrays;
public static File[] getFiles(String path) {
File file = new File(path);
// Get the subdirectories.
String[] directories = file.list(new FilenameFilter() {
#Override
public boolean accept(File current, String name) {
return new File(current, name).isDirectory();
}
});
//Use a list to save the files returned from the recursive call
List<File> filesList = new ArrayList<File>();
if( directories != null){
for (String dir : directories) {
// Doing recursion
filesList.addAll( Arrays.asList(getFiles(path + File.separator + dir)) );
}
}
// Get the files inside the directory.
FileFilter fileFilter = new FileFilter();
File[] files = file.listFiles(fileFilter);
//Merge the rest of the files with the files
//in the current dir
if( files != null)
filesList.addAll( Arrays.asList(files) );
return filesList.toArray(new File[filesList.size()]);
}
Code tested and working. Hope this helps.
import java.util.Arrays;
import java.util.ArrayList;
Put a fail-safe right after you initialize file (in case of a bad path on the first call).
if (!file.isDirectory()) return new File[0];
And change the last part of your code to:
FileFilter fileFilter = new FileFilter();
ArrayList<File> files = new ArrayList(Arrays.asList(file.listFiles(fileFilter)));
for (String dir : directories) {
files.addAll(Arrays.asList(getFiles(dir)));
}
return files.toArray(new File[0]);
(the toArray method expands the array that you pass to it if it's too small) Ref
You should do something like this:
import java.io.File;
import java.util.ArrayList;
import java.util.Arrays;
public static File[] getFiles(String path) {
File file = new File(path);
String[] directories = file.list(new FilenameFilter() {
#Override
public boolean accept(File current, String name) {
return new File(current, name).isDirectory();
}
});
ArrayList<File> files = Arrays.asList(file.listFiles(new FileFilter()));
for (String dir : directories) {
files.addAll(getFiles(dir));
}
return files.toArray(new File[list.size()]);
}
The new File[list.size()] is required because otherwise file.toArray() would return Object[].
Also, you should use a lambda expression instead of FilenameFilter, like so:
String[] directories = file.list((File current, String name) -> {
return new File(current, name).isDirectory();
});
I want include copyright comment on the top of every java file. I want to do it via eclipse formatter.
If it is not possible, please suggest any other way to include the copyright in existing java file.
Although it may not perfectly suit the intended approach (that is, using an eclipse plugin for that) : Once I created a small utility class that does exactly this.
It collects a list of all .java files in a given source directory. For each of the resulting files, it looks for the first line starting with the word "package". If the part above this line is not empty, then it will assume that a header is already present, and skip this file. Otherwise, it will insert a header (which is contained in some template file) before the line starting with "package".
This could arbitrarily be improved and generalized, but I only once wrote it down in order to quickly insert copyright headers into the files of an existing code base, and it worked well. Maybe you (or others) find it helpful.
import java.io.File;
import java.io.FileWriter;
import java.io.FilenameFilter;
import java.io.IOException;
import java.nio.charset.Charset;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
public class HeaderInserter
{
public static void main(String[] args)
{
String headerTemplateFileName = "HeaderTemplate.txt";
String path = "C:/Workspace/HeaderInserter/src";
insertHeaders(path, headerTemplateFileName);
}
private static void insertHeaders(String path, String headerTemplateFileName)
{
FilenameFilter filenameFilter = new FilenameFilter()
{
#Override
public boolean accept(File dir, String name)
{
return name.toLowerCase().endsWith(".java");
}
};
List<String> headerLines = readLines(headerTemplateFileName);
List<File> files = listFiles(new File(path), filenameFilter);
for (File file : files)
{
System.out.println("Inserting header into "+file);
handle(file, headerLines);
}
System.out.println("Done");
}
private static void handle(File inputFile, List<String> headerLines)
{
List<String> lines = readLines(inputFile.getPath());
int index = lineIndexStartingWith(lines, "package");
if (index == -1)
{
System.err.println("No 'package' line found in "+inputFile);
return;
}
if (index > 0)
{
List<String> removedLines = lines.subList(0, index);
String removedPart = createString(removedLines);
String removedContents = removedPart.replaceAll("\n", "");
if (removedContents.trim().length() > 0)
{
System.err.println("Non-empty header found in "+inputFile);
System.err.println(removedPart);
System.err.println("Skipping");
return;
}
}
List<String> keptLines = lines.subList(index, lines.size());
List<String> writtenLines = new ArrayList<String>();
writtenLines.addAll(headerLines);
writtenLines.addAll(keptLines);
String writtenContent = createString(writtenLines);
File outputFile = new File(inputFile.getName()+"_header");
boolean written = writeContent(outputFile, writtenContent);
if (written)
{
boolean deleted = inputFile.delete();
if (!deleted)
{
System.err.println(
"Could not delete old input file: "+inputFile);
return;
}
boolean renamed = outputFile.renameTo(inputFile);
if (!renamed)
{
System.err.println("Could not rename "+outputFile);
System.err.println(" to "+inputFile);
return;
}
System.out.println("Inserted header into "+inputFile);
}
}
private static int lineIndexStartingWith(
List<String> lines, String prefix)
{
for (int i=0; i<lines.size(); i++)
{
String line = lines.get(i);
if (line.trim().startsWith(prefix))
{
return i;
}
}
return -1;
}
private static String createString(List<String> lines)
{
StringBuilder sb = new StringBuilder();
for (String line : lines)
{
sb.append(line).append("\n");
}
return sb.toString();
}
private static boolean writeContent(
File outputFile, String writtenContent)
{
try (FileWriter fw = new FileWriter(outputFile))
{
fw.write(writtenContent);
fw.close();
return true;
}
catch (IOException e)
{
e.printStackTrace();
}
return false;
}
private static List<String> readLines(String fileName)
{
try
{
return Files.readAllLines(
Paths.get(fileName), Charset.defaultCharset());
}
catch (IOException e)
{
e.printStackTrace();
return null;
}
}
private static List<File> listFiles(
File rootDirectory, FilenameFilter filenameFilter)
{
List<File> result = new ArrayList<File>();
listFiles(rootDirectory, filenameFilter, result);
return result;
}
private static void listFiles(
File file, FilenameFilter filenameFilter, List<File> result)
{
if (!file.isDirectory())
{
if (filenameFilter.accept(file.getParentFile(), file.getName()))
{
result.add(file);
}
}
else
{
File files[] = file.listFiles();
for (File f : files)
{
listFiles(f, filenameFilter, result);
}
}
}
}
Try Eclipse Releng Tools here
Write a script to update each file and add first line - use eclipse find and replace regex
http://java.dzone.com/articles/using-regular-expressions
I'm looking for a lib which would provide a method which would give me a list of files matching given Ant-like pattern.
For *foo/**/*.txt I'd get
foo/x.txt
foo/bar/baz/.txt
myfoo/baz/boo/bar.txt
etc. I know it's achievable with DirWalker and
PathMatcher mat = FileSystems.getDefault().getPathMatcher("glob:" + filesPattern);
, but I'd rather some maintained lib. I expected Commons IO to have it but no.
Update: I'm happy with reusing Ant's code, but would prefer something smaller than whole Ant.
So I sacrified few MB of app's size for the sake of speed and used Ant's DirectoryScanner in the end.
Also, there's Spring's PathMatchingResourcePatternResolver.
//files = new PatternDirWalker( filesPattern ).list( baseDir );
files = new DirScanner( filesPattern ).list( baseDir );
public class DirScanner {
private String pattern;
public DirScanner( String pattern ) {
this.pattern = pattern;
}
public List<File> list( File dirToScan ) throws IOException {
DirectoryScanner ds = new DirectoryScanner();
String[] includes = { this.pattern };
//String[] excludes = {"modules\\*\\**"};
ds.setIncludes(includes);
//ds.setExcludes(excludes);
ds.setBasedir( dirToScan );
//ds.setCaseSensitive(true);
ds.scan();
String[] matches = ds.getIncludedFiles();
List<File> files = new ArrayList(matches.length);
for (int i = 0; i < matches.length; i++) {
files.add( new File(matches[i]) );
}
return files;
}
}// class
And here's my impl I started to code, not finished, just if someone would like to finish it. The idea was it would keep a stack of patterns, traverse the dir tree and compare the contents to the actual stack depth and the rest of it in case of **.
But I resorted to PathMatcher and then to Ant's impl.
public class PatternDirWalker {
//private static final Logger log = LoggerFactory.getLogger( PatternDirWalker.class );
private String pattern;
private List segments;
private PathMatcher mat;
public PatternDirWalker( String pattern ) {
this.pattern = pattern;
this.segments = parseSegments(pattern);
this.mat = FileSystems.getDefault().getPathMatcher("glob:" + pattern);
}
public List<File> list( File dirToScan ) throws IOException{
return new DirectoryWalker() {
List<File> files = new LinkedList();
#Override protected void handleFile( File file, int depth, Collection results ) throws IOException {
if( PatternDirWalker.this.mat.matches( file.toPath()) )
results.add( file );
}
public List<File> findMatchingFiles( File dirToWalk ) throws IOException {
this.walk( dirToWalk, this.files );
return this.files;
}
}.findMatchingFiles( dirToScan );
}// list()
private List<Segment> parseSegments( String pattern ) {
String[] parts = StringUtils.split("/", pattern);
List<Segment> segs = new ArrayList(parts.length);
for( String part : parts ) {
Segment seg = new Segment(part);
segs.add( seg );
}
return segs;
}
class Segment {
public final String pat; // TODO: Tokenize
private Segment( String pat ) {
this.pat = pat;
}
}
}// class
As of Java 7 there is a recursive directory scan. Java 8 can improve it a bit syntactically.
Path start = FileSystems.getDefault().getPath(",,,");
walk(start, "**.java");
One needs a glob matching class, best on directory level, so as to skip directories.
class Glob {
public boolean matchesFile(Path path) {
return ...;
}
public boolean matchesParentDir(Path path) {
return ...;
}
}
Then the walking would be:
public static void walk(Path start, String searchGlob) throws IOException {
final Glob glob = new Glob(searchGlob);
Files.walkFileTree(start, new SimpleFileVisitor<Path>() {
#Override
public FileVisitResult visitFile(Path file,
BasicFileAttributes attrs) throws IOException {
if (glob.matchesFile(file)) {
...; // Process file
}
return FileVisitResult.CONTINUE;
}
#Override
public FileVisitResult preVisitDirectory(Path dir,
BasicFileAttributes attrs) throws IOException {
return glob.matchesParentDir(dir)
? FileVisitResult.CONTINUE : FileVisitResult.SKIP_SUBTREE;
}
});
}
}
Google Guava has a TreeTraverser for files that lets you do depth-first and breadth-first enumeration of files in a directory. You could then filter the results based on a regex of the filename, or anything else you need to do.
Here's an example (requires Guava):
import java.io.File;
import java.util.List;
import java.util.regex.Pattern;
import com.google.common.base.Function;
import com.google.common.base.Predicates;
import com.google.common.io.Files;
import com.google.common.collect.Iterables;
import com.google.common.collect.TreeTraverser;
public class FileTraversalExample {
private static final String PATH = "/path/to/your/maven/repo";
private static final Pattern SEARCH_PATTERN = Pattern.compile(".*\\.jar");
public static void main(String[] args) {
File directory = new File(PATH);
TreeTraverser<File> traverser = Files.fileTreeTraverser();
Iterable<String> allFiles = Iterables.transform(
traverser.breadthFirstTraversal(directory),
new FileNameProducingPredicate());
Iterable<String> matches = Iterables.filter(
allFiles,
Predicates.contains(SEARCH_PATTERN));
System.out.println(matches);
}
private static class FileNameProducingPredicate implements Function<File, String> {
public String apply(File input) {
return input.getAbsolutePath();
}
}
}
Guava will let you filter by any Predicate, using Iterables.filter, so you don't have to use a Pattern if you don't want to.
HI I want to write a java program by which I can delete all the files of my computer having a specific extension or character pattern in name.I also want to apply wild card character on the name of file.
Thanks in advance
For your program to be really useful you need to do some more thinking, but for a starter;
import java.io.File;
import java.util.regex.Pattern;
private static void walkDir(final File dir, final Pattern pattern) {
final File[] files = dir.listFiles();
if (files != null) {
for (final File file : files) {
if (file.isDirectory()) {
walkDir(file, pattern);
} else if (pattern.matcher(file.getName()).matches()) {
System.out.println("file to delete: " + file.getAbsolutePath());
}
}
}
}
public static void main(String[] args) {
walkDir(new File("/home/user/something"), Pattern.compile(".*\\.mp3"));
}
Sidenote (not an answer, but you didn't ask something): Be aware of recursion.
public void deleteFilesWithExtension(final String directoryName, final String extension) {
final File dir = new File(directoryName);
final String[] allFiles = dir.list();
for (final String file : allFiles) {
if (file.endsWith(extension)) {
new File(aDirectoryName + "/" + file).delete();
}
}
}
I have the directory path being passed as an argument in Java program and the directory has various types of files. I want to retrieve path of text files and then further each text file.
I am new to Java, any recommendation how to go about it?
Even though this is not an optimum solution you can use this as a starting point.
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
public class DirectoryWalker {
/**
* #param args
*/
private String extPtr = "^.+\\.txt$";
private Pattern ptr;
public DirectoryWalker(){
ptr = Pattern.compile(extPtr);
}
public static void main(String[] args) {
String entryPoint = "c:\\temp";
DirectoryWalker dw = new DirectoryWalker();
List<String> textFiles = dw.extractFiles(entryPoint);
for(String txtFile : textFiles){
System.out.println("File: "+txtFile);
}
}
public List<String> extractFiles(String startDir) {
List<String> textFiles = new ArrayList<String>();
if (startDir == null || startDir.length() == 0) {
throw new RuntimeException("Directory entry can't be null or empty");
}
File f = new File(startDir);
if (!f.isDirectory()) {
throw new RuntimeException("Path " + startDir + " is invalid");
}
File[] files = f.listFiles();
for (File tmpFile : files) {
if (tmpFile.isDirectory()) {
textFiles.addAll(extractFiles(tmpFile.getAbsolutePath()));
} else {
String path = tmpFile.getAbsolutePath();
Matcher matcher = ptr.matcher(path);
if(matcher.find()){
textFiles.add(path);
}
}
}
return textFiles;
}
}
Create a File object representing the directory, then use one of the list() or listFiles() methods to obtain the children. You can pass a filter to these to control what is returned.
For example, the listFiles() method below will return an array of files in the directory accepted by a filter.
public File[] listFiles(FileFilter filter)
Start by reading the File API. You can create a File from a String and even determine if it exists() or isDirectory(). As well as listing the children in that directory.