Working with Directories with a Large Number of Files

Today I try to delete a directory with 485,000+ files on Linux server. It shows the error message below.

/bin/rm: Argument list too long.

If you’re trying to delete a very large number of files in a single directory at one time, you will probably run into this error. The problem is that when you type something like rm -f /data/*. So how do we working with directories with a large number of files? Let talk about it below.

Show first files

To show first n files in the directory instead of to show all files.

ls -U | head -10

Count the number of files

ls -f | wc -l

Remove files

Using for Loop

for f in *.pdf; do echo rm "$f"; done
for f in *.pdf; do rm "$f"; done

Using Find

Using find -delete. (2000 files/second, it is about three time faster rm)

find . -maxdepth 1 -type f -name "$.split-*" -print -delete

Using find and xargs. To find every file and pass them one-by-one to the “rm” command (not recommend, xargs is dangerous*)

find . -type f -exec rm -v {} \;
find . -maxdepth 1 -name "$.split-*" -print0 | xargs -0 rm