$ find /home/peter -name *~ |xargs rm
This works reasonably well, but some targeted backup files are actually not deleted, namely:
- backup files in a sub-directory that is symbolically linked.
- backup files that have spaces in their file name or path name.
In addition, the xargs command handles zero argument poorly. If no match is found by the find command, xargs is not smart enough to terminate right away, but will still try to execute the target command (rm) in some way.
$ find /home/peter -name no-such-thing* |xargs rm
rm: missing operand
Symbolic Links
find does not follow symbolic links, by default. To make it follow symbolic links, add the -L parameter.
$ find -L /home/peter -name *~ |xargs rm
Names with spaces
xargs splits up its input arguments at spaces (and newlines). If a file name (or path name) has spaces in it, e.g., "can not do this.pdf", xargs will misinterpret it and thinks there are multiple files.
The solution is to invoke xargs with the -0 (zero) parameter. xargs will separate filenames by NUL instead of whitespace. Also, the find command needs the -print0 parameter: this puts a NUL between filenames instead of a newline.
Note that the -print0 parameter must be put last in find.
$ find /home/peter -name *~ -print0 |xargs -0 rm
Zero argument
Use the -r flag with xargs. If stdin is empty, xargs will not run the command, and exit.
Putting it altogether, the command I will use for my task is:
$ find -L /home/peter -name *~ -print0 |xargs -0 -r rm
13 comments:
Or... you could just use the exec argument for the find command:
find -L /home/peter -name *~ \
-exec rm '{}' \;
That way, find passes the filenames on to rm where you find {}. Just be sure to add the quotes and backspace. See man find for more info...
Thanks, Jarrod.
I do use the -exec sometimes, especially for small jobs. I'll add it to the blog entry ...
Great, glad you like it.
You noted the xargs method is faster than the -exec method. You can make the xargs even faster yet by adding the switch to process in batches. For example the following line tells xargs to process 1000 arguments at a time, I find it to be much faster than leaving it off and letting it process at it's own rate:
find -L /home/peter -name *~ -print0 |xargs -0 -r -n1000 rm
I never though that xargs is faster than exec. Thanks for sharing.
xargs command examples in unix
great, thanks. i can files in linux.
"find" has a -delete switch which does just that without using the exec.
The problem I had was, "find -delete" on roughly 1 million files used up all the CPU for a very long time. So, I limited the number of files with a "find | head -n1000 | xargs -r rm", to avoid extreme situations. Perfect for a clean-up script running every 10 minutes.
Thanks for the "-r", that was exactly what I was looking for!
There is one more option.
find -L /home/peter -name *~ -exec rm '{}' +
does what xarg does.
Your use of find may include filenames made by other users in which case careless use of xargs is a security flaw.
http://www.zen19351.zen.co.uk/article_series/find_xargs_rm.html
The absence of the '-r' parameter just caused my 7z command to compress a whole file-system instead of, well, nothing! LOL. What a strange default action.
find and grep are bread and butter in Linux command line. Here are some more examples of xargs command in UNIX
Good Info. Thanks for sharing
Very useful information.
Post a Comment