[CentOS] This "find" command

Les Mikesell lesmikesell at gmail.com
Wed Dec 28 17:38:08 UTC 2005


On Wed, 2005-12-28 at 11:13, rado wrote:

> although I did enjoy playing w/this as I had never any experience w/the
> "-exec" command...well, it produced about the same amt of files to send
> to rsync w/no clamscan errors that would stop it but it took approx 1 hr
> to complete.

For programs that take multiple filenames on the command line it is
much more efficient to pipe the list to xargs instead of using
exec which will start the program over again for every file. 
However, if you have filenames with embedded spaces, shell
metacharacters or newlines, you can have problems as xargs
presents them on a command line to the program.  On GNU based
systems you can use the -print0 argument to find and -0 to
xargs to make them pass the filenames null-terminated and
quote them correctly on the command line.  When I saw your
first post I wondered if you had filenames with *'s or
spaces that made clamscan see directories after the shell
parsing and then waste time with its own recursion.

> also, it seems that no matter what I tried I cannot get find to stop
> looking in /proc  lol

One way is to use the -mount argument and make separate runs for
each filesystem.  That also avoids the problem of wandering into
isos/dvd's/nfs mounts, etc.

> oh well at least I have the statement to a state where it produces no
> errors to block the back up for taking place.

You can also redirect the find output to a file and look at
or edit the results before feeding it to xargs.

-- 
  Les Mikesell
    lesmikesell at gmail.com





More information about the CentOS mailing list