[CentOS] The directory that I am trying to clean up is huge

Les Mikesell lesmikesell at gmail.com
Mon Jan 25 18:40:46 UTC 2010


James B. Byrne wrote:
> On Mon, January 25, 2010 10:31, Robert Nichols wrote:
> \
>> Now if the "{}" string appears more than once then the command line
>> contains that path more than once, but it is essentially impossible
>> to exceed the kernel's MAX_ARG_PAGES this way.
>>
>> The only issue with using "-exec command {} ;" for a huge number of
>> files is one of performance.  If there are 100,000 matched files,
>> the command will be invoked 100,000 times.
>>
>> --
>> Bob Nichols         RNichols42 at comcast.net
>>
> 
> Since the OP reported that the command he used:
> 
>   find -name "*.access*" -mtime +2 -exec rm {} \;
> 
> in fact failed, one may infer that more than performance is at issue.
> 
> The OP's problem lies not with the -exec construction but with the
> unstated, but nonetheless present, './' of his find invocation.
> Therefore he begins a recursive descent into that directory tree.
> Since the depth of that tree is not given us, nor its contents, we
> may only infer that there must be some number of files therein which
> are causing the MAXPAGES limit to be exceeded before the recursion
> returns.

Find just emits the filenames as encountered, so _no_ number of files should be 
able to cause an error.  An infinitely deep directory tree might, or recursively 
linked directories, but only after a considerable amount of time and churning to 
exhaust the machine's real and virtual memory.

> I deduce that he could provide the -prune option or the -maxdepth= 0
> option to avoid this recursion instead. I have not tried either but
> I understand that one, or both, should work.

I'd say it is more likely that the command that resulted in an error wasn't 
exactly what was posted or there is a filesystem problem.

-- 
   Les Mikesell
    lesmikesell at gmail.com





More information about the CentOS mailing list