The directory that I am trying to clean up is huge . every time get this error msg
-bash: /usr/bin/find: Argument list too long
Please advise
Anas
I tried to run this command
find -name "*.access*" -mtime +2 -exec rm {} ;
and I have same error message
Anas
-----Original Message----- From: centos-bounces@centos.org [mailto:centos-bounces@centos.org] On Behalf Of Marcelo M. Garcia Sent: Saturday, January 23, 2010 3:34 PM To: CentOS mailing list Subject: Re: [CentOS] The directory that I am trying to clean up is huge
Anas Alnaffar wrote:
The directory that I am trying to clean up is huge . every time get this error msg
-bash: /usr/bin/find: Argument list too long
Please advise
*Anas *
Hi
Could you put the complete command? Please provide more details.
Regards
mg.
_______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
From: Anas Alnaffar a.alnaffar@tijaritelecom.com
I tried to run this command find -name "*.access*" -mtime +2 -exec rm {} ; and I have same error message
How many "*.access*" are there...?
JD
On Mon, Jan 25, 2010 at 03:14:54AM -0800, John Doe wrote:
From: Anas Alnaffar a.alnaffar@tijaritelecom.com
I tried to run this command find -name "*.access*" -mtime +2 -exec rm {} ; and I have same error message
How many "*.access*" are there...?
JD
if there are so many that you're finding the previously suggested techniques difficult to use, you can try the brute-force I sometimes use...
run: ls > list
then edit the file (list) with a decent text editor, one in which you can use one command to place text at the beginning of every line such that every line then turns out to read:
rm file1 rm file2
etc, as well as removing any lines for files you do NOT want to remove.
if you have 'vi', this command will do the edits for you: ":1,$s/^/rm /"
then make the file executable:
chmod a+x list
then run it:
./list
fred smith wrote:
On Mon, Jan 25, 2010 at 03:14:54AM -0800, John Doe wrote:
From: Anas Alnaffar a.alnaffar@tijaritelecom.com
I tried to run this command find -name "*.access*" -mtime +2 -exec rm {} ; and I have same error message
How many "*.access*" are there...?
JD
if there are so many that you're finding the previously suggested techniques difficult to use, you can try the brute-force I sometimes use...
It actually shouldn't matter. As long as the wildcards are quoted on the command line, you shouldn't get an error from too many files. I suspect the command that was typed wasn't exactly what is shown above.
fred smith wrote:
On Mon, Jan 25, 2010 at 03:14:54AM -0800, John Doe wrote:
From: Anas Alnaffar a.alnaffar@tijaritelecom.com
I tried to run this command find -name "*.access*" -mtime +2 -exec rm {} ; and I have same error message
How many "*.access*" are there...?
if there are so many that you're finding the previously suggested techniques difficult to use, you can try the brute-force I sometimes use...
It actually shouldn't matter. As long as the wildcards are quoted on the command line, you shouldn't get an error from too many files. I suspect the command that was typed wasn't exactly what is shown above.
First, I don't see the path there, which *must* be after the command....
Also, I don't believe that "" will work - the shell will interpret that. I think you need '', or, what I always use, , so that if I were typing it, I'd have: $ find . -name *.access* -mtime +2 -exec rm {} ;
mark
On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
Anas Alnaffar wrote:
I tried to run this command
find -name "*.access*" -mtime +2 -exec rm {} ;
Should have been: find ./ -name *.access* -mtime +2 -exec rm -f {} ;
No difference. If the path is omitted, current versions of find assume the current directory, and double quotes are fine for avoiding shell expansion of wildcards. (But, I'm guessing the quotes were omitted on the command that generated the error).
Les Mikesell wrote:
On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
Anas Alnaffar wrote:
I tried to run this command
find -name "*.access*" -mtime +2 -exec rm {} ;
Should have been: find ./ -name *.access* -mtime +2 -exec rm -f {} ;
No difference. If the path is omitted, current versions of find assume the current directory, and double quotes are fine for avoiding shell expansion of wildcards. (But, I'm guessing the quotes were omitted on the command that generated the error).
Well, like you said, I cannot imagine the above command line generating a "too many arguments" error. That only makes sense if find was fed too many arguments.
On Jan 26, 2010, at 6:06 PM, Les Mikesell wrote:
On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
Anas Alnaffar wrote:
I tried to run this command
find -name "*.access*" -mtime +2 -exec rm {} ;
Should have been: find ./ -name *.access* -mtime +2 -exec rm -f {} ;
No difference. If the path is omitted, current versions of find assume the current directory, and double quotes are fine for avoiding shell expansion of wildcards. (But, I'm guessing the quotes were omitted on the command that generated the error).
In my defense, I didn't realize that there were versions of find that didn't require a starting location. And I've tended to remain with more standard versions of commands like this, since I've had to use too many stripped down systems through the years, plus I still use several different versions of Unix like systems. Centos 5 does work without the path, but I wonder now when that was added to Linux? OS X doesn't support that variant. I don't know yet about Solaris.
On Wednesday, January 27, 2010 11:35 AM, Kevin Krieser wrote:
On Jan 26, 2010, at 6:06 PM, Les Mikesell wrote:
On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
Anas Alnaffar wrote:
I tried to run this command
find -name "*.access*" -mtime +2 -exec rm {} ;
Should have been: find ./ -name *.access* -mtime +2 -exec rm -f {} ;
No difference. If the path is omitted, current versions of find assume the current directory, and double quotes are fine for avoiding shell expansion of wildcards. (But, I'm guessing the quotes were omitted on the command that generated the error).
In my defense, I didn't realize that there were versions of find that didn't require a starting location. And I've tended to remain with more standard versions of commands like this, since I've had to use too many stripped down systems through the years, plus I still use several different versions of Unix like systems. Centos 5 does work without the path, but I wonder now when that was added to Linux? OS X doesn't support that variant. I don't know yet about Solaris.
GNU find and anything GNU has always been a bit different from UNIX/POSIX versions. GNU is NOT UNIX after all.
However, there are cases with you would want to use GNU find over the local UNIX version of find like on Solaris 8. Way, way faster. Of course, the Larry Lackeys er Sun Engineers would point out that GNU find is not doing things 'correctly.'
Now that I have gone way off topic and started bashing other operating systems, I shall make this my last post on this thread.
At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list centos@centos.org wrote:
Content-Language: en-us
The directory that I am trying to clean up is huge . every time get this error msg
-bash: /usr/bin/find: Argument list too long
'man xargs'
find <mumble> -print | xargs rm
Please advise
Anas
MIME-Version: 1.0
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Robert Heller wrote:
At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list centos@centos.org wrote:
Content-Language: en-us
The directory that I am trying to clean up is huge . every time get this error msg
-bash: /usr/bin/find: Argument list too long
'man xargs'
find <mumble> -print | xargs rm
Hi
Just curious. What is the difference between the command above and "find <numble> -exec rm -f {} ;" ?
Thanks
mg.
On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:
Robert Heller wrote:
-bash: /usr/bin/find: Argument list too long
'man xargs'
find <mumble> -print | xargs rm
Hi
Just curious. What is the difference between the command above and "find <numble> -exec rm -f {} ;" ?
the find ... -exec variation will invoke a new "rm" command for every single file it finds, which will simply take more time to run. beyond that, the effect should be the same.
rday --
======================================================================== Robert P. J. Day Waterloo, Ontario, CANADA
Linux Consulting, Training and Kernel Pedantry.
Web page: http://crashcourse.ca Twitter: http://twitter.com/rpjday ========================================================================
On Jan 23, 2010, at 6:45 AM, Robert P. J. Day wrote:
On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:
Robert Heller wrote:
-bash: /usr/bin/find: Argument list too long
'man xargs'
find <mumble> -print | xargs rm
Hi
Just curious. What is the difference between the command above and "find <numble> -exec rm -f {} ;" ?
the find ... -exec variation will invoke a new "rm" command for every single file it finds, which will simply take more time to run. beyond that, the effect should be the same.
Unless there are files or directories with spaces in them, in which case the xargs variant can fail.
It is likely the original poster either did find * ... or find . -name * and the bash shell still expanded the arguments. He was on the right track using the find command, but it wasn't used right.
Am 23.01.2010 14:12, schrieb Kevin Krieser:
On Jan 23, 2010, at 6:45 AM, Robert P. J. Day wrote:
On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:
Robert Heller wrote:
-bash: /usr/bin/find: Argument list too long
'man xargs'
find <mumble> -print | xargs rm
Hi
Just curious. What is the difference between the command above and "find <numble> -exec rm -f {} ;" ?
the find ... -exec variation will invoke a new "rm" command for every single file it finds, which will simply take more time to run. beyond that, the effect should be the same.
Unless there are files or directories with spaces in them, in which case the xargs variant can fail.
find on CentOS 5.4 supports
find <path> -exec {} +;
which avoids the negative effect of spawning new subprocesses when using "-exec {} ;"
find on CentOS 4.8 does not support that.
It is likely the original poster either did find * ... or find . -name * and the bash shell still expanded the arguments. He was on the right track using the find command, but it wasn't used right.
Alexander
find on CentOS 5.4 supports
find <path> -exec {} +;
which avoids the negative effect of spawning new subprocesses when using "-exec {} ;"
find on CentOS 4.8 does not support that.
I'll have to give that a try sometime. A person gets used to a subset of a command, and doesn't necessarily look for new options being added.
In article C31ED75A-0115-44FC-940D-C2956C46EA05@sbcglobal.net, Kevin Krieser k_krieser@sbcglobal.net wrote:
On Jan 23, 2010, at 6:45 AM, Robert P. J. Day wrote:
On Sat, 23 Jan 2010, Marcelo M. Garcia wrote: the find ... -exec variation will invoke a new "rm" command for every single file it finds, which will simply take more time to run. beyond that, the effect should be the same.
Unless there are files or directories with spaces in them, in which case the xargs variant can fail.
That's what -print0 is for, together with the -0 option to xargs:
find dir1 dir2 -name '*.foo' -print0 | xargs -0 rm
Cheers Tony
At Sat, 23 Jan 2010 12:43:40 +0000 CentOS mailing list centos@centos.org wrote:
Robert Heller wrote:
At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list centos@centos.org wrote:
Content-Language: en-us
The directory that I am trying to clean up is huge . every time get this error msg
-bash: /usr/bin/find: Argument list too long
'man xargs'
find <mumble> -print | xargs rm
Hi
Just curious. What is the difference between the command above and "find <numble> -exec rm -f {} ;" ?
The command "find <mumble> -exec rm -f {} ;" collects ALL of the names "find <numble>" as a single command line, which in your case is too large for the shell to deal with. The command "find <mumble> -print | xargs rm" uses a pipeline. "find <mumble> -print", *prints* the names it finds to stdout. xargs reads stdin, line by line, and collects those lines as words up to some reasonable string length (within the shell's command line length limits) and passes this argument list to xargs's arguments. If necessary, xargs will call the command repeatedly with suitable subsets of the complete list, keeping each subset below the shell's command line string length limit.
The '-exec ...' option to find is fine for small sets of results. The "find ... -print | xargs ..." will handle arbitrarily large result sets. xargs can also be used anyplace you happen to have a list of names (one per line) that you need to pass as words to a command:
tar tzf foo.tar.gz| xargs -r ls -d
Will list those files that are in the tar file that are also on disk. The '-r' option to xargs will prevent xargs from calling ls with no arguments, if xargs happens not to get any input.
Thanks
mg.
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Robert Heller wrote:
At Sat, 23 Jan 2010 12:43:40 +0000 CentOS mailing list centos@centos.org wrote:
Just curious. What is the difference between the command above and "find <numble> -exec rm -f {} ;" ?
The command "find <mumble> -exec rm -f {} ;" collects ALL of the names "find <numble>" as a single command line, which in your case is too large for the shell to deal with.
Gosh, then I guess the manpage for 'find' must be totally wrong where it says:
-exec command ; ... The specified command is run once for each matched file.
http://www.google.com/search?as_epq=Argument+list+too+long
Kai