Is there a switch in "find" (or some other command besides find) that'll let you find files larger than a specified size?
My file system is 88% full and I'd like to see where the biggest space hoggers are.
PG
On Thursday 10 January 2008 23:21:55 techlists@comcast.net wrote:
Is there a switch in "find" (or some other command besides find) that'll let you find files larger than a specified size?
My file system is 88% full and I'd like to see where the biggest space hoggers are.
I also found this on the net: du /path/to/anywhere/* -hs | grep [0-9]M | sort -rn | head -20
It will sort the space usage of each directories. HTH,
On Sun, Jan 13, 2008, Fajar Priyanto wrote:
On Thursday 10 January 2008 23:21:55 techlists@comcast.net wrote:
Is there a switch in "find" (or some other command besides find) that'll let you find files larger than a specified size?
My file system is 88% full and I'd like to see where the biggest space hoggers are.
I also found this on the net: du /path/to/anywhere/* -hs | grep [0-9]M | sort -rn | head -20
I usually use something like:
find /mountpoint -xdev -size +10000 > someplacenotfull
Bill -- INTERNET: bill@celestial.com Bill Campbell; Celestial Software LLC URL: http://www.celestial.com/ PO Box 820; 6641 E. Mercer Way FAX: (206) 232-9186 Mercer Island, WA 98040-0820; (206) 236-1676
Our Foreign dealings are an Open Book, generally a Check Book. Will Rogers
Am Sonntag, den 13.01.2008, 10:16 +0700 schrieb Fajar Priyanto:
On Thursday 10 January 2008 23:21:55 techlists@comcast.net wrote:
Is there a switch in "find" (or some other command besides find) that'll let you find files larger than a specified size?
My file system is 88% full and I'd like to see where the biggest space hoggers are.
I also found this on the net: du /path/to/anywhere/* -hs | grep [0-9]M | sort -rn | head -20
This only shows you usage for directories less than 1GB. (and more than 1MB) To see all:
du /path/to/anywhere/* -s | sort -rn | head -20
On Mon, 2008-01-14 at 08:21 +0100, Andreas Kuntzagk wrote:
Am Sonntag, den 13.01.2008, 10:16 +0700 schrieb Fajar Priyanto:
On Thursday 10 January 2008 23:21:55 techlists@comcast.net wrote:
Is there a switch in "find" (or some other command besides find) that'll let you find files larger than a specified size?
My file system is 88% full and I'd like to see where the biggest space hoggers are.
I also found this on the net: du /path/to/anywhere/* -hs | grep [0-9]M | sort -rn | head -20
This only shows you usage for directories less than 1GB. (and more than 1MB) To see all:
du /path/to/anywhere/* -s | sort -rn | head -20
May I suggest (g)awk? That way you'll get all, not just 20, of what you want.
du -s *|sort -rn|gawk --re-interval '/^[[:digit:]]{4,}\t/' -
This shows dirs with block counts of 1000 or more. And then there is perl etc. Usually these threads get long as everyone jumps in with their personal favorite, including me here. :-)
And smaller dirs can be identified with
du -s *|sort -rn|gawk --re-interval '/^[[:digit:]]{,3}\t/' -
BTW, I was surprised that the 4.* implementation defaults required the "--re-interval" switch. Hmmm.
HTH
On Mon, 2008-01-14 at 09:56 -0500, William L. Maltby wrote:
On Mon, 2008-01-14 at 08:21 +0100, Andreas Kuntzagk wrote:
Am Sonntag, den 13.01.2008, 10:16 +0700 schrieb Fajar Priyanto:
On Thursday 10 January 2008 23:21:55 techlists@comcast.net wrote:
Is there a switch in "find" (or some other command besides find) that'll let you find files larger than a specified size?
<snip>
May I suggest (g)awk? That way you'll get all, not just 20, of what you want.
du -s *|sort -rn|gawk --re-interval '/^[[:digit:]]{4,}\t/' -
This shows dirs with block counts of 1000 or more. And then there is perl etc. Usually these threads get long as everyone jumps in with their personal favorite, including me here. :-)
And smaller dirs can be identified with
du -s *|sort -rn|gawk --re-interval '/^[[:digit:]]{,3}\t/' -
BTW, I was surprised that the 4.* implementation defaults required the "--re-interval" switch. Hmmm.
BTW, sort can be eliminated if order is unimportant.
<snip>
On Mon, 2008-01-14 at 09:56 -0500, William L. Maltby wrote:
On Mon, 2008-01-14 at 08:21 +0100, Andreas Kuntzagk wrote:
Am Sonntag, den 13.01.2008, 10:16 +0700 schrieb Fajar Priyanto:
On Thursday 10 January 2008 23:21:55
techlists@comcast.net wrote:
Is there a switch in "find" (or some other command
besides find)
that'll let you find files larger than a specified size?
<snip>
May I suggest (g)awk? That way you'll get all, not just 20, of what you want.
du -s *|sort -rn|gawk --re-interval '/^[[:digit:]]{4,}\t/' -
This shows dirs with block counts of 1000 or more. And then
there is
perl etc. Usually these threads get long as everyone jumps in with their personal favorite, including me here. :-)
And smaller dirs can be identified with
du -s *|sort -rn|gawk --re-interval '/^[[:digit:]]{,3}\t/' -
BTW, I was surprised that the 4.* implementation defaults
required the
"--re-interval" switch. Hmmm.
BTW, sort can be eliminated if order is unimportant.
Why not just use find to test for a file size since thats what he asked for in the first place :)
find ./ -size +5M
Finds all files recursively from the dir you are standing in with a size of 5 MB or more. -size n[cwbkMG] File uses n units of space Numeric arguments can be specified as +n for greater than n, -n for less than n, n for exactly n.
/ C
On Mon, 2008-01-14 at 16:59 +0100, Carl Boberg wrote:
On Mon, 2008-01-14 at 09:56 -0500, William L. Maltby wrote:
On Mon, 2008-01-14 at 08:21 +0100, Andreas Kuntzagk wrote:
Am Sonntag, den 13.01.2008, 10:16 +0700 schrieb Fajar Priyanto:
On Thursday 10 January 2008 23:21:55
techlists@comcast.net wrote:
Is there a switch in "find" (or some other command
besides find)
that'll let you find files larger than a specified size?
<snip>
<snip>
Why not just use find to test for a file size since thats what he asked for in the first place :)
find ./ -size +5M
Finds all files recursively from the dir you are standing in with a size of 5 MB or more. -size n[cwbkMG] File uses n units of space Numeric arguments can be specified as +n for greater than n, -n for less than n, n for exactly n.
YES! I got caught up in, side-tracked by the "du" replies and didn't carefully re-read the OP.
-- Bill