[CentOS] file system defragmentation

Fri Aug 26 14:26:41 UTC 2005
Maciej Żenczykowski <maze at cela.pl>

speaking of many continuosly growing files...
like wgetting a few (3-4) big files (even GBs) at a slow rate (100 kB/s).
Is there some way to do preallocation (we know how big the files are gonna 
be...), does wget (or other such programs) support this?  What about cp, 
etc.  It would seem to be one of those useful things :)

Cheers,
MaZe.

On Fri, 26 Aug 2005, Les Mikesell wrote:

> On Fri, 2005-08-26 at 08:19, Bryan J. Smith wrote:
>
>> Instead of rehashing this for the 23rd time on a list, I'll create a
>> blog entry later this evening from my past posts.  I'll send you the
>> link when I put it up.
>>
>> Short version:
>> - Strict separation of binaries, data and temporary files
>> - Reservation of disk space (filesystems never completely fill)
>> - Allocation approaches of inode design filesystems
>
> Please cover the common and difficult cases of many continuously
> growing files and the situation where many small files have
> been created and removed and those non-contiguous blocks are
> now the only remaining space on the drive when you need to write
> a large file.  And how to fix this after the fact...
>
>