MHR wrote: > On Jan 30, 2008 8:26 AM, William L. Maltby <CentOS4Bill at triad.rr.com> wrote: > As long as the majority of the files are not plain text - I have had > really bad results using bzip2 on text files - specifically, massive > file corruption. I have had to go back to pre-bzipped archives to > rebuild these files - not a fun task. I've been using pigz for a while (Parallel gzip), to compress 100+GB tar files, it works well if you have multiple CPUs. Never encountered corruption with bzip2 myself, there is a parallel bzip but it's about 8x slower. from my notes: -- To compile: gcc pigz17.c -lpthread -lz -o pigz Sample command line: pigz -p 10 -v (filename) The default 32 threads seems to be kind of high, drives load up quite a bit, while 10 threads at least in a simple test on a 2GB file kept load a lot lower but still kept the CPUs busy at 100% utilization on a dual core system. YMMV. original source: http://zlib.net/pigz17.c.gz if that doesn't exist there may be a new version, try pibz18.c.gz 19.c.gz ..etc --- To be safe, since I deployed it a few months ago I've been running gzip -t afterwards on the files, and all of them have passed. nate