I need to tar up a good 100 GiB of files, but tar is progressing at a rate of about 1 MiB per second. Is there something, anything, faster?
Thanks!
On 01/06/2011 05:47 AM, Dotan Cohen wrote:
I need to tar up a good 100 GiB of files, but tar is progressing at a rate of about 1 MiB per second. Is there something, anything, faster?
tar is normally screaming fast unless you use bzip2 compression (or gzip compression on an underpowered CPU).
Provide details: What are you tarring, how are you invoking tar, what hardware are you running on (hard drive types, cpu type, etc).
On Thu, Jan 6, 2011 at 15:54, Jerry Franz jfranz@freerun.com wrote:
tar is normally screaming fast unless you use bzip2 compression (or gzip compression on an underpowered CPU).
Provide details: What are you tarring, how are you invoking tar, what hardware are you running on (hard drive types, cpu type, etc).
Thanks, Jerry, I was in fact using bzip2: $ tar -cjf dcl-2010-12-07.tbz dcl-2010-12-07/
I don't really need compressed, just archived (moving Linux files via FAT-formatted external hard drive) so I ditched the j option and it's now screaming along at almost 80 MiB/sec. Thanks!
On Thu, Jan 6, 2011 at 7:30 PM, Dotan Cohen dotancohen@gmail.com wrote:
On Thu, Jan 6, 2011 at 15:54, Jerry Franz jfranz@freerun.com wrote:
tar is normally screaming fast unless you use bzip2 compression (or gzip compression on an underpowered CPU).
Provide details: What are you tarring, how are you invoking tar, what hardware are you running on (hard drive types, cpu type, etc).
Thanks, Jerry, I was in fact using bzip2: $ tar -cjf dcl-2010-12-07.tbz dcl-2010-12-07/
bzip2 will slow down the operation. If you don't really need compressed than simply do "tar cf <tar file> <dir/file list>"
-- Arun Khan
On Thu, Jan 6, 2011 at 16:08, Arun Khan knura9@gmail.com wrote:
Thanks, Jerry, I was in fact using bzip2: $ tar -cjf dcl-2010-12-07.tbz dcl-2010-12-07/
bzip2 will slow down the operation. If you don't really need compressed than simply do "tar cf <tar file> <dir/file list>"
Yup, that's what I'm doing now! Thanks.
Hello Dotan,
On Thu, 2011-01-06 at 16:19 +0200, Dotan Cohen wrote:
On Thu, Jan 6, 2011 at 16:08, Arun Khan knura9@gmail.com wrote:
bzip2 will slow down the operation. If you don't really need compressed than simply do "tar cf <tar file> <dir/file list>"
Yup, that's what I'm doing now! Thanks.
Gzip is pretty fast and still should give you decent compression. In most cases the highest compression will hardly give you better compression than the default level of 6, so just go with the default (tar cz). Use bzip2 only if space is a big concern.
Regards, Leonard.
On Thu, Jan 06, 2011 at 08:15:16PM +0100, Leonard den Ottolander wrote:
Hello Dotan,
On Thu, 2011-01-06 at 16:19 +0200, Dotan Cohen wrote:
On Thu, Jan 6, 2011 at 16:08, Arun Khan knura9@gmail.com wrote:
bzip2 will slow down the operation. If you don't really need compressed than simply do "tar cf <tar file> <dir/file list>"
Yup, that's what I'm doing now! Thanks.
Gzip is pretty fast and still should give you decent compression. In most cases the highest compression will hardly give you better compression than the default level of 6, so just go with the default (tar cz). Use bzip2 only if space is a big concern.
pigz is a valuable tool for anyone needing gzip compression and having > 1 cpu. It runs multiple (parallel) gzip's for a huge speedup. I've been using it for months - very stable.
To use with Gnu tar: "--use-compress-program /usr/local/bin/pigz".
Download from: http://pkgs.org/centos-5-rhel-5/rpmforge-i386/pigz-2.1.6-1.el5.rf.i386.rpm.h...
Homepage: http://www.zlib.net/pigz/
Review: http://andrew.tumblr.com/post/344920968
On Thu, Jan 6, 2011 at 9:08 AM, Arun Khan knura9@gmail.com wrote:
On Thu, Jan 6, 2011 at 7:30 PM, Dotan Cohen dotancohen@gmail.com wrote:
On Thu, Jan 6, 2011 at 15:54, Jerry Franz jfranz@freerun.com wrote:
tar is normally screaming fast unless you use bzip2 compression (or gzip compression on an underpowered CPU).
Provide details: What are you tarring, how are you invoking tar, what hardware are you running on (hard drive types, cpu type, etc).
Thanks, Jerry, I was in fact using bzip2: $ tar -cjf dcl-2010-12-07.tbz dcl-2010-12-07/
bzip2 will slow down the operation. If you don't really need compressed than simply do "tar cf <tar file> <dir/file list>"
It would be interesting to see how compression would do if pushing across a slow link :).
There are many times where I run across the OP's scenario, but often I need to push a tar across a relatively slow link. bzip is slower to compress, but may make up the difference in transfer time. Maybe time to fire up maxima and find that sweet spot :D
On 1/6/2011 1:04 PM, Kwan Lowe wrote:
It would be interesting to see how compression would do if pushing across a slow link :).
There are many times where I run across the OP's scenario, but often I need to push a tar across a relatively slow link. bzip is slower to compress, but may make up the difference in transfer time. Maybe time to fire up maxima and find that sweet spot :D
If you already have part of the content or older version of files with small changes, 'rsync -P -z...' is the way to go.
On Thu, 2011-01-06 at 15:47 +0200, Dotan Cohen wrote:
I need to tar up a good 100 GiB of files, but tar is progressing at a rate of about 1 MiB per second. Is there something, anything, faster?
Yes, star.
http://cdrecord.berlios.de/private/star.html
And it is in the CentOS repos. The "-fifo" option can help allot [and it backs up ACLS & xattrs too!].
On Thu, Jan 6, 2011 at 16:06, Adam Tauno Williams awilliam@whitemice.org wrote:
On Thu, 2011-01-06 at 15:47 +0200, Dotan Cohen wrote:
I need to tar up a good 100 GiB of files, but tar is progressing at a rate of about 1 MiB per second. Is there something, anything, faster?
Yes, star.
http://cdrecord.berlios.de/private/star.html
And it is in the CentOS repos. The "-fifo" option can help allot [and it backs up ACLS & xattrs too!].
Thanks, I'll take a look at that.