[CentOS] Deduplicated archives via hardlinks [Was: XFS or EXT3 ?]

Adam Tauno Williams awilliam at whitemice.org
Fri Dec 3 21:14:21 UTC 2010


On Fri, 2010-12-03 at 12:51 -0800, John R Pierce wrote: 
> On 12/03/10 12:25 PM, Les Mikesell wrote:
> > Whenever anyone mentions backups, I like to plug the backuppc program
> > (http://backuppc.sourceforge.net/index.html and packaged in EPEL).  It
> > uses compression and hardlinks all duplicate files to keep much more
> > history than you'd expect on line with a nice web interface - and does
> > pretty much everything automatically.
> I'm curious how you backup backuppc, like for disaster recovery, 

I know nothing about backuppc;  I don't use it.  But we use rsync with
the same concept for a deduplicated archive.

> archival, etc?   since all the files are in a giant mess of symlinks

No, they are not symbolic links - they are *hard links*.   That they are
hard-links is the actual magic.  Symbolic links would provide the
automatic deallocation of expires files.

> (for deduplication) with versioning, I'd have to assume the archive 
> volume gets really messy after awhile, and further, something like that 
> is pretty darn hard to make a replica of it.

I don't see why;  only the archive is deduplicated in this manner, and
it certainly isn't "messy".  One simply makes a backup [for us that
means to tape - a disk is not a backup] of the most current snapshot.

The script just looks like - 

export ROOT="/srv/cifs/Arabis-Red"
export STAMP=`date +%Y%m%d%H`
export LASTSTAMP=`cat $ROOT/LAST.STAMP`
mkdir $ROOT/$STAMP
mkdir $ROOT/$STAMP/home

nice rsync --verbose --archive --delete --acls \
      --link-dest $ROOT/$LASTSTAMP/home/ \
      --numeric-ids \
      -e ssh \
        archivist at arabis-red:/home/ \
          $ROOT/$STAMP/home/ \
          2>&1 > $ROOT/$STAMP/home.log

echo $STAMP > $ROOT/LAST.STAMP





More information about the CentOS mailing list