On Mon, 2006-11-06 at 15:59 -0800, Mark Schoonover wrote:
Backuppc is even more extreme in the space savings. It first compresses the files, then detects duplicates using an efficient hashing scheme and links all duplicates to one pooled copy whether they came from the same source or not. It includes a custom rsync on the server side that understands the compressed storage format but works with stock versions on the remote side so you don't need any special client software. And it has a nice web interface for browsing the backup archive and doing restores.
Sounds like a good product! Since I had plenty of terabytes of storage, having compression on the files wasn't a requirement. One of my requirements was not having a gui to configure things, or do restores. Thanks for pointing some info out on backuppc, I've never heard of it.
Backuppc doesn't need the web interface - it is just handy for some operations, like grabbing a file or zip archive directly through a browser download and it has a concept of machine 'owners' so if people connect through the browser it will let different people see only their own backups. There are command line operations as well and it can construct what looks like a full tar image out of the compressed files at any point where you made an incremental or full run.