Hi,
I think about buying a NAS-server from Sun. To backup this server I want to use our central to-tape backup. For whatever reason ppl are asking me to make one compressed copy to disk and only backup this copy.
So to reduce load I'd like to have a script that:
identifies changed files only (using md5?) copies them and compresses them
storeBackup.pl does something similar, but keeps versions and creates hardlinks between them. (Is storeBackup in any CentOS repo? I know it from my SuSE box) Any ideas? Do I have to change storeBackup to my needs? (Would really hate that, because I'm not a Perl man)
regards, Andreas
----- "Andreas Kuntzagk" andreas.kuntzagk@mdc-berlin.de escreveu:
So to reduce load I'd like to have a script that:
identifies changed files only (using md5?) copies them and compresses them
storeBackup.pl does something similar, but keeps versions and creates hardlinks between them. (Is storeBackup in any CentOS repo? I know it from my SuSE box) Any ideas? Do I have to change storeBackup to my needs? (Would really hate that, because I'm not a Perl man)
Well, I'm using "rsync --link-dest" to do this. This article "http://www.rootprompt.org/article.php3?article=8976" had the principle but didn't use "--link-dest".
Antonio.
So to reduce load I'd like to have a script that:
identifies changed files only (using md5?) copies them and compresses them
storeBackup.pl does something similar, but keeps versions and creates hardlinks between them. (Is storeBackup in any CentOS repo? I know it from my SuSE box) Any ideas? Do I have to change storeBackup to my needs? (Would really hate that, because I'm not a Perl man)
Well, I'm using "rsync --link-dest" to do this. This article "http://www.rootprompt.org/article.php3?article=8976" had the principle but didn't use "--link-dest".
Well, but rsync doesn't compress, does it? So I would need to compress first and then rsync - meaning I need to keep the compressed files around twice.
Andreas
----- "Andreas Kuntzagk" andreas.kuntzagk@mdc-berlin.de escreveu:
So to reduce load I'd like to have a script that:
identifies changed files only (using md5?) copies them and compresses them
Well, I'm using "rsync --link-dest" to do this. This article "http://www.rootprompt.org/article.php3?article=8976" had the principle but didn't use "--link-dest".
Well, but rsync doesn't compress, does it? So I would need to compress first and then rsync - meaning I need to keep the compressed files around twice.
Well, rsync can "compress on the wire" (the data travels compressed), but rsync will do the first two items: identifies changed files and copy them. The compress to tape part will need to be done after that :) On the other hand, with the prices of tapes and HD today, I had chose to buy two servers with a lot of HD on each, put each one on a different building, and make a backups on them.
Antonio.
Antonio da Silva Martins Junior wrote:
----- "Andreas Kuntzagk" andreas.kuntzagk@mdc-berlin.de escreveu:
So to reduce load I'd like to have a script that:
identifies changed files only (using md5?) copies them and compresses them
Well, I'm using "rsync --link-dest" to do this. This article "http://www.rootprompt.org/article.php3?article=8976" had the principle but didn't use "--link-dest".
Well, but rsync doesn't compress, does it? So I would need to compress first and then rsync - meaning I need to keep the compressed files around twice.
Well, rsync can "compress on the wire" (the data travels compressed), but rsync will do the first two items: identifies changed files and copy them. The compress to tape part will need to be done after that :) On the other hand, with the prices of tapes and HD today, I had chose to buy two servers with a lot of HD on each, put each one on a different building, and make a backups on them.
Oh yeah. Since it is a Sun box too might as well run Solaris 10 or OpenSolaris and use zfs which comes with compression and snapshot capabilities among others.
Andreas Kuntzagk wrote:
Hi,
I think about buying a NAS-server from Sun. To backup this server I want to use our central to-tape backup. For whatever reason ppl are asking me to make one compressed copy to disk and only backup this copy.
So to reduce load I'd like to have a script that:
identifies changed files only (using md5?) copies them and compresses them
storeBackup.pl does something similar, but keeps versions and creates hardlinks between them. (Is storeBackup in any CentOS repo? I know it from my SuSE box) Any ideas? Do I have to change storeBackup to my needs? (Would really hate that, because I'm not a Perl man)
Backuppc (http://backuppc.sourceforge.net/) will backup a number of hosts, compressing all files and hardlinking all duplicates (whether from different hosts or different backup runs) to reduce the storage needed and permit keeping a longer history on line. It also provides a nice web interface for browsing, restoring, and archiving to tape. The tape archive part is manual and kind of an afterthought but the rest is completely automatic and some users have devised ways to use external disks for the archive or do partition level copies to an external disk for offsite storage.
Backuppc (http://backuppc.sourceforge.net/) will backup a number of hosts, compressing all files and hardlinking all duplicates (whether from different hosts or different backup runs) to reduce the storage needed and permit keeping a longer history on line. It also provides a nice web interface for browsing, restoring, and archiving to tape. The tape archive part is manual and kind of an afterthought but the rest is completely automatic and some users have devised ways to use external disks for the archive or do partition level copies to an external disk for offsite storage.
Backuppc looks like to much for my problem. But somebody proposed just using dump for this. To my shame I must confess I never used dump so I'll be doing some man page reading now.
Andreas
Andreas Kuntzagk wrote:
Backuppc (http://backuppc.sourceforge.net/) will backup a number of hosts, compressing all files and hardlinking all duplicates (whether from different hosts or different backup runs) to reduce the storage needed and permit keeping a longer history on line. It also provides a nice web interface for browsing, restoring, and archiving to tape. The tape archive part is manual and kind of an afterthought but the rest is completely automatic and some users have devised ways to use external disks for the archive or do partition level copies to an external disk for offsite storage.
Backuppc looks like to much for my problem. But somebody proposed just using dump for this. To my shame I must confess I never used dump so I'll be doing some man page reading now.
I'm not sure I understand what you mean by 'too much'. Backuppc is not difficult to set up and is fully automatic except for the tape part after you get it working. And as far as I know, it is the only thing that can do an rsync update of an uncompressed target directly against its own highly compressed storage archive.