[CentOS] Backing up remote system

Thu Feb 14 17:13:51 UTC 2008
Scott Ehrlich <scott at MIT.EDU>

I have an Overland Arcvault tape library, a CentOS 5 box, a Windows XP 
system, and RAID box that supports NFS and CIFS.

The RAID box is remotely located and acts as central file storage.

I might normally use dump to perform backups, but as was learned here, and 
on dump's man page, dump doesn't support remote file systems such as NFS 
or CIFS.

So, for now, I've connected the library to a Windows XP system, running 
Service Pack 2.  I have a shell script that tar gz's the directories of 
choice to preserve, places them in /backup, and, via samba's config of 
making /backup available, have it mounted as a drive letter on the XP 
system.  I'm then using XP's built-in backup/restore program to store the 
contents of /backup to tape, and that is working fine for now.

Is there a reliable Linux/CentOS-based way to do this, too?   I know 
people keep saying use Amanda and/or bacula and be done with it, but in 
the event something goes wrong and the mysql, etc database gets corrupt, 
and people want their data back, it would be much easier to use a 
readily-available command, like cpio, tar, etc, to do the job.

I've also elected to tar gz the contents ahead of time, for if I were to 
try and directly archive some of the files via a CIFS mount, I've 
experienced permission problems with various files and folders.  A local 
tar of the files has gotten around that, then just have the Windows box 
copy the tar files to tape.

Now to see what options tar has for producing incremental backups with a 
cron'ed tar job, unless someone has a better, comparable approach with 
existing OS tools (tar, cpio, dump, etc).

I want to retain as many permissions as possible.   I've tried rsync 
before, but that also requires enough disk space for the file copying, and 
some of the options I've used have complains of proper permissions being 
kept during the process.

Thanks again.

Scott