Matt Hyclak wrote:
On Tue, Oct 24, 2006 at 04:01:45PM +0100, Peter Crighton enlightened us:
We are running backup softwares for incrementals/differentials and full backups with variouse softwares currently using dirvish scripts + amanda .. what is everyones views on other opensourced backup software? is there anything better or other options we have missed? We are looking at backula as an option? any thoughts?
I am looking for the answer to the same question. I have got amanda going (but not used in anger - just doing my first Centos install for my home server). Just today got amanda to write and restore some data.
So my requirements:
- cheap/free
- multiple backups per tape
- fully automated backup each day
- easy recovery (happy with either a Kdat type GUI or amanda type) -
needs to know which tape to recover the latest (or chosen) version.
- ideal for a home network (mixed Linux Windows)
The only thing I don't like about amanda (so far) is that it needs a new tape each backup, mainly because I'd like it to be a completely automatic backup, only requiring the tape to be changed when full (or maybe each month).
I typically don't backup much data each day (because it's a home network), so I'd like to be able to store multiple backups on each tape. I have 20GB Travan tape drive, so that's enough for a complete full backup and several incremental's.
If you have enough holding disk, just leave the tape out until you hit about 20GB worth of data. I do this here at work on a weekly basis - holding disk is a pair of RAID 1 disks, then once a week I pop a tape in and it flushes the entire week's worth of data.
Matt
If you have 20GB of data, using tapes is OK. In my case, I have about 3TB of data that needs to be backed up, and taken offsite. So, the only real option is rsync going out to disks. We started out with using one of the recipes from the Linux Server Hacks book, #38, #41 & #42 to essentially build up a poor man's SAN. Using CentOS installed on systems with 3Ware cards, I have 2 onsite 4 TB NAS. The first one is for network use, the second is for hourly, daily and weekly snapshots of the main NAS. There's a third 4TB NAS that's located offsite in a colo facility that's fed with dual T1s. We can have anywhere from 2-5 GB of data change every day. We're a company of about 50 employees, and we do legal work - so nothing can be thrown away.
This system runs 7 days a week, and it's fully automated with email alerts, etc. The big benefit is restores. We've had our graphics dept accidently delete 250GB of data, and it was trivial to scp the missing data back to the main NAS. It all happened at network speeds, over a GB switch. All the NASes have dual NICS in them, and the second NICS are connected to their own private GB switch - hence the poor man's SAN. When hourly snapshots run, all the data that changes has a seperate GB network to move the data, leaving the office network alone. No user can tell that backups are happening throughout the day.
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
HTH Mark
Mark
Mark Schoonover wrote:
If you have 20GB of data, using tapes is OK. In my case, I have about 3TB of data that needs to be backed up, and taken offsite. So, the only real option is rsync going out to disks. We started out with using one of the recipes from the Linux Server Hacks book, #38, #41 & #42 to essentially build up a poor man's SAN. Using CentOS installed on systems with 3Ware cards, I have 2 onsite 4 TB NAS. The first one is for network use, the second is for hourly, daily and weekly snapshots of the main NAS. There's a third 4TB NAS that's located offsite in a colo facility that's fed with dual T1s. We can have anywhere from 2-5 GB of data change every day. We're a company of about 50 employees, and we do legal work - so nothing can be thrown away.
This system runs 7 days a week, and it's fully automated with email alerts, etc. The big benefit is restores. We've had our graphics dept accidently delete 250GB of data, and it was trivial to scp the missing data back to the main NAS. It all happened at network speeds, over a GB switch. All the NASes have dual NICS in them, and the second NICS are connected to their own private GB switch - hence the poor man's SAN. When hourly snapshots run, all the data that changes has a seperate GB network to move the data, leaving the office network alone. No user can tell that backups are happening throughout the day.
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
HTH Mark
If you do by chance write that up, I would be interested in seeing it.
Dustin
So would I! Sounds like exactly what we're looking for.
Dan Bulmer Owner / Operator / Sr. Technician Fibre Fast / Reliable Hosting Services
CRUSE Technologies Ltd.
-----Original Message----- From: centos-bounces@centos.org [mailto:centos-bounces@centos.org] On Behalf Of dnk Sent: Tuesday, October 24, 2006 11:03 AM To: CentOS mailing list Subject: Re: [CentOS] Best backup software for linux
Mark Schoonover wrote:
If you have 20GB of data, using tapes is OK. In my case, I have about 3TB
of
data that needs to be backed up, and taken offsite. So, the only real
option
is rsync going out to disks. We started out with using one of the recipes from the Linux Server Hacks book, #38, #41 & #42 to essentially build up a poor man's SAN. Using CentOS installed on systems with 3Ware cards, I have
2
onsite 4 TB NAS. The first one is for network use, the second is for
hourly,
daily and weekly snapshots of the main NAS. There's a third 4TB NAS that's located offsite in a colo facility that's fed with dual T1s. We can have anywhere from 2-5 GB of data change every day. We're a company of about 50 employees, and we do legal work - so nothing can be thrown away.
This system runs 7 days a week, and it's fully automated with email
alerts,
etc. The big benefit is restores. We've had our graphics dept accidently delete 250GB of data, and it was trivial to scp the missing data back to
the
main NAS. It all happened at network speeds, over a GB switch. All the
NASes
have dual NICS in them, and the second NICS are connected to their own private GB switch - hence the poor man's SAN. When hourly snapshots run,
all
the data that changes has a seperate GB network to move the data, leaving the office network alone. No user can tell that backups are happening throughout the day.
Maybe this is something I should write up in more detail. The entire
system
runs on just a couple of shell scripts, rsync, and Perl program to mail
out
logs....
HTH Mark
If you do by chance write that up, I would be interested in seeing it.
Dustin _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
HTH Mark
If you do by chance write that up, I would be interested in seeing it.
Dustin
Agreed! Hints about where you put the snapshot files, how you are mounting the NAS, the commands you use to generate the snapshots, etc. would be most helpful. I know that each of these pieces are documented sparerately, but gathering this stuff you do into a nice little recipe can be very helpful to other folks.
Dan Stoner Network Administrator Florida Museum of Natural History University of Florida (352)392-1721 ext. 233 http://www.flmnh.ufl.edu
I think Bacula(http://www.bacula.org/) is a very good option.
It has a lot of commercial features solutions.
El 25/10/2006, a las 08:05 AM, Dan Stoner escribió:
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
HTH Mark
If you do by chance write that up, I would be interested in seeing it. Dustin
Agreed! Hints about where you put the snapshot files, how you are mounting the NAS, the commands you use to generate the snapshots, etc. would be most helpful. I know that each of these pieces are documented sparerately, but gathering this stuff you do into a nice little recipe can be very helpful to other folks.
Dan Stoner Network Administrator Florida Museum of Natural History University of Florida (352)392-1721 ext. 233 http://www.flmnh.ufl.edu _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
On Tue, Oct 24, 2006 at 09:25:48AM -0700, Mark Schoonover wrote:
Matt Hyclak wrote:
On Tue, Oct 24, 2006 at 04:01:45PM +0100, Peter Crighton enlightened us:
We are running backup softwares for incrementals/differentials and full backups with variouse softwares currently using dirvish scripts + amanda .. what is everyones views on other opensourced backup software? is there anything better or other options we have missed? We are looking at backula as an option? any thoughts?
I am looking for the answer to the same question. I have got amanda going (but not used in anger - just doing my first Centos install for my home server). Just today got amanda to write and restore some data.
So my requirements:
- cheap/free
- multiple backups per tape
- fully automated backup each day
- easy recovery (happy with either a Kdat type GUI or amanda type) -
needs to know which tape to recover the latest (or chosen) version.
- ideal for a home network (mixed Linux Windows)
The only thing I don't like about amanda (so far) is that it needs a new tape each backup, mainly because I'd like it to be a completely automatic backup, only requiring the tape to be changed when full (or maybe each month).
I typically don't backup much data each day (because it's a home network), so I'd like to be able to store multiple backups on each tape. I have 20GB Travan tape drive, so that's enough for a complete full backup and several incremental's.
If you have enough holding disk, just leave the tape out until you hit about 20GB worth of data. I do this here at work on a weekly basis - holding disk is a pair of RAID 1 disks, then once a week I pop a tape in and it flushes the entire week's worth of data.
Matt
If you have 20GB of data, using tapes is OK. In my case, I have about 3TB of data that needs to be backed up, and taken offsite. So, the only real option is rsync going out to disks. We started out with using one of the recipes from the Linux Server Hacks book, #38, #41 & #42 to essentially build up a poor man's SAN. Using CentOS installed on systems with 3Ware cards, I have 2 onsite 4 TB NAS. The first one is for network use, the second is for hourly, daily and weekly snapshots of the main NAS. There's a third 4TB NAS that's located offsite in a colo facility that's fed with dual T1s. We can have anywhere from 2-5 GB of data change every day. We're a company of about 50 employees, and we do legal work - so nothing can be thrown away.
This system runs 7 days a week, and it's fully automated with email alerts, etc. The big benefit is restores. We've had our graphics dept accidently delete 250GB of data, and it was trivial to scp the missing data back to the main NAS. It all happened at network speeds, over a GB switch. All the NASes have dual NICS in them, and the second NICS are connected to their own private GB switch - hence the poor man's SAN. When hourly snapshots run, all the data that changes has a seperate GB network to move the data, leaving the office network alone. No user can tell that backups are happening throughout the day.
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
Looks like the makings of an article for Linux Journal!
Mark Schoonover wrote:
Matt Hyclak wrote:
On Tue, Oct 24, 2006 at 04:01:45PM +0100, Peter Crighton enlightened us:
We are running backup softwares for incrementals/differentials and full backups with variouse softwares currently using dirvish scripts + amanda .. what is everyones views on other opensourced backup software? is there anything better or other options we have missed? We are looking at backula as an option? any thoughts?
I am looking for the answer to the same question. I have got amanda going (but not used in anger - just doing my first Centos install for my home server). Just today got amanda to write and restore some data.
So my requirements:
- cheap/free
- multiple backups per tape
- fully automated backup each day
- easy recovery (happy with either a Kdat type GUI or amanda type) -
needs to know which tape to recover the latest (or chosen) version.
- ideal for a home network (mixed Linux Windows)
The only thing I don't like about amanda (so far) is that it needs a new tape each backup, mainly because I'd like it to be a completely automatic backup, only requiring the tape to be changed when full (or maybe each month).
I typically don't backup much data each day (because it's a home network), so I'd like to be able to store multiple backups on each tape. I have 20GB Travan tape drive, so that's enough for a complete full backup and several incremental's.
If you have enough holding disk, just leave the tape out until you hit about 20GB worth of data. I do this here at work on a weekly basis - holding disk is a pair of RAID 1 disks, then once a week I pop a tape in and it flushes the entire week's worth of data.
Matt
If you have 20GB of data, using tapes is OK. In my case, I have about 3TB of data that needs to be backed up, and taken offsite. So, the only real option is rsync going out to disks. We started out with using one of the recipes from the Linux Server Hacks book, #38, #41 & #42 to essentially build up a poor man's SAN. Using CentOS installed on systems with 3Ware cards, I have 2 onsite 4 TB NAS. The first one is for network use, the second is for hourly, daily and weekly snapshots of the main NAS. There's a third 4TB NAS that's located offsite in a colo facility that's fed with dual T1s. We can have anywhere from 2-5 GB of data change every day. We're a company of about 50 employees, and we do legal work - so nothing can be thrown away.
This system runs 7 days a week, and it's fully automated with email alerts, etc. The big benefit is restores. We've had our graphics dept accidently delete 250GB of data, and it was trivial to scp the missing data back to the main NAS. It all happened at network speeds, over a GB switch. All the NASes have dual NICS in them, and the second NICS are connected to their own private GB switch - hence the poor man's SAN. When hourly snapshots run, all the data that changes has a seperate GB network to move the data, leaving the office network alone. No user can tell that backups are happening throughout the day.
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
HTH Mark
Mark _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
I would be interested in seeing your writeup also if you decide to do it.
Ed
Mark Schoonover wrote:
Maybe this is something I should write up in more detail. The entire system runs on just a couple of shell scripts, rsync, and Perl program to mail out logs....
I think a lot of folks might stand to benefit from such a write-up. If time permits, go for it!
Cheers,