I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days. Can you point me out.
Thanks madunix
Hi,
Try the ff:
On 1/25/11 4:31 PM, madunix@gmail.com wrote:
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis
Yes, just use crontab for that. Something like,
30 6 * * * command-or-path-to-script-here
to run the command or script everyday 6:30 a.m.
and to keep the last 5days backup on the box and remove older version than 5days. Can you point me out.
I think the easiest way is to use logrotate. man logrotate for details.
HTH,
You could create a script and have a variable date --date="5 days ago" append to your tar file and after that, combine it with if syntax. If match, then rm.
HTH
On Tue, Jan 25, 2011 at 3:31 PM, madunix@gmail.com madunix@gmail.comwrote:
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days. Can you point me out.
Am thinking to have this in my script
#!/bin/bash tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* find /tmp/website/website*.tgz -ctime +5 -exec rm {} ; # removes older then 5 days
crontab it 30 6 * * * /mypath/myscript
On Tue, Jan 25, 2011 at 10:45 AM, Nelson ntserafica@gmail.com wrote:
You could create a script and have a variable date --date="5 days ago" append to your tar file and after that, combine it with if syntax. If match, then rm.
HTH
On Tue, Jan 25, 2011 at 3:31 PM, madunix@gmail.com madunix@gmail.com wrote:
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days. Can you point me out.
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
On 25/01/11 21:56, madunix@gmail.com wrote:
Am thinking to have this in my script
#!/bin/bash tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* find /tmp/website/website*.tgz -ctime +5 -exec rm {} ; # removes older then 5 days
That should do in your case. Though, in general, you would prefer the following (because, in the general case, that glob could match a _lot_ of things, though in _your_ case, it should be fine).
find /tmp/website/ -name website*.tgz -ctime +5 -exec rm {} ;
Also, from a security standpoint (especially if your website contains things private materials the webserver would not serve), you should use umask to change the default permissions the archive is assigned. You can set this temporarily as follows:
(umask 077; tar ....)
The (...) construct defines a _subshell_. A umask specifies the mode bits to clear on a new file, so 077 causes new files to be created as rw-------. Umask is a property inherited from parent process to child processes, and is in effect until either changed or the parent proces (the shell, typically) ends.
The umask _command_ (actually, _shell-internal_ command) affects the umask of the shell process, which causes the tar child process to see the change). To prevent subsequent processes also getting that same, restrictive, umask, I've used a sub-shell (the round-brackets), to limit the scope of the umask effect to just the tar command.
PS. You're not really keeping your website backups in /tmp, are you?
From: "madunix@gmail.com" madunix@gmail.com
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days.
A quick way to do it is to use the day of the week: website-$(date +%u).tgz It will automaticaly keep the last 7 days... Otherwise, you will have to use date calculations...
JD
Should I add to my tar the following option -p, --preserve-permissions extract all protection information tar -cvzfp ......
Thanks
On Tue, Jan 25, 2011 at 7:10 PM, John Doe jdmls@yahoo.com wrote:
From: "madunix@gmail.com" madunix@gmail.com
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days.
A quick way to do it is to use the day of the week: website-$(date +%u).tgz It will automaticaly keep the last 7 days... Otherwise, you will have to use date calculations...
JD
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
madunix@gmail.com wrote:
Should I add to my tar the following option -p, --preserve-permissions extract all protection information tar -cvzfp ......
Thanks
On Tue, Jan 25, 2011 at 7:10 PM, John Doe jdmls@yahoo.com wrote:
From: "madunix@gmail.com" madunix@gmail.com
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days.
A quick way to do it is to use the day of the week: website-$(date +%u).tgz It will automaticaly keep the last 7 days... Otherwise, you will have to use date calculations...
I hope I'm not duplicating something someone has already said -- /tmp may not be the best possible choice for backups. A reboot could potentially "help" by cleansing that directory. Off-host copies (eg, scp website-20110101-1459.tgz fred@otherhost:/home/fred/backups/) would address a number of risks.
I have reallocated it to /home thx
On Fri, Jan 28, 2011 at 5:33 PM, cpolish@surewest.net wrote:
madunix@gmail.com wrote:
Should I add to my tar the following option -p, --preserve-permissions extract all protection information tar -cvzfp ......
Thanks
On Tue, Jan 25, 2011 at 7:10 PM, John Doe jdmls@yahoo.com wrote:
From: "madunix@gmail.com" madunix@gmail.com
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days.
A quick way to do it is to use the day of the week: website-$(date +%u).tgz It will automaticaly keep the last 7 days... Otherwise, you will have to use date calculations...
I hope I'm not duplicating something someone has already said -- /tmp may not be the best possible choice for backups. A reboot could potentially "help" by cleansing that directory. Off-host copies (eg, scp website-20110101-1459.tgz fred@otherhost:/home/fred/backups/) would address a number of risks. -- Charles Polisher
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
On Fri, 28 Jan 2011, madunix@gmail.com wrote:
To: CentOS mailing list centos@centos.org From: "madunix@gmail.com" madunix@gmail.com Subject: Re: [CentOS] backup script
I have reallocated it to /home thx
On Fri, Jan 28, 2011 at 5:33 PM, cpolish@surewest.net wrote:
madunix@gmail.com wrote:
Should I add to my tar the following option -p, --preserve-permissions extract all protection information tar -cvzfp ......
Thanks
On Tue, Jan 25, 2011 at 7:10 PM, John Doe jdmls@yahoo.com wrote:
From: "madunix@gmail.com" madunix@gmail.com
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days.
A quick way to do it is to use the day of the week: website-$(date +%u).tgz It will automaticaly keep the last 7 days... Otherwise, you will have to use date calculations...
I hope I'm not duplicating something someone has already said -- /tmp may not be the best possible choice for backups. A reboot could potentially "help" by cleansing that directory. Off-host copies (eg, scp website-20110101-1459.tgz fred@otherhost:/home/fred/backups/) would address a number of risks.
Hi Charles.
You might find this php script I wrote handy:
http://forums.fedoraforum.org/showthread.php?t=248436
I use a seperate 500GB drive just for storing backups of various things I don't want to loose. Then at certain intervals (ie when I think needed), I burn the backups to CD or DVD - just to be extra safe!
Most of my backup scripts are run by cron jobs overnight.
Kind Regards,
Keith Roberts
----------------------------------------------------------------- Websites: http://www.karsites.net http://www.php-debuggers.net http://www.raised-from-the-dead.org.uk
All email addresses are challenge-response protected with TMDA [http://tmda.net] -----------------------------------------------------------------
home folder for backup /backup
On Fri, Jan 28, 2011 at 7:49 PM, m.roth@5-cent.us wrote:
madunix@gmail.com wrote:
I have reallocated it to /home thx
Please stop top posting.
Relocated it to /home, as in /home/backup? Don't clutter your base directories, that's very bad practice.
mark
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
madunix@gmail.com wrote:
home folder for backup /backup
On Fri, Jan 28, 2011 at 7:49 PM, m.roth@5-cent.us wrote:
madunix@gmail.com wrote:
I have reallocated it to /home thx
Please stop top posting.
Relocated it to /home, as in /home/backup? Don't clutter your base directories, that's very bad practice.
   mark
Do you actually understand what we're talking about when *many* of us here ask people to STOP TOP POSTING?
mark
-----Original Message----- From: centos-bounces@centos.org [mailto:centos-bounces@centos.org] On Behalf Of m.roth@5-cent.us Sent: Friday, January 28, 2011 1:07 PM To: CentOS mailing list Subject: Re: [CentOS] backup script
madunix@gmail.com wrote:
home folder for backup /backup
On Fri, Jan 28, 2011 at 7:49 PM, m.roth@5-cent.us wrote:
madunix@gmail.com wrote:
I have reallocated it to /home thx
Please stop top posting.
Relocated it to /home, as in /home/backup? Don't clutter your base directories, that's very bad practice.
Do you actually understand what we're talking about when *many* of us here ask people to STOP TOP POSTING?
mark
Furthermore, do you understand the need to make clear fact-rich helpful posts? I have no personal gripe against top-posting that I don't also have against people quoting the entire message running 2+ pages to add "that works for me" or "Package yada does it better" at the bottom.
You, madunix, are both top-posting and making uselessly short ambiguous posts. Please stop the one practice, or the other. ******************************************************************* This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the system manager. This footnote also confirms that this email message has been swept for the presence of computer viruses. www.Hubbell.com - Hubbell Incorporated**
On Fri, Jan 28, 2011 at 1:03 PM, madunix@gmail.com madunix@gmail.com wrote:
home folder for backup /backup
This is a tactical problem. If you actually read the "File System Hierarchy" guidelines, you'll see that it should be in "/var" as dynamic, volatile content, probably undar "/var/backup".
If that backup repository is network mounted for whatever reasons, it also keeps mounting problems off of the "/ directory, which is very desirable.
Recall.. I run now the following task every day tar -cvzf /rescue/website-$(date +%u).tgz /var/www/htdocs/* I want now to move these files from the local server to a remote server via ftp.
any help.
Thanks
On Fri, Jan 28, 2011 at 5:33 PM, cpolish@surewest.net wrote:
madunix@gmail.com wrote:
Should I add to my tar the following option -p, --preserve-permissions extract all protection information tar -cvzfp ......
Thanks
On Tue, Jan 25, 2011 at 7:10 PM, John Doe jdmls@yahoo.com wrote:
From: "madunix@gmail.com" madunix@gmail.com
I want to create bash script to have a zip copy from a website running on linux /var/www/htdocs/* local on the same box on different directory I am thinking to do a local backup using crontab (snapshot my web) tar -cvzf /tmp/website-$(date +%Y%m%d-%H%M).tgz /var/www/htdocs/* This command will create a file /tmp/website-20110101-1459.tgz I want it run on daily basis and to keep the last 5days backup on the box and remove older version than 5days.
A quick way to do it is to use the day of the week: website-$(date +%u).tgz It will automaticaly keep the last 7 days... Otherwise, you will have to use date calculations...
I hope I'm not duplicating something someone has already said -- /tmp may not be the best possible choice for backups. A reboot could potentially "help" by cleansing that directory. Off-host copies (eg, scp website-20110101-1459.tgz fred@otherhost:/home/fred/backups/) would address a number of risks. -- Charles Polisher
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Pinter Tibor wrote:
On 03/20/2011 08:31 AM, madunix@gmail.com wrote:
Recall.. I run now the following task every day tar -cvzf /rescue/website-$(date +%u).tgz /var/www/htdocs/* I want now to move these files from the local server to a remote server via ftp.
any help.
man lftp
Or replace with rsync.
mark