On Tue, November 11, 2014 9:10 am, Fran Garcia wrote:
On Mon, Nov 10, 2014 at 8:25 PM, Rodrigo Pichiñual Norin rodrigo.pichinual@gmail.com wrote:
Hi all.
I usally make backups of databases mysql.
I make buckups of all datbase for example:
mysqldump -u user -ppassword name_db > backups.sql
also I make backups just its schema for example
mysqldump -u user -ppassword name_db --no-data > backups.sql
Hola,
If size's a concern, just export your DBs gziped :
mysqldump -u user database | gzip > backup_database.sql
I would stay away from compression. Compression results in binary file. Even though compressed result is smaller, when you will try to keep several versions back, you will have multiple compressed file versions hitting your backup storage. The original poster's intent is better: to keep diff of ASCII dump files. If one commits dumps into some version control system, even though its files may be treated as binary, you will need only one latest version on backup (as it contains all versions of database).
Just my $0.02.
Valeri
Also if you are concerned about time to compress, you can enable multithreaded parallel gzip compress with pigz (available in EPEL) :
mysqldump -u user database | pigz > backup_database.sql
TBH, 500 MB databases aren't big enough to seek a more complex approach (unless you have more ambitious RTO/RPO requirements)
++++++++++++++++++++++++++++++++++++++++ Valeri Galtsev Sr System Administrator Department of Astronomy and Astrophysics Kavli Institute for Cosmological Physics University of Chicago Phone: 773-702-4247 ++++++++++++++++++++++++++++++++++++++++