Backuping Full Database

Dump doesn’t work. If you have huge tables with billions of rows your backup process could become a nightmare, mainly if you need to restore it.

This method aims to perform a full backup of a large MySQL database, as Zabbix (any version), focusing on a fast recovery from disaster. So I chose XtraBackup for this task, a backup tool from Percona, which works using the hotcopy mode.

First, you need to download and install XtraBackup:

XtraBackup offers a lot of parameters, so this script is intented to be the simplest possible.

Make sure that you’re using InnoDB for history tables at least. In my case, I have a Zabbix Database with 300GB data that takes about 3 hours to do all backup.

Create script /var/lib/xtrabackup/

Adjust the permissions:

Configure your crontab to backup every day at 04:15am:

So if you need a restore, it’s very simple:

3 Responses to “ “Backuping Full Database”

  1. Nicolassdiaz says:

    Thank you Ricardo.

  2. Sergeylo says:

    Actually, mysqldump works perfectly.
    Worked around zabbix DB backup task a year ago, and tried to solve such problem as it’s low perfomance. In the end, I dropped all these dirty methods, and hotcopy too.
    On test server I had
    single SSD as storage (lately that company bought couple of them for RAID10, but I left them), sata3 HDD as backup storage, *default* mysqldump with pair of keys (but they didn’t matter much).
    I didn’t stop mysql process – it did not cause any problems – all backups were able to works, and monitoring didn’t stop while backup process. And it worked on AMD Phenom CPU, with no problems.
    What a pity, I don’t remember a size of DB.
    (Sorry for my bad English)

  3. João Lopes says:

    I’m trying to implement a full backup one table each time based on mysqldump (mysqldump -R –opt –extended-insert=FALSE) and during the backups Zabbix server stops collecting data and I get alarms. How can I work around this?
    Thanks in advance

Leave a Reply