0

A common issue for administrators. I have a database that I want to make a daily backup. Also, I want to keep only last week n copies.

Do you have any recommendations based on essential Linux commands?

2 Answers 2

1

Use this command combined with a cronjob. Let's do it step-by-step

mysqldump -u root db_name | gzip > /srv/backup/db_name-$(date "+%Y.%m.%d-%H.%M.%S").sql.gz && ls -t /srv/backup/* | sed -e "1,7d" | xargs rm
  1. mysqldump gets the backup.
  2. gzip compress the output
  3. the ls, sed and xargs commands remove all the additional files above 7.

(I prefer this solution instead of others based on "older than x days" because in case the dump fails you'll always keep at least 7 copies.)

Run the command to verify that works. If so, put it in bash file like /root/bin/backup_db.sh

Now, edit your crontab (crontab -e) and add this new line. First, I would run it every 2 minutes just to ensure that command works properly:

*/2 * * * * bash /root/bin/backup_db.sh

Wait a few minutes and observe that files are created as expected

watch ls /srv/backup

It seems it works! Now change you cronjob to run on a desired schedule (in my case every day at 01:10 a.m.

10 1 * * * bash /root/bin/backup_db.sh

Hope this helps!

2
  • 1
    That doens't seem to take care of live read/writes and might leave your database backup in a corrupt state (this might not be a problem for you). Here is a slightly more sophisticated answer over at dba.SE. Commented Oct 19, 2022 at 17:09
  • Your're right. Mysqldump command could be extended (--single-transaction --routines --triggers --all-databases...) or even preceded by a flush tables comand, to ensure a better state of the database, but i this example I want to keep it simple. The current databases I use for are simple web applications with not high read/write operations. Anyway thank for your comment, it would be helpful for other administrators. Commented Oct 21, 2022 at 10:49
0

Make sure to replace the placeholders (your_database_user, your_database_password, your_database_name, and /path/to/backup/directory) with your actual database credentials and the desired backup directory.

#!/bin/bash

# Set your database credentials and other parameters
DB_USER="your_database_user"
DB_PASSWORD="your_database_password"
DB_NAME="your_database_name"
BACKUP_DIR="/path/to/backup/directory"
DAYS_TO_KEEP=7

# Create a timestamp for the backup file
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")

# Dump the database to a SQL file
mysqldump -u $DB_USER -p$DB_PASSWORD $DB_NAME > $BACKUP_DIR/backup_$TIMESTAMP.sql

# Compress the backup file
gzip $BACKUP_DIR/backup_$TIMESTAMP.sql

# Remove backups older than the specified number of days
find $BACKUP_DIR -name "backup_*" -type f -mtime +$DAYS_TO_KEEP -exec rm {} \;
1
  • You'd have safer code if you double-quoted your variables each time you used them, so that the shell doesn't parse them. For example, consider a password containing a space, such as one two. You'd get mysqldump -u myusername -pone two databasename…, and MySQL would see one as the password and (at best) two as a database name, with the consequent failure to log in due to an invalid password. A similar issue arises for directory names referenced through $BACKUP_DIR Commented Nov 20, 2023 at 14:30

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .