Automating backups on a dedicated server hosting multiple websites via vhosts is essential to ensure data security and availability. In this article, you’ll learn how to create a Bash script to automate the backup process for different websites hosted on a server with vhosts. I’ll include code examples so you can implement it easily.

Step 1: Setting Up the Environment

Before you start writing the script, make sure your server has SSH access configured and that you have root or sudo privileges. Additionally, you’ll need a directory where the backups will be stored.

Step 2: Creating the Backup Script

Below is an example of a Bash script that automates backups of the vhosts. This script will compress the files of each website and store them in a backup directory.

#!/bin/bash
# Directory where backups will be stored
backup_dir="/var/backups"
# Current date
date=$(date +'%Y-%m-%d')
# Vhosts directory
vhosts_dir="/etc/apache2/sites-available"
# Create backup directory if it doesn’t exist
mkdir -p $backup_dir/$date

# Loop through each vhost and perform the backup
for vhost in $(ls "$vhosts_dir"); do
site_name=$(basename "$vhost" .conf)
tar -czf "$backup_dir/$date/$site_name.tar.gz" "/var/www/$site_name"
done

Step 3: Scheduling the Script

To have the script run automatically, you can schedule it using cron. To edit the cron, run the following command:

$ crontab -e

Add the following line to have the script run daily at 2 AM:

0 2 * * * /path/to/script/backup_script.sh

Conclusion:

Automating backups of your vhosts is an excellent way to ensure your data is protected without the need for manual intervention. This script provides a simple and effective solution for maintaining regular backups of your websites.