Backing up the giants: DirectAdmin Edition

I recently decided it was time to upgrade my server, and I wanted to give my workflow a face-lift. I’ve been using DirectAdmin a lot recently, and it’s shaped out to be quite a resourceful panel that let’s me get stuff done. But, coming from systems like Plesk and cPanel with built-in decent backup systems and plugins like Acronis for backing up, it’s a big shift. While Acronis isn’t available yet for DirectAdmin, I thought it was time to hack-up my own solution for fully-encrypted offsite backups!

For those who are new to this blog, I use Restic a lot in my workflow for encrypting and backing up files. I panic at the thought of losing my files, so I get a little over-zealous at backing up from time-to-time. Recently, I’ve began backing up my files from servers to a local copy at home, then using BackBlaze B2 and Wasabi for two more off-site copies. I’m not a fan of transferring my data across borders, or even transit lines without some form of encryption, so I use Restic to encrypt the files locally and ship them as encrypted blobs. My encrypted backups include my emails (self-hosting emails, woo!), website and database backups, along with family websites/databases/emails. All-in-all, I designed this system to let me backup everything multiple times, and keep each tree in sync. Yes, it’s excessive!

I prefer to be excessive in backing up, rather than relying on just one copy. I’ve had situations where I’ve lost both copies by simple mistake or corruption. Having the numerous copies let’s me attempt to avoid that!

Let’s Backup!

In order to setup our backups, we’ll need to initialize a repository for Restic. This is effectively a duplicate of the content that’ll be encrypted and sent to our backup provider of choice (in this case, BackBlaze B2). I like to store them in the /home/ directory under a “resticbackups” user. The user has no permissions to sign-in, limited commands can be executed and I dump all the backup logic into scripts in the home directory and crontab them for consistent backups.

I’ve created a script on disk to execute, backing up all the important content, placed in /home/resticbackups/backup.sh:

#!/bin/bash
set -e
cd /home/resticbackups/

##############################
# BackBlaze B2 Configuration #
##############################
export B2_ACCOUNT_ID=""                   # B2 ACCOUNT ID
export B2_ACCOUNT_KEY=""                  # B2 ACCOUNT KEY
export RESTIC_REPOSITORY=""               # B2 REPO NAME
export RESTIC_PASSWORD_FILE="b2pw.txt"    # PASSWORD FILE FOR RESTIC

if [ "$1" == "init" ]; then
	restic -r b2:$RESTIC_REPOSITORY init
else
	# backup mysql data
	mysqldump --databases wp_tgb_blog | restic -r b2:$RESTIC_REPOSITORY backup --stdin --stdin-filename my_blog.sql
	mysqldump --databases wp_wb_blog | restic -r b2:$RESTIC_REPOSITORY backup --stdin --stdin-filename wife_blog.sql
	mysqldump --databases wd_wp_business | restic -r b2:$RESTIC_REPOSITORY backup --stdin --stdin-filename business.sql

	# backup web data
	cp -r /home/personaladm/domains/* ./domains/

	restic -r b2:$RESTIC_REPOSITORY backup ./domains/
fi

In this script, I copy my entire tree in a domains folder (similar to how DirectAdmin structures user accounts), from my main production account (personaladm in this case). I also run mysqldump on the databases, dump them to files while piping it through restic encrypting it.

Now, a simple crontab runs every night at midnight to push the files up:

0 0 * * * /home/resticbackups/backup.sh

Now, I have a perfect simple backup system for B2 instances. I can now extend this to use any storage backend I want, leading to my master backup cli program! Here’s what I’ve got so far for my backup cli program:

#!/bin/bash
set -e
cd /home/resticbackups/

if [ "$EXP_REPO" == "" ]; then
    read -p "What storage mechanism would you like to use? (b2/wasabi/s3) " EXP_REPO
fi
if [ "$EXP_REPO" == "" ]; then
    echo "Error: Storage mechanism was not set, please select a valid storage backend (b2/wasabi/s3)."
    exit 1
fi

# load individual configuration files (contains all object-storage api info)
if [ "$EXP_REPO" == "wasbai" ]; then
    . repo_wasabi
fi
if [ "$EXP_REPO" == "s3" ]; then
    . repo_s3
fi
if [ "$EXP_REPO" == "b2" ]; then
    . repo_b2
fi

# copy necessary files once
if [ ! -f ".filelock" ]; then # this will get rm'd by crontab just before executing first time
    touch .filelock
    # backup mysql data
    mysqldump --databases wp_tgb_blog | restic -r $EXP_REPO:$RESTIC_REPOSITORY backup --stdin --stdin-filename my_blog.sql
    mysqldump --databases wp_wb_blog | restic -r $EXP_REPO:$RESTIC_REPOSITORY backup --stdin --stdin-filename wife_blog.sql
    mysqldump --databases wd_wp_business | restic -r $EXP_REPO:$RESTIC_REPOSITORY backup --stdin --stdin-filename business.sql

    # backup web data
    cp -r /home/personaladm/domains/* ./domains/
fi

if [ "$1" == "init" ]; then
	restic -r $EXP_REPO:$RESTIC_REPOSITORY init
else
	restic -r $EXP_REPO:$RESTIC_REPOSITORY backup ./domains/
fi

And that’s my current script for backing up data to multiple storage backends. I tried to keep it simple, clean and really generic as possible so I can add/remove storage engines when necessary, it turned out to be really clean this time around!

That’s all I’ve got for right now on backing up my DirectAdmin servers with Restic, what do you think?