Need to setup a quick file backup solution for your website and do not really know where to start? Well you have come to the right place for some help getting something quick and simple setup.
NOTE: For the copy & paste people, be sure to properly convert the quotes.
Step 1: Generate your ssh key pair.
The first thing you will want to do is setup SSH keys to automate the offsite backups you will be taking.
In terminal type ssh-keygen -t rsa then follow the prompts, and do not enter a passphrase.
Once you are done creating your ssh key pair you will find it in your home directory in the .ssh directory. The files should be called id_rsa and id_rsa.pub, you will want to insure that id_rsa is kept secure and never sent to anyone for any reason since this is your private key that only you should have access to. The id_rsa.pub file is your public key that you will need to copy to the server(s) you will want to pull backups down from.
Step 2: Copy your public key to the server you will be backing up files from.
Now that you have your key pair, you will need to copy it to the server you will want to pull backups from. You can do this by typing in cat ~/.ssh/id_rsa.pub | ssh username@xxx.xxx.xxx.xxx “mkdir -p ~/.ssh && cat >> ~/.ssh/authorized_keys”
and pressing enter. If you have never connected to this server before you will be asked to authenticate the connection to the server. After you have verified the server you are connecting to is the one you want to add your public key to type yes. You will may or may not receive a notice that your public key has been added to the list of trusted hosts. Now you should be able to login to your server using username@xxx.xxx.xxx.xxx without having to enter a password. Give it a try to insure that it works.
Step 3: Create the backup file
Create the following file ~/Backups/websitebackup
Also be sure to review the following notes:
You may have to change BACKUPDIR to match your local user directory.
YOURUSERNAME = The username on your local linux/unix/mac
/path/to/your/website/directory/ = The path you would need to navigate to on your remote server to view the files you want to backup.
REMOTESERVERIP = This would be the IP address or hostname that you use to ssh into the server.
REMOTESSHUSERNAME = This would be the username you use to ssh into your server (e.g. yourusername@xxx.xxx.xxx.xxx) replacing xxx.xxx.xxx.xxx with the IP address of your server.
If you do not know the IP address of your server you can normally ping it by typing ping yourdomain.tld into the terminal and the IP address will normally show up. If you are having issues finding out your IP address ask your system administrator or web host.
In the new websitebackup file add the following below or download it websitebackup:
#!/bin/bash
BACKUPDIR=”/Users/YOURUSERNAME/Backups”
BACKUPDIRDST=”$BACKUPDIR/mywebsitebackups”
BACKUPARCHIVES=”$BACKUPDIR/archives”
BACKUPLOGS=”$BACKUPDIR/archive_logs”
BACKUPDATE=$(date +”%Y-%d-%m%_H%M”)
BACKUPSRC=”/path/to/your/website/directory/”
BACKUPADDR=”REMOTESERVERIP”
BACKUPUSER=”REMOTESSHUSERNAME”if [ ! -d “$BACKUPDIR” ]; then
echo “Creating backups directory”
mkdir -p $BACKUPDIRDST
fiif [ ! -d “$BACKUPARCHIVES” ]; then
echo “Creating backup archives directory”
mkdir -p $BACKUPARCHIVES
fiif [ ! -d “$BACKUPLOGS” ]; then
echo “Creating backup archive logs directory”
mkdir -p $BACKUPLOGS
fiif [ ! -d “$BACKUPDIRDST” ]; then
echo “Creating backup destination directory for storing downloaded files”
mkdir -p $BACKUPDIRDST
fiif [ -d “$BACKUPDIR” ]; then
echo “Starting website backup at $BACKUPDATE”
echo “Copying down new or changed files and directories”
/usr/bin/rsync -acvi $BACKUPUSER@$BACKUPADDR:$BACKUPSRC $BACKUPDIRDST –stats –progress > $BACKUPLOGS/latestwebsitebackup_rsync_$BACKUPDATE.logcd $BACKUPDIR
echo “Creating archive”
/usr/bin/tar -cvf “$BACKUPARCHIVES/websitebackup_$BACKUPDATE.tar” $BACKUPDIRDST 2>> $BACKUPLOGS/latestwebsitebackup_archive_$BACKUPDATE.logecho “Compressing logs”
/usr/bin/find $BACKUPLOGS -type f -iname “*.log” | /usr/bin/xargs /usr/bin/gzip -9echo “Compressing archives”
/usr/bin/find $BACKUPARCHIVES -type f -iname “*.tar” | /usr/bin/grep -v “.tar.gz” | /usr/bin/xargs /usr/bin/gzip -9
echo “Backup completed at $BACKUPDATE”
fi
After you have added the above text into the file and saved it, run:
chmod +x ~/Backups/websitebackup from terminal to give the file executable permissions.
Give the file a test to insure everything is working as expected by running:
./websitebackup from terminal.
Example output would look appear as:
Starting website backup at 2015-06-042113
Copying down new or changed files and directories
Creating archive
Compressing logs
Compressing archives
Backup completed at 2015-06-042113
Step 4: Automate the backups
Now that you have passwordless login setup you can now setup regular automated backups. The regular way to accomplish this task is by pulling down the backups from your server to your local system or a local server, instead of trying to push backups to where you want them to go. Start of by thinking of when you would like to do backups, a good backup interval would be daily or hourly depending on how frequently you data changes on your website and how many regular backups you want. Let’s setup a backup to run daily at 2AM in the morning.
Execute crontab -e in terminal to edit your crontab.
Type i to go into insert mode.
Then type in the following below with the correct full path to the websitebackup file:
* 2 * * * /path/to/websitebackup
When you are finished press the esc key then type :wq to quit and save the file.
You should now be able to check back daily to see the new backup archives that were created at 2AM in the morning.