does anybody know a way to perform a periodic remote backup of the full environment (so comprehensive of the application servers and SQL databases) in Jelastic?
I wanted to use Google Drive to store the backups as I was already using it with Plesk.
Thanks.
The solution I found uses rclone
to perform a remote backup of a single node, you can use it with various cloud storage services, but I'm using it with Google Drive. Also, I'm using Linux CentOS and this solution doesn't require root access.
Connect to your node via SSH, if you use the WebSSH terminal included in the Jelastic administration console the rclone configuration might not work (since it requires you to open your browser); I'm using vscode with Remote SSH extension, but PuTTY should work fine as well.
Run the following commands in a folder where you have read/write permissions to install rclone into a subfolder called "backup-tools":
curl -O https://downloads.rclone.org/rclone-current-linux-amd64.zip
unzip rclone-current-linux-amd64.zip
cd rclone-*-linux-amd64
mkdir ../backup-tools/
cp rclone ~/backup-tools/
cd ..
rm -r rclone-*-linux-amd64
rm rclone-*-linux-amd64.zip
cd backup-tools
run ./rclone config
and config your chosen cloud storage service, refer to the documentation for details.
create a backup.sh file and edit it with your favourite editor, for apache nodes I did something like this:
# set environment variables
export PATH=$PATH:~/backup-tools
export REMOTE_PATH="gdrive:JelasticBackups"
export FILE_NAME="$(date +"%Y-%m-%d_%H-%M-%S").zip"
export PASSWORD="YOUR_PASSWORD"
# create temporary directory for backups
mkdir -p /tmp/backups
# create archive of environment files
cd ~/webroot/ROOT/
zip -r -P $PASSWORD /tmp/backups/$FILE_NAME *
# upload backup to Google Drive using rclone
~/backup-tools/rclone copy /tmp/backups/$FILE_NAME $REMOTE_PATH
# delete the local copy
rm -rf /tmp/backups/*
# delete the remote backups older than 30 days
~/backup-tools/rclone delete $REMOTE_PATH --min-age 30d
for mysql nodes you need first to run the backup script that Jelastic creates by default to dump the databases, so I did like this:
# set environment variables
export PATH=$PATH:~/backup-tools
export REMOTE_PATH="gdrive:JelasticBackups"
export FILE_NAME="$(date +"%Y-%m-%d_%H-%M-%S").zip"
export ZIP_PASS="ZIP_PASSWORD"
export DB_USR="DB_USERNAME"
export DB_PASS="DB_PASSWORD"
# create temporary directory for backups
mkdir -p /tmp/backups
# create archive of the databases
/var/lib/jelastic/bin/backup_script.sh -m dumpall -c 1 -u $DB_USR -p $DB_PASS
cd /var/lib/jelastic/backup/
zip -r -P $ZIP_PASS /var/lib/jelastic/backup/$FILE_NAME *
# upload backup to Google Drive using rclone
~/backup-tools/rclone copy /var/lib/jelastic/backup/$FILE_NAME $REMOTE_PATH
# delete the local copy
rm -rf /var/lib/jelastic/backup/*
# delete the remote backups older than 30 days
~/backup-tools/rclone delete $REMOTE_PATH --min-age 30d
Then don't forget to make the file executable by running chmod +x backup.sh
If you want you can create cron jobs to schedule periodic backups, for example this runs everyday at midnight:
0 0 * * * /var/www/backup-tools/backup.sh
That's it.