I have started the process of transitioning all of my web hosting from my current provider to Google Cloud. This entails moving my databases over to Google cloud as well. Backups are important and I was looking for a way to have backups happen automagically and I think I found my (not so) elegant solution.
The code I came up with is the following:
rm -r backups/* mysqldump -u root --password <PASSWORD_HERE> --all-databases > backups/backup-$(date -d "today" +"%Y-%m-%d-%H%M").sql gsutil cp backups/* gs://<BUCKET HERE>
This code is also available as a gist. Please suggest changes to this code as I would like to simplify and streamline this process as currently it feels rather clunky.
Basically, the gcloud util api makes this job super simple. After clearing out the backup directory and creating a new backup file, it's one line to move these backups into Google cloud storage. There is probably an elegant way to one-line all the interesting parts of this script and I am looking forward to hearing suggestions!