I am trying to do a daily backup of a MySQL DB hosted on Stackhero (on Heroku) to a Google Cloud Storage bucket. We need to backup tables individually so that if data is corrupted / lost in a specific table, we can restore it without restoring the entire database.
I've looked into third party tools (such as SimpleBackups) but they do not allow atomic backups, they only generate one sql file for the entire database. This is quite tedious to work with as we'd need to restore the entire database in a staging environment and then extract the data from the table we want to restore.
At the moment the only solution I've found is to run a process myself on Cloud Run, on a schedule, invoking mysqldump
to extract the data from a table and upload to GCS.
I was wondering if there are any other approaches that might be more effective / not require a custom implementation?
3 Answers 3
Look, most of the backup tools will always give you full database dumps. That is safe for recovery, but it does not help if you only want one table back.
some options to try out:
stick with
mysqldump
per table. it works fine, it's just slow if DB is big. each table ends up as its own file in GCS. its straightforward.use
mydumper/myloader
- this is what i would recommend. it's likemysqldump
, but faster, parallelized and by defualt gives you one file per table.Full backup + point-on-time recovery => this is at scale. in which you can restore the DB somewhere else and grab the table you need. that's an extra overhead, but it is reliable.
if it were me: I would set up Cloud Scheduler -> Cloud Run -> mydumper
-> push to GCS. that way we get clean per-table-dumps, and restoring is also easy as myloader
back into production. no crazy code but just glue.
Fun fact: you do not need to restore the whole database from the dump.
mysqldump -u username -p database | gzip > dumpfile.sql.gz
tablename="mybrokentable"
( echo "TRUNCATE TABLE \`${tablename}\`;" ; \
gunzip -c dumpfile.sql.gz \
| sed -n -e "/LOCK TABLES \`${tablename}\` WRITE/,/UNLOCK TABLES/p" ) \
> single.table.dump.sql
mysql -u username -p database < single.table.dump.sql
-
Thanks for the helpful tip!Nat Geman– Nat Geman2025年09月23日 09:23:06 +00:00Commented Sep 23 at 9:23
You're backing up a MySQL DB on Stackhero (Heroku) to Google Cloud Storage and want per-table backups for easier recovery. Most tools like SimpleBackups only support full DB dumps. Your current solution—using Cloud Run with mysqldump per table—is solid.
Alternatives:
Use mydumper for faster, per-table dumps.
Trigger Cloud Functions via Cloud Scheduler for lightweight automation.
Consider hybrid backups: full weekly + table-level daily.
Explore newer tools like UpBack! for granular restore options.
-
Sounds like AI generated Please read the help.Rohit Gupta– Rohit Gupta2025年09月05日 22:09:25 +00:00Commented Sep 5 at 22:09
mysqldump
for the restore.