0

I am trying to do a daily backup of a MySQL DB hosted on Stackhero (on Heroku) to a Google Cloud Storage bucket. We need to backup tables individually so that if data is corrupted / lost in a specific table, we can restore it without restoring the entire database.

I've looked into third party tools (such as SimpleBackups) but they do not allow atomic backups, they only generate one sql file for the entire database. This is quite tedious to work with as we'd need to restore the entire database in a staging environment and then extract the data from the table we want to restore.

At the moment the only solution I've found is to run a process myself on Cloud Run, on a schedule, invoking mysqldump to extract the data from a table and upload to GCS.

I was wondering if there are any other approaches that might be more effective / not require a custom implementation?

asked Sep 5 at 8:35
3
  • The reason most backup tools only deal with whole databases is that rows and tables aren't isolated entities. Think of things like foreign keys. So restoring a whole database is the safest option. For your specific purpose restoring the database in a staging environment, and then copy data over manually, sounds like a good solution. Commented Sep 5 at 8:53
  • You're on the right track — mysqldump --tables is still the most reliable way to back up individual tables. Most third-party tools focus on full DB backups. One workaround is to script mysqldump per table and run it via a scheduled job (Cloud Run or Cloud Scheduler + Cloud Function). You could also explore using mydumper, which supports parallel per-table dumps, but you'd need to self-host it. Commented Sep 5 at 9:45
  • Thanks for the feedback! I think I will go with a full backup to be safe and use mysqldump for the restore. Commented Sep 23 at 9:22

3 Answers 3

0

Look, most of the backup tools will always give you full database dumps. That is safe for recovery, but it does not help if you only want one table back.

some options to try out:

  1. stick with mysqldump per table. it works fine, it's just slow if DB is big. each table ends up as its own file in GCS. its straightforward.

  2. use mydumper/myloader - this is what i would recommend. it's like mysqldump, but faster, parallelized and by defualt gives you one file per table.

  3. Full backup + point-on-time recovery => this is at scale. in which you can restore the DB somewhere else and grab the table you need. that's an extra overhead, but it is reliable.

if it were me: I would set up Cloud Scheduler -> Cloud Run -> mydumper -> push to GCS. that way we get clean per-table-dumps, and restoring is also easy as myloader back into production. no crazy code but just glue.

answered Sep 5 at 11:44
0

Fun fact: you do not need to restore the whole database from the dump.

mysqldump -u username -p database | gzip > dumpfile.sql.gz
tablename="mybrokentable"
( echo "TRUNCATE TABLE \`${tablename}\`;" ; \
gunzip -c dumpfile.sql.gz \
| sed -n -e "/LOCK TABLES \`${tablename}\` WRITE/,/UNLOCK TABLES/p" ) \
> single.table.dump.sql
mysql -u username -p database < single.table.dump.sql
answered Sep 5 at 13:11
1
  • Thanks for the helpful tip! Commented Sep 23 at 9:23
-1

You're backing up a MySQL DB on Stackhero (Heroku) to Google Cloud Storage and want per-table backups for easier recovery. Most tools like SimpleBackups only support full DB dumps. Your current solution—using Cloud Run with mysqldump per table—is solid.

Alternatives:

Use mydumper for faster, per-table dumps.

Trigger Cloud Functions via Cloud Scheduler for lightweight automation.

Consider hybrid backups: full weekly + table-level daily.

Explore newer tools like UpBack! for granular restore options.

answered Sep 5 at 16:43
1
  • Sounds like AI generated Please read the help. Commented Sep 5 at 22:09

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.