0

I’m trying to set up backup strategy for a production postgreSQL DB. It will have large amount of data and it should be running 24 X 7. Could you recommend some backup & recovery strategies that can meet the following criteria?

  • Large amount data (about over 200GB).
  • Hot backup (online backup)
  • Minimum Impact on the DB performance.
  • Minimum restore time.
  • Allow PITR(Point-In-Time Recovery)
    1. Can we execute backup with above criteria on slave server of replication?
    2. If you know backup strategy using storage snapshot, Could you let me know?
asked Feb 20, 2014 at 12:00
1
  • 1
    the usual setup is (if I'm not mistaken) to create a slave using streaming replication and then to the backups on the slave (to minimize impact on the master). You might want to look at pgBarman and repmgr Commented Feb 20, 2014 at 12:02

1 Answer 1

2

It sounds like you'd be best suited by using a physical base backup + WAL archiving, with regularly updated snapshots of the base backup. I strongly recommend taking regular dumps anyway.

Using newer PostgreSQL versions (9.2 and up, IIRC) you can take fresh base backups from a replica server so you don't have to disrupt the master.

File-system or logical volume level snapshot backups work fine with PostgreSQL so long as your snapshots are atomic. Restoring one is like starting PostgreSQL back up after unexpected power loss or OS reboot, not a big deal.

See also:

answered Feb 20, 2014 at 13:39

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.