0

I have a large (3TB+) single database on a Azure Postgres Flexible Server. This is a managed service, so it takes backups automatically and the frequency can be adjusted. As a disaster recovery/emergency option, I'd like to periodically take a snapshot of the database separately from these incremental backups and store it somewhere else.

I don't need ongoing connectivity for incremental backups (current managed system does that), I just want a periodic backup that'd be restorable to a new server. I would just choose Azure's own Vaulted Backup option, which provides enough separation for this purpose. But according to the docs:

Vaulted backups are supported for server size <= 1 TB. If backup is configured on server size larger than 1 TB, the backup operation fails.

So I'm looking for other options. I checked out Barman, which looks like it should be perfect. But the managed Postgres Flexible Server doesn't support ssh/rsync access and I'm getting conflicting info on whether pg_basebackup is supported--seems unlikely, along with other physical replication options.

I can't use Azure's own tools to backup my server, it's too big. I can't use external tools, they can't have shell access. The last option is just pg_dump, which in my experience will take days to finish and may need to be performed on a point-in-time-restore so that performance isn't affected. I'd prefer to create and restore from a physical backup rather than logical, so this is a last resort.

Is there a better option for a managed server this size than pg_dump?

asked May 28 at 23:05
3
  • You can either dump the entire postgres setup with pg_dumpall --clean --if-exists and pipe it through your favorite compression utility (e.g. zstd or xz or bzip2, etc..) or use the pg_dump and check out the -Fc option. See pg_dump - Description at top of page. Concur with suggestion to migrate to database administration StackExchange site. Commented May 29 at 0:27
  • I don't know what Microsoft provides, but I expect that pg_dump will be the only option. Hosting providers make it easy to move to the cloud, but they make it as hard as they can to move back out. Commented May 29 at 5:38
  • pg_dump [flags] -Ft | [p]bzip2 -9 | azcopy https://yourstorageaccount.dfs.core.windows.net/yourcontainer/pgdump-$(date -u +%Y-%m-%dT%H:%M:%S%Z).tar.bz2 Commented May 30 at 10:00

1 Answer 1

0

I know, it's a pain. I discovered that at my own cost.

Time has passed but the only way to backup PostgreSQL on Azure or anywhere alse is still pg_dump.

And Microsoft provides a documentation without any shame.

You are in the Cloud but you still have to use that archaic command and there is no automation. I don't know how this is possible.

They want you to hate PostgreSQL and use Azure SQL Database.

If you want to automate that use Azure Logic Apps or Azure Data Factory to execute that script each hour and save the backup on the Azure Blob Storage Container, each backup with time and date.

answered May 30 at 8:06

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.