I have a large (3TB+) single database on a Azure Postgres Flexible Server. This is a managed service, so it takes backups automatically and the frequency can be adjusted. As a disaster recovery/emergency option, I'd like to periodically take a snapshot of the database separately from these incremental backups and store it somewhere else.
I don't need ongoing connectivity for incremental backups (current managed system does that), I just want a periodic backup that'd be restorable to a new server. I would just choose Azure's own Vaulted Backup option, which provides enough separation for this purpose. But according to the docs:
Vaulted backups are supported for server size <= 1 TB. If backup is configured on server size larger than 1 TB, the backup operation fails.
So I'm looking for other options. I checked out Barman, which looks like it should be perfect. But the managed Postgres Flexible Server doesn't support ssh/rsync access and I'm getting conflicting info on whether pg_basebackup
is supported--seems unlikely, along with other physical replication options.
I can't use Azure's own tools to backup my server, it's too big. I can't use external tools, they can't have shell access. The last option is just pg_dump
, which in my experience will take days to finish and may need to be performed on a point-in-time-restore so that performance isn't affected. I'd prefer to create and restore from a physical backup rather than logical, so this is a last resort.
Is there a better option for a managed server this size than pg_dump
?
1 Answer 1
I know, it's a pain. I discovered that at my own cost.
Time has passed but the only way to backup PostgreSQL on Azure or anywhere alse is still pg_dump
.
And Microsoft provides a documentation without any shame.
You are in the Cloud but you still have to use that archaic command and there is no automation. I don't know how this is possible.
They want you to hate PostgreSQL and use Azure SQL Database.
If you want to automate that use Azure Logic Apps or Azure Data Factory to execute that script each hour and save the backup on the Azure Blob Storage Container, each backup with time and date.
Explore related questions
See similar questions with these tags.
pg_dumpall --clean --if-exists
and pipe it through your favorite compression utility (e.g.zstd
orxz
orbzip2
, etc..) or use thepg_dump
and check out the-Fc
option. See pg_dump - Description at top of page. Concur with suggestion to migrate to database administration StackExchange site.pg_dump
will be the only option. Hosting providers make it easy to move to the cloud, but they make it as hard as they can to move back out.pg_dump [flags] -Ft | [p]bzip2 -9 | azcopy https://yourstorageaccount.dfs.core.windows.net/yourcontainer/pgdump-$(date -u +%Y-%m-%dT%H:%M:%S%Z).tar.bz2