Consider a production database with a few dozens of normal-size tables, and a handful of huge tables.
I am looking for a convenient way to pg_dump
the database content, excluding the few huge tables, for my own local laptop. I can easily dump a small sample of the huge tables using \COPY
, if I occasionally need to address these tables.
If I use the --exclude-table=table
argument, both the schema and the data of the huge table is omitted, which breaks queries expecting these table on the local development environment.
Is there a way to dump a data base with a complete schema, excluding the content of some given tables?
1 Answer 1
I think you need to use --exclude-table-data=table
option. From the docs:
--exclude-table-data=table
Do not dump data for any tables matching the table pattern. The pattern is interpreted according to the same rules as for
-t
.--exclude-table-data
can be given more than once to exclude tables matching any of several patterns. This option is useful when you need the definition of a particular table even though you do not need the data in it.To exclude data for all tables in the database, see
--schema-only
.
-
2the table provided to --exclude-table-data (and --exclude-table) is a pattern (postgresql.org/docs/9.4/app-psql.html#APP-PSQL-PATTERNS). If you want to exclude everything prefixed with something like
ks_
you write it asks_*
where * matches a sequence of any charactersDan Andreasson– Dan Andreasson2019年03月12日 21:33:50 +00:00Commented Mar 12, 2019 at 21:33