9

Consider a production database with a few dozens of normal-size tables, and a handful of huge tables.

I am looking for a convenient way to pg_dump the database content, excluding the few huge tables, for my own local laptop. I can easily dump a small sample of the huge tables using \COPY, if I occasionally need to address these tables.

If I use the --exclude-table=table argument, both the schema and the data of the huge table is omitted, which breaks queries expecting these table on the local development environment.

Is there a way to dump a data base with a complete schema, excluding the content of some given tables?

asked May 8, 2016 at 10:11

1 Answer 1

15

I think you need to use --exclude-table-data=table option. From the docs:

--exclude-table-data=table

Do not dump data for any tables matching the table pattern. The pattern is interpreted according to the same rules as for -t. --exclude-table-data can be given more than once to exclude tables matching any of several patterns. This option is useful when you need the definition of a particular table even though you do not need the data in it.

To exclude data for all tables in the database, see --schema-only.

answered May 8, 2016 at 10:21
1
  • 2
    the table provided to --exclude-table-data (and --exclude-table) is a pattern (postgresql.org/docs/9.4/app-psql.html#APP-PSQL-PATTERNS). If you want to exclude everything prefixed with something like ks_ you write it as ks_* where * matches a sequence of any characters Commented Mar 12, 2019 at 21:33

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.