I am creating ETL where I rely on the database to refuse duplicate values for a business key. In SQL Server this works like a charm. The database gives an error, the ETL ignores the error, continues and the next row gets inserted just fine. With PostgreSQL my UNIQUE constraint (or index, they show the same behaviour) blocks any further inserts. This makes this solution useless and PostgreSQL a less useful candidate for the datavault I'm creating for my client.
I find a solution with using a rule, but this turns out to be extremely slow. The performance is multiple times worse. This makes using a rule also no good way to have the database deal with doubles.
Is there any way to have PostgreSQL keep on inserting new rows after a UNIQUE constraint violation?
1 Answer 1
Check PostgreSQL documentation for detailed syntax for INSERT statement.
In particular, there's an optional ON CONFLICT
part:
ON CONFLICT [ conflict_target ] conflict_action
where
conflict_target
can be one of:( { index_column_name | ( index_expression ) } [ COLLATE collation ] [ opclass ] [, ...] ) [ WHERE index_predicate ] ON CONSTRAINT constraint_name
and
conflict_action
is one of:DO NOTHING DO UPDATE SET { column_name = { expression | DEFAULT } | ( column_name [, ...] ) = ( { expression | DEFAULT } [, ...] ) | ( column_name [, ...] ) = ( sub-SELECT ) } [, ...] [ WHERE condition ]
In your case, you probably want to append at the end of your INSERT statement:
ON CONFLICT DO NOTHING
and the conflicts will be skipped, and further inserts will continue.
This answer is based on @Akina's comment. Thanks!
INSERT .. ON CONFLICT DO NOTHING
. UPDATE syntax