I'm trying to dump csv of distinct varchar(43) column to file, table has around billion rows, select distinct command either runs out of memory and command is aborted or server just closes all connections. Does PostgreSQL have built in methods for such task?
psql (10.9 (Ubuntu 10.9-0ubuntu0.18.10.1)) 16GB RAM
2 Answers 2
Solution suggested by jjanes worked, run set enable_hashagg=off
in psql terminal
You can dump it without distinct and than use the Linux "uniq" command. Just make sure that you defined the swap file that is good idea in any case if your system doesn't use it constantly.
set enable_hashagg=off
first. Inappropriate hash aggregates is what usually causes this in my experience. The planner should know better than to use them with very large result sizes, but if the estimates are way off it might not know there is a very large result size.EXPLAIN select distinct x from y
?