I am exporting layers to a PostgreSQL 11 database with PostGIS extension. The export itself is taking a while for big datasets (let's say 50.000 points). As I see in the pgadmin dashboard there are only tuples with a size of 200 features pushed to the database per transaction: enter image description here How can I increase the transaction size in QGIS in a way which enables me to send let's say 1000 or even more inserts per transaction? I have another Python based input script which pushes about 1800 features per transaction.
-
3If you are happy using Python, just use psycopg2 and prepared transactions. Or, even better, if it is just a point dataset, use the Postgres copy command. You should be able to import 50,000 points in less than a second, though, you might have to recreate the geometry after import from x, y. Using GUIs for anything like this is always slow and obscures the real functionality under the hood.John Powell– John Powell2019年02月22日 16:46:48 +00:00Commented Feb 22, 2019 at 16:46