4

I am exporting layers to a PostgreSQL 11 database with PostGIS extension. The export itself is taking a while for big datasets (let's say 50.000 points). As I see in the pgadmin dashboard there are only tuples with a size of 200 features pushed to the database per transaction: enter image description here How can I increase the transaction size in QGIS in a way which enables me to send let's say 1000 or even more inserts per transaction? I have another Python based input script which pushes about 1800 features per transaction.

underdark
84.9k22 gold badges237 silver badges418 bronze badges
asked Feb 22, 2019 at 14:47
1
  • 3
    If you are happy using Python, just use psycopg2 and prepared transactions. Or, even better, if it is just a point dataset, use the Postgres copy command. You should be able to import 50,000 points in less than a second, though, you might have to recreate the geometry after import from x, y. Using GUIs for anything like this is always slow and obscures the real functionality under the hood. Commented Feb 22, 2019 at 16:46

1 Answer 1

3

Looking at the code, 200 features is an hard-coded constant in the layer export function. It seems to be applied to any data provider, not just Postgres.

You can recompile QGIS with a larger value, or use an alternative tool such as ogr2ogr

answered Feb 22, 2019 at 16:16

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.