0

Our project needs to load 100k+ rows into a Postgres table. We have a huge .CSV file with all these 100K+ rows and we need to load this data into a table effectively and quicker (may be in the form of batches so that unloaded data/batch can be revisited, corrected and loaded again).

Our project involves a Java application and we initially thought of using JDBC, but reading each row and committing to database slows down the application.

Please suggest if you know an approach to get this job done.

MDCCL
8,5303 gold badges32 silver badges63 bronze badges
asked Apr 29, 2015 at 20:22
1
  • 1
    Use COPY or the PgJDBC CopyManager interface to it. Or pg_loader / pg_bulkload, but a couple of hundred thousand records isn't too much and shouldn't need that. Did you search before posting? This is asked all the time. Commented Apr 30, 2015 at 5:23

1 Answer 1

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.