Our project needs to load 100k+ rows into a Postgres table. We have a huge .CSV
file with all these 100K+ rows and we need to load this data into a table effectively and quicker (may be in the form of batches so that unloaded data/batch can be revisited, corrected and loaded again).
Our project involves a Java application and we initially thought of using JDBC
, but reading each row and committing to database slows down the application.
Please suggest if you know an approach to get this job done.
1 Answer 1
You can use the COPY command for this.
See: http://www.postgresql.org/docs/9.3/static/sql-copy.html https://stackoverflow.com/questions/2987433/how-to-import-csv-file-data-into-a-postgres-table
COPY
or the PgJDBCCopyManager
interface to it. Or pg_loader / pg_bulkload, but a couple of hundred thousand records isn't too much and shouldn't need that. Did you search before posting? This is asked all the time.