I have a text file that contains data in the following format:
char char char char
#1 a b c
char char char dateTime
#2 d e 20-12-2012
#3 g h 8-12-2013
I have created 2 tables in PostgreSQL: one with the following datatype structure: (varchar, varchar, varchar, varchar) (for the #1 a b c record).
The second table that is supposed to hold the rest of the data and has been designed with the following datatype structure: (varchar,varchar,varchar, dateTime). Now, what I'd like to do is load the data given above into my PostgreSQL DB. However, since I am new to PostgreSQL, I am not sure how to load data stored in a text file into the database. Could anybody please give me suggestions on how I should go about doing this?
-
2You'll need a DB API adapter for PostgreSQL. Pyscopg2 is a popular choice for this. The documentation there should get you going.Pedro Romano– Pedro Romano2013年04月29日 07:48:12 +00:00Commented Apr 29, 2013 at 7:48
1 Answer 1
Use the Python csv module or a similar tool to read the file data in as rows of data.
Once you have the data as rows (lists or tuples) in Python, you can create a new connection using a Python DBI driver like psycopg2. Establish a connection to PostgreSQL, prepare a statement and execute it once with each data row.
There's lots more info all over the 'net; search for "psycopg tutorial" to start. Here's one article, though I haven't done anything more than skim read it.