Please consider me to be a complete novice with psycopg2. My aim is to insert a 1D numpy array of dtype object (where the elements are only strings) into a postgresQL table. My main program saves the fields as strings in the numpy array. I then want to add each separate element to a column in the postgresqL table (or if you prefer, the the 1d Array is one row). Please note, the actual array has 36 elements! I need a method to put them all in.
I am using the cur.execute command although I believe there is some problem with the string conversion.
Array=np.empty(3,dype=object)
Array[0]='Hello'
Array[1]='Tea?'
Array[2]='Bye'
statement= "INSERT INTO testlog (field1,field2,field3) VALUES (%s)" #Etc.
cur.execute(statement,Array)
I get error:
cur.execute(statement,Array)
TypeError: not all arguments converted during string formatting
Also tried:
cur.executemany('INSERT INTO testlog VALUES ( %s )', [[v] for v in Array ]
Thanks
-
@Clodoaldo You are right, read about it just now hereKobi K– Kobi K2014年08月19日 13:23:47 +00:00Commented Aug 19, 2014 at 13:23
1 Answer 1
Your statement should contain place holder for all values:
statement= "INSERT INTO testlog (field1,field2,field3) VALUES (%s, %s, %s)"
For Example:
=# create table testlog (field1 varchar(50), field2 varchar(50), field3 varchar(50))`;
then in the python shell (note dtype not dype:
Array=np.empty(3,dtype=object)
Array[0]='Hello'
Array[1]='Tea?'
Array[2]='Bye'
sql = "INSERT INTO testlog (field1, field2, field3) VALUES (%s, %s, %s)"
cur.execute(sql, [f for f in Array])
conn.commit()
And in DB:
select * from testlog;
field1 | field2 | field3
--------+--------+--------
Hello | Tea? | Bye
(1 row)
Comments
Explore related questions
See similar questions with these tags.