I have a 100 csv files named sequentially 1,2,3....100.csv
I copy the csv files with this script:
COPY location FROM 'C:\Program Files\PostgreSQL9円.5\data\ ́1.csv' DELIMITER ',';
I would to execute the COPY with a single script that covers all the files (1,2,3 .... 100.csv).
I use pgadminIII on MS Windows
5 Answers 5
You can use pgScript:
DECLARE @i, @filename;
SET @i = 1;
WHILE @i <= 100
BEGIN
SET @filename = 'C:\\Program Files\\PostgreSQL\9円.5\\data\\' + CAST(@i AS STRING) + '.csv' ;
COPY location FROM '@filename' DELIMITER ',' ;
SET @i = @i + 1;
END
NOTE: You shouldn't have your CSV files under the PostgreSQL data directory. This directory is for the database engine to use, not users. Mistakenly overwriting or deleting any of PostgreSQL files could have your database rendered useless.
You should keep the files somewhere under C:\Users\user\Documents\CSV\...
or a similar location in "userland" and give proper permissions to that directory for the PostgreSQL process to read. For a standard install of this version, you normally need to give read permission to the LocalSystem account. Some installations can use the Postgres account).
NOTE 2: pgScript hasn't had much traction. It's not been ported to pgAdmin 4.
The kosher way to do this is to use your shell. The whole pgScript
seems like too much functionality in pgAdmin. pgScript seems to have been yanked in pgAdmin IV too, so buyer beware.
for i in $(seq 1 100);
do psql -d database -e 'COPY ...';
end;
If you're on windows you can either
- do the analog in powershell,
- install mingw
- install Bash On Windows.
That said, I agree with @joanolo. It's comically bad to put your CSVs in your data directory. Under normal circumstances, the CSV's never go to the server. They stay on the client. And then you can just use PSQL's \COPY
. That said, sometimes the locked-down FTP server is the SQL server and it just makes sense, but generally you should be using \COPY
not COPY
, and doing this kind of looping in the shell.
PostgreSQL 9.2 added DO
which seems to provide for scripting on the server. If you really wanted to write a script. I'm sure with it you can write something up that runs natively on your server. See the function for pg_ls_dir
to loop over a dir.
You should also be able to do this with a plpgsql anonymous block:
do $block$
declare
r record;
BEGIN
for r in select generate_series(1,100) n
loop
execute $$COPY location FROM 'C:\Program Files\PostgreSQL9円.5\data\1ドル.csv' DELIMITER ','$$ using r.n;
end loop;
end$block$;
I haven't tested this with COPY, but here is a dbfiddle with the loop working.
DO $$
DECLARE
r RECORD;
filePath TEXT;
BEGIN
FOR r IN SELECT generate_series(1, 10) AS n
LOOP
filePath := format(
'E:\path_to_your_csv_folder\%s.csv',
r.n
);
EXECUTE format(
'COPY ims FROM %L WITH (FORMAT CSV, HEADER TRUE)',
filePath
);
END LOOP;
END $$;
This is tested and working fine.
DuckDB can be also used an an efficient ETL to feed CSV files into a PostgreSQL database in just 2 lines
ATTACH 'dbname=mydatabase user=username' AS pg_db (TYPE postgres);
COPY pg_db.location FROM 'C:\...\*.csv';