I have multiple CSV files. Those CSV files are the log files created as a result of running a Windows batch file. The location of these CSV files lies on the server and I want all these CSV files to copy from the server to a particular table (say import_error_table, which I had already created in PostgreSQL).
How to accomplish this using PostgreSQL?
I tried running a Windows batch file which is shown below.
enter image description here
And in the above snapshot, there is an exe file to convert a shapefile to pgsql. I would like to know is there any specific exe for converting CSV to pgsql.
I had run the above code as a batch file. The prep.sql file is created with (0KB size). But also schema was not created in the PostgreSQL.
for %%f in (*.csv) do \192円.158.5.170\working\PostGIS\psql.exe -p -k -s 32643 %%f CSV_Logs.%%~nf > prep_%%~nf.sql
set PGPASSWORD=rpc123
for %%f in (prep_*.sql) do \192円.158.5.170\working\PostGIS\psql -h 192.158.5.170 -p 5432 -d NPCL_test -U postgres -f %%f
The result of which is prep.sql files created but schema was not created in PostgreSQL.
enter image description here
1 Answer 1
Here are two examples of how this can be done:
Just put the batchfile in the same folder where the csv-files are:
for copy from local machine to local database:
for %%f in (*.csv) do psql -d your_database_name -h localhost -U postgres -p 5432 -c "\COPY public.yourtablename FROM '%%~dpnxf' DELIMITER ',' CSV;"
pause
You can also run the batchfile on your computer and send the content of the CSV-Files to the remote database. Just put the csv-files on your local computer and the batchfile in the same folder. Content of the batchfile:
@echo off
setlocal
set PGPASSWORD=yourpassword
for %%f in (*.csv) do psql -d your_database -h your_server_ip -U postgres -p 5432 -c "\COPY public.yourtablename FROM '%%~dpnxf' DELIMITER ',' CSV;"
pause
endlocal
Just tested it on my machine and works both: local PostgreSQL on Windows7 and remote PostgreSQL on Ubuntu Linux.
PS: Using the \COPY instead of just COPY is important if you want to read the local csv's and copy them to the remote database.
-
Great! my problem solved. your code works.User123– User1232015年02月01日 09:41:08 +00:00Commented Feb 1, 2015 at 9:41
-
I had edited my post. Since there is a new problem in running the batch file.User123– User1232015年02月02日 07:02:33 +00:00Commented Feb 2, 2015 at 7:02
-
You don't have to create sql-files.all you need is adapt my batch file template so it fits to your server-ip, database name and so on.don't insert any other commandsThomas B– Thomas B2015年02月02日 07:54:25 +00:00Commented Feb 2, 2015 at 7:54
-
Before loading my CSV files into a table. I need to create a table right for loading CSV files? That's what the above code snippet (prep.sql) doesUser123– User1232015年02月02日 09:02:35 +00:00Commented Feb 2, 2015 at 9:02
-
well in you question you said "to a particular table (say import_error_table, which i had already created in postgreSQL"). You can't create a table using CSV. You have to manually create the table and its columns, because a csv does not contain information about which data type should be used for each column.Thomas B– Thomas B2015年02月02日 09:16:18 +00:00Commented Feb 2, 2015 at 9:16
psql -c "copy some_table from some_csv csv" db_name
in the shell.