Trying to import sql file around 8GB using these commands
sudo -u postgres psql yourdb -f my_file.sql
sudo -u postgres psql yourdb < my_file.sql
sudo -u postgres cat my_file.sql | psql yourdb
sudo -u postgres psql
\i my_file.sql
but all of them returning out of memory
error message.
Also tried configuration given here
1 Answer 1
psql
generally doesn't need much memory when playing large SQL files, since it doesn't buffer the whole file, only one query at a time, or it uses a COPY
stream.
The main situation when it may run out of memory is not when importing, but when SELECT'ing a large resultset, especially on 32 bits systems. This situation is generally solved by setting FETCH_COUNT
.
On import, the dump would need to contain unusually large rows to cause client-side memory issues. The maximum size of a column is 1Gb, so it's theorically possible that it would cause trouble on import, if the database has that kind of contents.
If the system doesn't have enough RAM, maybe you could just add swap space.
-
thanks for your answer, i'll try to add additional swap spaceSaid Kaldybaev– Said Kaldybaev2014年10月24日 09:57:24 +00:00Commented Oct 24, 2014 at 9:57
psql
throwing the out of memory error, or is the backend server that has the problem?