The context is that I am writing a script to send load/insert query to a postgres server to insert rows into an existent table. I read many resources about the topic, about the approach to choose, between "COPY" and multi-valued SELECT query. I want to know what is the maximum query length accepted for a PostgreSQL multi-valued SELECT query ? I am building a SQL query dynamically to send it to the postgres server, and I can't know beforehand how many line after "values" I may have.
Another question, if I store my table in a file and use "COPY" to load the file content, is there any limit for the file, ie. limit in term of lines, bytes, ... ?
I tried both approaches, with the multi-valued SELECT query and the "COPY" query. They both work for me, but I need to know more about the limits of each one of them. I searched for it, I found that for a simple query the limit is 1GB. Is this limit applied to a multi-valued select ?
I couldn't find any limits for the "COPY" query, so I am looking for this information too.
Thank you in advance !
EDIT :
What I meant by "multi-valued select" query is :
SELECT INTO "TABLE" VALUES (Column 1, Column 2), (Column 1, Column 2), (Column 1, Column 2).. ;
The maximal message length that the PostgreSQL frontend/backend protocol supports is 1GB. As the documentation says:
The first byte of a message identifies the message type, and the next four bytes specify the length of the rest of the message (this length count includes itself, but not the message-type byte).
But you are likely to get into trouble before you reach that limit.
If you intend to load lots of data into PostgreSQL, don't even consider using a multi-line INSERT
statement. The way to go for that is COPY
. With COPY ... FROM STDIN
, you can load arbitrarily many data in a single statement, since you can send the data over the network in chunks. With COPY ... FROM 'file'
, there is no limit on the size of the file.