42

I'm having problem exporting my PostgreSQL output from a shell to a csv file.
My SQL script is called script.sql.

I typed the following command in my shell:

psql congress -af script.sql &> filename.csv

But when I opened the filename.csv file, values of all the columns are squeezed in one column in the Excel csv (see the attached screenshot).

Then I tried another way. I edited my script.sql to be:

Copy (Select * From ...) To '/tmp/filename.csv' With CSV;

Then I typed the following command in the shell within the database dbname.

\i script.sql

The output is:

COPY 162

Well, my output query has 162 rows.

So the 162 rows of my output table have been copied in the shell. How can I paste or move them to a csv file?

Or, if I'm going to use the filename.csv (screenshot is attached), how can I fix the format of that csv/Excel file?

Screenshot of filename.csv

3

4 Answers 4

69

Modern syntax:

COPY (SELECT * FROM ...) TO '/tmp/filename.csv' (FORMAT csv);

So the 162 rows of my output table have been copied in the shell. How can I paste or move them to a csv file?

The result is the CSV file. Open it with any spreadsheet program using matching delimiters. The manual:

The default is a tab character in text format, a comma in CSV format

The psql meta command \copy is a wrapper around the SQL COPY function. It writes and reads files local to the client (while COPY uses files local to the server) and does not require superuser privileges.

See:

3
  • 24
    COPY requires you to be a superuser. \copy can be used by regular users and seemed to have the same syntax
    – Patrick
    Commented Jan 11, 2017 at 19:19
  • this will ignore headers in output by default?
    – RBB
    Commented Jun 10, 2018 at 11:16
  • 9
    @Spike yes. If you want the header, just use (format CSV, HEADER)
    – HelLViS69
    Commented Jul 20, 2018 at 9:07
8

Most previos answers are correct, here is more details and only for linux

1.Create a file like my_query.sql

\COPY (select * from some_table) TO '/tmp/some_file.csv' (format CSV);

2.Run this below command after you have updated the variables for your DB and file path.

psql "host=YOUR_HOST port=YOUR_PORT dbname=YOUR_DB user=YOUR_USER password=YOUR_PASS" -af my_query.sql

tip: be cautious of your free disk space you have available if you do a select * from your_table you might run out of space if have a large table.

2

first copy your connection info into ~/.pgpass and

cat ip:port:dbname:user:pass > ~/.pgpass
chmod 0600 ~/.pgpass
psql -h serverip -U userid dbname -af test.sql | gzip > result.txt.gz
1

Going off on a bit of a tangent, there is another way too.

I use the following in a windows batch script: -

psql.exe -U %cUser% -h %cHost% -p %cPort% -d %cDB% -t -o "%dumpDir%\tables.txt" -A -c "SELECT table_schema, table_name   FROM information_schema.tables WHERE table_schema = '%qSchema%';"

The trick is to remember the -A option. It suppresses whitespace padding of the data.

I do this to avoid permission errors from the COPY command shown above with the account running postgress not have the same permissions as the account running my scheduled batch file.

This gets me a list of tables in a given schema with results like:-

myschema|mytable1

myschema|mytable2

I then use the FOR batch command to process each line. If you REALLY wanted a CSV file, all you would need to do would be this:-

ECHO table_schema,table_name > %dumpDir%\tables.csv
FOR /F "delims=|" %%p in (%dumpDir%\tables.txt) DO echo %%p,%%q >> %dumpDir%\tables.csv

Probably not the most efficient of mechanisms, but works fine on small numbers of output rows.

3
  • 4
    I like psql -c 'select * from table limit 5' -tAF, which automatically produces csv.
    – benjimin
    Commented Aug 21, 2017 at 4:31
  • Where you are using password
    – RU Ahmed
    Commented Jan 24, 2021 at 7:20
  • Unless you have | in your data in which case good luck
    – boatcoder
    Commented Mar 23, 2022 at 17:08

Not the answer you're looking for? Browse other questions tagged or ask your own question.