I need to see the queries submitted to a PostgreSQL server. Normally I would use SQL Server profiler to perform this action in SQL Server land, but I'm yet to find how to do this in PostgreSQL. There appears to be quite a few pay-for tools, I am hoping there is an open source variant.
6 Answers
You can use the log_statement config setting to get the list of all the queries to a server
https://www.postgresql.org/docs/current/static/runtime-config-logging.html#guc-log-statement
Just set that, and the logging file path and you'll have the list. You can also configure it to only log long running queries.
You can then take those queries and run EXPLAIN on them to find out what's going on with them.
https://www.postgresql.org/docs/9.2/static/using-explain.html
-
3
-
40Well, its hard to call .csv log files an "equivalent of SQL Server profiler"... Commented Nov 6, 2019 at 8:40
Adding to Joshua's answer, to see which queries are currently running simply issue the following statement at any time (e.g. in PGAdminIII's query window):
SELECT datname,procpid,current_query FROM pg_stat_activity;
Sample output:
datname | procpid | current_query
---------------+---------+---------------
mydatabaseabc | 2587 | <IDLE>
anotherdb | 15726 | SELECT * FROM users WHERE id=123 ;
mydatabaseabc | 15851 | <IDLE>
(3 rows)
-
7With my version of PG (9.3), I used the following query: SELECT datname, pid, usename, application_name, client_addr, query FROM pg_stat_activity; pg_stat_activity is a view of the DB 'postgresql' Commented Aug 26, 2015 at 23:10
-
6SELECT client_addr, state_change, query FROM pg_stat_activity; Commented May 31, 2016 at 9:37
-
7The "query" column length is too short to display long queries. Commented May 6, 2020 at 13:34
I discovered pgBadger (https://pgbadger.darold.net/) and it is a fantastic tool that saved my life many times. Here is an example of an report. If you open it and go to 'top' menu you can see the slowest queries and the time consuming queries. Then you can ask details and see nice graphs that show you the queries by hour and if you use detail button you can see the SQL text in a pretty form. So I can see that this tool is free and perfect.
-
3Pretty nice tool. I used this tutorial to install it, as the official doc is pretty verbose: dhis2.org/analysing-postgresql-logs-using-pgbadger Commented Aug 27, 2015 at 3:26
-
8Just a note, that the tool is for *nix systems only, which sucks for Windows users Commented Jan 2, 2018 at 8:23
-
+1 as the OP asked for a tool like Sql Server Profiler not config options to manually extract needed performance info.– EAmezCommented Jul 15, 2019 at 6:00
-
Easy to install and easy to use!
sudo apt install pgbadger
pgbadger /var/log/postgresql/postgresql-11-main.log
Thanks for the recommendation! It's the profiler tool I was looking for. github.com/darold/pgbadger#postgresql-configuration– Sangar82Commented Mar 7, 2021 at 16:42 -
@EAmez you can run pgBadger on windows. See: github.com/darold/pgbadger/issues/480– FlimtixCommented Jul 13, 2022 at 14:56
I need to see the queries submitted to a PostgreSQL server
As an option, if you use pgAdmin (on my picture it's pgAdmin 4 v2.1). You can observe queries via "Dashboard" tab:
Update on Jun, 2022. Answering to the questions in the comments.
Question 1: My long SQL query gets truncated, is there any workaround?
Answer:
Close pgAdmin
Find
postgresql.conf
file. On my PC it is located inc:\Program Files\PostgreSQL\13\data\postgresql.conf
. If you can't find it - try this answer.Open
postgresql.conf
and find propertytrack_activity_query_size
. Default value is 1024. If sql query is longer - it will be truncated. Uncomment and increase the limit, for example:track_activity_query_size = 32768
Restart PostgreSQL service on your PC
P.S: Now, everything is ready. From development/debugging standpoint you won't see any difference, but for 'production' environment - better to revert that property back as it might slightly decrease the performance. More details here.
Question 2: I ran my function/method that triggers SQL query but still can't see it in pgAdmin, or sometimes I see it but it runs so quickly so I can't even expand the session on 'Dashboard' tab?
Answer: Try to 'debug' your application and set a breakpoint right before closing the database connection. At the same time (while debugging) click on 'refresh' button of 'Dashboard' tab in pgAdmin.
-
4This can't show long SQL statements. The SQL gets truncated. Commented May 6, 2020 at 13:32
-
2And it also can't show short running queries. Is there a way to see queries that has just finished? Commented Aug 11, 2021 at 11:53
You can use the pg_stat_statements extension.
If running the db in docker just add this command in docker-compose.yml
, otherwise just look at the installation instructions for your setup:
command: postgres -c shared_preload_libraries=pg_stat_statements -c pg_stat_statements.track=all -c max_connections=200
And then in the db run this query:
CREATE EXTENSION pg_stat_statements;
Now to see the operations that took more time run:
SELECT * FROM pg_stat_statements ORDER BY total_time/calls DESC LIMIT 10;
Or play with other queries over that view to find what you are looking for.
-
2Also similarly, SELECT query_start,query,datname FROM pg_stat_activity where datname='your database name' order by query_start desc– cansuCommented Mar 24, 2021 at 11:16
-
All those tools like pgbadger or pg_stat_statements require access to the server, and/or altering the server-settings/server-log-settings, which is not such a good idea, especially if it requires server-restart and because logging slows everything down, including production use.
In addition to that, extensions such as pg_stat_statements don't really show the queries, let alone in chronological order, and pg_stat_activity doesn't show you anything that doesn't run right now, and in addition, queries that are running that are from other users than you.
Instead of running any such crap, you can add a TCP-proxy in between your application and PostgreSQL-server.
Then your TCP-proxy reads all the sql-query-statements from what goes over the wire from your application to the server, and outputs it to console (or wherever). Also it forwards everything to PostgreSQL and returns the answer(s) to your application.
This way, you don't need to stop/start/restart your db-server, you don't need admin/root rights !ON THE DB-SERVER! to change the config file, and you don't need any access to the db-server. All you need to do is change the db connection string in your application (e.g. in your dev-environment) to point to the proxy server instead of the sql-server (the proxy-server then needs to point to the sql-server). Then you can see (in chronological order) what your <insert_profanity_here> application does on the database - and also, other people's queries don't show up (which makes it even better than sql-server-profiler). [Of course, you can also see what other people do if you put it on the db server on the old db port, and assing the db a new port. ]
I have implemented this with pg_proxy_net
(runs on Windows, Linux and Mac and doesn't require OS-dependencies, as it is .NET-Core-self-contained-deployment).
That way, you get appx. "the same" as you get with sql-server profiler.
Wait, if you aren't disturbed by other people's queries, what you get with pg_proxy_net is actually better than what you get with sql-server profiler.
Also, on github, I have a command-line MS-SQL-Server profiler that works on Linux/Mac.
And an GUI MS-SQL-Express-Profiler for Windows.
The funny thing is, once you have written one such tool, writing some more is just a piece of cake and done in under a day.
Also, if you want to get pg_stat_statements to work, you need to alter the config file (postgresql.conf), adding tracking and preloading libraries, and then restart the server:
CREATE EXTENSION pg_stat_statements;
-- D:\Programme\LessPortableApps\SQL_PostGreSQL\PostgreSQLPortable\Data\data\postgresql.conf
shared_preload_libraries = 'pg_stat_statements'
pg_stat_statements.track = all
You find the documentation for the PostgreSQL protocol here:
https://www.postgresql.org/docs/current/protocol-overview.html
You can see how the data is written into the TCP-buffer by looking at the source code of a postgresql-client, e.g. FrontendMessages of Npgsql on github: https://github.com/npgsql/npgsql/blob/main/src/Npgsql/Internal/NpgsqlConnector.FrontendMessages.cs
Also, just in case you have a .NET application (with source code) that uses Npgsql, you might want to have a look at Npgsql.OpenTelemetry.
PS:
To configure the logs, see ChartIO Tutorial and TablePlus.
Cheers !
Happy "profiling" !