25

I've got a django setup are using Django 1.6.7 and Postgres 9.3 on Ubuntu 14.04 LTS.

At any given time, the site gets about ~250 simultaneous connections to the PostgreSQL database, which is a Quad Core Xeon E5-2670 at 2.5GHz, and has 16GB of ram. The load average on that particular machine throughout the day is around 20 to 30.

Occasionally I will get emails in sentry about connections timing out to the database, and I figure enabling some sort of connection pooling will help mitigate this issue, as well as lower the load on the database a bit.

Since we are using Django 1.6, we do have the built in pooling available to us. However, when I set CONN_MAX_AGE to 10 seconds, or 60 seconds, almost immediately the number of simultaneous connections jumps to the maximum allowed setting (which is about double what we usually see), and connections start getting rejected.

So, it appears for what ever reason, the connections ARE persisting, but they ARE NOT being reused.

What could be the cause of this?

PS. We are also using gunicorn with --worker-class=eventlet. Perhaps this is the source of our woes?

2 Answers 2

29

Doing some more experimenting, I have found that the cause of our problem was indeed gunicorn's eventlet worker class. Each microthread made it's own persistent connection, and there was no way at all to reuse any of them.

Disabling eventlet has made the load on our webservers go up (but not by much), but the postgres load is now down to an average of 3. From 30.

4
  • 2
    You've just saved us a ton of time! We observe exactly the same behaviour and we are using eventlet. Will try to switch to connection pooling and see how it will work.
    – silentser
    Commented Oct 18, 2014 at 23:19
  • 3
    Update: pooling database connections with pgBouncer seemed to solve the problem (we are still using eventlet)
    – silentser
    Commented Nov 13, 2014 at 21:07
  • 1
    Apparently there's also psycogreen: pypi.python.org/pypi/psycogreen/1.0 (I've not tried it as once I set the CONN_MAX_AGE to zero it takes our system 20ms to make a DB connection so we simply don't need pooling)
    – Darren
    Commented Jun 30, 2016 at 10:37
  • 2
    It took me some time googling to turn up this answer to the exact same problem we were having.
    – Alper
    Commented Aug 5, 2019 at 10:15
0

Can you help me with some doubts, as I am facing the same issue.

  1. Are you suggesting, CONN_MAX_AGE doesnot works if we are using gunicorn with --worker-class=eventlet?
  2. Is there any solution to it, as I dont want to open and close connection on each request, is there a way to reuse connection(can we use singleton connection in django)?
2
  • I think you will need to ask a new question, otherwise no one will see this. If you want to use greenlets, I think you have to set CONN_MAX_AGE to 0 and use an external pooling solution like pgbouncer. See stackoverflow.com/a/77654006/238849
    – synic
    Commented Feb 14 at 22:40
  • Thanks @synic, do you have any idea on if i use gthread instead of gevent, will CONN_MAX_AGE will make sense, the connection will be reused and closed properly in each thread? Commented Feb 15 at 5:55

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .