127

I've installed a self-signed root ca cert into debian's /usr/share/ca-certificates/local and installed them with sudo dpkg-reconfigure ca-certificates. At this point true | gnutls-cli mysite.local is happy, and true | openssl s_client -connect mysite.local:443 is happy, but python2 and python3 requests module insists it is not happy with the cert.

python2:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 70, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 56, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 497, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)

python3

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/local/bin/python3.5/site-packages/requests/api.py", line 70, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/local/bin/python3.5/site-packages/requests/api.py", line 56, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/local/bin/python3.5/site-packages/requests/sessions.py", line 488, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/bin/python3.5/site-packages/requests/sessions.py", line 609, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/bin/python3.5/site-packages/requests/adapters.py", line 497, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)

Why does python ignore the system ca-certificates bundle, and how do I integrate it?

1
  • 3
    don't you just pass requests.get('https://example.com', verify='/usr/share/ca-certificates/local')?
    – user3064538
    Commented Mar 15, 2022 at 9:46

8 Answers 8

253

From https://stackoverflow.com/a/33717517/1695680

To make python requests use the system ca-certificates bundle, it needs to be told to use it over its own embedded bundle

export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt

Requests embeds its bundles here, for reference:

/usr/local/lib/python2.7/site-packages/requests/cacert.pem
/usr/lib/python3/dist-packages/requests/cacert.pem

Or in newer versions use additional package to obtain certificates from: https://github.com/certifi/python-certifi

To verify from which file certificates are loaded, you can try:

Python 3.8.5 (default, Jul 28 2020, 12:59:40) 
>>> import certifi
>>> certifi.where()
'/etc/ssl/certs/ca-certificates.crt'
5
  • 12
    After I delete the default cacert.pem bundled by requests, requests seems to pickup the system ca-certifications bundle without setting the environment variable. Commented Aug 8, 2017 at 13:54
  • 9
    Setting environmental variable REQUESTS_CA_BUNDLE works. However, it does not change crt path in certifi module. The answer implies that it does, but my test in python 3.7 and 3.8 shows otherwise. I recommand use os.getenv to check the path instead.
    – nafooesi
    Commented Apr 27, 2021 at 4:53
  • you can also use the path in the verify keyword argument like so: verify='/etc/ssl/certs/ca-certificates.crt'
    – Baza86
    Commented Jan 11, 2023 at 0:43
  • 2
    This environment variable does not work on all systems. On CentOS, for example, I needed to use SSL_CERT_FILE variable as noted in another answer: stackoverflow.com/a/75352343/2779147
    – jakebeal
    Commented Feb 21, 2023 at 15:19
  • @jakebeal are you sure you're interacting python pypi Requests and not openssl? it is very possible in your case requests is using openssl and will respect this argument, but there is a subtle difference, eg i wouldn't expect SSL_CERT_FILE to impact requests linked to gnutls -- unless gnutls is mocking openssl environs compatibility in the same was requests considered and possibly Incorporated here github.com/psf/requests/issues/2899 Thanks for a good note! Commented Mar 3, 2023 at 6:44
38

I struggled with this for a week or so recently. I finally found that the way to verify a self-signed, or privately signed, certificate in Python. You need to create your own certificate bundle file. No need to update obscure certificate bundles every time you update a library, or add anything to the system certificate store.

Start by running the openssl command that you ran before, but add -showcerts. openssl s_client -connect mysite.local:443 -showcerts This will give you a long output, and at the top you'll see the entire certificate chain. Usually, this means three certs, the website's certificate, the intermediate certificate, and the root certificate in that order. We need to put just the root and intermediate certificates into a next file in the opposite order.

Copy the last cert, the root certificate, to a new text file. Grab just the stuff between, and including:

-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----

Copy the middle cert (aka the intermediate certificate) to the new text file under the root cert. Again, grab the Begin and End Certificate lines and everything in between.

Save this text file to the directory where your Python script resides. My recommendation is to call it CertBundle.pem. (If you give it a different name, or put it somewhere else in your folder structure, make sure that the verify line reflects that.) Update your script to reference the new certificate bundle:

response = requests.post("https://www.example.com/", headers=headerContents, json=bodyContents, verify="CertBundle.pem")

And that's it. If you have only the root or only the intermediate certificate, then Python can't validate the entire certificate chain. But, if you include both of the certificates in the certificate bundle that you created, then Python can validate that the intermediate was signed by the root, and then when it accesses the website it can validate that the website's certificate was signed by the intermediate certificate.

edit: Fixed the file extension for the cert bundle. Also, fixed a couple of grammatical mistakes.

3
  • 2
    I'm not confident .p7b is the right semantic extension for said bundle. (Albeit I am not really an expert) just used to seeing .pem and .crt used for CA Bundles. I know the debian ca-certificates package is picky about certificates being .crt extensions to be added to the system-provided certificate store. Commented Feb 6, 2018 at 19:24
  • I almost said something about that in my post. I went with p7b because I think that's the right extention, but for this purpose it doesn't matter. It can be .txt, or no extention at all. The important thing is that your Python script can find the file.
    – fryad
    Commented Feb 9, 2018 at 3:08
  • 8
    @fryad is correct; this file ought to have the .pem extension, and some tools will mishandle it because it has the wrong extension. .pem is the de facto standard extension for this base64-encoded certificate format, while the binary version of the same format is .der, and .p7b is a different base64-encoded format. A handy reference on how to convert among them using openssl CLI tools: knowledge.digicert.com/solution/SO26449.html
    – Dan Lenski
    Commented Jul 17, 2018 at 2:58
23

My two cents:

Thanks to this other answer, which had me check on actual requests code, I figured out that you don't have to use the env variable but can just set the "verify" param in your request:

requests.get("https://whatever", verify="/my/path/to/cacert.crt", ...)

It is also documented, although I could only find the documentation after having made the discovery (and the pypi project points to a dead link for doc) :D

20

requests uses certifi as default root certs package, which builts in a lot of good CAs but unable to modify.

Debian (and Ubuntu) maintainers changed certifi's behavior different from default:

def where():
    return "/etc/ssl/certs/ca-certificates.crt"

So if you use apt-installed requests and certifi there is no problem.

But pip3 installed certifi inside virtual env use builtin CAs. So unable to use update-ca-certificates mechanism. Beside manually specifying root cert in app code (which may not be possible if request is called indirectly through 3rd party interfaces), it can also overriding with enviroument variable REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt to emulate the Debianized behavor.

4
  • I'm in a docker container with Debian Buster. Simply setting REQUESTS_CA_BUNDLE to a cert file does not use those certs with aiohttp. Perhaps it works for 'requests'. I'm in Python 3.10
    – Lee Meador
    Commented Oct 3, 2022 at 16:33
  • 4
    I found that setting SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt does work with aiohttp.
    – Lee Meador
    Commented Oct 3, 2022 at 17:08
  • Setting REQUESTS_CA_BUNDLE to my private cert solved my requests issue in a docker container. Commented Aug 10, 2023 at 20:19
  • If this helps anyone, for those of you using Python Requests module in a docker container in WSL, and have their corporate local CA certificate (e.g. zscaler) in /usr/local/share/ca-certificates in WSL, then adjust your docker run command to include the following: docker run ... -e REQUESTS_CA_BUNDLE=/etc/ssl/certs/zscaler.crt -v /usr/local/share/ca-certificates/zscaler.crt:/etc/ssl/certs/zscaler.crt ...
    – Simon C
    Commented Jan 15 at 21:02
10

After trying everything, I found this worked for me on Ubuntu

export SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt

I had to do this even though certifi showed the same path.

2

@fryads answer worked best for me. Being behind an company vpn that has https interception I had similar problems with local python scripts. the following script (done with the help gpt after instructing it properly :) - simply does the job.

#!/bin/bash

# Description:
# This script fetches the entire certificate chain for a given domain using openssl.
# It then removes the server's certificate and reverses the order of the remaining certificates.
# The output is saved to a file named `certbundle.pem`.
# 
# Usage:
# ./scriptname.sh <domain_name>
#
# Example:
# ./scriptname.sh example.com

# Check if a URL has been provided as an argument
if [ "$#" -ne 1 ]; then
    echo "Usage: $0 <domain_name>"
    exit 1
fi

# Extract the domain name from the argument
DOMAIN="$1"

# Fetch only the certificate chain using openssl.
# We use sed to filter out everything except the certificates.
openssl s_client -connect "$DOMAIN":443 -showcerts | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > certbundle.pem

# Remove the first certificate (from the server itself) from the certbundle.pem file.
# In macOS, the -i option for sed requires an extension, but we're providing an empty string to modify the file in-place without creating a backup.
sed -i '' '1,/-END CERTIFICATE-/d' certbundle.pem

# Reverse the order of the certificates and save it back to certbundle.pem.
awk 'BEGIN {last = 0} /-----BEGIN CERTIFICATE-----/ { if(last) {print lastcert} last=1 } { lastcert = (lastcert $0 RS) } END {print lastcert}' certbundle.pem > reversed_certbundle.pem && mv reversed_certbundle.pem certbundle.pem

echo "Certificate chain has been saved in certbundle.pem."

in python then just simply do:


import request
response = requests.get("your url goes here", verify='certbundle.pem')
1

I didn't want to use a static file or a additional pip package which I don't understand, to solve the exact same problem. Luckily the standard ssl package, especially the load_default_certs() function, can solve the issue:

import ssl
import requests
from requests.adapters import HTTPAdapter

class LocalSSLContext(HTTPAdapter):
    def init_poolmanager(self, *args, **kwargs):
        context = ssl.create_default_context()
        context.load_default_certs()
        kwargs['ssl_context'] = context
        return super(LocalSSLContext, self).init_poolmanager(*args, **kwargs)

session = requests.Session()
sslContext = LocalSSLContext()
session.mount('https://www.example.com/', sslContext)
response = session.get(url='https://www.example.com/')

print(response.status_code)

Worked for me in Windows and Linux environments.

0

There can be another reason for getting the error. Namely, where there are two too similar (e.g. self-signed) certificates added to the /usr/share/ca-certificates/local.

In that case requests finds one of them without problems, but when trying to connect to the host of another, gives an error.

Then SAN (subject alternative name) and/or name can cause confusion (I am still not sure which). For example, name can be same or SAN hosts intersect. Name also means Issuer when one checks with

openssl x509 -in my.crt  -text -noout

Not the answer you're looking for? Browse other questions tagged or ask your own question.