Python requests fails when tryign to connect to .onion site - python-3.x

I'm trying to get a webpage which is hosted in the tor network. I'm using the following code:
import requests
def get_tor_session():
session = requests.session()
session.proxies = {'http': 'socks5://127.0.0.1:9150',
'https': 'socks5://127.0.0.1:9150'}
return session
session = get_tor_session()
When I try to get a normal website it works fine, for example: print(session.get("http://httpbin.org/ip").text) prints {"origin": "80.67.172.162"}
But when I try it on a .onion site, it fails with this error:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/socks.py", line 813, in connect
negotiate(self, dest_addr, dest_port)
File "/usr/local/lib/python3.6/site-packages/socks.py", line 477, in _negotiate_SOCKS5
CONNECT, dest_addr)
File "/usr/local/lib/python3.6/site-packages/socks.py", line 540, in _SOCKS5_request
resolved = self._write_SOCKS5_address(dst, writer)
File "/usr/local/lib/python3.6/site-packages/socks.py", line 592, in _write_SOCKS5_address
addresses = socket.getaddrinfo(host, port, socket.AF_UNSPEC, socket.SOCK_STREAM, socket.IPPROTO_TCP, socket.AI_ADDRCONFIG)
File "/usr/local/Cellar/python3/3.6.3/Frameworks/Python.framework/Versions/3.6/lib/python3.6/socket.py", line 745, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno 8] nodename nor servname provided, or not known
During handling of the above exception, another exception occurred:
...
Traceback (most recent call last):
File "spider.py", line 13, in <module>
print(session.get("http://zqktlwi4fecvo6ri.onion/").text)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 521, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 508, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 618, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/adapters.py", line 508, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: SOCKSHTTPConnectionPool(host='zqktlwi4fecvo6ri.onion', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.contri
b.socks.SOCKSConnection object at 0x106fd62e8>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known',))

When using the socks5 scheme, domains are resolved locally by the client's DNS server. But 'normal' DNS servers can't resolve .onion domains and so your request fails.
From docs.python-requests.org:
Using the scheme socks5 causes the DNS resolution to happen on the client, rather than on the proxy server. This is in line with curl, which uses the scheme to decide whether to do the DNS resolution on the client or proxy. If you want to resolve the domains on the proxy server, use socks5h as the scheme.
So, in order to connect to .onion sites you should let TOR resolve the domain. This is possible if you use the socks5h sheme in the proxies dictionary.
import requests
session = requests.session()
session.proxies = {'http': 'socks5h://127.0.0.1:9150', 'https': 'socks5h://127.0.0.1:9150'}
response = session.get("https://3g2upl4pq6kufc4m.onion/")
print(response)
#<Response [200]>
Note that you may have to install extra dependencies.
pip install requests[socks]

Related

Unable to send e-mail using python

I'm running Rasbian Buster and use Msmtp to send emails from the command line and it works just fine.
When I try to send emails using Python it fails miserably, I've tried various python examples from the net, e.g.
# Sending Email Alerts via Zoho
#
#
import smtplib
server = smtplib.SMTP_SSL('smtp.zoho.com',port=465) #server for sending the email
server.ehlo() # simple starting of the connection
server.login('test_email#zoho.com','pwd_12345') # login credentials and password
msg = """From:test_email#zoho.com
Subject: Test Email \n
To: recipient_email#gmail.com \n"""
# This is where the email content goes. It could be information about the error, time of day, where in the script, etc.
server.sendmail('test_email#zoho.com','recipient_email#gmail.com',msg) # this is where the email is sent to the recipient
server.quit() # exit the connection
... but I unfortunately I always get the following error:
Traceback (most recent call last):
File "/usr/lib/python3.7/smtplib.py", line 387, in getreply
line = self.file.readline(_MAXLINE + 1)
File "/usr/lib/python3.7/socket.py", line 589, in readinto
return self._sock.recv_into(b)
ConnectionResetError: [Errno 104] Connection reset by peer
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/pi/test_email_python_06.py", line 6, in <module>
server = smtplib.SMTP('smtpauths.bluewin.ch',port=465) #server for sending the email
File "/usr/lib/python3.7/smtplib.py", line 251, in __init__
(code, msg) = self.connect(host, port)
File "/usr/lib/python3.7/smtplib.py", line 338, in connect
(code, msg) = self.getreply()
File "/usr/lib/python3.7/smtplib.py", line 391, in getreply
+ str(e))
smtplib.SMTPServerDisconnected: Connection unexpectedly closed: [Errno 104] Connection reset by peer
As a newbee any hint would be appreciated.
Thanks!
This issue has been solved!
My ISP uses SSL on port 465 and my command line email client MSMTP works just great using that.
As I was so desperate, so that I startet to play around and just use port 25 and "Bingo" the sending of emails works now just fine, the funny part is that my ISP suggest to use port 465.

Is there an Eclipse Pydev setting or something in my Python get request preventing me from connecting with a host?

I am trying to read data from an ESA server in a PyDev Eclipse 3.8.1 Environment using Python 3.5.3. An example of a product link is given here:
"https://scihub.copernicus.eu/dhus/odata/v1/Products('c7208694-dedb-4f47-96c0-c8fb03512ff5')
You will need authentication credentials to access the website.
In my Eclipse environment I have manually added in proxy settings by going to Windows-> General -> Network Connections. Authentication is not required for the proxy.
To get the content of the webpage, I use a Python get request to send my query as below:
url = r"https://scihub.copernicus.eu/dhus/odata/v1/Products('c5dc59f0-b041-4f76-a685-49be63491270')"
r = requests.get(url, verify=False, proxies=proxies, auth=auth)
I need the verify=False to disable the certificate check and proxies is a dictionary storing the proxy addresses with the relevant port. Server credentials are given with auth which is a tuple of a username and password.
When I send the request to the server, I get the following error:
Traceback (most recent call last):
File "/lib/python3.5/site-packages/urllib3/connection.py", line 159, in _new_conn
(self._dns_host, self.port), self.timeout, **extra_kw)
File "/lib/python3.5/site-packages/urllib3/util/connection.py", line 80, in create_connection
raise err
File "/lib/python3.5/site-packages/urllib3/util/connection.py", line 70, in create_connection
sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/lib/python3.5/site-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/lib/python3.5/site-packages/urllib3/connectionpool.py", line 343, in _make_request
self._validate_conn(conn)
File "/lib/python3.5/site-packages/urllib3/connectionpool.py", line 839, in _validate_conn
conn.connect()
File "/lib/python3.5/site-packages/urllib3/connection.py", line 301, in connect
conn = self._new_conn()
File "/lib/python3.5/site-packages/urllib3/connection.py", line 168, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTTPSConnection object at 0x7fecf00acbe0>: Failed to establish a new connection: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/lib/python3.5/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/lib/python3.5/site-packages/urllib3/connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "/lib/python3.5/site-packages/urllib3/util/retry.py", line 398, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='scihub.copernicus.eu', port=443): Max retries exceeded with url: /dhus/odata/v1/Products('c5dc59f0-b041-4f76-a685-49be63491270') (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fecf00acbe0>: Failed to establish a new connection: [Errno 110] Connection timed out',))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "download_lib2/download_lib/scihub/download.py", line 325, in <module>
download_url(dl_link, auth, temp_dir)
File "download_lib2/download_lib/scihub/download.py", line 153, in download_url
online = check_url_online(url_to_download, auth)
File "download_lib2/download_lib/scihub/download.py", line 51, in check_url_online
r = requests.get(url, verify=False, proxies=proxies, auth=auth)
File "/lib/python3.5/site-packages/requests/api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "/lib/python3.5/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/lib/python3.5/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/lib/python3.5/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/lib/python3.5/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='scihub.copernicus.eu', port=443): Max retries exceeded with url: /dhus/odata/v1/Products('c5dc59f0-b041-4f76-a685-49be63491270') (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fecf00acbe0>: Failed to establish a new connection: [Errno 110] Connection timed out',))
I have tried the same request outside of Eclipse, in IDLE, and I am able to successfully connect and read the content. Is there something wrong with my current Eclipse/PyDev environment that will not allow me to send a get request?
Why is Python not able to connect? I am able to send get requests to this and other websites using other projects in Eclipse (e.g. https://podaac-opendap.jpl.nasa.gov/opendap/allData/oscar/preview/L4/oscar_third_deg/) with no problem whatsoever.
Edit: Added in versions
The socks timeout indicates that your problem is actually the connection to your proxy as opposed to the ESA server. Additionally, I believe the proxy settings in Eclipse are only used for plugins/software updates and not by any code you write.
As an initial step, check your proxy dictionary keys are correct and verify your proxy URLS if you can.
You mention you have other Eclipse projects that can access the internet; it's worth checking what proxy settings they implement and seeing if there are any discrepancies with your current setup.

Accessing https pages with python3

I do not understand how to use urllib3 or requests to connect to an https web site. This is driving me nuts. I have installed certifi and I see the default .pem file it provides. I have tried to set the requests.verify option to requests to every .pem and .crt file on the machine my script runs on [I am not an admin on this device]. I get nothing but errors.
I switched to using urllib3 and am now getting:
H:\Projects\MyScraper\venv\Scripts\python.exe H:/Projects/MyScraper/MyScraper.py
Traceback (most recent call last):
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
chunked=chunked)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 343, in _make_request
self._validate_conn(conn)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 839, in _validate_conn
conn.connect()
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connection.py", line 344, in connect
ssl_context=context)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\util\ssl_.py", line 342, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "C:\Program Files (x86)\Python36-32\lib\ssl.py", line 407, in wrap_socket
_context=self, _session=session)
File "C:\Program Files (x86)\Python36-32\lib\ssl.py", line 814, in __init__
self.do_handshake()
File "C:\Program Files (x86)\Python36-32\lib\ssl.py", line 1068, in do_handshake
self._sslobj.do_handshake()
File "C:\Program Files (x86)\Python36-32\lib\ssl.py", line 689, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "H:/Projects/MyScraper/MyScraper.py", line 15, in <module>
raw_html = HTTP.request('GET', 'https://portal.xsede.org/course-calendar/')
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\request.py", line 68, in request
**urlopen_kw)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\request.py", line 89, in request_encode_url
return self.urlopen(method, url, **extra_kw)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\poolmanager.py", line 323, in urlopen
response = conn.urlopen(method, u.request_uri, **kw)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 667, in urlopen
**response_kw)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 667, in urlopen
**response_kw)
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 667, in urlopen
**response_kw)
[Previous line repeated 6 more times]
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "H:\Projects\MyScraper\venv\lib\site-packages\urllib3\util\retry.py", line 398, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='portal.xsede.org', port=443): Max retries exceeded with url: /course-calendar/ (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)'),))
Process finished with exit code 1
My code looks like:
#!/home/me/virtualenv/python3.6/3.6/bin/python
import certifi
import urllib3
from bs4 import BeautifulSoup
HTTP = urllib3.PoolManager(
cert_reqs='CERT_REQUIRED',
ca_certs=certifi.where(),
retries=10
)
raw_html = HTTP.request('GET', 'https://portal.xsede.org/course-calendar/')
html = BeautifulSoup(raw_html, 'html.parser')
It blows up on the raw_html = HTTP.request(... line. Ideas?
Edit
Huh, this has something to do with my target host. If I go to google.com then several of my pem/crt files work.
The problem is, you are using wrong certificate to make request.
you can run this command to verify which certificate is used when any request is made, and then use that certificate in your request,
openssl s_client -showcerts -connect google.com:443
Please also make sure that you are passing verify the path to CA_BUNDLE file or directory with certificates of trusted CAs.
This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.
If this doesn't work out for you can explicitly merge the environment settings into your session,
When you are using the prepared request flow, keep in mind that it
does not take into account the environment. This can cause problems if
you are using environment variables to change the behaviour of
requests. For example: Self-signed SSL certificates specified in
REQUESTS_CA_BUNDLE will not be taken into account. As a result an SSL:
CERTIFICATE_VERIFY_FAILED is thrown. You can get around this behaviour
by explicity merging the environment settings into your session:
from requests import Request, Session
s = Session()
req = Request('GET', url)
prepped = s.prepare_request(req)
# Merge environment settings into session
settings = s.merge_environment_settings(prepped.url, None, None, None, None)
resp = s.send(prepped, **settings)
print(resp.status_code)

Python requests module exception SSLError

I am getting a requests module exception error when I try to access the http://www.acastipharma.com/ website. I am not having problems with any other website so I believe this is a website specific issue. Here is some example code
import requests
initialURL = 'http://www.acastipharma.com/'
r = requests.get(initialURL)
When I run this code I get an error message that terminates with
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.acastipharma.com', port=443): Max retries exceeded with url: /investors/ (Caused by SSLError(SSLError(1, '[SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645)'),))
An internet search indicates that the problem might be with acastipharma's SSL certificate. I tried installing pyopenssl to make sure I had the latest version of the the module thats checks SSL certificates but that did not solve the problem. I also tried running the requests.get statement with the verify=False option but that was also unsuccessful.
r = requests.get(initialURL, verify=False)
If anybody has any ideas on how to resolve this issue I would appreciate the assistance. I also tried using the older urllib.request package but ran into the same error.
This is an update to my original question: The error message I posted was from trying to run the requests command on one of the acastipharma's website subpages, here is the complete error message I get when I run the code exactly as shown in this question:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/connectionpool.py", line 601, in urlopen
chunked=chunked)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/connectionpool.py", line 346, in _make_request
self._validate_conn(conn)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/connectionpool.py", line 850, in _validate_conn
conn.connect()
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/connection.py", line 326, in connect
ssl_context=context)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/util/ssl_.py", line 329, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 376, in wrap_socket
_context=self)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 747, in __init__
self.do_handshake()
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 983, in do_handshake
self._sslobj.do_handshake()
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 628, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/adapters.py", line 440, in send
timeout=timeout
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/connectionpool.py", line 639, in urlopen
_stacktrace=sys.exc_info()[2])
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/urllib3/util/retry.py", line 388, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.acastipharma.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645)'),))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 508, in request
resp = self.send(prep, **send_kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 640, in send
history = [resp for resp in gen] if allow_redirects else []
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 640, in <listcomp>
history = [resp for resp in gen] if allow_redirects else []
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 218, in resolve_redirects
**adapter_kwargs
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 618, in send
r = adapter.send(request, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/adapters.py", line 506, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.acastipharma.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645)'),))
I am using Python 3.5.1. I am on a Mac using High Sierra version 10.13.2. I am using requests 2.18.4. Since posting the question I believe the problem lies with my IDE PyCharm's environment. If I use my Python 3.5 environment I have the problem as shown in this question, if I switch the project interpreter to a Python 3.6 Anaconda environment requests will work but unfortunately mysql won't import. Thanks
The OpenSSL version is OpenSSL 0.9.8zh 14 Jan 2016
This (very old and long unsupported) version of OpenSSL does not support the newer ciphers required by this specific web server. This server only supports ECDHE key exchange, which is not supported yet by OpenSSL 0.9.8. This means that the client only offers ciphers to the server which the server will not accept and due to no common ciphers the server will close the connection with an SSL handshake alert.
We can ignore this SSL Error using the following:
import warnings
from urllib3.exceptions import InsecureRequestWarning
warnings.simplefilter('ignore',InsecureRequestWarning)

SSL Handshake Error Python (SSLError(SSLError(336265225, '[SSL] PEM lib (_ssl.c:2959)') [duplicate]

I would like to authenticate to server from my client using certificate that is generated from server.I have a server-ca.crt and below is the CURL command that is working.How to send similar request using python requests module .
$ curl -X GET -u sat_username:sat_password \
-H "Accept:application/json" --cacert katello-server-ca.crt \
https://satellite6.example.com/katello/api/organizations
I have tried following way and it is getting some exception, can someone help in resolving this issue.
python requestsCert.py
Traceback (most recent call last):
File "requestsCert.py", line 2, in <module>
res=requests.get('https://satellite6.example.com/katello/api/organizations', cert='/certificateTests/katello-server-ca.crt', verify=True)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 68, in get
return request('get', url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 464, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 431, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [SSL] PEM lib (_ssl.c:2554)
res=requests.get('https://...', cert='/certificateTests/katello-server-ca.crt', verify=True)
The cert argument in requests.get is used to specify the client certificate and key which should be used for mutual authentication. It is not used to specify the trusted CA as the --cacert argument in curl does. Instead you should use the verify argument:
res=requests.get('https://...', verify='/certificateTests/katello-server-ca.crt')
For more information see SSL Cert Verification and Client Side Certificates in the documentation for requests.

Resources