Python certificate verify failed: unable to get local issuer certificate - python-3.x

I am a novice programmer so pardon my mistakes. I have written the below code to verify a list of Websites are still active and all my work is based off this problem statement.
The script is able to check most sites but stumbled with below error for https://precisionit.net/
<urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)>
The above URL opens fine in Firefox and Chrome but fails to open in Python code. I have updated certifi and used it in my code as suggested by many folks but the error would not go away.
I am using Conda Python Env and I also executed the below command
conda install -c conda-forge certifi
There were multiple posts that suggested running "Install Certificates.command" which does not apply to Conda Python so I downloaded Python 3.9 installer and executed "Install Certificates.command" and executed the script with Python 3.9 yet no luck. I feel the issue is that even with latest version of certifi the sites certificate is not validated. Although certifi page says the list is based off Mozilla’s root certificates I guess it's not an exact replica which is why Firefox is able to open the site. Not sure if my understanding makes sense and will be glad to be corrected.
Pasting my script below. I am not sure what else needs to be done to fix the issue, kindly advise.
import urllib.request
import sys
import certifi
import ssl
def checkURL(url):
try:
hdr = { 'User-Agent' : 'Mozilla/79.0 (Windows NT 6.1; Win64; x64)' }
req=urllib.request.Request(url,headers=hdr)
r = urllib.request.urlopen(req,timeout=100,context=ssl.create_default_context(cafile=certifi.where()))
except Exception as e:
#print(r.read())
print('Failed Connecting to Website')
print(e)
return(1)
print(r.status)
finalurl = r.geturl()
if r.status==200:
print(finalurl)
return(0)
else:
print("Website Not Found")
return(2)
checkURL('https://precisionit.net/')

I had a similar problem, and this is how I solved it.
First, check who the issuer of the site certificate is. You can do this in many ways (check in the browser, connect using openssl ...).
Easiest is probably to just go to https://www.digicert.com/help/ and search for https://precisionit.net.
You are likely missing Sectigo RSA Domain Validation Secure Server CA. Just go to their site (https://support.sectigo.com/Com_KnowledgeDetailPage?Id=kA01N000000rfBO) and download it.
Then get the location of the cacert.pem file where your certificates are saved with certifi.where(), and simply add the contents of the certificate you downloaded to said file.
The certificate should be in form
-----BEGIN CERTIFICATE-----
... some base64 encoded stuff ...
-----END CERTIFICATE-----

first : save the site public key as base74
second: add code for verfify with your saved file.
enter image description here
with requests.Session() as s:
CC_host = 'https://precisionit.net'
first_page = s.get(CC_host,verify='./theSiteCert.cer')
html = first_page.text
print(html)

Related

elastic_enterprise_search.AppSearch client fails in python sdk on GCloud Dataflow with urllib3 certificate error

I'm working on a DoFn that writes to Elastic Search App Search (elastic_enterprise_search.AppSearch). It works fine when I run my pipeline using the DirectRunner.
But when I deploy to DataFlow the elasticsearch client fails because, I suppose, it can't access a certificate store:
File "/usr/local/lib/python3.8/site-packages/urllib3/util/ssl_.py", line 402, in ssl_wrap_socket
context.load_verify_locations(ca_certs, ca_cert_dir, ca_cert_data)
FileNotFoundError: [Errno 2] No such file or directory
Any advice on how to overcome this sort of problem? I'm finding it difficult to get any traction on how to solve this on google.
Obviously urllib3 is set up properly on my local machine for DirectRunner. I have "elastic-enterprise-search" in the REQUIRED_PACKAGES key of setup.py for my package along with all my other dependencies:
REQUIRED_PACKAGES = ['PyMySQL', 'sqlalchemy',
'cloud-sql-python-connector', 'google-cloud-pubsub', 'elastic-enterprise-search']
Can I package certificates up with my pipeline? How? Should I look into creating a custom docker image? Any hints on what it should look like?
Yes, creating a custom container that has the necessary credentials in it would work well here.

PyMongo [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate

I'm using Python 3.9.5 and PyMongo 3.11.4. The version of my MongoDB database is 4.4.6. I'm using Windows 8.1
I'm learning MongoDB and I have a cluster set up in Atlas that I connect to. Whenever I try to insert a document into a collection, a ServerSelectionTimeoutError is raised, and inside its parentheses there are several [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate.
Troubleshooting TLS Errors in the PyMongo docs weren't too much help as they only provided tips for Linux and macOS users.
It's worth mentioning that if I set tlsAllowInvalidCertificates=True when initializing my MongoClient, everything works fine. That sounds insecure, and while I am working on a small project, I would still like to develop good habits and not override any security measures in place, so I'm hoping there is an alternative to that.
From all the searching I've done, I'm guessing that I'm missing certain certificates, or that Python can't find them. I've looked into the certifi package, but this part of the docs makes it seem that should only be necessary if I'm using Python 2.x, which I'm not.
So yeah, I'm kind of stuck right now.
Well, I eventually decided to install certifi and it worked.
client = MongoClient(CONNECTION_STRING, tlsCAFile=certifi.where())
Wish the docs were a bit clearer on this, but maybe I just didn't look hard enough.
In Flask server I solved by using:
import certifi
app = Flask(__name__)
app.config['MONGO_URI'] =
'mongodb+srv://NAME:<PWD><DBNAME>.9xxxx.mongodb.net/<db>? retryWrites=true&w=majority'
mongo = PyMongo(app,tlsCAFile=certifi.where())
collection_name = mongo.db.collection_name
By default, pymongo relies on the operating system’s root certificates.
You need to install certifi:
pip install certifi
It could be that Atlas itself updated its certificates or it could be that something on your OS changed. “certificate verify failed” often occurs because OpenSSL does not have access to the system’s root certificates or the certificates are out of date. For how to troubleshoot see TLS/SSL and PyMongo — PyMongo 3.12.0 documentation 107.
So try:
client = pymongo.MongoClient(connection, tlsCAFile=certifi.where())
This happens in django as well just add the above code to your settings.py in Django:
DATABASE = {
'default': {
'ENGINE': 'djongo',
"CLIENT": {
"name": <your_database_name>,
"host": <your_connection_string>,
"username": <your_database_username>,
"password": <your_database_password>,
"authMechanism": "SCRAM-SHA-1",
},
}
}
But in host you may get this issue:
"pymongo.errors.ServerSelectionTimeoutError:"[SSL:
CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get
local issuer certificate (_ssl.c:997)
So for this you can add:
"mongodb+srv://sampleUser:samplePassword#cluster0-gbdot.mongodb.net/sampleDB??ssl=true&ssl_cert_reqs=CERT_NONE&retryWrites=true&w=majority"
Add
ssl=true&ssl_cert_reqs=CERT_NONE
after db name of your url string works fine
"mongodb+srv://username:Password#cluster0-gbdot.mongodb.net/DbName?**ssl=true&ssl_cert_reqs=CERT_NONE**&retryWrites=true&w=majority"
I saw an answer that worked for me, it appears i had not yet installed the python certificates on my mac, so from the following path i went and installed it
/Applications/Python 3.10/Install Certificates.command
Only change the version of your python, after that everything, worked fine for me
PS: I had been trying to solve the problem for half a day, I even asked ChatGPT
Step 1:
pip install certifi
Step 2:
client = pymongo.MongoClient(connection, tlsCAFile=certifi.where())

Python requests equivalent of '--proxy-header' in curl with SSL certification

Reference: How does one specify the equivalent of `--proxy-headers` curl argument into requests?
I am a newbie vis-a-vis Python.
I have a requirement, where a request to a destination(webpage) must go through a proxy server.
I need to pass headers to the "Proxy server" (same as --proxy-header of curl)
Need to add an SSL certificate (a '.cer' file) to read the passed headers to the proxy server(a 'Man In the Middle' scenario) on CONNECT.
The curl equivalent of my requirement is as follows:
curl -k --verbose --cacert /proxy/cert/folder/proxy-certificate.cer --proxy-header "header1: value1" --proxy 'http://localhost:8080/' 'https://destination.com'
I did come across a similar example How does one specify the equivalent of `--proxy-headers` curl argument into requests?. But I am unsure how to incorporate this with an SSL certificate.
My Code:
proxyheaders = { 'http://localhost:9090/': { 'header1': 'value1' } }
class ProxyHeaderAwareHTTPAdapter(requests.adapters.HTTPAdapter):
def proxy_headers(self, proxy):
if proxy in proxyheaders:
return proxyheaders[proxy]
else:
return None
s = requests.Session()
s.mount('http://', ProxyHeaderAwareHTTPAdapter())
s.mount('https://', ProxyHeaderAwareHTTPAdapter())
URL = "https://stackoverflow.com/"
cert_file_path = "/Path/to/certificate/proxy-certificate.cer"
try:
s.get(URL, verify=cert_file_path)
except Exception as e:
print(e)
I get the following error:
HTTPSConnectionPool(host='stackoverflow.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)')))
THIS IS NOT A SOLUTION:
When i usually encounter certificate/verification errors, i just force it to not verify the certificate using the code below:
conda config --set ssl_verify false
Note that this is not usually recommended and i usually do it temporarily until i finish either running spicific script or downloading a library or so. If you want to try this and if it works for you, remember to turn it back on once done using the code below:
conda config --set ssl_verify true

Connecting to FTPS with Python

I am trying to connect to an FTPS server which requires anonymous login with a .pfx certificate.
I have been given instructions for how to access it through the gui application SmartFTP which does work, thus I know I haven't got any firewall issues etc. However, for this workflow getting access to it through python would be ideal. Below are the settings I have been given:
Protocol: FTPS (Explicit)
Host: xxx.xxx.xxx.xxx
Port: 21
login type: Anonymous
Client Certificate: Enabled (providing a .pfx file)
Send FEAT: Send before and after login
I am having trouble picking the python module best suited to this with a full example using a .pfx certificate. Currently I have only tried the standard FTP module using the below code. Does anyone have a worked example?
from ftplib import FTP_TLS
ftps = FTP_TLS(host='xxx.xxx.xxx.xxx',
keyfile=r"/path/to.pfx"
)
ftps.login()
ftps.prot_p()
ftps.retrlines('LIST')
ftps.quit()
Using the above code I get:
ValueError: certfile must be specified
Client versions:
Ubuntu == 14.04,
Python == 3.6.2
Update
Think I am a little closer with the code below but getting a new error:
from ftplib import FTP_TLS
import tempfile
import OpenSSL.crypto
def pfx_to_pem(pfx_path, pfx_password):
""" Decrypts the .pfx file to be used with requests. """
with tempfile.NamedTemporaryFile(suffix='.pem') as t_pem:
f_pem = open(t_pem.name, 'wb')
pfx = open(pfx_path, 'rb').read()
p12 = OpenSSL.crypto.load_pkcs12(pfx, pfx_password)
f_pem.write(OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM, p12.get_privatekey()))
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, p12.get_certificate()))
ca = p12.get_ca_certificates()
if ca is not None:
for cert in ca:
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
f_pem.close()
yield t_pem.name
pfx = pfx_to_pem(r"/path/to.pfx", 'password')
ftps = FTP_TLS(host='xxx.xxx.xxx.xxx',
context=pfx
)
ftps.login()
ftps.prot_p()
# ftps.prot_c()
print(ftps.retrlines('LIST'))
ftps.quit()
Error:
ftplib.error_perm: 534 Local policy on server does not allow TLS secure connections.
Any Ideas?
Cheers
It sounds like you try to do SFTP. FTP over SSL is not the same as SFTP. As far as I know SFTP (which is related to SSH) is not possible with the standard library.
See this for more about SFTP in Python: SFTP in Python? (platform independent)

Getting a WCF service (both host/client) to work on https on Linux with Mono

I have a small test console application that serves as the WCF host and another console application that serves as the client.
The client can reach the host via http, everything works fine so far.
But when switching to https, I get the following error:
Error: System.Net.WebException: Error: SendFailure (Error writing headers) --->
System.Net.WebException: Error writing headers --->
System.IO.IOException: The authentication or decryption has failed. --->
Mono.Security.Protocol.Tls.TlsException: The authentication or decryption has failed.
...
The steps so far I have attempted to solve the issue:
I have verified that the ca-certificates-mono package is installed
I have imported the CA certs to the machine store with (why do I need this if I work with a selfsigned cert?)
sudo mozroots --import --machine --sync
I created a selfsigned cert for testing with (as described in the Mono Security FAQ)
makecert -r -eku 1.3.6.1.5.5.7.3.1 -n "CN=Cert4SSL" -sv cert.pvk cert.cer
I added it to the mono cert store
sudo certmgr -add -c -m Trust cert.cer
I have also did tests with other stores (Root, My) and also using not the maching but the user's store - none did work, the same error on each attempt
I assigned port my service uses to the cert
httpcfg -add -port 6067 -cert cert.cer -pvk cert.pvk
I added ignoring the certificate validation
ServicePointManager.ServerCertificateValidationCallback += (o, certificate, chain, errors) => true;
This did not help either (but it got called, the cert object looked allright in the debugger).
The client uses this code to call the WebService:
IService svcClient2 = null;
string address2 = "https://localhost:6067/TestService";
BasicHttpBinding httpBinding2 = new BasicHttpBinding();
httpBinding2.TransferMode = TransferMode.Buffered;
httpBinding2.Security.Mode = BasicHttpSecurityMode.Transport;
httpBinding2.Security.Transport.ClientCredentialType = HttpClientCredentialType.None;
httpBinding2.MessageEncoding = WSMessageEncoding.Text;
httpBinding2.UseDefaultWebProxy = true;
ChannelFactory<IService> channelFac2 = new ChannelFactory<IService>( httpBinding2, new EndpointAddress( address2 ) );
svcClient2 = channelFac2.CreateChannel();
string res2 = svcClient2.TestHello( "Bob" ); // <----- this is where I get the exception
Any help is appreciated, I feel like running in circles.
A few infos about the environment:
I am using Ubuntu 14.04 LTS and Mono 4.0.2, IDE is MonoDevelop
edit: I have now built the very same projects with visual studio and C#, there it works as expected. The client can connect to the host on both http and https.
If i copy over the mono version to my Windows machine, I run into the same issue and error message as on Ubuntu.
Could this be a mono-related issue?

Resources