Python ldap3 lib unable to use Tls unknown error - python-3.x

i have currently have working code in linux which does ldapsearch here is the code
export LDAPTLS_CACERT=ldap.pem
ldapsearch -D 'CN=admuser,OU=Service Accounts,DC=InfoDir,DC=Dev,DC=AAA'
-H ldaps://server:1234 -b OU=People,DC=InfoDir,DC=Dev,DC=AAA -W "
(cn=staffid)"
this asks me password for admuser bind user and gives result. I need to code same in python. Here is my python code
from ldap3 import Server, Connection, Tls
import ssl
cacertfile='ldap.pem'
server = Server(host='hostname',port=1234)
conn = Connection(server,'CN=admuser,OU=Service
Accounts,DC=InfoDir,DC=Dev,DC=AAA','password,auto_bind=True)
print(conn)
this gives fine and gives me default ssl in conn object becuase ldap url has ldaps. However, this does not validate server cert which is not safe. Hence i further update my code to force tls here is the code..
from ldap3 import Server, Connection, Tls
import ssl
cacertfile='ldap.pem'
tls_conf = Tls(ssl.CERT_REQUIRED,ca_certs_file=cacertfile)
server = Server(host='hostname',port=1234,tls_conf)
conn = Connection(server,'CN=admuser,OU=Service
Accounts,DC=InfoDir,DC=Dev,DC=AAA','password,auto_bind=True)
print(conn)
When i run this i get exception
raise LDAPSocketOpenError('unable to open socket', exception_history)
ldap3.core.exceptions.LDAPSocketError:('unable to open socket',....'socket ssl wrapping error: unknown error(_ssl.c:3517..)
I am following the link for my code https://ldap3.readthedocs.io/en/latest/tutorial_intro.html
Please advise.

Related

Python requests equivalent of '--proxy-header' in curl with SSL certification

Reference: How does one specify the equivalent of `--proxy-headers` curl argument into requests?
I am a newbie vis-a-vis Python.
I have a requirement, where a request to a destination(webpage) must go through a proxy server.
I need to pass headers to the "Proxy server" (same as --proxy-header of curl)
Need to add an SSL certificate (a '.cer' file) to read the passed headers to the proxy server(a 'Man In the Middle' scenario) on CONNECT.
The curl equivalent of my requirement is as follows:
curl -k --verbose --cacert /proxy/cert/folder/proxy-certificate.cer --proxy-header "header1: value1" --proxy 'http://localhost:8080/' 'https://destination.com'
I did come across a similar example How does one specify the equivalent of `--proxy-headers` curl argument into requests?. But I am unsure how to incorporate this with an SSL certificate.
My Code:
proxyheaders = { 'http://localhost:9090/': { 'header1': 'value1' } }
class ProxyHeaderAwareHTTPAdapter(requests.adapters.HTTPAdapter):
def proxy_headers(self, proxy):
if proxy in proxyheaders:
return proxyheaders[proxy]
else:
return None
s = requests.Session()
s.mount('http://', ProxyHeaderAwareHTTPAdapter())
s.mount('https://', ProxyHeaderAwareHTTPAdapter())
URL = "https://stackoverflow.com/"
cert_file_path = "/Path/to/certificate/proxy-certificate.cer"
try:
s.get(URL, verify=cert_file_path)
except Exception as e:
print(e)
I get the following error:
HTTPSConnectionPool(host='stackoverflow.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)')))
THIS IS NOT A SOLUTION:
When i usually encounter certificate/verification errors, i just force it to not verify the certificate using the code below:
conda config --set ssl_verify false
Note that this is not usually recommended and i usually do it temporarily until i finish either running spicific script or downloading a library or so. If you want to try this and if it works for you, remember to turn it back on once done using the code below:
conda config --set ssl_verify true

Python 3: Authenticate using response-kerberos package

I'm trying to authenticate to a server using the requests_kerberos package, following the instructions here:
https://github.com/requests/requests-kerberos
import requests
from requests_kerberos import HTTPKerberosAuth
kerberos_auth = HTTPKerberosAuth()
r = requests.get(<myserver>, auth=kerberos_auth)
r.text
And here is the response:
'Apache Tomcat/6.0.53 - Error report HTTP Status 401 - Authentication requiredtype Status reportmessage Authentication requireddescription This request requires HTTP authentication.Apache Tomcat/6.0.53'
klist shows that I have a valid TGT.
I have tried setting the principal directly, but that didn't help. I can authenticate using curl:
curl -i -L --negotiate -u : "<server>"
I'm not sure what else to try; everything is happening "behind the scenes" so I don't know what I'm doing wrong.

Error psycopg2.OperationalError: fe_sendauth: no password supplied even after postgres authorized connection

My python script is raising an 'psycopg2.OperationalError: fe_sendauth: no password supplied' error, even though the Postgre server is authorizing the connect.
I am using Python 3.5, psycopg2, Postgre 9.5 and the password is stored in a .pgpass file. The script is part of a restful flask application, using flask-restful. The script is running on the same host as the Postgre server.
I am calling the connect function as follows:
conn_admin = psycopg2.connect("dbname=database user=username")
When I execute the script I get the following stack trace:
File "/var/www/flask/content_provider.py", line 84, in get_report
conn_admin = psycopg2.connect("dbname=database user=username")
File "/usr/local/lib/python3.5/dist-packages/psycopg2/__init__.py", line 130, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
psycopg2.OperationalError: fe_sendauth: no password supplied
However when I look at the Postgre server log I see the following (I enabled the logger to also show all connection requests):
2019-01-04 18:28:35 SAST [17736-2] username#database LOG: connection authorized: user=username database=database
This code is running fine on my development PC, however when I put it onto the Unbuntu server, I start getting this problem.
To try and find the issue, I have hard-coded the password into the connection string, but I still get the same error.
If I execute the above line directly into my Python terminal on the host, it works fine, with and without the password in the connection string.
EDIT:
One thing I did notice is that on my desktop I use Python 3.6.2, while on the server I use Python 3.5.2.
Try adding the host:
conn_admin = psycopg2.connect("dbname=database user=username host=localhost")
Try adding the password ie
conn = psycopg2.connect("dbname=database user=username host=localhost password=password")

Connecting to FTPS with Python

I am trying to connect to an FTPS server which requires anonymous login with a .pfx certificate.
I have been given instructions for how to access it through the gui application SmartFTP which does work, thus I know I haven't got any firewall issues etc. However, for this workflow getting access to it through python would be ideal. Below are the settings I have been given:
Protocol: FTPS (Explicit)
Host: xxx.xxx.xxx.xxx
Port: 21
login type: Anonymous
Client Certificate: Enabled (providing a .pfx file)
Send FEAT: Send before and after login
I am having trouble picking the python module best suited to this with a full example using a .pfx certificate. Currently I have only tried the standard FTP module using the below code. Does anyone have a worked example?
from ftplib import FTP_TLS
ftps = FTP_TLS(host='xxx.xxx.xxx.xxx',
keyfile=r"/path/to.pfx"
)
ftps.login()
ftps.prot_p()
ftps.retrlines('LIST')
ftps.quit()
Using the above code I get:
ValueError: certfile must be specified
Client versions:
Ubuntu == 14.04,
Python == 3.6.2
Update
Think I am a little closer with the code below but getting a new error:
from ftplib import FTP_TLS
import tempfile
import OpenSSL.crypto
def pfx_to_pem(pfx_path, pfx_password):
""" Decrypts the .pfx file to be used with requests. """
with tempfile.NamedTemporaryFile(suffix='.pem') as t_pem:
f_pem = open(t_pem.name, 'wb')
pfx = open(pfx_path, 'rb').read()
p12 = OpenSSL.crypto.load_pkcs12(pfx, pfx_password)
f_pem.write(OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM, p12.get_privatekey()))
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, p12.get_certificate()))
ca = p12.get_ca_certificates()
if ca is not None:
for cert in ca:
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
f_pem.close()
yield t_pem.name
pfx = pfx_to_pem(r"/path/to.pfx", 'password')
ftps = FTP_TLS(host='xxx.xxx.xxx.xxx',
context=pfx
)
ftps.login()
ftps.prot_p()
# ftps.prot_c()
print(ftps.retrlines('LIST'))
ftps.quit()
Error:
ftplib.error_perm: 534 Local policy on server does not allow TLS secure connections.
Any Ideas?
Cheers
It sounds like you try to do SFTP. FTP over SSL is not the same as SFTP. As far as I know SFTP (which is related to SSH) is not possible with the standard library.
See this for more about SFTP in Python: SFTP in Python? (platform independent)

Getting a WCF service (both host/client) to work on https on Linux with Mono

I have a small test console application that serves as the WCF host and another console application that serves as the client.
The client can reach the host via http, everything works fine so far.
But when switching to https, I get the following error:
Error: System.Net.WebException: Error: SendFailure (Error writing headers) --->
System.Net.WebException: Error writing headers --->
System.IO.IOException: The authentication or decryption has failed. --->
Mono.Security.Protocol.Tls.TlsException: The authentication or decryption has failed.
...
The steps so far I have attempted to solve the issue:
I have verified that the ca-certificates-mono package is installed
I have imported the CA certs to the machine store with (why do I need this if I work with a selfsigned cert?)
sudo mozroots --import --machine --sync
I created a selfsigned cert for testing with (as described in the Mono Security FAQ)
makecert -r -eku 1.3.6.1.5.5.7.3.1 -n "CN=Cert4SSL" -sv cert.pvk cert.cer
I added it to the mono cert store
sudo certmgr -add -c -m Trust cert.cer
I have also did tests with other stores (Root, My) and also using not the maching but the user's store - none did work, the same error on each attempt
I assigned port my service uses to the cert
httpcfg -add -port 6067 -cert cert.cer -pvk cert.pvk
I added ignoring the certificate validation
ServicePointManager.ServerCertificateValidationCallback += (o, certificate, chain, errors) => true;
This did not help either (but it got called, the cert object looked allright in the debugger).
The client uses this code to call the WebService:
IService svcClient2 = null;
string address2 = "https://localhost:6067/TestService";
BasicHttpBinding httpBinding2 = new BasicHttpBinding();
httpBinding2.TransferMode = TransferMode.Buffered;
httpBinding2.Security.Mode = BasicHttpSecurityMode.Transport;
httpBinding2.Security.Transport.ClientCredentialType = HttpClientCredentialType.None;
httpBinding2.MessageEncoding = WSMessageEncoding.Text;
httpBinding2.UseDefaultWebProxy = true;
ChannelFactory<IService> channelFac2 = new ChannelFactory<IService>( httpBinding2, new EndpointAddress( address2 ) );
svcClient2 = channelFac2.CreateChannel();
string res2 = svcClient2.TestHello( "Bob" ); // <----- this is where I get the exception
Any help is appreciated, I feel like running in circles.
A few infos about the environment:
I am using Ubuntu 14.04 LTS and Mono 4.0.2, IDE is MonoDevelop
edit: I have now built the very same projects with visual studio and C#, there it works as expected. The client can connect to the host on both http and https.
If i copy over the mono version to my Windows machine, I run into the same issue and error message as on Ubuntu.
Could this be a mono-related issue?

Resources