Connecting to FTPS with Python - ubuntu-14.04

I am trying to connect to an FTPS server which requires anonymous login with a .pfx certificate.
I have been given instructions for how to access it through the gui application SmartFTP which does work, thus I know I haven't got any firewall issues etc. However, for this workflow getting access to it through python would be ideal. Below are the settings I have been given:
Protocol: FTPS (Explicit)
Host: xxx.xxx.xxx.xxx
Port: 21
login type: Anonymous
Client Certificate: Enabled (providing a .pfx file)
Send FEAT: Send before and after login
I am having trouble picking the python module best suited to this with a full example using a .pfx certificate. Currently I have only tried the standard FTP module using the below code. Does anyone have a worked example?
from ftplib import FTP_TLS
ftps = FTP_TLS(host='xxx.xxx.xxx.xxx',
keyfile=r"/path/to.pfx"
)
ftps.login()
ftps.prot_p()
ftps.retrlines('LIST')
ftps.quit()
Using the above code I get:
ValueError: certfile must be specified
Client versions:
Ubuntu == 14.04,
Python == 3.6.2
Update
Think I am a little closer with the code below but getting a new error:
from ftplib import FTP_TLS
import tempfile
import OpenSSL.crypto
def pfx_to_pem(pfx_path, pfx_password):
""" Decrypts the .pfx file to be used with requests. """
with tempfile.NamedTemporaryFile(suffix='.pem') as t_pem:
f_pem = open(t_pem.name, 'wb')
pfx = open(pfx_path, 'rb').read()
p12 = OpenSSL.crypto.load_pkcs12(pfx, pfx_password)
f_pem.write(OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM, p12.get_privatekey()))
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, p12.get_certificate()))
ca = p12.get_ca_certificates()
if ca is not None:
for cert in ca:
f_pem.write(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
f_pem.close()
yield t_pem.name
pfx = pfx_to_pem(r"/path/to.pfx", 'password')
ftps = FTP_TLS(host='xxx.xxx.xxx.xxx',
context=pfx
)
ftps.login()
ftps.prot_p()
# ftps.prot_c()
print(ftps.retrlines('LIST'))
ftps.quit()
Error:
ftplib.error_perm: 534 Local policy on server does not allow TLS secure connections.
Any Ideas?
Cheers

It sounds like you try to do SFTP. FTP over SSL is not the same as SFTP. As far as I know SFTP (which is related to SSH) is not possible with the standard library.
See this for more about SFTP in Python: SFTP in Python? (platform independent)

Related

Python ldap3 lib unable to use Tls unknown error

i have currently have working code in linux which does ldapsearch here is the code
export LDAPTLS_CACERT=ldap.pem
ldapsearch -D 'CN=admuser,OU=Service Accounts,DC=InfoDir,DC=Dev,DC=AAA'
-H ldaps://server:1234 -b OU=People,DC=InfoDir,DC=Dev,DC=AAA -W "
(cn=staffid)"
this asks me password for admuser bind user and gives result. I need to code same in python. Here is my python code
from ldap3 import Server, Connection, Tls
import ssl
cacertfile='ldap.pem'
server = Server(host='hostname',port=1234)
conn = Connection(server,'CN=admuser,OU=Service
Accounts,DC=InfoDir,DC=Dev,DC=AAA','password,auto_bind=True)
print(conn)
this gives fine and gives me default ssl in conn object becuase ldap url has ldaps. However, this does not validate server cert which is not safe. Hence i further update my code to force tls here is the code..
from ldap3 import Server, Connection, Tls
import ssl
cacertfile='ldap.pem'
tls_conf = Tls(ssl.CERT_REQUIRED,ca_certs_file=cacertfile)
server = Server(host='hostname',port=1234,tls_conf)
conn = Connection(server,'CN=admuser,OU=Service
Accounts,DC=InfoDir,DC=Dev,DC=AAA','password,auto_bind=True)
print(conn)
When i run this i get exception
raise LDAPSocketOpenError('unable to open socket', exception_history)
ldap3.core.exceptions.LDAPSocketError:('unable to open socket',....'socket ssl wrapping error: unknown error(_ssl.c:3517..)
I am following the link for my code https://ldap3.readthedocs.io/en/latest/tutorial_intro.html
Please advise.

Python certificate verify failed: unable to get local issuer certificate

I am a novice programmer so pardon my mistakes. I have written the below code to verify a list of Websites are still active and all my work is based off this problem statement.
The script is able to check most sites but stumbled with below error for https://precisionit.net/
<urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)>
The above URL opens fine in Firefox and Chrome but fails to open in Python code. I have updated certifi and used it in my code as suggested by many folks but the error would not go away.
I am using Conda Python Env and I also executed the below command
conda install -c conda-forge certifi
There were multiple posts that suggested running "Install Certificates.command" which does not apply to Conda Python so I downloaded Python 3.9 installer and executed "Install Certificates.command" and executed the script with Python 3.9 yet no luck. I feel the issue is that even with latest version of certifi the sites certificate is not validated. Although certifi page says the list is based off Mozilla’s root certificates I guess it's not an exact replica which is why Firefox is able to open the site. Not sure if my understanding makes sense and will be glad to be corrected.
Pasting my script below. I am not sure what else needs to be done to fix the issue, kindly advise.
import urllib.request
import sys
import certifi
import ssl
def checkURL(url):
try:
hdr = { 'User-Agent' : 'Mozilla/79.0 (Windows NT 6.1; Win64; x64)' }
req=urllib.request.Request(url,headers=hdr)
r = urllib.request.urlopen(req,timeout=100,context=ssl.create_default_context(cafile=certifi.where()))
except Exception as e:
#print(r.read())
print('Failed Connecting to Website')
print(e)
return(1)
print(r.status)
finalurl = r.geturl()
if r.status==200:
print(finalurl)
return(0)
else:
print("Website Not Found")
return(2)
checkURL('https://precisionit.net/')
I had a similar problem, and this is how I solved it.
First, check who the issuer of the site certificate is. You can do this in many ways (check in the browser, connect using openssl ...).
Easiest is probably to just go to https://www.digicert.com/help/ and search for https://precisionit.net.
You are likely missing Sectigo RSA Domain Validation Secure Server CA. Just go to their site (https://support.sectigo.com/Com_KnowledgeDetailPage?Id=kA01N000000rfBO) and download it.
Then get the location of the cacert.pem file where your certificates are saved with certifi.where(), and simply add the contents of the certificate you downloaded to said file.
The certificate should be in form
-----BEGIN CERTIFICATE-----
... some base64 encoded stuff ...
-----END CERTIFICATE-----
first : save the site public key as base74
second: add code for verfify with your saved file.
enter image description here
with requests.Session() as s:
CC_host = 'https://precisionit.net'
first_page = s.get(CC_host,verify='./theSiteCert.cer')
html = first_page.text
print(html)

Unable to use hostkey from registry to connect to EFT server using pySFTP [duplicate]

This question already has answers here:
Verify host key with pysftp
(9 answers)
Closed 5 days ago.
I'm at a total loss with this code. I'm trying to set up a script to transfer files from an EFT server to a local folder on a regular basis. I'n using pySFTP and by the looks of ti the file transfer code is very simple.
However I can't connect to the EFT server due to issues with the hostkey. I'm not familiar with how hostkeys work and despite reading a lot about it I'm still not sure I understand it. I've tried this first code, from Martin Prikryl. I've replaced the actual login details with placeholders as this is a work server so can't share here, however I'm 100% certain I'm using the right ones:
import pysftp
import paramiko
from base64 import decodebytes
keydata = b"""0x11,0xc83f438e85c279c64150c44db874ab091267f38a69843dbdfb0d5b729109b4db64c706af00a68f243740149afa1c3022ebb5435904256229f5820050678361a2880bdb8934d4876c8383d4bd457b74397178880c9ae669645d778510e3ff1dcc1ac91ab43701fda075afaab49a0c3526bd98e848895221791c68b4aa98fe196f2e7ba998a713d48608f38c2699dc8b8d1bd70bedee143a7aad1fd6b2409c77588bc0d9dd2fecae9cf272a0242f2080f5054e78a100a94fda577bdab18ba75676aa999d2dd31c3df56c62cbd6e45aa5bffcb44de2ef129dfe97f6bf6d6d51032fe138950409168c003d3d316588b40ba97b5cc8122d0a323bd809bc3c53074bdb"""
key = paramiko.RSAKey(data=decodebytes(keydata))
cnopts = pysftp.CnOpts()
cnopts.hostkeys.add('HOST', 'rsa2022', key)
host = "HOST"
username = "USER"
password = "PASS"
with pysftp.Connection(host, username, password, cnopts=cnopts) as sftp:
print("Connection established...")
This fails due to a padding issue. I got this key by using Putty to connect to the server, which saved the key to my registry. I've also tried using the key provided by the server owner:
import pysftp
import paramiko
from base64 import decodebytes
keydata = b"""e1:98:d0:0e:85:f8:51:23:87:fa:24:4b:7e:81:88:e8"""
key = paramiko.RSAKey(data=decodebytes(keydata))
cnopts = pysftp.CnOpts()
cnopts.hostkeys.add('HOST', 'ssh-rsa', key)
host = "HOST"
username = "USER"
password = "PASS"
with pysftp.Connection(host, username, password, cnopts=cnopts) as sftp:
print("Connection established...")
This gives me an error saying - UnicodeDecodeError: 'utf-8' codec can't decode bytes in position 2-3: invalid continuation byte - which I'm not sure on. I've tried removing the colons but I get the same error.
I've tried 5 or 6 different solutions and my biggest issue is, Putty has stored the hostkey in my Registry and pySFTP or Paramiko (to my knowledge) cannot access it there.
I've looked into creating a known_hosts file myself for pySFTP to reference but can't find any clear way of doing this.
Admittedly this is more advanced Python than I'm used to but how it wants the hostkey is completely confusing me. If anyone can suggest a solution or an alternative to try, I'm open to all suggestions.
Thank you so much for your help Martin Prikryl. I used ssh-keyscan (despite my permissions it let me use this in cmd prompt) and got the true hostkey. This then worked in your code for verifying hostkeys with pySFTP!
So, to answer my own question, here is my working code, with HOSTKEY being the key gathered through ssh-keyscan in command prompt:
import pysftp
import paramiko
from base64 import decodebytes
keydata = b"""HOSTKEY"""
key = paramiko.RSAKey(data=decodebytes(keydata))
cnopts = pysftp.CnOpts()
cnopts.hostkeys.add('HOST', 'ssh-rsa', key)
myHost = "HOST"
myUsername = "USER"
myPassword = "PASSWORD"
with pysftp.Connection(host=myHost, username=myUsername, password=myPassword, cnopts=cnopts) as sftp:
print("Connection established...")
I still can't work out how to create or populate a known_hosts file but trying the '> known_hosts' command in cmd prompt gets me an access denied so it's entirely possible I don't have access to the location it would be stored in anyway. This code will be run on a secure machine only I have access to so it's not a massive issue.

Getting a WCF service (both host/client) to work on https on Linux with Mono

I have a small test console application that serves as the WCF host and another console application that serves as the client.
The client can reach the host via http, everything works fine so far.
But when switching to https, I get the following error:
Error: System.Net.WebException: Error: SendFailure (Error writing headers) --->
System.Net.WebException: Error writing headers --->
System.IO.IOException: The authentication or decryption has failed. --->
Mono.Security.Protocol.Tls.TlsException: The authentication or decryption has failed.
...
The steps so far I have attempted to solve the issue:
I have verified that the ca-certificates-mono package is installed
I have imported the CA certs to the machine store with (why do I need this if I work with a selfsigned cert?)
sudo mozroots --import --machine --sync
I created a selfsigned cert for testing with (as described in the Mono Security FAQ)
makecert -r -eku 1.3.6.1.5.5.7.3.1 -n "CN=Cert4SSL" -sv cert.pvk cert.cer
I added it to the mono cert store
sudo certmgr -add -c -m Trust cert.cer
I have also did tests with other stores (Root, My) and also using not the maching but the user's store - none did work, the same error on each attempt
I assigned port my service uses to the cert
httpcfg -add -port 6067 -cert cert.cer -pvk cert.pvk
I added ignoring the certificate validation
ServicePointManager.ServerCertificateValidationCallback += (o, certificate, chain, errors) => true;
This did not help either (but it got called, the cert object looked allright in the debugger).
The client uses this code to call the WebService:
IService svcClient2 = null;
string address2 = "https://localhost:6067/TestService";
BasicHttpBinding httpBinding2 = new BasicHttpBinding();
httpBinding2.TransferMode = TransferMode.Buffered;
httpBinding2.Security.Mode = BasicHttpSecurityMode.Transport;
httpBinding2.Security.Transport.ClientCredentialType = HttpClientCredentialType.None;
httpBinding2.MessageEncoding = WSMessageEncoding.Text;
httpBinding2.UseDefaultWebProxy = true;
ChannelFactory<IService> channelFac2 = new ChannelFactory<IService>( httpBinding2, new EndpointAddress( address2 ) );
svcClient2 = channelFac2.CreateChannel();
string res2 = svcClient2.TestHello( "Bob" ); // <----- this is where I get the exception
Any help is appreciated, I feel like running in circles.
A few infos about the environment:
I am using Ubuntu 14.04 LTS and Mono 4.0.2, IDE is MonoDevelop
edit: I have now built the very same projects with visual studio and C#, there it works as expected. The client can connect to the host on both http and https.
If i copy over the mono version to my Windows machine, I run into the same issue and error message as on Ubuntu.
Could this be a mono-related issue?

fabric keeps asking for password using SSH connection

I'm trying to connect to a windows azure instance using fabric, but despite I configure ssh conection to execute commands, fabric keeps asking for password.
This is my fabric file:
def azure1():
env.hosts = ['host.cloudapp.net:60770']
env.user = 'adminuser'
env.key_filename = './azure.key'
def what_is_my_name():
run('whoami')
I run it as:
fab -f fabfile.py azure1 what_is_my_name
or
fab -k -f fabfile.py -i azure.key -H adminuser#host.cloudapp.net:60770 -p password what_is_my_name
But nothing worked, it keeps asking for user password despite I enter it correctly.
Executing task 'what_is_my_name'
run: whoami
Login password for 'adminuser':
Login password for 'adminuser':
Login password for 'adminuser':
Login password for 'adminuser':
If I try to connect directly with ssh, it works perfectly.
ssh -i azure.key -p 60770 adminuser#host.cloudapp.net
I've tried the advises given in other questions (q1 q2 q3) but nothing works.
Any idea what I am doing wrong?
Thank you
Finally I found the problem is due to the public-private key pair generation.
I followed the steps provided in windows azure guide, there the keys are generated using openssl, so the process outcomes a public key stored in a pem file you must upload to your instance during creation process.
The problem is that this private key obtained is not correctly recognized by paramiko, so fabric won't work. If you try to open a ssh connection using paramiko from python interpreter:
>>> import paramiko, os
>>> paramiko.common.logging.basicConfig(level=paramiko.common.DEBUG)
>>> ssh = paramiko.SSHClient()
>>> ssh.load_host_keys('private_key_file.key') # private key file generated using openssl
>>> ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
>>> ssh.connect("web1.cloudapp.net",port=56317)
Gives me the error:
DEBUG:paramiko.transport:Trying SSH agent key a9d8dd41609191ebeedbe8df768ad8c9
DEBUG:paramiko.transport:userauth is OK
INFO:paramiko.transport:Authentication (publickey) failed.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File ".. /paramiko/client.py", line 337, in connect
self._auth(username, password, pkey, key_filenames, allow_agent, look_for_keys)
File ".. /paramiko/client.py", line 528, in _auth
raise saved_exception
paramiko.PasswordRequiredException: Private key file is encrypted
When the key file isn't encrypted.
To solve this, I created the key pair using openssh and then convert the public key to pem to upload it to azure:
# Create key with openssh
ssh-keygen -t rsa -b 2048 -f private_key_file.key
# extract public key and store as x.509 pem format
openssl req -x509 -days 365 -new -key private_key_file.key -out public_key_file.pem
# upload public_key_file.pem file during instance creation
# check connection to instance
ssh -i private_key_file.key -p 63534 adminweb#host.cloudapp.net
This solved the problem.
To debug fabric's ssh connections, add these lines to your fabfile:
import paramiko, os
paramiko.common.logging.basicConfig(level=paramiko.common.DEBUG)
This will print all of paramiko's debug messages. Paramiko is the ssh library that fabric uses.
Note that since Fabric 1.4 you have to specifically enable using ssh config:
env.use_ssh_config = True
(Note: I'm pretty sure absolutely certain that my fabfile used to work with Fabric > 1.5 without this option, but it doesn't now that I upgraded to 1.10).

Resources