Trouble passing HTTPS communication through mitmproxy - python-3.x

I'm trying to debug a HTTPS connection using mitmproxy and Python 3. The idea is the following: the proxy is running locally, and I tell Python to use it for a HTTPS connection; Python must accept the self-signed certificate created by mitmproxy for this to work, and if all goes fine, the console of mitmproxy will show me the decoded request/response pair.
I am able to do what I want with Python 3 and requests, but I have to do it using the standard urllib, alas, and I'm failing. Here is the code:
#!/usr/bin/env python3
import urllib.request
import ssl
proxy = urllib.request.ProxyHandler({'https': 'localhost:8080'})
opener = urllib.request.build_opener(proxy)
urllib.request.install_opener(opener)
ssl_ctx = ssl.create_default_context()
ssl_ctx.check_hostname = False
ssl_ctx.verify_mode = ssl.CERT_NONE
#ssl_ctx = ssl._create_unverified_context()
req = urllib.request.Request('https://github.com')
with urllib.request.urlopen(req) as res:
#with urllib.request.urlopen(req, context=ssl_ctx) as res:
print(res.read().decode('utf8'))
When the above code is executed, I get a
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)
error, which means the proxy is used, but its certificate doesn't validate. This is exactly what I expect to happen at this point.
When I use the commented out line to use the ssl_context (create in any of the ways), I get the contents of the page I request, but the proxy console never shows the decoded request information, it stays blank. It is as though using the ssl_context circumvents the proxy altogether.
Could someone please help me out and tell me what I'm doing wrong?
EDIT: just to make sure, I changed the proxy port to 8081, and now the code not using the ssl_ctx variable fails with 'connection refused' (as expected), and the code using the ssl_ctx works fine - this asserts my assumption that the proxy is not used at all.

Thanks for asking this and posting the code you tried. I'm not sure why the documentation claims that fetching HTTPS through a proxy is not supported, because it does work.
Instead of explicitly passing the SSL context, I created and installed an HTTPSHandler in addition to the ProxyHandler. This worked for me (Python 3.5 + mitmproxy):
import urllib.request
import ssl
ssl_ctx = ssl.create_default_context()
ssl_ctx.check_hostname = False
ssl_ctx.verify_mode = ssl.CERT_NONE
ssl_handler = urllib.request.HTTPSHandler(context=ssl_ctx)
proxy_handler = urllib.request.ProxyHandler({'https': 'localhost:8080'})
opener = urllib.request.build_opener(ssl_handler, proxy_handler)
urllib.request.install_opener(opener)
if __name__ == '__main__':
req = urllib.request.Request('https://...')
with urllib.request.urlopen(req) as res:
print(res.read().decode('utf8'))
Once this opener is installed, it's used as the default for all requests using urllib, even in third-party libraries -- which is what I needed.
I'm not sure, though, why your commented out line didn't work. Perhaps passing the context in urlopen installs a new opener that overrides the custom one.

Related

Using socks5 proxies with smtplib returns Connection not allowed by ruleset

I'm trying to use the smtplib python library in combination with proxies. The way I tried is to use PySocks library like so:
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, 'hostname', port, True, 'login','password')
socks.wrapmodule(smtplib)
smtp = smtplib.SMTP('smtp.server.com', 465)
However, after the authorization, it returns this error:
socks.GeneralProxyError: Socket error: 0x02: Connection not allowed by ruleset
The proxies are working as otherwise it'd return this error:
socks.GeneralProxyError: Socket error: SOCKS5 authentication failed
Is there anything I'm missing or is it the proxy's issue? I've already contacted their sys admin to assist me with this issue to see if it's truly on my side
Or is there any other way to connect the current Python script to a proxy and tunnel all of it's requests through it?

Sending request body to fastapi server while debugging

I have a simple fastapi server which is supposed to accept a request body that consists of a single string. It is supposed to take that string and send it to another module and get the return result, then send that back to the client.
Things on the client side seem to be working fine, but for some reason the call from the server to the other module isn't working. I suspect that I am not actually sending to the module what I think I am, and that I have to modify the call in some way. I want to stop it in the debugger so I can see exactly what the request and response objects look like.
I know how to set it up to debug the server, but not how to then send it a request body while in the debugger.
The server looks like this, with a main block added so I can debug it directly:
import uvicorn
from fastapi import FastAPI
from pydantic import BaseModel
from annotate import AnnotationModule
from fastapi.testclient import TestClient
app = FastAPI()
class Sentence(BaseModel):
text: str
print('Spinning up resources.')
am = AnnotationModule(wsd_only=True)
print('Everything loaded. Ready.')
#app.post("/annotate/")
async def read_item(sentence: Sentence):
return am.annotate(sentence.text)
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
I can set a breakpoint and run this, but how do I send it a request body so I can see it working?
I tried using this client code, but it didn't work:
import requests
# api-endpoint
URL = "http://0.0.0.0:8000/annotate"
PARAMS = {"text": "This is a test sentence."}
# sending get request and saving the response as response object
annotation_dict = requests.post(url = URL, json = PARAMS)
Running this while the server is sitting in the debugger, I get:
requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8000): Max retries exceeded with url: /annotate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd926ed5f90>: Failed to establish a new connection: [Errno 61] Connection refused'))
So, how can I debug it while observing it taking and processing a request? If it matters, I am doing this in vscode.

Fetch a cert using python3 from a server that requires client auth without authenticating?

Is it possible to only fetch a cert using python3 from a server that requires client auth without authenticating?
There's a server in my environment requiring a client ssl cert to authenticate. If you don't have one, when you connect with openssl you'll be able to fetch the certificate but get an error: Post with details. The error occurs late in the handshake.
If I try with python using the ssl module:
hostname = 'serverwithclientauth'
# PROTOCOL_TLS_CLIENT requires valid cert chain and hostname
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
context.load_verify_locations('path/to/cabundle.pem')
with socket.socket(socket.AF_INET, socket.SOCK_STREAM, 0) as sock:
with context.wrap_socket(sock, server_hostname=hostname) as ssock:
print(ssock.version())
(which is basically copy/paste from: Python.org)), wrap_socket fails. I am wondering if the ssl sequence can be broken down into smaller chunks such that I can terminate after the server sends the cert to the client.
I have tried just going for:
ssl.get_server_certificate((host,port))
but even just that bit fails for this host (others without client auth requirement work great!).

Can't get SSL to work for a Secure Websocket connection

I'm new to Python but have to build a Websocket API for work.
The website of the websockets module says that this code should work for secure websocket connections (https://websockets.readthedocs.io/en/stable/intro.html)
However, I cannot get the provided code to work..
import websockets
import asyncio
import pathlib
import ssl
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
ssl_context.load_verify_locations(pathlib.Path(__file__).with_name('localhost.pem'))
I get the error:
Traceback (most recent call last):
File "/Applications/Python 3.7/apii.py", line 7, in module>
ssl_context.load_verify_locations(pathlib.Path(__ file__).with_name('localhost.pem'))
FileNotFoundError: [Errno 2] No such file or directory
Could you help me out?
PS. I do not get the idea of this ssl_context code at all. Could somebody explain the logic behind it, please?
I test websockets from docker containers. where I have NAT ips (cant use LetsEncrypt), and no * domain certificate. If I test directly in Firefox it works fine without any certificate. So disabling the SSL for testing purpose inside a firewall should be allowed.
Extract:
import asyncio
import websockets
import ssl
:
ssl_context = ssl.create_default_context()
ssl_context.check_hostname = False
ssl_context.verify_mode = ssl.CERT_NONE
async def logIn(uri, myjson):
async with websockets.connect(uri) as websocket:
await websocket.send(myjson)
async def logIn(uri, myjson):
async with websockets.connect(uri, ssl=ssl_context) as websocket:
await websocket.send(myjson)
resp = await websocket.recv()
print(resp)
myurl = f"wss://{args.server}:{args.port}"
print("connection url:{}".format(myurl))
# logdef is the json string I send in
asyncio.get_event_loop().run_until_complete(
logIn(myurl, logdef)
)
Late answer but maybe still interesting for others.
The documentation (1) for this specific example says:
This client needs a context because the server uses a self-signed certificate. A client connecting to a secure WebSocket server with a valid certificate (i.e. signed by a CA that your Python installation trusts) can simply pass ssl=True to connect() instead of building a context.
So if the server-certificate from the server you want to access has a valid certificate just do the following:
uri = "wss://server_you_want_to_connect_to_endpoint"
async with websockets.connect(uri, ssl=True) as websocket:
and so on ...

How to only send certain requests with Tor in python?

right now i am using the following code to port my python through tor to send requests:
socks.set_default_proxy(socks.SOCKS5, "127.0.0.1", 9450)
socket.socket = socks.socksocket
I put this at the front of my code and then start sending requests, and all my requests will be sent through tor.
In my code I need to send a lot of requests, but only 1 of them needs to actually go through tor. The rest of them don't have to go through tor.
Is there any way I can configure my code so that I can choose which requests to send through tor, and which to send through without tor?
Thanks
Instead of monkey patching sockets, you can use requests for just the Tor request.
import requests
import json
proxies = {
'http': 'socks5h://127.0.0.1:9050',
'https': 'socks5h://127.0.0.1:9050'
}
data = requests.get("http://example.com",proxies=proxies).text
Or if you must, save the socket.socket class prior to changing it to the SOCKS socket so you can set it back when you're done using Tor.
socks.set_default_proxy(socks.SOCKS5, "127.0.0.1", 9450)
default_socket = socket.socket
socket.socket = socks.socksocket
# do stuff with Tor
socket.socket = default_socket

Resources