Router blocking response from external networks - security

I'm working on a company network right now, and I've come across a problem where my scripts cannot connect to external networks. I'm just wondering if anyone knows common practices in network security that may cause this?
Ex. I can visit www.example.com on firefox, but my python script will get a timeout error if it tries to connect.
These scripts work perfectly fine on another network or if I change the URL to something on the local network.
import urllib.request
f = urllib.request.urlopen('http://www.python.org/')
print(f.read(300))
ANSWER: the browser uses the network's proxy. Scripts also have to use that proxy to run
import urllib.request
proxy = urllib.request.ProxyHandler({'http': '127.0.0.1'})
opener = urllib.request.build_opener(proxy)
urllib.request.install_opener(opener)
req = urllib.request.urlopen('http://www.google.com')
print(req.read())

It is very likely that your browser is configured to use a proxy. If that is true, then you will need to augment your python script with ProxyHandler (see Proxy with urllib2)

Related

WSGIServer not working with https and python3

We got a old package write in python2, and working on upgrade it to python3. It's a web app and we are using the WSGIServer.
from gevent import pywsgi
from geventwebsocket.handler import WebSocketHandler
server = pywsgi.WSGIServer(('', port), apphandler, handler_class=WebSocketHandler)
def apphandler(request, start_response):
log.info("ATTENTION: request is {}".format(request))
# do something
However, when I try to use the web app with https://website, it would show this error:
This site can’t provide a secure connection
website.com sent an invalid response.
ERR_SSL_PROTOCOL_ERROR
And in my server log, I would see things like:
Invalid HTTP method: '\x16\x03\x01\x02\x00\x01\x00\x01ü\x03\x03Èæ\x01\x92É\x16ÁW»P½\x1aBÐa\x83ÆÊa]ãíDp¥¥¥\x12Ç\x82|\x1f E5\x03aAv\x99¢Nª\x93ÅÏ:Ð\x9d÷3£\x80çÍïÌÑÿC\rÏUÄ\x8b\x00 ªª\x13\x01\x13\x02\x13\x03À+À/À,À0̨̩À\x13À\x14\x00\x9c\x00\x9d\x00/\x005\x01\x00\x01\x93ZZ\x00\x00\x00\x00\x005\x003\x00\x000website.com\x00\x17\x00\x00ÿ\x01\x00\x01\x00\x00\n'
When I actually use the website as http://website.com, it would work. And when in python2, the server would respond correctly to https.
I'm guessing it might be encoding issue, but it won't even reach where I put log in apphandler function. Does anyone know how am I possible to fix this? Do I need to change server or python3 encoding setup?
Looks like you are starting up a plain TCP web server sitting on port 443 and then hitting it from a web browser using an https: url.
If you check out the documentation:
https://www.gevent.org/api/gevent.pywsgi.html#gevent.pywsgi.WSGIServer
You need to supply ssl_args in order to get the socket to run under SSL instead of TCP. Just supplying a port number of 443 is not sufficient.
You can find a few examples of how to set ssl_args here:
https://www.programcreek.com/python/example/78007/gevent.pywsgi.WSGIServer

Python Requests module - does it use system level (on windows) proxy settings?

Background
I've got an app using the requests module to handle connecting to a remote webserver. This works perfectly, but I want to deploy it at within an organisation using an enterprise proxy server. The machines in the organisation have the proxy configured at the operating system level (ie windows setting the system proxy).
I'd prefer to have my app automatically use the already configured OS proxy settings, rather than have to ask them for the info (especially as they use basic authentication, so I'd have to securely store a username/password, not just the proxy host/port).
Question
Does Requests automatically use the operating system's proxy settings if you do not specify a proxy directly yourself?
I couldn't find the definitive answer to this after reading Request's documentation, or the underlying urllib3.
On my dev machine I don't have a proxy to test with, and so would like to know the answer before I go and code manual proxy handling in my app that might not actually be necessary...
Some more info
As a bit of comparison, Urllib does do this - see https://docs.python.org/3/library/urllib.request.html#urllib.request.ProxyHandler ...if no proxy is specified it will utilize the system configured one.
If seemed on my initial review of Request's documentation it didn't use the system configuration, instead only using environment variables if they were set: https://2.python-requests.org/en/master/user/advanced/#proxies
But, after a bit more digging, I found a way to at least obtain the OS proxy configuration, using urllib.request.getproxies(): https://stackoverflow.com/a/16311657/9423009
At this point I thought I'd at least be able to use the above at run time to get the OS proxy config, and pass that to requests...
...but then I found this post, which states that requests will use the OS level configuration if nothing is specified: How to use requests library without system-configured proxies
So, at this point, I can't find a definitive answer in the documentation either for requests or urllib3, but do have a SO post stating requests will use the OS level config, by calling urllib.requests.getproxies() itself.
...so can anyone confirm/deny this is the case?
thanks!
There are two aspects in your question
1. does requests use urllib.request.getproxies ?
As of version requests=2.25.1, from Session.request source, if not provided, proxy information is obtained from self.merge_environment_settings
if self.trust_env:
# Set environment's proxies.
no_proxy = proxies.get('no_proxy') if proxies is not None else None
env_proxies = get_environ_proxies(url, no_proxy=no_proxy)
And get_environ_proxies uses getproxies that is either imported from urllib (py2) or from urllib.request (py3).
So the answer is YES
2. is urllib.request.getproxies able to pick up the OS proxy configuration on windows ?
As far as I know, "the OS configured one" is not reliable on windows. At least on my corporate machine, urllib.request.getproxies does not pick up the proxy. From its documentation or from the one in ProxyHandler it states
If no proxy environment variables are set, then in a Windows environment proxy settings are obtained from the registry’s Internet Settings section, and in a Mac OS X environment proxy information is retrieved from the OS X System Configuration Framework.
From the source code I see that it reads under HKEY_CURRENT_USER > 'Software\Microsoft\Windows\CurrentVersion\Internet Settings', the value of ProxyEnable and ProxyServer. On my machine, that has a proxy configured, this is empty - the settings seem to be rather stored in Internet Explorer / the .Net stack somewhere.
Note that very often in corporate environments the proxy is set from a .pac :
So to conclude on windows at least as of today, we can not reliably trust urllib.request.getproxies. This is why I developed envswitch to make it extremely easy for me and my colleagues to switch all the proxy-related environment variables in one click, back and forth (home-train-plane/office). At least urllib (and requests) use them reliably when they are set. (note: the tool works fine even if there is a "build failed" badge on the tool's doc page :) )

Is there any way to force a program/software to use system proxy in Linux?

I am working on a Java project in Intellij Idea (Linux) that needs to access websites through a proxy. I have a personal proxy subscription to use and I can request through it programmatically with something like -
HttpHost proxy = new HttpHost("PROXY_SERVER", PORT);
String res = Executor.newInstance()
.auth(proxy, "USER_NAME", "PASSWORD")
.execute(Request.Get("https://example.com").viaProxy(proxy))
.returnContent().asString();
System.out.println(res);
However, if I use the proxy in the etc/environment with http(s)_proxy or through ubuntu network proxy from the settings, my browsers and some of the system programs such as - Chrome, Firefox use the proxy while making any requests but Intellij Idea doesn't follow the system proxy. I've tried to set it manually from IDEA settings but it doesn't work. The requests are always going from my current IP. So, I was curious if it is possible to force a software in Linux to use system proxy somehow. I need to mention that, I have tried proxychains but it didn't work, my server wasn't recognized. Any kind of help/suggestion will be appreciated as I have a little or no experience in networking.

node js send html to network rather than only localhost server

I'm using node js trying to send my web-page to my network, I successfully call localhost:port in my computer using express as server, the webpage loads fine trigger my webcam which I used to streaming in the webpage, and then im working to make a simple app in my phone to directly access my server, so my questions:
1.How do I able to access my server from different devices in the same wireless-network? by calling ip + port ?192.168.1.104:9001 ? cause i've tried and it didnt work.
2.I've found https with .pem something like that, is that the answer ? is there also any other way ?
3.maybe any advice before i work to make my web-app to devices? using koa? i don't even really know what is that, but i'm happily take any advices.
EDIT: i've read How could others, on a local network, access my NodeJS app while it's running on my machine?
let's say I simply using random router, so i can't configure my router-port, my server in my pc and my phone join in the same network, trying to access the server in my phone
1.How do I able to access my server from different devices in the same wireless-network?
All you need to do is find your server's IP address in this same wireless-network, and find the Node.js application's port. Then access the following URL in other devices:
http://{server_IP}:{port}
However, there are some points need to check:
Need to check firewall and confirm the port is not blocked, server IP is not blocked by test device, and test device IP is not blocked by server.
Need to check whether there is any Proxy setting in server and test device. If there is any, disable the proxy.
A computer may have many IP addresses at the same time, you need to find the correct one in the same wireless-network. For example, If you install a virtual machine software such as VMware and run a virtual system inside, your real computer will get IP address as 192.168.*.* -- this IP address looks like an intranet IP in wireless-network, but it is not, and can never be accessed by test device.
2.I've found https with .pem something like that, is that the answer?
No, HTTPS has nothing to do with this problem. HTTPS just add security (based on HTTP layer), it does not impact any HTTP connectivity. Actually, to minify the problem, it is better to only use HTTP in your scenario.
There is only one very special case that may bring your problem by HTTPS -- the test machine is configured and will block any non-HTTPS connection for security.
3.maybe any advice before i work to make my web-app to devices? using koa?
My suggestion is: As there is an HTTP connectivity issue, the first step is trying to find the root cause of that issue. Thus, it is better to make a simplest HTTP server using native Node.js, no Koa, no Express. In this way, the complexity of server will be reduced, which makes root cause investigation easier.
After the HTTP connectivity issue is fixed, you can pick up Koa or Express or any other mature Node.js web framework to help the web-app work.
4.let's say I simply using random router, so i can't...
Do you mean your server get dynamic IP address by DHCP? As long as the IP is not blocked by test device, it does not matter.

Python 3 Specifying Socks5 Proxy In Specified Request Only

The answer here states how to use a Socks5 proxy in Python 3. Unfortunately this answer wouldn't work when one wants to send requests over the socks proxy and not over the proxy at virtually the same time.
I want to be able to do this where the first request uses the proxy and the second uses my IP at virtually the same time without having to switch the sockets around:
import concurrent.futures
import requests
threadPoolTasks1=[]
with concurrent.futures.ThreadPoolExecutor(max_workers=7) as executor:
#this request should use the proxy. I should see the IP reflecting that
threadPoolTasks1.append(executor.submit(print(requests.get("http://checkip.amazonaws.com/").text)))
#this request won't use the proxy.
threadPoolTasks1.append(executor.submit(print(requests.get("http://checkip.amazonaws.com/").text)))

Resources