Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I am exploring pymetasploit3
module but I am having trouble connecting to the background process of MSGRPC
import random , os
from pymetasploit3.msfrpc import MsfRpcClient
passwrd = ''.join([ str(random.randrange(0 , 99)) for i in range(10) ])
print (f"Current password : {passwrd}" )
os.system(f'msfrpcd -P {passwrd} -S')
client = MsfRpcClient(passwrd, port=55553 )
I tried to change the port number, use msfadmin as the password , but it does not seem to work
[*] MSGRPC starting on 0.0.0.0:55553 (NO SSL):Msg...
[*] MSGRPC backgrounding at 2021-07-20 07:54:03 -0400...
[*] MSGRPC background PID 1374
Traceback (most recent call last):
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/connection.py", line 158, in _new_conn
conn = connection.create_connection(
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/util/connection.py", line 80, in create_connection
raise err
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/util/connection.py", line 70, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 597, in urlopen
httplib_response = self._make_request(conn, method, url,
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.9/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.9/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.9/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.9/http/client.py", line 1010, in _send_output
self.send(msg)
File "/usr/lib/python3.9/http/client.py", line 950, in send
self.connect()
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/connection.py", line 181, in connect
conn = self._new_conn()
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/connection.py", line 167, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fe0816afcd0>: Failed to establish a new connection: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/kali/.local/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 637, in urlopen
retries = retries.increment(method, url, error=e, _pool=self,
File "/home/kali/.local/lib/python3.9/site-packages/urllib3/util/retry.py", line 399, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=55553): Max retries exceeded with url: /api/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe0816afcd0>: Failed to establish a new connection: [Errno 111] Connection refused'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/kali/Desktop/ooo.py", line 13, in <module>
client = MsfRpcClient(passwrd, port=55553 )
File "/home/kali/.local/lib/python3.9/site-packages/pymetasploit3/msfrpc.py", line 195, in __init__
self.login(kwargs.get('username', 'msf'), password)
File "/home/kali/.local/lib/python3.9/site-packages/pymetasploit3/msfrpc.py", line 229, in login
auth = self.call(MsfRpcMethod.AuthLogin, [user, password])
File "/home/kali/.local/lib/python3.9/site-packages/pymetasploit3/msfrpc.py", line 215, in call
r = self.post_request(url, payload)
File "<decorator-gen-2>", line 2, in post_request
File "/home/kali/.local/lib/python3.9/site-packages/retry/api.py", line 73, in retry_decorator
return __retry_internal(partial(f, *args, **kwargs), exceptions, tries, delay, max_delay, backoff, jitter,
File "/home/kali/.local/lib/python3.9/site-packages/retry/api.py", line 33, in __retry_internal
return f()
File "/home/kali/.local/lib/python3.9/site-packages/pymetasploit3/msfrpc.py", line 226, in post_request
return requests.post(url, data=payload, headers=self.headers, verify=False)
File "/home/kali/.local/lib/python3.9/site-packages/requests/api.py", line 116, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/kali/.local/lib/python3.9/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/home/kali/.local/lib/python3.9/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/home/kali/.local/lib/python3.9/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/home/kali/.local/lib/python3.9/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=55553): Max retries exceeded with url: /api/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe0816afcd0>: Failed to establish a new connection: [Errno 111] Connection refused'))
specify the ip address for the server to connect to it trying to connect to 127.0.0.1 but the background process is using 0.0.0.0
os.system(f'msfrpcd -P {passwrd} -S -a 127.0.0.1 ')
Related
I'm trying to find an existing parser (preferably in Python) to write and read SysML v2.
The official Github repo by SysML v2 Submission Team (SST) has placed some examples online already (For instance https://github.com/Systems-Modeling/SysML-v2-Release/blob/master/sysml/src/examples/Vehicle%20Example/VehicleUsages.sysml ), but I'm unable to get the Eclipse Plugin running properly. (Nothing is shown in the preview windows, no error message either).
There's a repo for the SysML-v2-API-Python-Client which I installed and executed the given sample code (https://github.com/Systems-Modeling/SysML-v2-API-Python-Client), but all I'm getting is this message:
$ python3.8 Test1.py
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 159, in _new_conn
conn = connection.create_connection(
File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 84, in create_connection
raise err
File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 74, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 665, in urlopen
httplib_response = self._make_request(
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 387, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.8/http/client.py", line 1256, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1302, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1251, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1011, in _send_output
self.send(msg)
File "/usr/lib/python3.8/http/client.py", line 951, in send
self.connect()
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 187, in connect
conn = self._new_conn()
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 171, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f9492d1b9a0>: Failed to establish a new connection: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "Test1.py", line 38, in <module>
api_response = api_instance.delete_branch_by_project_and_id(
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/api/branch_api.py", line 62, in delete_branch_by_project_and_id
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/api/branch_api.py", line 144, in delete_branch_by_project_and_id_with_http_info
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/api_client.py", line 364, in call_api
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/api_client.py", line 181, in __call_api
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/api_client.py", line 431, in request
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/rest.py", line 256, in DELETE
File "/home/qohelet/.local/lib/python3.8/site-packages/sysml_v2_api_client-2021.post9-py3.8.egg/sysml_v2_api_client/rest.py", line 163, in request
File "/usr/lib/python3/dist-packages/urllib3/request.py", line 75, in request
return self.request_encode_url(
File "/usr/lib/python3/dist-packages/urllib3/request.py", line 97, in request_encode_url
return self.urlopen(method, url, **extra_kw)
File "/usr/lib/python3/dist-packages/urllib3/poolmanager.py", line 330, in urlopen
response = conn.urlopen(method, u.request_uri, **kw)
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 747, in urlopen
return self.urlopen(
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 747, in urlopen
return self.urlopen(
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 747, in urlopen
return self.urlopen(
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 719, in urlopen
retries = retries.increment(
File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 436, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=80): Max retries exceeded with url: /projects/project_id_example/branches/branch_id_example (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f9492d1b9a0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Am I supposed to have anything running? Is there anywhere a tutorial or manual on how to use it?
Or are there any other parsers out there?
There are no parsers currently available for Python. The API Python Client is an example implementation to interact with the Pilot Server API Implementation. You would need to run the server to interact with the API using the python client.
I'd like to use visdom for visualization of results in a deep learning algorithm which is trained in a remote cluster server. I found a link that tried to describe a correct way to setup everything in a slurm script.
python -u Script.py --visdom_server http://176.97.99.618 --visdom_port 8097
I use my ip and 8097 to connect to the remote cluster server:
ssh -L 8097:176.97.99.618:8097 my_userid#r#my_server_address
I have the following lines of code:
import visdom
import numpy as np
cfg = {"server": "176.97.99.618",
"port": 8097}
vis = visdom.Visdom('http://' + cfg["server"], port = cfg["port"])
win = None
#Plotting on remote server
def update_viz(epoch, loss, title):
global win
if win is None:
title = title
win = viz.line(
X=np.array([epoch]),
Y=np.array([loss]),
win=title,
opts=dict(
title=title,
fillarea=True
)
)
else:
viz.line(
X=np.array([epoch]),
Y=np.array([loss]),
win=win,
update='append'
)
update_viz(epoch, elbo2.item(), 'ELBO2 Loss of beta distributions')
I got this error:
Setting up a new session...
Traceback (most recent call last):
File "/anaconda3/lib/python3.8/site-packages/urllib3/connection.py", line 174, in _ne
w_conn
conn = connection.create_connection(
File "/anaconda3/lib/python3.8/site-packages/urllib3/util/connection.py", line 96, in
create_connection
raise err
File "/anaconda3/lib/python3.8/site-packages/urllib3/util/connection.py", line 86, in
create_connection
sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.8/site-packages/urllib3/connectionpool.py", line 699, in
urlopen
httplib_response = self._make_request(
File "/anaconda3/lib/python3.8/site-packages/urllib3/connectionpool.py", line 394, in
_make_request
conn.request(method, url, **httplib_request_kw)
File "/anaconda3/lib/python3.8/site-packages/urllib3/connection.py", line 239, in req
uest
super(HTTPConnection, self).request(method, url, body=body, headers=headers)
File "/anaconda3/lib/python3.8/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.8/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.8/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.8/http/client.py", line 1010, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.8/http/client.py", line 950, in send
self.connect()
File "/anaconda3/lib/python3.8/site-packages/urllib3/connection.py", line 205, in con
nect
conn = self._new_conn()
File "/anaconda3/lib/python3.8/site-packages/urllib3/connection.py", line 186, in _ne
w_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7ff292f14d00
>: Failed to establish a new connection: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/anaconda3/lib/python3.8/site-packages/urllib3/connectionpool.py", line 755, in
urlopen
retries = retries.increment(
File "/anaconda3/lib/python3.8/site-packages/urllib3/util/retry.py", line 574, in inc
rement
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='176.97.99.618', port=8097): Max retries
exceeded with url: /env/main (Caused by NewConnectionError('<urllib3.connection.HTTPConnection obj
ect at 0x7ff292f14d00>: Failed to establish a new connection: [Errno 110] Connection timed out'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.8/site-packages/visdom/__init__.py", line 708, in _send
return self._handle_post(
File "/anaconda3/lib/python3.8/site-packages/visdom/__init__.py", line 677, in _handl
e_post
r = self.session.post(url, data=data)
File "/anaconda3/lib/python3.8/site-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/anaconda3/lib/python3.8/site-packages/requests/sessions.py", line 542, in requ
est
resp = self.send(prep, **send_kwargs)
File "/anaconda3/lib/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/anaconda3/lib/python3.8/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='192.168.2.10', port=8097): Max retri
es exceeded with url: /env/main (Caused by NewConnectionError('<urllib3.connection.HTTPConnection
object at 0x7ff292f14d00>: Failed to establish a new connection: [Errno 110] Connection timed out'
))
Visdom python client failed to establish socket to get messages from the server. This feature is o
ptional and can be disabled by initializing Visdom with `use_incoming_socket=False`, which will pr
event waiting for this request to timeout.
Script.py:41: UserWarning: To copy construct from a tensor, it is recommended to us
e sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than
torch.tensor(sourceTensor).
params['w'].append(nn.Parameter(torch.tensor(Normal(torch.zeros(n_in, n_out), std * torch.ones(n
_in, n_out)).rsample(), requires_grad=True, device=device)))
Script.py:42: UserWarning: To copy construct from a tensor, it is recommended to us
e sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than
torch.tensor(sourceTensor).
params['b'].append(nn.Parameter(torch.tensor(torch.mul(bias_init, torch.ones([n_out,])), require
s_grad=True, device=device)))
Script.py:292: UserWarning: To copy construct from a tensor, it is recommended to u
se sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather tha
n torch.tensor(sourceTensor).
return torch.exp(torch.lgamma(torch.tensor(a, dtype=torch.float, requires_grad=True).to(device=l
ocal_device)) + torch.lgamma(torch.tensor(b, dtype=torch.float, requires_grad=True).to(device=loca
l_device)) - torch.lgamma(torch.tensor(a+b, dtype=torch.float, requires_grad=True).to(device=local
_device)))
Script.py:679: UserWarning: This overload of add_ is deprecated:
add_(Number alpha, Tensor other)
Consider using one of the following signatures instead:
add_(Tensor other, *, Number alpha) (Triggered internally at /opt/conda/conda-bld/pytorch
_1631630815121/work/torch/csrc/utils/python_arg_parser.cpp:1025.)
exp_avg.mul_(beta1).add_(1 - beta1, grad)
[Errno 110] Connection timed out
on_close() takes 1 positional argument but 3 were given
Traceback (most recent call last):
File "Script.py", line 873, in <module>
update_viz(epoch, elbo2.item(), 'ELBO2 Loss of beta distributions')
File "Script.py", line 736, in update_viz
win = viz.line(
NameError: name 'viz' is not defined
how can I run my plotting script on a remote server? How should the command line of python code be in my SLURM script? How can I store the plot and move it later to my laptop using scp command?
Try using global viz after global win line.
Requests library is giving me an error I've never encountered before, while trying to download an image from my website.
Here is the download function
def dl_jpg(url, filePath, fileName):
fullPath = filePath + fileName + '.jpg'
r = requests.get(url,allow_redirects=True)
open(fullPath,'wb').write(r.content)
The URL I'm entering is:
http://www.deepfrybot.ga/uploads/c4c7936ef4218dbe7014cb543049168b.jpg
Here's the error message
Last login: Fri Nov 1 03:36:32 2019 from 116.193.136.13
__| __|_ )
_| ( / Amazon Linux 2 AMI
___|\___|___|
https://aws.amazon.com/amazon-linux-2/
[ec2-user#ip ~]$ cd dfb-master
[ec2-user#ip dfb-master]$ sudo python3 mainr.py
enter url (n) if auto: http://www.deepfrybot.ga/uploads/c4c7936ef4218dbe7014cb543049168b.jpg
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 160, in _new_conn
(self._dns_host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.7/site-packages/urllib3/util/connection.py", line 57, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/lib64/python3.7/socket.py", line 748, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 603, in urlopen
chunked=chunked)
File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 355, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib64/python3.7/http/client.py", line 1244, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib64/python3.7/http/client.py", line 1290, in _send_request
self.endheaders(body, encode_chunked=encode_chunked) File "/usr/lib64/python3.7/http/client.py", line 1239, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib64/python3.7/http/client.py", line 1026, in _send_output
self.send(msg)
File "/usr/lib64/python3.7/http/client.py", line 966, in send
self.connect()
File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 183, in connect
conn = self._new_conn()
File "/usr/local/lib/python3.7/site-packages/urllib3/connection.py", line 169, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fdce843ed50>: Failed to establish a new connection: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/usr/local/lib/python3.7/site-packages/urllib3/connectionpool.py", line 641, in urlopen
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.7/site-packages/urllib3/util/retry.py", line 399, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='www.deepfrybot.ga', port=80): Max retries exceeded with url: /uploads/c4c7936ef4218dbe7014cb543049168b.jpg (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fdce843ed50>: Failed to establish a new connection: [Errno -2] Name or service not known'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "mainr.py", line 32, in wrapper
return job_func(*args, **kwargs)
File "mainr.py", line 138, in main_p
dl_jpg(url, get_abs_file('images/'), file_name)
File "mainr.py", line 67, in dl_jpg
r = requests.get(url,allow_redirects=True)
File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.7/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.7/site-
packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.7/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.7/site-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='www.deepfrybot.ga', port=80): Max retries exceeded with url: /uploads/c4c7936ef4218dbe7014cb543049168b.jpg (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fdce843ed50>: Failed to establish a new connection: [Errno -2] Name or service not known'))
I'm clueless as to why this is happening, a little help would be much appreciated
lxop said "
The script is failing to resolve the name 'www.deepfrybot.ga'. Does
your EC2 instance have DNS access?
"
And after checking and researching a bit on it I came to know that some specific EC2 instances/urllib3 are sometimes not able to access free top level domains (like .ga,.ml,.tk) as they have a general tendency to be malicious and also since they generally lack an SSL certificate.
These domains with SSL certificates work fine though!
And if you are hosting on say heliohost.org (which i am), simply change the domain in the script from yourwebsite.ga to yourwebsite.yourhost.domain
That should solve it!
Hello this is my first question on Stack Overflow...
I have written Python script which uses latest Python 3.7 and latest requests module.
My request queries work, however after multiple (thousands) of requests in a day, I sometimes get this stack trace and program exits. It is large stack trace and I am not sure which pieces I should investigate first. Any help would be greatly appreciated
Here is the stack trace:
File "/usr/lib/python3/dist-packages/urllib3/contrib/pyopenssl.py", line 453, in wrap_socket
cnx.do_handshake()
File "/usr/lib/python3/dist-packages/OpenSSL/SSL.py", line 1915, in do_handshake
self._raise_ssl_error(self._ssl, result)
File "/usr/lib/python3/dist-packages/OpenSSL/SSL.py", line 1639, in _raise_ssl_error
raise SysCallError(errno, errorcode.get(errno))
OpenSSL.SSL.SysCallError: (104, 'ECONNRESET')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 343, in _make_request
self._validate_conn(conn)
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 841, in _validate_conn
conn.connect()
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 344, in connect
ssl_context=context)
File "/usr/lib/python3/dist-packages/urllib3/util/ssl_.py", line 344, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib/python3/dist-packages/urllib3/contrib/pyopenssl.py", line 459, in wrap_socket
raise ssl.SSLError('bad handshake: %r' % e)
ssl.SSLError: ("bad handshake: SysCallError(104, 'ECONNRESET')",)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 398, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='block.io', port=443): Max retries exceeded with url: /api/v2/get_address_balance/?api_key=0d76-078a-c2c9-e524&addresses=172MQBZyt2UGfCPRwUpKCH4cmB4sRrhywy (Caused by SSLError(SSLError("bad handshake: SysCallError(104, 'ECONNRESET')")))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "Birds.py", line 291, in <module>
main()
File "Birds.py", line 287, in main
loop()
File "Birds.py", line 260, in loop
checkBTC()
File "Birds.py", line 186, in checkBTC
currentBTCBalance = getBitcoinBalance() # Check Snapy for BTC balance
File "Birds.py", line 111, in getBitcoinBalance
r = requests.get(query_url)
File "/usr/lib/python3/dist-packages/requests/api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "/usr/lib/python3/dist-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='block.io', port=443): Max retries exceeded with url: /api/v2/get_address_balance/?api_key=0d76-078a-c2c9-e524&addresses=172MQBZyt2UGfCPRwUpKCH4cmB4sRrhywy (Caused by SSLError(SSLError("bad handshake: SysCallError(104, 'ECONNRESET')")))
I have answered my own question via these websites:
https://www.peterbe.com/plog/best-practice-with-retries-with-requests
https://realpython.com/python-requests/
Which explain how to use Transport Adapters and smart backoff. I am currently testing queries once per second and the program has been running for 24 hours without problem.
When I'm trying to use the Wikipedia API, I get a certificate error message.
This for an assistant I am coding on a school computer and I think it is something that has been put there by administration as I got a certificate error when using NPM previously.
Here is the code I am using:
wikiSearch = query.strip("wiki ")
outputs = wikipedia.summary(wikiSearch, sentences=3)
I thought this would return the first three sentences of the article as it does in python 2.7 but instead returns this lengthy error message:
Traceback (most recent call last):
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py", line 603, in urlopen
chunked=chunked)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py", line 344, in _make_request
self._validate_conn(conn)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py", line 843, in _validate_conn
conn.connect()
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/connection.py", line 350, in connect
ssl_context=context)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/util/ssl_.py", line 355, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/ssl.py", line 412, in wrap_socket
session=session
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/ssl.py", line 853, in _create
self.do_handshake()
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/ssl.py", line 1117, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/connectionpool.py", line 641, in urlopen
_stacktrace=sys.exc_info()[2])
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/urllib3/util/retry.py", line 399, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='en.wikipedia.org', port=443): Max retries exceeded with url: /w/api.php?list=search&srprop=&srlimit=1&limit=1&srsearch=alex&srinfo=suggestion&format=json&action=query (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "dave.py", line 54, in <module>
index()
File "dave.py", line 24, in index
outputs = wikipedia.summary(wikiSearch, sentences=3)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/util.py", line 28, in __call__
ret = self._cache[key] = self.fn(*args, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 231, in summary
page_info = page(title, auto_suggest=auto_suggest, redirect=redirect)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 270, in page
results, suggestion = search(title, results=1, suggestion=True)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/util.py", line 28, in __call__
ret = self._cache[key] = self.fn(*args, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 103, in search
raw_results = _wiki_request(search_params)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 737, in _wiki_request
r = requests.get(API_URL, params=params, headers=headers)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 668, in send
history = [resp for resp in gen] if allow_redirects else []
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 668, in <listcomp>
history = [resp for resp in gen] if allow_redirects else []
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 247, in resolve_redirects
**adapter_kwargs
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='en.wikipedia.org', port=443): Max retries exceeded with url: /w/api.php?list=search&srprop=&srlimit=1&limit=1&srsearch=alex&srinfo=suggestion&format=json&action=query (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1056)')))
Edit:
I now get this error:
File "/Users/alexander.hawking/Desktop/Dave/dave.py", line 84, in <module>
index()
File "/Users/alexander.hawking/Desktop/Dave/dave.py", line 26, in index
outputs = wikipedia.summary(wikiSearch, sentences=3)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/util.py", line 28, in __call__
ret = self._cache[key] = self.fn(*args, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 231, in summary
page_info = page(title, auto_suggest=auto_suggest, redirect=redirect)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 270, in page
results, suggestion = search(title, results=1, suggestion=True)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/util.py", line 28, in __call__
ret = self._cache[key] = self.fn(*args, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 103, in search
raw_results = _wiki_request(search_params)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/wikipedia/wikipedia.py", line 737, in _wiki_request
r = requests.get(API_URL, params=params, headers=headers)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/Users/alexander.hawking/Library/Python/3.7/lib/python/site-packages/requests/adapters.py", line 510, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPConnectionPool(host='myproxy.proxy.com', port=1234): Max retries exceeded with url: http://en.wikipedia.org/w/api.php?list=search&srprop=&srlimit=1&limit=1&srsearch=alex&srinfo=suggestion&format=json&action=query (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10e2d3eb8>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known')))
As you have mentioned that you encountered a similar error when you were working with NPM, it's most likely due to the proxy setup by your administration.
You can configure the proxy for any third party module to connect to the internet. By default, urlopen of urllib3 uses the environment variable http_proxy to determine which HTTP proxy to use.
run this command in CMD before running your python application
export http_proxy='http://myproxy.proxy.com:1234'
============================================================
if you don't know your proxy you can try this workaround.
run this command in CMD before running the python application
export CURL_CA_BUNDLE=''
or add these lines to your code to avoid adding environment variable manually every time.
import os
os.environ['CURL_CA_BUNDLE'] = ""
if you are getting InsecureRequestWarning and you want to suppress it you can do the following.
import wikipedia
import os
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
os.environ['CURL_CA_BUNDLE'] = ""
os.environ['PYTHONWARNINGS']="ignore:Unverified HTTPS request"
print(wikipedia.wikipedia.summary("Google"))
I hope this helps.