The urllib library installed in my os:
pip list |grep urllib
urllib3 1.25.11
I want to upload local file into the dropbox with proxy:
import dropbox
access_token = "xxxxxx"
file_from = "local_file"
file_to = "/directory_in_dropbox"
proxyDict = {
"http": "http://127.0.0.1:8123",
"https": "https://127.0.0.1:8123"
}
mysesh = dropbox.create_session(1,proxyDict)
dbx = dropbox.Dropbox(access_token,session=mysesh)
with open(file_from, 'rb') as f:
dbx.files_upload(f.read(), file_to)
It encounter errors:
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/base.py", line 3208, in files_upload
r = self.request(
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 326, in request
res = self.request_json_string_with_retry(host,
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 476, in request_json_string_with_retry
return self.request_json_string(host,
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 589, in request_json_string
r = self._session.post(url,
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 696, in urlopen
self._prepare_proxy(conn)
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 966, in _prepare_proxy
conn.connect()
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 359, in connect
conn = self._connect_tls_proxy(hostname, conn)
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 500, in _connect_tls_proxy
return ssl_wrap_socket(
File "/usr/lib/python3/dist-packages/urllib3/util/ssl_.py", line 453, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls)
File "/usr/lib/python3/dist-packages/urllib3/util/ssl_.py", line 495, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock)
File "/usr/lib/python3.9/ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "/usr/lib/python3.9/ssl.py", line 997, in _create
raise ValueError("check_hostname requires server_hostname")
ValueError: check_hostname requires server_hostname
It's no use to write the proxy dict as below:
proxyDict = {
"http": "http://127.0.0.1:8123",
"https": "http://127.0.0.1:8123"
}
The proxy 127.0.0.1:8123 works fine,i can down resources from web with proxy in youtube-dl command:
youtube-dl --proxy http://127.0.0.1:8118 $url
Updated for Paulo's advice:
Updaed for Markus' advice:
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
ssl.SSLContext.verify_mode = property(lambda self: ssl.CERT_NONE, lambda self, newval: None)
import dropbox
access_token = "xxxxxxxx"
file_from = "/home/debian/sample.sql"
file_to = "/mydoc"
proxyDict = {
"http": "http://127.0.0.1:8123",
"https": "https://127.0.0.1:8123"
}
mysesh = dropbox.create_session(1,proxyDict)
dbx = dropbox.Dropbox(access_token,session=mysesh)
with open(file_from, 'rb') as f:
dbx.files_upload(f.read(), file_to)
It encounter the below error:
/home/debian/.local/lib/python3.9/site-packages/urllib3/connectionpool.py:981: InsecureRequestWarning: Unverified HTTPS request is being made to host '127.0.0.1'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
warnings.warn(
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/base.py", line 3208, in files_upload
r = self.request(
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 326, in request
res = self.request_json_string_with_retry(host,
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 476, in request_json_string_with_retry
return self.request_json_string(host,
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 596, in request_json_string
self.raise_dropbox_error_for_resp(r)
File "/home/debian/.local/lib/python3.9/site-packages/dropbox/dropbox_client.py", line 639, in raise_dropbox_error_for_resp
raise AuthError(request_id, err)
dropbox.exceptions.AuthError: AuthError('xxxxxxxxxxxxxxxxxxxxxx', AuthError('invalid_access_token', None))
Update for Life is complex's advice:
I tried many times to get mysesh = dropbox.create_session(1,proxyDict) to work correctly.
I decided to look at the code for dropbox-sdk-python and noted that it is calling requests.Session(). So I decided to use that over dropbox.create_session()
import requests
from dropbox import Dropbox
from dropbox.files import WriteMode
access_token = "my_access_token"
file_from = 'test.docx'
file_to = '/test.docx'
# https://free-proxy-list.net
proxyDict = {
"http": "http://50.218.57.65:80",
"https": "https://83.229.73.175:80"
}
s = requests.Session()
s.proxies = proxyDict
dbx = Dropbox(access_token, session=s)
with open(file_from, 'rb') as f:
file_content = f.read()
dbx.files_upload(f=file_content, path=file_to, mode=WriteMode.overwrite, mute=False)
Here is a screenshot of the file being written to DropBox.
I have tried this code with multiple proxy servers and it works each time.
Tldr;
So far, my understanding is it may be
Miss-use of the urllib
Bad https certificates
Solution (maybe)
urllib format
If I remember well urllib changed his format at some point from
proxyDict = {
'http':'8.88.888.8:8888',
'https':'8.88.888.8:8888'
}
proxyDict = {
'https': 'https://8.88.888.8:8888',
'http': 'http://8.88.888.8:8888',
}
Have you tried both format ?
You must have a problem with
your proxy not forwarding some stuff the right way or
your access token is wrong
the Dropbox app has the wrong permissions set
because this code (which is basically what you have in your question - even without disabling SSL certificate check!) works just fine with my access token put into the environment variable DROPBOX_ACCESS_TOKEN.
import dropbox
import sys
import os
DROPBOX_ACCESS_TOKEN = os.getenv('DROPBOX_ACCESS_TOKEN')
def uploadFile(fromFilePath,toFilePath):
proxy = '127.0.0.1:3128' # locally installed squid proxy server
proxyDict = {
"http": "http://"+proxy,
"https": "http://"+proxy # connection to proxy is http!!
}
session = dropbox.create_session(1,proxyDict)
client = dropbox.Dropbox(DROPBOX_ACCESS_TOKEN,session)
client.files_upload(open(fromFilePath, "rb").read(), toFilePath)
print("Done uploading {} to {}".format(fromFilePath,toFilePath))
if __name__=="__main__":
uploadFile(sys.argv[1],sys.argv[2])
Be aware though, that the access token - once it is generated - has the permissions that were in effect at the time of token generation. If you change the app's permissions AFTER generating the token, the token will still have the original permissions!
EDIT: It looks like, the Dropbox API is clever enough to NOT use the proxy, if it can reach the target directly. Thus this code is working with ANYTHING you put into the proxyDict and it is not at all clear, if the code works, if it really has to go through the proxy. I am working on verifying that and will update the answer accordingly.
Update: I installed squid on my MacBook and used http://127.0.0.1:3128 as the proxy in above code, but the logs showed, the code never even tried to go through the proxy. But once I set the environment variables http_proxy and https_proxy to "http://127.0.0.1:3128" the request WOULD go through the proxy and proceed successfully. So... either there is something going on, I don't fully understand or the Dropbox API has some problem with the proxy definitions in the create_session call. Time to look at the API source code I guess...
Thank for Life is complex's code,i add permission on Files and folders.
And re-generate the dropbox token ,execute the same code (nothing changed) with the new token ,done!
It is nothing related with proxy setting,just dropbox setting!
Related
I am trying to create a simple Python bot for my project. Everything is working fine on my localhost, but the same code stops working behind the network firewall which needs environment proxy to be set.
from slack import RTMClient
proxy='http://XXXX:NNNN'
token='XXXX'
class Botso():
def __init__(self):
self.proxy=self.get_proxy()
self.rt= RTMClient(
token=token,
connect_method='rtm.start',
proxy=self.proxy
)
def get_proxy(self):
host=socket.gethostname()
if "internal" in host:
return None
elif "XXX" in host:
return proxy
#RTMClient.run_on(event="message")
def say_hello(**payload):
data = payload['data']
web_client = payload['web_client']
if 'text' in data and 'hii' in data['text']:
channel_id = data['channel']
thread_ts = data['ts']
user = data['user'] # This is not username but user ID (the format is either U*** or W***)
web_client.chat_postMessage(
channel=channel_id,
text=f"Hi <#{user}>!"
#thread_ts=thread_ts
)
if __name__ == '__main__':
botso=Botso()
botso.rt.start()
The error I am getting while initializing the RTMClient is
Traceback (most recent call last):
File "botso.py" , in <module>
botso.rt.start()
File "/usr/lib64/python3.6/http/client.py", line 974, in send
self.connect()
File "/usr/lib64/python3.6/http/client.py", line 1407, in connect
super().connect()
File "/usr/lib64/python3.6/http/client.py", line 950, in connect
self._tunnel()
File "/usr/lib64/python3.6/http/client.py", line 929, in _tunnel
message.strip()))
OSError: Tunnel connection failed: 403 Forbidden
I have other code in the same environment which uses the same proxy to send slack messages and works fine bu using request api.
params={
'token': self.slack_token,
'types': ['public_channel','private_channel']
}
slack_url='https://slack.com/api/conversations.list'
response = requests.get(url=slack_url,params=params,proxies=self.proxy).json()
How can we make the RTMClient work with proxy and Python3.
Couldn't find much help in slack API documents.
I need to access a Twitter user's timeline as a JSON string and return the first 250 chars.
Twitter1.py:
import urllib.request, urllib.parse, urllib.error
import twurl
import ssl
TWITTER_URL = 'https://api.twitter.com/1.1/statuses/user_timeline.json'
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
while True:
print('')
acct = input('Enter Twitter Account:')
if (len(acct) < 1): break
url = twurl.augment(TWITTER_URL,
{'screen_name': acct, 'count': '2'})
print('Retrieving', url)
connection = urllib.request.urlopen(url, context=ctx)
data = connection.read().decode()
print(data[:250])
headers = dict(connection.getheaders())
print('Remaining', headers['x-rate-limit-remaining'])
An error related to urllib occurs in output:
Enter Twitter Account:jack
...
Traceback (most recent call last):
File "C:\Users\User\...\twitter1.py", line 18, in <module>
connection = urllib.request.urlopen(url, context=ctx)
File "C:\Users\User\anaconda3\lib\urllib\request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "C:\Users\User\anaconda3\lib\urllib\request.py", line 531, in open
response = meth(req, response)
File "C:\Users\User\anaconda3\lib\urllib\request.py", line 641, in http_response
'http', request, response, code, msg, hdrs)
File "C:\Users\User\anaconda3\lib\urllib\request.py", line 569, in error
return self._call_chain(*args)
File "C:\Users\User\anaconda3\lib\urllib\request.py", line 503, in _call_chain
result = func(*args)
File "C:\Users\User\anaconda3\lib\urllib\request.py", line 649, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
HTTPError: Bad Request
I cannot figure out the source of the issue. The syntax appears correct and the correct API information was entered into a separate python file 'hidden.py'. twurl and oauth were imported from twurl.py and oauth.py to access the data (included below). hidden.py simply returns my API info in JSON within a function oauth() and oauth is well known so it is also excluded here. Any guidance would be greatly appreciated.
twurl.py:
import urllib.request, urllib.parse, urllib.error
import oauth
import hidden
def augment(url, parameters):
secrets = hidden.oauth()
consumer = oauth.OAuthConsumer(secrets['consumer_key'],
secrets['consumer_secret'])
token = oauth.OAuthToken(secrets['token_key'], secrets['token_secret'])
oauth_request = oauth.OAuthRequest.from_consumer_and_token(consumer,
token=token, http_method='GET', http_url=url,
parameters=parameters)
oauth_request.sign_request(oauth.OAuthSignatureMethod_HMAC_SHA1(),
consumer, token)
return oauth_request.to_url()
Follow up: was resolved soon after I posted, the issue was regarding a domain being blocked by an antivirus filter.
I'm calling a REST API with requests in python and so far have been successful when I set verify=False.
Now, I have to use client side cert that I need to import for authentication and I'm getting this error everytime I'm using the cert (.pfx). cert.pfx is password protected.
r = requests.post(url, params=payload, headers=headers,
data=payload, verify='cert.pfx')
This is the error I'm getting:
Traceback (most recent call last):
File "C:\Users\me\Desktop\test.py", line 65, in <module>
r = requests.post(url, params=payload, headers=headers, data=payload, verify=cafile)
File "C:\Python33\lib\site-packages\requests\api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "C:\Python33\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python33\lib\site-packages\requests\sessions.py", line 346, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python33\lib\site-packages\requests\sessions.py", line 449, in send
r = adapter.send(request, **kwargs)
File "C:\Python33\lib\site-packages\requests\adapters.py", line 322, in send
raise SSLError(e)
requests.exceptions.SSLError: unknown error (_ssl.c:2158)
I've also tried openssl to get .pem and key but with .pem and getting SSL: CERTIFICATE_VERIFY_FAILED
Can someone please direct me on how to import the certs and where to place it? I tried searching but still faced with the same issue.
I had this same problem. The verify parameter refers to the server's certificate. You want the cert parameter to specify your client certificate.
import requests
cert_file_path = "cert.pem"
key_file_path = "key.pem"
url = "https://example.com/resource"
params = {"param_1": "value_1", "param_2": "value_2"}
cert = (cert_file_path, key_file_path)
r = requests.get(url, params=params, cert=cert)
I had the same problem and to resolve this, I came to know that we have to send RootCA along with certificate and its key as shown below,
response = requests.post(url, data=your_data, cert=('path_client_certificate_file', 'path_certificate_key_file'), verify='path_rootCA')
I am new in Python and trying use Google Drive Apis, but facing this issue. Error I am getting after running python quickstart.py
Traceback (most recent call last):
File "quickstart.py", line 9, in <module>
creds = store.get()
File "/usr/local/lib/python3.6/site-packages/oauth2client/client.py", line 407, in get
return self.locked_get()
File "/usr/local/lib/python3.6/site-packages/oauth2client/file.py", line 54, in locked_get
credentials = client.Credentials.new_from_json(content)
File "/usr/local/lib/python3.6/site-packages/oauth2client/client.py", line 302, in new_from_json
module_name = data['_module']
KeyError: '_module'
I have generated client_secret.json file as per the Python Quickstart tutorial.
All the file are in the same directory as that of quickstart.py.
Here is how my quickstart.py file looks.
from __future__ import print_function
from apiclient.discovery import build
from httplib2 import Http
from oauth2client import file, client, tools
# Setup the Drive v3 API
SCOPES = 'https://www.googleapis.com/auth/drive.metadata.readonly'
store = file.Storage('credentials.json')
creds = store.get()
if not creds or creds.invalid:
flow = client.flow_from_clientsecrets('client_secret.json', SCOPES)
creds = tools.run_flow(flow, store)
service = build('drive', 'v3', http=creds.authorize(Http()))
# Call the Drive v3 API
results = service.files().list(
pageSize=10, fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
else:
print('Files:')
for item in items:
print('{0} ({1})'.format(item['name'], item['id']))
UPDATE:
I checked and it turns out that credentials.json file is auto-generated on the first run and for some reason, this is not happening.
KeyError: '_module'
This key _module is suppose to be present in credentials.json file and that is why this error is thrown. Not sure what is missing. Can someone please tell me how to resolve this issue.
Similar issue here Try to remove both files from your directory - "credentials.json" and "client_secret.json". Then re-generate your credentials and re-create "client_secret.json", this worked for me.
I am implementing Python client for Testrail in my project (http://docs.gurock.com/testrail-api2/bindings-python)
I am running an API call "get_test" and I am receiving an error as below
File "playground.py", line 10, in <module>
case = client.send_get('get_test/53948')
File "/Users/bhdev/Work/Python/TBH/testrail.py", line 36, in send_get
return self.__send_request('GET', uri, None)
File "/Users/bhdev/Work/Python/TBH/testrail.py", line 70, in __send_request
response = urllib.request.urlopen(request).read()
File "/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 526, in open
response = self._open(req, data)
File "/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 544, in _open
'_open', req)
File "/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 504, in _call_chain
result = func(*args)
File "/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 1361, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py", line 1320, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:833)>
My code
from testrail import *
client = APIClient('https://******')
client.user = '*****'
client.password = '******'
case = client.send_get('get_test/xxxx')
print(case)
How could I bypass the SSL cert issue.
Thanks
The root cause of your error is that the machine you are running the request on does not trust the certificate your TestRail server is using.
You can either:
Fix the SSL Certificate on your TestRail server
Modify the bindings (testrail.py) to disable certificate verification as per the instructions in PEP 476 :: Opting Out
Based on the current python bindings source code on github, you should be able to disable certificate verification as follows:
Change import json, base64 (line 14) to import json, base64, ssl
Add the following line to the __init__(self, base_url) method
self.context = ssl._create_unverified_context()
Modify the following line:
response = urllib.request.urlopen(request).read()
to pass in the unverified context.
response = urllib.request.urlopen(request, context=self.context).read()
This addition solve me the problem in my case:
import ssl
ssl._create_default_https_context = ssl._create_unverified_context