I have been trying to send a GET Request to a web but I've been raised following Error:
requests.exceptions.SSLError: HTTPSConnectionPool(host='courses.fit.hcmus.edu.vn', port=443): Max retries exceeded with url: /course/view.php?id=2040 (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)')))
This is my code up to now:
import requests
session = requests.get('https://courses.fit.hcmus.edu.vn/course/view.php?id=2040')
print(session.text)
This web requires account and password but I've tried to use cookies already but I got same errors,...
I've try to upgrade pip or several things familiar to that but it didn't work.
Thanks,
Tien Dung
Related
I’m trying to download a repo from huggingface using this code:
from huggingface_hub import snapshot_download
snapshot_download(repo_id="openclimatefix/era5-land", repo_type="dataset",
cache_dir="/home/saben1/scratch/o/slurms/data_4")
After 3 hours of running, the repo wasn't completely downloaded and I got this error:
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Read timed out.
I added the parameter resume_download=True (to begin downloading from where it stops) and increased the etag_timeout like this:
from huggingface_hub import snapshot_download
snapshot_download(repo_id="openclimatefix/era5-land", repo_type="dataset",
cache_dir="/home/saben1/scratch/o/slurms/data_4", etag_timeout=120,
resume_download=True)
But still, have a new error:
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/datasets/openclimatefix/era5-land/revision/main (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x2aba8ecc7070>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
We facing the issue in docker where we remove the proxy but still we facing the proxy error.
Connection Error occurred. ProxyError: HTTPSConnectionPool(host='', port=443): Max retries exceeded with url: (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 403 Forbidden',)))
we removed proxy configuration from docker config.json file as well as docker.service.d folder.
I am a very beginner, but need to use KModes.
I found the documentation for KModes.
As described I am starting by this code to install kmodes on Jupyter Notebook (via Anaconda)
!pip install kmodes
but get this error message :
Could not fetch URL https://pypi.org/simple/kmodes/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.org', port=443): Max retries exceeded with url: /simple/kmodes/ (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)'))) - skipping
I should I install this new KModes, please ?
I'm developing with python 3.9.6.
I encounter this error...
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)`
Someone help me ?
Thanks
Your question needs a bit more context but try this:
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
I am trying to crawl Twitter with Twython, tweepy module according to their documentation. Each time got stuck along with max entries restrictions:
#python 3.8.2
#twython 3.8.2
from twython import Twython
twitter = Twython(APP_KEY, APP_SECRET,
OAUTH_TOKEN, OAUTH_TOKEN_SECRET)
twitter.verify_credentials(verify = False)
twitter.get_home_timeline()
Error
twython.exceptions.TwythonError: HTTPSConnectionPool(host='api.twitter.com', port=443): Max retries exceeded with url: /1.1/account/verify_credentials.json (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fa1b4d86100>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
Any suggestions What I am missing?
You get this error simply because twitter is filtered/prohibited in your area !
try connecting with proxy !