AIOHTTP, can't read multiple parameters from request - python-3.x

I try to read multiple parameters lat and lon from my request using request.rel_url.query["lat"] and request.rel_url.query["lon"], respectively (following this SO post).
When debugging, query has only one element, the second parameter is not read correctly.
The call:
$ curl -i -X GET 127.0.0.1:8000/test?lat=47.659049&lon=9.166003
HTTP/1.1 500 Internal Server Error
Content-Type: text/plain; charset=utf-8
Content-Length: 55
Date: Tue, 22 Sep 2020 06:49:15 GMT
Server: Python/3.8 aiohttp/3.6.2
Connection: close
500 Internal Server Error
Server got itself in trouble
The python error:
Error handling request
Traceback (most recent call last):
File "/home/lorenz/.local/lib/python3.8/site-packages/aiohttp/web_protocol.py", line 418, in start
resp = await task
File "/home/lorenz/.local/lib/python3.8/site-packages/aiohttp/web_app.py", line 458, in _handle
resp = await handler(request)
File "/home/lorenz/Documents/copernicus-service/topography/topography_server.py", line 199, in handle_copernicus
lon = request.rel_url.query["lon"]
KeyError: 'lon'
What's the reason for this, and how can I read the 2nd parameter correctly?

Related

No MESSAGE-ID and get imap_tools work for imap.mail.yahoo.com

The question is twofold, about getting MESSAGE-ID, and using imap_tools. For an email client ("handmade") in Python I need to lessen the data amount read from the server (presently it takes 2 min to read the whole mbox folder of ~170 msg for yahoo), I believe that having MESSAGE-ID will help me.
imap_tools has IDLE command which is essential to keep the yahoo server connection alive and other features which I believe will simplify the code.
To learn about MESSAGE-ID I started with the following code (file fetch_ssl.py):
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import imaplib
import email
import os
import ssl
import conf
# Why UID==1 has no MESSAGE-ID ?
if __name__ == '__main__':
args = conf.parser.parse_args()
host, port, env_var = conf.config[args.host]
if 0 < args.verbose:
print(host, port, env_var)
with imaplib.IMAP4_SSL(host, port,
ssl_context=ssl.create_default_context()) as mbox:
user, pass_ = os.getenv('USER_NAME_EMAIL'), os.getenv(env_var)
mbox.login(user, pass_)
mbox.select()
typ, data = mbox.search(None, 'ALL')
for num in data[0].split():
typ, data = mbox.fetch(num, '(RFC822)')
msg = email.message_from_bytes(data[0][1])
print(f'num={int(num)}, MESSAGE-ID={msg["MESSAGE-ID"]}')
ans = input('Continue[Y/n]? ')
if ans.upper() in ('', 'Y'):
continue
else:
break
Where conf.py is:
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import argparse
HOST = 'imap.mail.yahoo.com'
PORT = 993
config = {'gmail': ('imap.gmail.com', PORT, 'GMAIL_APP_PWD'),
'yahoo': ('imap.mail.yahoo.com', PORT, 'YAHOO_APP_PWD')}
parser = argparse.ArgumentParser(description="""\
Fetch MESSAGE-ID from imap server""")
parser.add_argument('host', choices=config)
parser.add_argument('-verbose', '-v', action='count', default=0)
fetch_ssl.py outputs:
$ python fetch_ssl.py yahoo
num=1, MESSAGE-ID=None
Continue[Y/n]?
num=2, MESSAGE-ID=<83895140.288751#communications.yahoo.com>
Continue[Y/n]? n
I'd like to understand why the message with UID == 1 has no MESSAGE-ID? Does that happen from time to time (I mean there are messages with no MESSAGE-ID)? How to handle these cases? I haven't found such cases for gmail.
Then I attempted to do similar with imap_tools (Version: 0.56.0), (file fetch_tools.py):
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import os
import ssl
from imap_tools import MailBoxTls
import conf
# https://github.com/ikvk/imap_tools/blob/master/examples/tls.py
# advices
# ctx.load_cert_chain(certfile="./one.crt", keyfile="./one.key")
if __name__ == '__main__':
args = conf.parser.parse_args()
host, port, env_var = conf.config[args.host]
if 0 < args.verbose:
print(host, port, env_var)
user, pass_ = os.getenv('USER_NAME_EMAIL'), os.getenv(env_var)
ctx = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ctx.options &= ~ssl.OP_NO_SSLv3
# imaplib.abort: socket error: EOF
with MailBoxTls(host=host, port=port, ssl_context=ctx) as mbox:
mbox.login(user, pass_, 'INBOX')
for msg in mbox.fetch():
print(msg.subject, msg.date_str)
Command
$python fetch_tools.py yahoo
outputs:
Traceback (most recent call last):
File "/home/vlz/Documents/python-scripts/programming_python/Internet/Email/ymail/imap_tools_lab/fetch_tools.py", line 20, in <module>
with MailBoxTls(host=host, port=port, ssl_context=ctx) as mbox:
File "/home/vlz/Documents/.venv39/lib/python3.9/site-packages/imap_tools/mailbox.py", line 322, in __init__
super().__init__()
File "/home/vlz/Documents/.venv39/lib/python3.9/site-packages/imap_tools/mailbox.py", line 35, in __init__
self.client = self._get_mailbox_client()
File "/home/vlz/Documents/.venv39/lib/python3.9/site-packages/imap_tools/mailbox.py", line 328, in _get_mailbox_client
client = imaplib.IMAP4(self._host, self._port, self._timeout) # noqa
File "/usr/lib/python3.9/imaplib.py", line 205, in __init__
self._connect()
File "/usr/lib/python3.9/imaplib.py", line 247, in _connect
self.welcome = self._get_response()
File "/usr/lib/python3.9/imaplib.py", line 1075, in _get_response
resp = self._get_line()
File "/usr/lib/python3.9/imaplib.py", line 1185, in _get_line
raise self.abort('socket error: EOF')
imaplib.abort: socket error: EOF
Command
$ python fetch_tools.py gmail
Produces identical results. What are my mistakes?
Using Python 3.9.2, Debian GNU/Linux 11 (bullseye), imap_tools
(Version: 0.56.0)
EDIT
Headers from the message with no MESSAGE-ID
X-Apparently-To: vladimir.zolotykh#yahoo.com; Sun, 25 Oct 2015 20:54:21 +0000
Return-Path: <mail#product.communications.yahoo.com>
Received-SPF: fail (domain of product.communications.yahoo.com does not designate 216.39.62.96 as permitted sender)
...
X-Originating-IP: [216.39.62.96]
Authentication-Results: mta1029.mail.bf1.yahoo.com from=product.communications.yahoo.com; domainkeys=neutral (no sig); from=product.communications.yahoo.com; dkim=pass (ok)
Received: from 127.0.0.1 (EHLO n3-vm4.bullet.mail.gq1.yahoo.com) (216.39.62.96)
by mta1029.mail.bf1.yahoo.com with SMTPS; Sun, 25 Oct 2015 20:54:21 +0000
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=product.communications.yahoo.com; s=201402-std-mrk-prd; t=1445806460; bh=5PTgF8Jghm92xeMD5mSHp6A3eRVV70PWo1oQ15K7Tfk=; h=Date:From:Reply-To:To:Subject:From:Subject; b=D7ItgOiuLbiexJGHvORgbpRi22X+sYso6gwZKDXVca79DxMMy2R1dUtZTIg7tcft1lovVJUDw/7fC51orDltRidlfnpayeY8lT+94DRlSBwopuxgOqqR9oTTjTBZ0oEvdxUcXl/q54N2GxuBFvmg8UO0OZoCnFPpUVYo9x4arMjt/0TOW1Q5d/yjdmO7iwiued/rliP/Bsq0TaZYcb0oCAT7Q50tb1fB7wcXLYNSC1OCQ1l1LajbUqmU1LWWNse36mUUTBieO2sZT0ERFrHaCTaTNQSXKQG2AxYF7Dd/8i0Iq3xqdcS0bDpjmWE25uoKvCdtXtUbylsuQSChuLFMTw==
Received: from [216.39.60.185] by n3.bullet.mail.gq1.yahoo.com with NNFMP; 25 Oct 2015 20:54:20 -0000
Received: from [98.137.101.84] by t1.bullet.mail.gq1.yahoo.com with NNFMP; 25 Oct 2015 20:54:20 -0000
Date: 25 Oct 2015 20:54:20 +0000
Received: from [127.0.0.1] by nu-repl01.direct.gq1.yahoo.com with NNFMP; 25 Oct 2015 20:54:20 -0000
X-yahoo-newman-expires: 1445810060
From: "Yahoo Mail" <mail#product.communications.yahoo.com>
Reply-To: replies#communications.yahoo.com
To: <ME>#yahoo.com
Subject: Welcome to Yahoo! Vladimir
X-Yahoo-Newman-Property: ydirect
Content-Type: text/html
Content-Length: 25180
I skipped only X-YMailISG.
EDIT II
Of 167 messages 21 have no MESSAGE-ID header.
fetch_ssl.py takes 4m12.342s, and fetch_tools.py -- 3m41.965s
It looks simply like your email without a Message-ID legitimately does not have one; it appears the welcome email Yahoo sent you actually lacks it. Since it's a system generated email, that's not that unexpected. You'd just have to skip over it.
The second problem is that you need to use imap_tools.MailBox.
Looking at the documentation and source at the repo it appears that the relevant classes to use are:
MailBox - for a normal encrypted connection. This is what most email servers use these days, aka IMAPS (imap with SSL/TLS)
MailBoxTls - For a STARTTLS connection: this creates a plaintext connection then upgrades it later by using a STARTTLS command in the protocol. The internet has mostly gone to the "always encrypted" rather than "upgrade" paradigm, so this is not the class to use.
MailBoxUnencrypted - Standard IMAP without SSL/TLS. You should not use this on the public internet.
The naming is a bit confusing. MailBox corresponds to imaplib.IMAP4_SSL; MailBoxTls corresponds to imaplib.IMAP4, then using startls() on the resulting connection; and MailboxUnencrypted corresponds to imaplib.IMAP4 with no security applied. I imagine it's this way so the most common one (Mailbox) is a safe default.

Loading a GRIB from the web in Python without saving the file locally

I would like to download a GRIB file from the web:
Opt1: https://noaa-gfs-bdp-pds.s3.amazonaws.com/gfs.20210801/12/atmos/gfs.t12z.pgrb2.1p00.f000
Opt2: https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_1200_000.grb2
(FYI, data is identical)
A similar question is asked here:
How can a GRIB file be opened with pygrib without first downloading the file?
However, ultimately a final solution isn't given as the data source has multiple responses (Causing issues). When I tried to recreate the code:
url = 'https://noaa-gfs-bdp-pds.s3.amazonaws.com/gfs.20210801/12/atmos/gfs.t12z.pgrb2.1p00.f000'
urllib.request.install_opener(opener)
req = urllib.request.Request(url,headers={'User-Agent': 'Mozilla/5.0'})
data = urllib.request.urlopen(req, timeout = 300)
import pygrib
grib = pygrib.fromstring(data.read())
My error is different:
ECCODES ERROR : grib_handle_new_from_message: No final 7777 in message!
Traceback (most recent call last):
File "grib_data_test.py", line 139, in <module>
grib = pygrib.fromstring(data.read())
File "src\pygrib\_pygrib.pyx", line 627, in pygrib._pygrib.fromstring
File "src\pygrib\_pygrib.pyx", line 1390, in pygrib._pygrib.gribmessage._set_projparams
File "src\pygrib\_pygrib.pyx", line 1127, in pygrib._pygrib.gribmessage.__getitem__
RuntimeError: b'Key/value not found'
I have also tried to work directly with the osgeo.gdal library (preferable for my project as it is already in use in the project). Documentation: https://gdal.org/user/virtual_file_systems.html#vsimem-in-memory-files:
Attempt1: url = "/vsicurl/https://noaa-gfs-bdp-pds.s3.amazonaws.com/gfs.20210801/12/atmos/gfs.t12z.pgrb2.0p50.f000"
Attempt2: url = "/vsis3/https://noaa-gfs-bdp-pds.s3.amazonaws.com/gfs.20210801/12/atmos/gfs.t12z.pgrb2.0p50.f000"
Attempt3: url = "/vsicurl/https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2"
grib = gdal.Open(url)
Errors:
Attempt1: ERROR 11: CURL error: schannel: next InitializeSecurityContext failed: Unknown error (0x80092012) - The revocation function was unable to check revocation for the certificate.
Attempt2: ERROR 11: HTTP response code: 0
Attempt3: ERROR 4: /vsicurl/https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2 is a grib file, but no raster dataset was successfully identified.
For Attempt1 and 2: I feel like both attempts should work here - normally you do not need any specific AWS Credentials / connection to access the publically available s3 bucket data (As it is accessed using urllib above).
For Attempt3, there is a similar issue here:
https://gis.stackexchange.com/questions/395867/opening-a-grib-from-the-web-with-gdal-in-python-using-vsicurl-throws-error-on-m
However I do not experience the same issue, my Output:
gdalinfo /vsicurl/https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2 --debug on --config CPL_CURL_VERBOSE YES
HTTP: libcurl/7.78.0 Schannel zlib/1.2.11 libssh2/1.9.0
HTTP: GDAL was built against curl 7.77.0, but is running against 7.78.0.
* Couldn't find host www.ncei.noaa.gov in the (nil) file; using defaults
* Trying 205.167.25.177:443...
* Connected to www.ncei.noaa.gov (205.167.25.177) port 443 (#0)
* schannel: disabled automatic use of client certificate
> HEAD /data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2 HTTP/1.1
Host: www.ncei.noaa.gov
Accept: */*
* schannel: server closed the connection
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Date: Mon, 20 Sep 2021 14:38:59 GMT
< Server: Apache
< Strict-Transport-Security: max-age=31536000
< Last-Modified: Sun, 01 Aug 2021 04:46:49 GMT
< ETag: "287c05a-5c87824012f22"
< Accept-Ranges: bytes
< Content-Length: 42451034
< Access-Control-Allow-Origin: *
< Access-Control-Allow-Headers: X-Requested-With, Content-Type
< Connection: close
<
* Closing connection 0
* schannel: shutting down SSL/TLS connection with www.ncei.noaa.gov port 443
VSICURL: GetFileSize(https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2)=42451034 response_code=200
VSICURL: Downloading 0-16383 (https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2)...
* Couldn't find host www.ncei.noaa.gov in the (nil) file; using defaults
* Hostname www.ncei.noaa.gov was found in DNS cache
* Trying 205.167.25.177:443...
* Connected to www.ncei.noaa.gov (205.167.25.177) port 443 (#1)
* schannel: disabled automatic use of client certificate
> GET /data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2 HTTP/1.1
Host: www.ncei.noaa.gov
Accept: */*
Range: bytes=0-16383
* schannel: failed to decrypt data, need more data
* Mark bundle as not supporting multiuse
< HTTP/1.1 206 Partial Content
< Date: Mon, 20 Sep 2021 14:39:00 GMT
< Server: Apache
< Strict-Transport-Security: max-age=31536000
< Last-Modified: Sun, 01 Aug 2021 04:46:49 GMT
< ETag: "287c05a-5c87824012f22"
< Accept-Ranges: bytes
< Content-Length: 16384
< Content-Range: bytes 0-16383/42451034
< Access-Control-Allow-Origin: *
< Access-Control-Allow-Headers: X-Requested-With, Content-Type
< Connection: close
<
* schannel: failed to decrypt data, need more data
* schannel: failed to decrypt data, need more data
* schannel: failed to decrypt data, need more data
* Closing connection 1
* schannel: shutting down SSL/TLS connection with www.ncei.noaa.gov port 443
VSICURL: Got response_code=206
VSICURL: Downloading 81920-98303 (https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2)...
* Couldn't find host www.ncei.noaa.gov in the (nil) file; using defaults
* Hostname www.ncei.noaa.gov was found in DNS cache
* Trying 205.167.25.177:443...
* Connected to www.ncei.noaa.gov (205.167.25.177) port 443 (#2)
* schannel: disabled automatic use of client certificate
* schannel: failed to receive handshake, SSL/TLS connection failed
* Closing connection 2
VSICURL: DownloadRegion(https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2): response_code=0, msg=schannel: failed to receive handshake, SSL/TLS connection failed
VSICURL: Got response_code=0
GRIB: ERROR: Ran out of file in Section 7
ERROR: Problems Jumping past section 7
ERROR 4: /vsicurl/https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2 is a grib file, but no raster dataset was successfully identified.
gdalinfo failed - unable to open '/vsicurl/https://www.ncei.noaa.gov/data/global-forecast-system/access/grid-003-1.0-degree/analysis/202108/20210801/gfs_3_20210801_0000_000.grb2'.
Basically. I can currently download the files using urllib, then:
file_out = open(<file.path>, 'w')
file_out.write(data.read())
grib = gdal.Open(file_out)
does what I require. But there is a desire to not save the files locally during the temporary processing moment due to the system we are working within.
Python Versions:
gdal 3.3.1 py38hacca965_1 defaults
pygrib 2.1.4 py38hceae430_0 defaults
python 3.8.10 h7840368_1_cpython defaults
Cheers.
This worked for me. I was using the GEFS data hosted on AWS instead though. I believe there is GFS on AWS also. No account should be needed, so it would just be a matter of changing the bucket and s3_object names to point to GFS data instead
import pygrib
import boto3
from botocore import UNSIGNED
from botocore.config import Config
s3 = boto3.client('s3', config=Config(signature_version=UNSIGNED))
bucket_name = 'noaa-gefs-pds'
s3_object = 'gefs.20220425/00/atmos/pgrb2ap5/gec00.t00z.pgrb2a.0p50.f000'
obj = s3.get_object(Bucket=bucket_name, Key=s3_object)['Body'].read()
grbs = pygrib.fromstring(obj)
print(type(grbs))

Dataflow job fails with HttpError, NotImplementedError

I'm running a Dataflow job which I think should work, and is failing after 1.5 hrs with what looks like network errors. It works fine when run against a subset of the data.
The first trouble sign is a whole string of warnings like this:
Refusing to split <dataflow_worker.shuffle.GroupedShuffleRangeTracker object at 0x7f2bcb629950> at b'\xa4r\xa6\x85\x00\x01': proposed split position is out of range [b'\xa4^E\xd2\x00\x01', b'\xa4r\xa6\x85\x00\x01'). Position of last group processed was b'\xa4r\xa6\x84\x00\x01'.
Then there are four errors which seem to be about writing CSV files to GCS:
Error in _start_upload while inserting file gs://(redacted).csv: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload self._client.objects.Insert(self._insert_request, upload=self._upload) File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1156, in Insert upload=upload, upload_config=upload_config) File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/(redacted).csv&uploadType=resumable&upload_id=(redacted)>: response: <{'content-type': 'text/plain; charset=utf-8', 'x-guploader-uploadid': '(redacted)', 'content-length': '0', 'date': 'Wed, 08 Jul 2020 22:17:28 GMT', 'server': 'UploadServer', 'status': '503'}>, content <>
Error in _start_upload while inserting file gs://(redacted).csv: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload self._client.objects.Insert(self._insert_request, upload=self._upload) File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1156, in Insert upload=upload, upload_config=upload_config) File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod http_request, client=self.client) File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload return self.StreamInChunks() File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks additional_headers=additional_headers) File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia self.RefreshResumableUploadState() File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 873, in RefreshResumableUploadState self.stream.seek(self.progress) File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 301, in seek offset, whence, self.position, self.last_block_position)) NotImplementedError: offset: 0, whence: 0, position: 411, last: 411
The Dataflow job ID is 2020-07-07_13_08_31-7649894576933400587 -- if anyone from Google Cloud Support is able to look at this I'd be very grateful. Thanks very much.
P.S I asked a similar question last year (Dataflow job fails at BigQuery write with backend errors), the resolution was to use --experiments=use_beam_bq_sink -- I am already doing this.
You can safely ignore "Refusing to split" errors. This just means that the split position Dataflow service provided probably was received by the worker after that position was already read by the worker. Hence the worker has to ignore the split request.
Error "Error in _start_upload while inserting" seems more problematic and seems to be similar to https://issues.apache.org/jira/browse/BEAM-7014. I suspect this to be a rare flake though so I'm not sure if this was the reason for your job failure (the job only fails of the same workitem failed four times).
Can you contact Google Cloud support so that they can look into your job ?
I will mention this in the JIRA.

Pyzmail/IMAPclient: Can't figure out what key to use

I'm following this guide: https://automatetheboringstuff.com/chapter16/#calibre_link-45
to scrape emails and I am having issues using pyzmail.PyzMessage.factory(). I keep getting a KeyError.
I took the advice from here: Python email bot Pyzmail/IMAPclient error
but I continued to get the same error.
imapObj = imapclient.IMAPClient("imap.gmail.com", ssl = True)
imapObj.login("MY_EMAIL_ADDRESS", "MY_PASSWORD")
imapObj.select_folder("INBOX", readonly=False)
UIDs = imapObj.gmail_search("test1")
print(UIDs)
rawMessages = imapObj.fetch(UIDs, ["BODY[]"])
pprint.pprint(rawMessages)
message = pyzmail.PyzMessage.factory(rawMessages[40041][b'BODY[]'])
I am getting this error:
[7156]
Traceback (most recent call last):
File "C:/Users/Logan/PycharmProjects/email_sending_test/venv/main1.py", line 17, in <module>
message = pyzmail.PyzMessage.factory(rawMessages[0][b'BODY[]'])
KeyError: b'BODY[]'
defaultdict(<class 'dict'>,
{7156: {b'BODY[]': b'MIME-Version: 1.0\r\nDate: Thu, 3 Jan 2019 16:'
b'51:54 -0500\r\nMessage-ID: <CAB4Lt1swQPJvCL3ot'
b'8E7q2Pc9_C26hZxMdUgcZd9LbJUyhZbvw#mail.gmail'
b'.com>\r\nSubject: test1\r\nFrom: Rob Roberts'
b' <swimmingonanarwhal#gmail.com>\r\nTo: Rob Rob'
b'erts <swimmingonanarwhal#gmail.com>\r\nContent'
b'-Type: multipart/alternative; boundary="0000'
b'000000006f5b28057e94c5de"\r\n\r\n--000000000'
b'0006f5b28057e94c5de\r\nContent-Type: text/plai'
b'n; charset="UTF-8"\r\n\r\ntrying this ou'
b't\r\n\r\n--0000000000006f5b28057e94c5de\r\nCon'
b'tent-Type: text/html; charset="UTF-8"\r\n\r'
b'\n<div dir="ltr">trying this out</div>\r\n\r'
b'\n--0000000000006f5b28057e94c5de--',
b'SEQ': 6962}})
Process finished with exit code 1

stripe.error.APIConnectionError connected to openssl

I am dealing with the following error
Traceback (most recent call last):
File "once.py", line 1757, in <module>
once()
File "once.py", line 55, in once
stripe.Charge.all()
File "/Library/Python/2.7/site-packages/stripe/resource.py", line 438, in all
return cls.list(*args, **params)
File "/Library/Python/2.7/site-packages/stripe/resource.py", line 450, in list
response, api_key = requestor.request('get', url, params)
File "/Library/Python/2.7/site-packages/stripe/api_requestor.py", line 150, in request
method.lower(), url, params, headers)
File "/Library/Python/2.7/site-packages/stripe/api_requestor.py", line 281, in request_raw
method, abs_url, headers, post_data)
File "/Library/Python/2.7/site-packages/stripe/http_client.py", line 139, in request
self._handle_request_error(e)
File "/Library/Python/2.7/site-packages/stripe/http_client.py", line 159, in _handle_request_error
raise error.APIConnectionError(msg)
stripe.error.APIConnectionError: Unexpected error communicating with Stripe. If this problem persists,
let us know at support#stripe.com.
I get this error when running a simple test program which stripe suggested
import stripe
stripe.api_key = "blah bla"
stripe.api_base = "https://api-tls12.stripe.com"
print "stripe.VERSION = ", stripe.VERSION
if stripe.VERSION in ("1.13.0", "1.14.0", "1.14.1", "1.15.1", "1.16.0", "1.17.0", "1.18.0", "1.19.0"):
print "Bindings update required."
try:
stripe.Charge.all()
print "TLS 1.2 supported, no action required."
except stripe.error.APIConnectionError:
print "TLS 1.2 is not supported. You will need to upgrade your integration."
raise
I do not understand why I get this error, since my stripe version is high enough
stripe.VERSION = 1.55.2
and my openssl version does support TLS?
>>$ openssl version
OpenSSL 1.0.2k 26 Jan 2017
>>$ which openssl
/usr/bin/openssl
any ideas how to debug this further? I am lost...
ok I don't know what exactly caused the problem, but I got it working by changing the client
client = stripe.http_client.PycurlClient()
stripe.default_http_client = client
I think the requests package is the default. pycurl seems to work in my case...

Resources