building a asynchronous websocket iterator - python-3.x

I have a class i created thats a websocket and it connects to my data endpoint with no issues. However, I wanted my socket to run forever. I am using the websockets python library. Heres a sample:
from websockets import connect
class Socket(metaclass=ABCMeta):
def __init__(self, url: str):
self.url = url
async def __aenter__(self):
self._conn = connect(self.url, ping_interval=None)
self.websocket = await self._conn.__aenter__()
return self
async def __aexit__(self, *args, **kwargs):
await self._conn.__aexit__(*args, **kwargs)
Now, i am able to write async with statement with no problems. My issue arises when i want my socket to remain connected.
reading in the library, it seems one suggested way is to do the following:
async for socket in websockets.connect(", ping_interval=None):
try:
your logic
except websockets.closedConnection as e:
continue
This allows me to keep trying to connect if there is an issue. How would i incorporate this into my class as an iterator? I tried the following but getting error:
TypeError: 'async for' received an object from __aiter__ that does not implement __anext__: coroutine
After i added the following code in the above class:
async def __aiter__(self):
return self
async def __anext__(self):
async for websocket in connect(self.url, ping_interval=None):
try:
self.websocket = await websocket
except StopIteration:
raise StopAsyncIteration
I am not posting my entire code here as the goal is to encapsulate a class around this socket class i created with the goal being
async for object in MyCustomClassSocketIterator(url):
try:
await object.send()
await object.receive()
except websockets.closedConnection as e:
etc....
where the encapsulated class has implemented receive() and send() functions. So each time program starts, object is instantiated asynchronously. If anything breaks...then it attemps to connect again if there is a socket.closedConnection. Thanks

Related

SQLAlchemy async engine with ORM unable to execute basic queries

I have switched my SQLAlchemy database code to use an async engine and am having trouble establishing basic functionality.
I have a class that starts the database like this:
class PostgresDb:
def __init__(self):
self._session = None
self._engine = None
def __getattr__(self, name):
return getattr(self._session, name)
def init(self):
self._engine = create_async_engine(
ENGINE,
echo=True,
future=True)
self._session = sessionmaker(
self._engine, expire_on_commit=False, class_=AsyncSession
)()
async def create_all(self):
async with self._engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
# Other methods...
Example of how create_all gets called:
async def init_db_tables():
self.init()
await self.create_all()
asyncio.run(init_db_tables())
When I want to achieve basic functionality, like getting all the tables, I can do something like:
def get_tables(self):
with create_engine(SYNCHRONOUS_ENGINE).connect() as conn:
meta = MetaData(conn, schema=SCHEMA)
meta.reflect(views=True)
table_list = meta.tables
return table_list
This is not ideal as I need to actually pass in a synchronous engine connection instead of the actual async engine I am using in the Class. It is also very verbose and shouldn't need to be initiated like this for every query.
I have tried doing something like this to select the table 'appuser' from the database:
async def get_tables(self):
self.init()
async with self._session() as session:
q = select('appuser')
result = await session.execute(q)
curr = result.scalars()
for i in curr:
print(i)
Which I've tried calling like this
db = PostgresDb()
asyncio.run(db.get_tables())
asyncio.get_event_loop().run_until_complete(db.get_tables())
These both give error:
async with self._session() as session:
TypeError: 'AsyncSession' object is not callable
Calling it with db.get_tables() errors RuntimeWarning: coroutine 'PostgresDb.get_tables' was never awaited db.get_tables() RuntimeWarning: Enable tracemalloc to get the object allocation traceback
Trying to use inspector with run_sync like this:
async def get_tables(self):
await self.init()
async with self._engine.begin() as conn:
inspector = conn.run_sync(inspect(conn))
table_names = await conn.run_sync(inspector.get_table_names())
print(table_names)
Returns error
sqlalchemy.exc.NoInspectionAvailable: Inspection on an AsyncConnection is currently not supported. Please use ``run_sync`` to pass a callable where it's possible to call ``inspect`` on the passed connection.
I have read the documentation at https://docs.sqlalchemy.org/en/14/orm/extensions/asyncio.html#sqlalchemy.ext.asyncio.AsyncConnection.run_sync but I am still unclear about how to work cleanly with async engines.
Thanks for any and all insight you're able to offer on how to execute a simple query get all tables in SQLAlchmey using the async engine!

call from method in class to #class.method using python

I am trying to make a class, that will be eventually turned into a library. To do this, I am trying to do something like what discord.py made, and the idea comes from it.
The code that discord makes is:
#bot.event
async def on_ready():
print('discord bot is ready')
Where the '#bot' is just an object that I created before by doing
bot = discord()
And the '.event' is a preprogramed and ready to use method.
on_ready() is a function that is called already.
I want to have a way to create this from my own class, and from there mannage the entire code using async functions.
How to do it in my own code?
You need to implement a class whose public methods are decorators. For example, this class implements a scheduler that exposes scheduling through a decorator:
class Scheduler:
def __init__(self):
self._torun = []
def every_second(self, fn):
self._torun.append(fn)
async def _main(self):
while True:
for fn in self._torun:
asyncio.create_task(fn())
await asyncio.sleep(1)
def run(self):
asyncio.run(self._main())
You'd use it like this:
sched = Scheduler()
#sched.every_second
async def hello():
print('hello')
#sched.every_second
async def world():
print('world')
sched.run()
The class mimics discord in that it has a run method that calls asyncio.run for you. I would prefer to expose an async main instead, so the user can call asyncio.run(bot.main()) and have control over which event loop is used. But the example follows discord's convention, to make the API more familiar to users of discord.py.

Python 3.7 Non-Blocking Request?

I'd like to do a non-blocking http request in Python 3.7. What I'm trying to do is described well in this SO post, but it doesn't yet have an accepted answer.
Here's my code so far:
import asyncio
from aiohttp import ClientSession
[.....]
async def call_endpoint_async(endpoint, data):
async with ClientSession() as session, session.post(url=endpoint, data=data) as result:
response = await result.read()
print(response)
return response
class CreateTestScores(APIView):
permission_classes = (IsAuthenticated,)
def post(self, request):
[.....]
asyncio.run(call_endpoint_async(url, data))
print('cp #1') # <== `async.io` BLOCKS -- PRINT STATEMENT DOESN'T RUN UNTIL `asyncio.run` RETURNS
What is the correct way to do an Ajax-style non-blocking http request in Python?
Asyncio makes it easy to make a non-blocking request if your program runs in asyncio. For example:
async def doit():
task = asyncio.create_task(call_endpoint_async(url, data))
print('cp #1')
await asyncio.sleep(1)
print('is it done?', task.done())
await task
print('now it is done')
But this requires that the "caller" be async as well. In your case you want the whole asyncio event loop to run in the background, so that. This can be achieved by running it in a separate thread, e.g.:
pool = concurrent.futures.ThreadPoolExecutor()
# ...
def post(self, request):
fut = pool.submit(asyncio.run, call_endpoint_async(url, data))
print('cp #1')
However, in that case you're not getting anything by using asyncio. Since you're using threads anyway, you could as well call a sync function such as requests.get() to begin with.

Python async CancelledError() with no details

The following code fails and I'm not able to get the actual error, I just get numerous CancelledError messages
import aiobotocore, asyncio
async def replicate_to_region(chunks, region):
session = aiobotocore.get_session()
client = session.create_client('dynamodb', region_name=region)
start = 0
while True:
chunk = chunks[start]
item = {'my_table': chunk}
response = await client.batch_write_item(RequestItems=item)
async def main():
asyncio.gather(*(replicate_to_region(payloads, region) for region in regions))
asyncio.run(main())
I get the following errors;
client_session: <aiohttp.client.ClientSession object at 0x7f6fb65a34a8>
Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x7f6fb64c82b0>
_GatheringFuture exception was never retrieved
future: <_GatheringFuture finished exception=CancelledError()>
concurrent.futures._base.CancelledError
_GatheringFuture exception was never retrieved
future: <_GatheringFuture finished exception=CancelledError()>
I've tried quite a number of variations of the replicate_to_region function but they all fail with the same error above. It would be useful just to be able to see what the actual error is.
async def main():
asyncio.gather(...)
asyncio.gather() is an awaitable itself:
awaitable asyncio.gather(*aws, loop=None, return_exceptions=False)
It means you should use await when deal with it:
async def main():
await asyncio.gather(*(replicate_to_region(payloads, region) for region in regions))
off-topic:
I didn't work with aiobotocore and not sure if it's important, but it's better to do as documentation says. In particular you should probably use async with when creating a client as example shows.

How to mock "async with" statements?

I'm trying to write tests for a method that uses "async with" statements (in this case, aioredis's connection pool), i want to mock the connection to redis, but i'm having trouble figuring out how.
Here's what i have so far:
from asyncio import Future
from unittest.mock import MagicMock
import pytest
# The thing i'm trying to test
async def set_value(redis, value):
# Do things
async with redis.get() as conn:
await conn.set("key", value)
#My Mock classes
class MockRedis():
def get(self):
return MockAsyncPool()
class MockAsyncPool(MagicMock):
async def __aenter__(self):
conn = MagicMock()
f = Future()
f.set_result(True)
conn.set = MagicMock(return_value=f)
return conn
def __aexit__(self, exc_type, exc_val, exc_tb):
pass
# The actual test
#pytest.mark.asyncio
async def test_get_token():
redis = MockRedis()
token = await set_value(redis, 'something')
assert token is not None
I run it with:
py.test path/to/file.py
And i'm getting this error:
> await conn.set("key", value)
E TypeError: object NoneType can't be used in 'await' expression
__aexit__ needs to also be asyncronous (needs to return an awaitable):
async def __aexit__(self, exc_type, exc_val, exc_tb):
pass
Without being async it is returning None instead of a coroutine so it raises an error, as for the very misleading error message I have created this issue to point out that the error message could be greatly improved.

Resources