Asynchronously writing to console from stdin and other sources - python-3.x

I try to try to write some kind of renderer for the command line that should be able to print data from stdin and from another data source using asyncio and blessed, which is an improved version of python-blessings.
Here is what I have so far:
import asyncio
from blessed import Terminal
#asyncio.coroutine
def render(term):
while True:
received = yield
if received:
print(term.bold + received + term.normal)
async def ping(renderer):
while True:
renderer.send('ping')
await asyncio.sleep(1)
async def input_reader(term, renderer):
while True:
with term.cbreak():
val = term.inkey()
if val.is_sequence:
renderer.send("got sequence: {0}.".format((str(val), val.name, val.code)))
elif val:
renderer.send("got {0}.".format(val))
async def client():
term = Terminal()
renderer = render(term)
render_task = asyncio.ensure_future(renderer)
pinger = asyncio.ensure_future(ping(renderer))
inputter = asyncio.ensure_future(input_reader(term, renderer))
done, pending = await asyncio.wait(
[pinger, inputter, renderer],
return_when=asyncio.FIRST_COMPLETED,
)
for task in pending:
task.cancel()
if __name__ == '__main__':
asyncio.get_event_loop().run_until_complete(client())
asyncio.get_event_loop().run_forever()
For learning and testing purposes there is just a dump ping that sends 'ping' each second and another routine, that should grab key inputs and also sends them to my renderer.
But ping only appears once in the command line using this code and the input_reader works as expected. When I replace input_reader with a pong similar to ping everything is fine.
This is how it looks when typing 'pong', although if it takes ten seconds to write 'pong':
$ python async_term.py
ping
got p.
got o.
got n.
got g.

It seems like blessed is not built to work correctly with asyncio: inkey() is a blocking method. This will block any other couroutine.
You can use a loop with kbhit() and await asyncio.sleep() to yield control to other coroutines - but this is not a clean asyncio solution.

Related

How to stop awaiting ainput funcion from another coroutine/task

I want to stop awaiting ainput function at 6th iteration of for loop
import asyncio
from aioconsole import ainput
class Test():
def __init__(self):
self.read = True
async def read_from_concole(self):
while self.read:
command = await ainput('$>')
if command == 'exit':
self.read = False
if command == 'greet':
print('greetings :J')
async def count(self):
console_task = asyncio.create_task(self.read_from_concole())
for c in range(10):
await asyncio.sleep(.5)
print(f'number: {c}')
if c == 5: # 6th iteration
# What shoud I do here?
# Following code doesn't meet my expectations
self.read = False
console_task.cancel()
await console_task
# async def run_both(self):
# await asyncio.gather(
# self.read_from_concole(),
# self.count()
# )
if __name__ == '__main__':
o1 = Test()
loop = asyncio.new_event_loop()
loop.run_until_complete(o1.count())
Of course, this code is simplified, but covers the idea: write a program where one coroutine can cancel another which is awaiting something (in this example ainput.
asyncio.Task.cancel() is not the solution because it won't make coroutine stop awaiting (so I need put arbitrary character into console and press enter, this is not what I want).
I don't even know whether my approach makes sense, I'm a fresh asyncio user, for now, I know only the basics. In my real project, the situation is very similar. I have a GUI application and a console window. By clicking 'X' button I want to close window and terminate ainput (which read commands from the console) to completely finish the program (the console part is working on a different thread, and because of that I can't close my program completely - this thread will run until ainput receive some input from a user).

Real time stdout redirect from a python function call to an async method

So I have a heavy time-consuming function call my_heavy_function and I need to redirect that output to a web interface that is calling it, I have a method to send messages to the web interface, let's called that method async push_message_to_user().
basically, it's something like
import time
def my_heavy_function():
time_on = 0
for i in range(20):
time.sleep(1)
print(f'time spend {time_on}')
time_on = time_on+1
async def push_message_to_user(message:str):
# some lib implementation
pass
if __name__ == "__main__":
my_heavy_function() # how push prints to the UI ?
maybe there is a way giving my_heavy_function(stdout_obj) and use that "std_object"(StringIO) to do something like stdout_object.push(f'time spend {time_on}'). I can do that, but what I can't change the my_heavy_function() by an async version, to add push_message_to_user() directly instead of the print (it's used by other non-ascyn routines)
what I would want it's something like (pseudocode)
with contextlib.redirect_output(my_prints):
my_heavy_function()
while my_prints.readable():
# realtime push
await push_message_to_user(my_prints)
Thanks!
Thanks for the comment of #HTF I finally managed to solve the problem with janus. I copy the example of the repo, and I modified in order to receive a variable number of messages (because I don't know how many iterations my_heavy_function() will use)
import asyncio
import janus
import time
def my_heavy_function(sync_q):
for i in range(10):
sync_q.put(i)
time.sleep(1)
sync_q.put('end') # is there a more elegant way to do this ?
sync_q.join()
async def async_coro(async_q):
while True:
val = await async_q.get()
print(f'arrived {val}')
# send val to UI
# await push_message_to_user(val)
async_q.task_done()
if val == 'end':
break
async def main():
queue = janus.Queue()
loop = asyncio.get_running_loop()
fut = loop.run_in_executor(None, my_heavy_function, queue.sync_q)
await async_coro(queue.async_q)
await fut
queue.close()
await queue.wait_closed()
asyncio.run(main())

running asyncio task concurrently and in background

So apologies because i've seen this question asked a bunch, but having looked through all of the questions none seem to fix my problem. My code looks like this
TDSession = TDClient()
TDSession.grab_refresh_token()
q = queue.Queue(10)
asyncio.run(listener.startStreaming(TDSession, q))
while True:
message = q.get()
print('oh shoot!')
print(message)
orderEntry.placeOrder(TDSession=TDSession)
I have tried doing asyncio.create_task(listener.startStreaming(TDSession,q)), the problem is I get
RuntimeError: no running event loop
sys:1: RuntimeWarning: coroutine 'startStreaming' was never awaited
which confused me because this seemed to work here Can an asyncio event loop run in the background without suspending the Python interpreter? which is what i'm trying to do.
with the listener.startStreaming func looking like this
async def startStreaming(TDSession, q):
streamingClient = TDSession.create_streaming_session()
streamingClient.account_activity()
await streamingClient.build_pipeline()
while True:
message = await streamingClient.start_pipeline()
message = parseMessage(message)
if message != None:
print('putting message into q')
print( dict(message) )
q.put(message)
Is there a way to make this work where I can run the listener in the background?
EDIT: I've tried this as well, but it only runs the consumer function, instead of running both at the same time
TDSession.grab_refresh_token()
q = queue.Queue(10)
loop = asyncio.get_event_loop()
loop.create_task(listener.startStreaming(TDSession, q))
loop.create_task(consumer(TDSession, q))
loop.run_forever()
As you found out, the asyncio.run function runs the given coroutine until it is complete. In other words, it waits for the coroutine returned by listener.startStreaming to finish before proceeding to the next line.
Using asyncio.create_task, on the other hand, requires the caller to be already running inside an asyncio loop already. From the docs:
The task is executed in the loop returned by get_running_loop(), RuntimeError is raised if there is no running loop in current thread.
What you need is to combine the two, by creating a function that's async, and then call create_task inside that async function.
For example:
async def main():
TDSession = TDClient()
TDSession.grab_refresh_token()
q = asyncio.Queue(10)
streaming_task = asyncio.create_task(listener.startStreaming(TDSession, q))
while True:
message = await q.get()
print('oh shoot!')
print(message)
orderEntry.placeOrder(TDSession=TDSession)
await streaming_task # If you want to wait for `startStreaming` to complete after the while loop
if __name__ == '__main__':
asyncio.run(main())
Edit: From your comment I realized you want to use the producer-consumer pattern, so I also updated the example above to use asyncio.Queue instead of a queue.Queue, in order for the thread to be able to jump between the producer (startStreaming) and the consumer (the while loop)

Proper way to start a Trio server that manages multiple TCP Connections

I recently finished a project using a mix of Django and Twisted and realized it's overkill for what I need which is basically just a way for my servers to communicate via TCP sockets. I turned to Trio and so far I'm liking what I see as it's way more direct (for what I need). That said though, I just wanted to be sure I was doing this the right way.
I followed the tutorial which taught the basics but I need a server that could handle multiple clients at once. To this end, I came up with the following code
import trio
from itertools import count
PORT = 12345
BUFSIZE = 16384
CONNECTION_COUNTER = count()
class ServerProtocol:
def __init__(self, server_stream):
self.ident = next(CONNECTION_COUNTER)
self.stream = server_stream
async def listen(self):
while True:
data = await self.stream.receive_some(BUFSIZE)
if data:
print('{} Received\t {}'.format(self.ident, data))
# Process data here
class Server:
def __init__(self):
self.protocols = []
async def receive_connection(self, server_stream):
sp: ServerProtocol = ServerProtocol(server_stream)
self.protocols.append(sp)
await sp.listen()
async def main():
await trio.serve_tcp(Server().receive_connection, PORT)
trio.run(main)
My issue here seems to be that each ServerProtocol runs listen on every cycle instead of waiting for data to be available to be received.
I get the feeling I'm using Trio wrong in which case, is there a Trio best practices that I'm missing?
Your overall structure looks fine to me. The issue that jumps out at me is:
while True:
data = await self.stream.receive_some(BUFSIZE)
if data:
print('{} Received\t {}'.format(self.ident, data))
# Process data here
The guarantee that receive_some makes is: if the other side has closed the connection already, then it immediately returns an empty byte-string. Otherwise, it waits until there is some data to return, and then returns it as a non-empty byte-string.
So your code should work fine... until the other end closes the connection. Then it starts doing an infinite loop, where it keeps checking for data, getting an empty byte-string back (data = b""), so the if data: ... block doesn't run, and it immediately loops around to do it again.
One way to fix this would be (last 3 lines are new):
while True:
data = await self.stream.receive_some(BUFSIZE)
if data:
print('{} Received\t {}'.format(self.ident, data))
# Process data here
else:
# Other side has gone away
break

how to make this code Non-Blocking with Asyncio?

I'm trying to create a code that is non-blocking and that allows me to create multiple clients to do some request on a server. However, I can't create more than 1 client simultaneously!
CLIENT.PY
import asyncio
PYTHONASYNCIODEBUG = 1
#ECHO CLIENT PROTOCOL
async def tcp_echo_client(message, loop):
# Send request to server
reader, writer = await asyncio.open_connection('127.0.0.1', 8888, loop=loop)
print('Send: %r' % message)
writer.write(message.encode())
# Receive the information
if message == '1':
await asyncio.Task(read_position(reader))
else:
await asyncio.ensure_future(read_server(reader))
# Close the connection
print('Close the socket')
writer.close()
#ASYNCIO COROUTINES TO REQUEST INFORMATION
async def read_server(reader):
server_message = await reader.read()
print(type(server_message))
print('Received: %r' % server_message.decode())
async def read_position(reader):
while True:
print("I'm Here")
server_message = await reader.read(50)
position = server_message.split()
print(position)
print(type(position))
print('Received: %r' % server_message.decode())
#FUNCTION THAT CREATES THE CLIENT
def main(message):
'''This function creates the client'''
loop = asyncio.get_event_loop()
try:
loop.run_until_complete(tcp_echo_client(message, loop))
finally:
pass
# This is how I create a new client
if __name__ == '__main__':
message = '2'
main(message)
message = '3'
main(message)
I want to create multiples clients, however, the code is blocking in the first main when I send the message('1'). I don't know why the code is blocking if I'm using asyncio. My server accepts multiples connections, because if I run this code seperatly I can do everything. The propose of this is to create clients every time I click a button at my Kivy app to send a request to the server.
This problems exists because I want to control a Robot and do a lot of things simultaneously, however with a blocking code I can't do it because I'm get stuck
Maybe it's a stupid question but I've only started coded 2 months ago and I haven't any help.
Your main function doesn't "create the client", as its docstring claims. It creates the client and runs it to completion. This is why multiple invocations of main() result in serial execution. main() being a regular function, that's exactly what you'd expect, asyncio doesn't change that. It's useful to remember that asyncio is single-threaded, so it can't do some "run in the background" magic, unless you cooperate.
To cooperate, you need to tell aysncio to start both clients, and then await them in parallel:
async def main(messages):
loop = asyncio.get_event_loop()
# launch the coroutines in parallel
tasks = [loop.create_task(tcp_echo_client(msg, loop)) for msg in messages]
# don't exit until all of them are done
await asyncio.gather(*tasks)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main(['2', '3']))
Note that when awaiting your coroutines, you don't need to wrap them in asyncio.ensure_future() or asyncio.Task() - asyncio will handle that automatically. await read_position(reader) and await read_server(reader) would work just fine and have the same meaning as the longer versions.

Resources