Could somebody explain me why I can't execute my tasks if I start the loop without any added tasks before? (Python 3.7)
import asyncio
import threading
def run_forever(loop):
loop.run_forever()
async def f(x):
print("%s executed" % x)
# init is called first
def init():
print("init started")
loop = asyncio.new_event_loop()
# loop.create_task(f("a1")) # <--- first commented task
thread = threading.Thread(target=run_forever, args=(loop,))
thread.start()
loop.create_task(f("a2")) # <--- this is not being executed
print("init finished")
If I leave comment on # loop.create_task(f("a1")) the execution is:
init started
init finished
Uncommented execution is:
init started
init finished
a1 executed
a2 executed
Why so? I wanted to create a loop and to add tasks in the future.
Unless explicitly stated otherwise, asyncio API is not thread-safe. This means that calling loop.create_task() from a thread other than the one that runs the event loop will not properly synchronize with the loop.
To submit the task to the event loop from a foreign thread, you need to invoke asyncio.run_coroutine_threadsafe instead:
asyncio.run_coroutine_threadsafe(f("a2"), loop)
This will wake up the loop to alert it that a new task has arrived, and it also returns a concurrent.futures.Future which you can use to obtain the result of the coroutine.
Related
I'm trying to understand the pattern for indefinitely running asyncio Tasks
and the difference that a custom loop signal handler makes.
I create workers using loop.create_task() so that they run concurrently.
In my regular workers' code I am polling for data and act accordingly when data is there.
I'm trying to handle the shutdown process gracefully on a signal.
When a signal is delivered - I again create_task() with the shutdown function, so that currently running tasks continue, and shutdown gets executed in next iteration of the event loop.
Now - when a single worker's while loop doesn't actually do any IO or work then it prevents the signal handler from being executed. It never ends and does not give back execution so that other tasks could be run.
When I don't attach a custom signal handler to a loop and run this program, then a signal is delivered and the program stops. I assume it's a main thread that stops the loop itself.
This is obviously different from trying to schedule a (new) shutdown task on a running loop, because that running loop is stuck in a single coroutine which is blocked in a while loop and doesn't give back any control or time for other tasks.
Is there any standard pattern for such cases?
Do I need to asyncio.sleep() if there's no work to do, do I replace the while loop with something else (e.g. rescheduling the work function itself)?
If the range(5) is replaced with range(1, 5) then all workers do await asyncio.sleep,
but if one of them does not, then everything gets blocked. How to handle this case, is there any standard approach?
The code below illustrates the problem.
async def shutdown(loop, sig=None):
print("SIGNAL", sig)
tasks = [t for t in asyncio.all_tasks()
if t is not asyncio.current_task()]
[t.cancel() for t in tasks]
results = await asyncio.gather(*tasks, return_exceptions=True)
# handle_task_results(results)
loop.stop()
async def worker(intval):
print("start", intval)
while True:
if intval:
print("#", intval)
await asyncio.sleep(intval)
loop = asyncio.get_event_loop()
for sig in {signal.SIGINT, signal.SIGTERM}:
loop.add_signal_handler(
sig,
lambda s=sig: asyncio.create_task(shutdown(loop, sig=s)))
workers = [loop.create_task(worker(i)) for i in range(5)] # this range
loop.run_forever()
Apologies for my poor phrasing but here goes.
I need to execute a function every thirty minutes whilst other tasks are running however I have no idea how to do this or to phrase it into google. My goal is to modify my script so that it operates (without a UI) like the task manager program with background services, programs, utils, ect.
I have tried to create this by timing each function and creating functions that execute other functions however no matter what I try it operates in an asynchronous fashion like any script would.
An example of this would include the following.
def function_1():
"""Perform operations"""
pass
def function_2():
"""Perform operations"""
pass
def executeAllFunctions():
function_1()
function_2()
How can I initialize function_1 as a background task whilst function_2 is executed in a normal manner?
There is an excellent answer here.
The main idea is to run an async coroutine in a forever loop inside a thread.
In your case, you have to define function one as a coroutine use a caller function to be in the thread and create a thread.
Sample example heavily inspired to the answer in the link but adapted to your question.
#asyncio.coroutine
def function_1():
while True:
do_stuff
yield from asyncio.sleep(1800)
def wrapper(loop):
asyncio.set_event_loop(loop)
loop.run_until_complete(function_1())
def function_2():
do_stuff
def launch():
loop = asyncio.get_event_loop()
t = threading.Thread(target=wrapper, args=(loop,)) # create the thread
t.start() # launch the thread
function_2()
t.exit() # when function_2 return
I have a function that will compute something then put it on a PDF. However, this process takes time. Because of this, we implement stop button for the user to stop the process.
I tried using thread.Event(), but the function I am working with is a non-looping function. This means that, I can't regularly check if event is set. (Hard coding the application by simply putting multiple checker will do but nope -- that idea is not accepted).
def generate_data(self, event):
self.thread = threading.Thread(target=self.check_event)
self.thread.start()
...
def check_event(self):
while True:
if self.event.is_set():
self.controller.enable_run_btn_and_tab()
return
time.sleep(1)
My idea is to create another thread that will regularly check if event is set so I can return and exit the function. However, the above code for def check_event(self) will only exit the child thread, not the thread for generate_date(self).
Is there some code or modification in my code to stop the main thread using child thread?
Is it possible to use Python3 asyncio package with Boost.Python library?
I have CPython C++ extension that builds with Boost.Python. And functions that are written in C++ can work really long time. I want to use asyncio to call these functions but res = await cpp_function() code doesn't work.
What happens when cpp_function is called inside coroutine?
How not get blocked by calling C++ function that works very long time?
NOTE: C++ doesn't do some I/O operations, just calculations.
What happens when cpp_function is called inside coroutine?
If you call long-running Python/C function inside any of your coroutines, it freezes your event loop (freezes all coroutines everywhere).
You should avoid this situation.
How not get blocked by calling C++ function that works very long time
You should use run_in_executor to run you function in separate thread or process. run_in_executor returns coroutine that you can await.
You'll probably need ProcessPoolExecutor because of GIL (I'm not sure if ThreadPoolExecutor is option in your situation, but I advice you to check it).
Here's example of awaiting long-running code:
import asyncio
from concurrent.futures import ProcessPoolExecutor
import time
def blocking_function():
# Function with long-running C/Python code.
time.sleep(3)
return True
async def main():
# Await of executing in other process,
# it doesn't block your event loop:
loop = asyncio.get_event_loop()
res = await loop.run_in_executor(executor, blocking_function)
if __name__ == '__main__':
executor = ProcessPoolExecutor(max_workers=1) # Prepare your executor somewhere.
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
try:
loop.run_until_complete(main())
finally:
loop.run_until_complete(loop.shutdown_asyncgens())
loop.close()
Python 3.4, I'm trying to make a server using the websockets module (I was previously using regular sockets but wanted to make a javascript client) when I ran into an issue (because it expects async, at least if the examples are to be trusted, which I didn't use before). Threading simply does not work. If I run the following code, bar will never be printed, whereas if I comment out the line with yield from, it works as expected. So yield is probably doing something I don't quite understand, but why is it never even executed? Should I install python 3.5?
import threading
class SampleThread(threading.Thread):
def __init__(self):
super(SampleThread, self).__init__()
print("foo")
def run(self):
print("bar")
yield from var2
thread = SampleThread()
thread.start()
This is not the correct way to handle multithreading. run is neither a generator nor a coroutine. It should be noted that the asyncio event loop is only defined for the main thread. Any call to asyncio.get_event_loop() in a new thread (without first setting it with asyncio.set_event_loop() will throw an exception.
Before looking at running the event loop in a new thread, you should first analyze to see if you really need the event loop running in its own thread. It has a built-in thread pool executor at: loop.run_in_executor(). This will take a pool from concurrent.futures (either a ThreadPoolExecutor or a ProcessPoolExecutor) and provides a non-blocking way of running processes and threads directly from the loop object. As such, these can be await-ed (with Python3.5 syntax)
That being said, if you want to run your event loop from another thread, you can do it thustly:
import asyncio
class LoopThread(threading.Thread):
def __init__(self):
self.loop = asyncio.new_event_loop()
def run():
ayncio.set_event_loop(self.loop)
self.loop.run_forever()
def stop():
self.loop.call_soon_threadsafe(self.loop.stop)
From here, you still need to device a thread-safe way of creating tasks, etc. Some of the code in this thread is usable, although I did not have a lot of success with it: python asyncio, how to create and cancel tasks from another thread