I have approximately the following code
import asyncio
.
.
.
async def query_loop()
while connected:
result = await asyncio.gather(get_value1, get_value2, get_value3)
if True in result:
connected = False
async def main():
await query_loop()
asyncio.run(main())
The get_value - functions query a device, receive values, and publish them to a server. If no problems occur they return False, else True.
Now I need to implement, that the get_value2-function checks if it received the value 7. In this case I need the program to wait for 3 min before sending a special command to the device. But in the mean time, and also afterwards the query_loop should continue.
Has anybody an idea how to do that ?
thanks in advance!
If I understand you correctly, you want to modify get_value2 so that it reacts to a value received from device by spawning additional work in the background, i.e. do something without the loop in query_loop having to wait for that new work to finish.
You can use asyncio.create_task() to spawn a background task. In fact, you can always combine create_task() and await to runs things in the background; asyncio.gather is just a utility function that does it for you. In this case query_loop remains unchanged, and get_value2 gets modified like this:
async def get_value2():
...
value = await receive_value_from_device()
if value == 7:
# schedule send_command() to run, but don't wait for it
asyncio.create_task(special_command())
...
return False
async def special_command():
await asyncio.sleep(180)
await send_command_to_device(...)
Note that if get_value1 and others are async functions, the correct invocation of gather must call them, so it should be await asyncio.gather(get_value1(), get_value2(), get_value3()) (note the extra parentheses).
Related
My code has 2 functions:
async def blabla():
sleep(5)
And
async def blublu():
sleep(2)
asyncio.wait_for as I know can wait for one function like this:
asyncio.wait_for(blublu(), timeout=6) or asyncio.wait_for(blublu(), timeout=6)
What I wan't to do, is to make asyncio wait for both of them, and if one of them ends faster, proceed without waiting for the second one.
Is it possible to make so?
Edit: timeout is needed
Use asyncio.wait with the return_when kwarg:
# directly passing coroutine objects in `asyncio.wait`
# is deprecated since py 3.8+, wrapping into a task
blabla_task = asyncio.create_task(blabla())
blublu_task = asyncio.create_task(blublu())
done, pending = await asyncio.wait(
{blabla_task, blublu_task},
return_when=asyncio.FIRST_COMPLETED
)
# do something with the `done` set
I'm noticing that when I spawn an asyncio task using create_task, it's first completing the rest of the logic rather than starting that task. I'm forced to add an await asyncio.sleep(0) to get the task started, which seems a bit hacky and unclean to me.
Here is some example code:
async def make_rpc_calls(...some args...)
val_1, val_2 = await asyncio.gather(rpc_call_1(...), rpc_call_2(...))
return process(val_1, val_2)
def some_very_cpu_intensive_function(...some args...):
// Does a lot of computation, can take 20 seconds to run
task_1 = asyncio.get_running_loop().create_task(make_rpc_calls(...))
intensive_result = some_very_cpu_intensive_function(...)
await task_1
process(intensive_result, task_1.result())
Anytime I run the above, it runs the some_very_cpu_intensive_function function before the kicking off the expensive RPCs. The only way I've gotten this to work is to do:
async def make_rpc_calls(...some args...)
val_1, val_2 = await asyncio.gather(rpc_call_1(...), rpc_call_2(...))
return process(val_1, val_2)
def some_very_cpu_intensive_function(...some args...):
// Does a lot of computation, can take 20 seconds to run
task_1 = asyncio.get_running_loop().create_task(make_rpc_calls(...))
await asyncio.sleep(0)
intensive_result = some_very_cpu_intensive_function(...)
await task_1
process(intensive_result, task_1.result())
This feels like a hack to me - I'm forcing the event loop to context switch, and doesn't feel like I'm using the asyncio framework correctly. Is there another way I should be approaching this?
sleep() always suspends the current task, allowing other tasks to run.
Setting the delay to 0 provides an optimized path to allow other tasks to run. This can be used by long-running functions to avoid blocking the event loop for the full duration of the function call.
Source: https://docs.python.org/3/library/asyncio-task.html
I have some code that runs multiple tasks in a loop like this:
done, running = await asyncio.wait(running, timeout=timeout_seconds,
return_when=asyncio.FIRST_COMPLETED)
I need to be able to determine which of these timed out. According to the documentation:
Note that this function does not raise asyncio.TimeoutError. Futures or Tasks that aren’t done when the timeout occurs are simply returned in the second set.
I could use wait_for() instead, but that function only accepts a single awaitable, whereas I need to specify multiple. Is there any way to determine which one from the set of awaitables I passed to wait() was responsible for the timeout?
Alternatively, is there a way to use wait_for() with multiple awaitables?
Your can try that tricks, probably it is not good solution:
import asyncio
async def foo():
return 42
async def need_some_sleep():
await asyncio.sleep(1000)
return 42
async def coro_wrapper(coro):
result = await asyncio.wait_for(coro(), timeout=10)
return result
loop = asyncio.get_event_loop()
done, running = loop.run_until_complete(asyncio.wait(
[coro_wrapper(foo), coro_wrapper(need_some_sleep)],
return_when=asyncio.FIRST_COMPLETED
)
)
for item in done:
print(item.result())
print(done, running)
Here is how I do it:
done, pending = await asyncio.wait({
asyncio.create_task(task, name=index)
for index, task in enumerate([
my_coroutine(),
my_coroutine(),
my_coroutine(),
])
},
return_when=asyncio.FIRST_COMPLETED
)
num = next(t.get_name() for t in done)
if num == 2:
pass
Use enumerate to name the tasks as they are created.
Hello there i am trying to use Async to take an input if given and continue if its not given cancel the task and use the default input.
state = 'routine'
async def start():
while True:
state = input('Enter state')
print(state)
async def main():
task = asyncio.Task(start())
await asyncio.sleep(5)
task.cancel()
print(state)
with suppress(asyncio.CancelledError):
await task
There is an infinite loop, user is given 5 seconds to provide an input and if user has not given any input use the default one (state).It's stuck on taking input.
input is a blocking function, so it must not be called from a coroutine. A hint that your start coroutine is not correctly written is that it doesn't await anything. Take a look at aioconsole for an async equivalent of input and others.
Also, create tasks using asyncio.create_task (or loop.create_task prior to Python 3.7), not by calling the Task constructor directly.
I'm trying to fetch some data from OpenSubtitles using asyncio and then download a file who's information is contained in that data. I want to fetch that data and download the file at the same time using asyncio.
The problem is that I want to wait for 1 task from the list tasks to finish before commencing with the rest of the tasks in the list or the download_tasks. The reason for this is that in self._perform_query() I am writing information to a file and in self._download_and_save_file() I am reading that same information from that file. So in other words, the download_tasks need to wait for at least one task in tasks to finish before starting.
I found out I can do that with asyncio.wait(return_when=FIRST_COMPLETED) but for some reason it is not working properly:
payloads = [create_payloads(entry) for entry in retreive(table_in_database)]
tasks = [asyncio.create_task(self._perform_query(payload, proxy)) for payload in payloads]
download_tasks = [asyncio.create_task(self._download_and_save_file(url, proxy) for url in url_list]
done, pending = await asyncio.wait(tasks, return_when=asyncio.FIRST_COMPLETED)
print(done)
print(len(done))
print(pending)
print(len(pending))
await asyncio.wait(download_tasks)
The output is completely different than expected. It seems that out of 3 tasks in the list tasks all 3 of them are being completed despite me passing asyncio.FIRST_COMPLETED. Why is this happening?
{<Task finished coro=<SubtitleDownloader._perform_query() done, defined at C:\Users\...\subtitles.py:71> result=None>, <Task finished coro=<SubtitleDownloader._perform_query() done, defined at C:\Users\...\subtitles.py:71> result=None>, <Task finished coro=<SubtitleDownloader._perform_query() done, defined at C:\Users\...\subtitles.py:71> result=None>}
3
set()
0
Exiting
As far as I can tell, the code in self._perform_query() shouldn't affect this problem. Here it is anyway just to make sure:
async def _perform_query(self, payload, proxy):
try:
query_result = proxy.SearchSubtitles(self.opensubs_token, [payload], {"limit": 25})
except Fault as e:
raise "A fault has occurred:\n{}".format(e)
except ProtocolError as e:
raise "A ProtocolError has occurred:\n{}".format(e)
else:
if query_result["status"] == "200 OK":
with open("dl_links.json", "w") as dl_links_json:
result = query_result["data"][0]
subtitle_name = result["SubFileName"]
download_link = result["SubDownloadLink"]
download_data = {"download link": download_link,
"file name": subtitle_name}
json.dump(download_data, dl_links_json)
else:
print("Wrong status code: {}".format(query_result["status"]))
For now, I've been testing this without running download_tasks but I have included it here for context. Maybe I am going about this problem in a completely wrong manner. If so, I would much appreciate your input!
Edit:
The problem was very simple as answered below. _perform_query wasn't an awaitable function, instead it ran synchronously. I changed that by editing the file writing part of _perform_query to be asynchronous with aiofiles:
def _perform_query(self, payload, proxy):
query_result = proxy.SearchSubtitles(self.opensubs_token, [payload], {"limit": 25})
if query_result["status"] == "200 OK":
async with aiofiles.open("dl_links.json", mode="w") as dl_links_json:
result = query_result["data"][0]
download_link = result["SubDownloadLink"]
await dl_links_json.write(download_link)
return_when=FIRST_COMPLETED doesn't guarantee that only a single task will complete. It guarantees that the wait will complete as soon as a task completes, but it is perfectly possible that other tasks complete "at the same time", which for asyncio means in the same iteration of the event loop. Consider, for example, the following code:
async def noop():
pass
async def main():
done, pending = await asyncio.wait(
[noop(), noop(), noop()], return_when=asyncio.FIRST_COMPLETED)
print(len(done), len(pending))
asyncio.run(main())
This prints 3 0, just like your code. Why?
asyncio.wait does two things: it submits the coroutines to the event loop, and it sets up callbacks to notify it when any of them is complete. However, the noop coroutine doesn't contain an await, so none of the calls to noop() suspends, each just does its thing and immediately returns. As a result, all three coroutine instances finish within the same pass of the event loop. wait is then informed that all three coroutines have finished, a fact it dutifully reports.
If you change noop to await a random sleep, e.g. change pass to await asyncio.sleep(0.1 * random.random()), you get the expected behavior. With an await the coroutines no longer complete at the same time, and wait will report the first one as soon as it detects it.
This reveals the true underlying issue with your code: _perform_query doesn't await. This indicates that you are not using an async underlying library, or that you are using it incorrectly. The call to SearchSubtitles likely simply blocks the event loop, which appears to work in trivial tests, but breaks essential asyncio features such as concurrent execution of tasks.