how to write a simple Python Multi-thread task - multithreading

I have the below simple functions:
import time
def foo():
print(f'i am working come back in 5mins')
time.sleep(300)
def boo():
print(f' boo!')
def what_ever_function():
print(f'do whatever function user input at run time.')
What I wish to do is execute foo() and then immediately execute boo() or what_ever_function() without having to wait for 300 seconds for foo() to finish.
Imagine a workflow in Ipython:
>>> foo()
i am working come back in 5mins
>>> boo()
boo!
The idea is after execute foo(), I can use the user-prompt to run another function immediately; whatever function that may be; without having to wait 300 seconds for foo() to finish.
I already tried googleing:
https://docs.python.org/3/library/asyncio.html
and
https://docs.python.org/3/library/threading.html#
But still couldn't achieve the above task.
Any pointer or help please?
Thanks

If you use asyncio, you should use asyncio.sleep instead of time.sleep because it would block the asycio event loop. here is a working example:
import asyncio
async def foo():
print("Waiting...")
await asyncio.sleep(5)
print("Done waiting!")
async def bar():
print("Hello, world!")
async def main():
t1 = asyncio.create_task(foo())
await asyncio.sleep(1)
t2 = asyncio.create_task(bar())
await t1, t2
if __name__ == "__main__":
asyncio.run(main())
In this example, foo and bar run concurrently: bar does execute while foo also do.

Related

Python threading not running 2 processes at once

I have 2 functions that both have a while True loop in them and when I try running both at them at once, only the first one runs.
I have tried doing this with threading.Thread(target=hello()).start() as well as with multiprocessing.Process(target=hello()).start() and none worked.
import threading
def hello():
while True:
print("hello")
def world():
while True:
print("world")
threading.Thread(target=hello()).start()
threading.Thread(target=world()).start()
This happens because of CPython's Global Interpreter Lock . When you run your first thread, the second one is blocked and won't start running until the first is finished. In your case, the first thread has an infinite loop inside, so the second one is never awoken
What I'd suggest you is to use asyncio. It'll do the right work for you:
# python3.7+
import asyncio
async def hello():
while True:
print("Hello")
await asyncio.sleep(1)
async def world():
while True:
print("World")
await asyncio.sleep(1)
async def main():
await asyncio.gather(hello(), world())
asyncio.run(main())
Note asyncio.sleep(1) call in each coroutine. It's done to pass the control to the event loop and let another coroutine to be executed.

How to call asynchronous functions without expecting returns from them?

In the code below I'd like to call task1 and task2 but WITHOUT expecting returns from these methods, is it possible?
import asyncio
async def say(something, delay):
await asyncio.sleep(delay)
print(something)
loop = asyncio.get_event_loop()
task1 = loop.create_task(say('hi', 1))
task2 = loop.create_task(say('hoi', 2))
loop.run_until_complete(asyncio.gather(task1, task2))
I would like to process something from a queue that gets to the main in a while loop, without waiting, because I do not need to return the functions, for example, a pseudo code:
import asyncio
async def say(something, delay):
await asyncio.sleep(delay)
print(something)
def main():
while True:
# search for database news
# call say asynchronous, but I do not need any return, I just want you to do anything, independent
time.sleep(1)
If I understood you correctly, what you want you already have when you create task. Created task will be executed "in background": you don't have to await for it.
import asyncio
async def say(something, delay):
await asyncio.sleep(delay)
print(something)
async def main():
# run tasks without awaiting for their results
for i in range(5):
asyncio.create_task(say(i, i))
# do something while tasks running "in background"
while True:
print('Do something different')
await asyncio.sleep(1)
asyncio.run(main())
Result:
Do something different
0
Do something different
1
2
Do something different
3
Do something different
4
Do something different
Do something different
Do something different
Do something different

ipyparallel parallel function calls example in Jupyter Lab

I'm finding it difficult to figure out how to use ipyparallel from jupyter lab to execute two functions in parallel. Could someone please give me an example of how this should be done? For example, running these two functions at the same time:
import time
def foo():
print('foo')
time.sleep(5)
def bar():
print('bar')
time.sleep(10)
So first you will need to ensure that ipyparallel is installed and an ipycluster is running - instructions here.
Once you have done that, here is some adapted code that will run your two functions in parallel:
from ipyparallel import Client
rc = Client()
def foo():
import time
time.sleep(5)
return 'foo'
def bar():
import time
time.sleep(10)
return 'bar'
res1 = rc[0].apply(foo)
res2 = rc[1].apply(bar)
results = [res1, res2]
while not all(map(lambda ar: ar.ready(), results)):
pass
print(res1.get(), res2.get())
N.B. I removed the print statements as you can't call back from the child process into the parent Jupyter session in order to print, but we can of course return a result - I block here until both results are completed, but you could instead print the results as they became available

Calling coroutine and getting future in asyncio.Protocol.data_received()?

I need to get the future result inside asyncio loop, it is similar to Calling a coroutine from asyncio.Protocol.data_received
But asyncio in PY35 and PY34 are completely different, here are the code which could run correctly in PY34, but in PY35 it will pause at yield from and never return.
# PY34
class RelayClient(asyncio.Protocol):
pass
class Server(asyncio.Protocol):
def data_received(self, data):
# I need to connect to another address, and get future result at current function.
# Also I could not run loop.run_until_complete().
loop = asyncio.get_event_loop()
result = yield from loop.create_connection(RelayClient, 'www.google.com', 443)
do_some_thing_with_result(result)
So, how to do this in python 3.5?
Any advice is appreciated.
You cannot await a coroutine from a function that is not a coroutine. data_received is not a coroutine, so as was mentioned in the comments, you need to use the ensure_future helper to create a "background" task from your coroutine.
No need to start using callbacks however:
async def do_stuff(data):
result = await loop.create_connection(RelayClient, 'www.google.com', 443)
await do_some_thing_with_result(result)
class Server(asyncio.Protocol):
def data_received(self, data):
asyncio.ensure_future(do_stuff(data))
I would point out however, that asyncio gives no garanties whatsoever that data_received will be called with the complete data you are expecting. Usually the pattern you see in a Protocol looks a lot like this:
async def process_line(line):
...
class Server(asyncio.Protocol):
def __init__(self):
self.buffer = b''
def data_received(self, data):
self.buffer += data
if b'\n' not in self.buffer:
return
line, self.buffer = self.buffer.split(b'\n')
fut = asyncio.ensure_future(process_line(line))
fut.add_done_callback(self._handle_exception)
def _handle_exception(self, fut):
if fut.exception() is not None:
print('Processing failed', fut.exception())
(this is just an example, it copies the buffer way too much and would be very inefficient in most production use-cases)

Test calling a python coroutine (async def) from a regular function

Let's say I have some asyncio coroutine which fetches some data and returns it. Like this:
async def fetch_data(*args):
result = await some_io()
return result
Basically this coroutine is called from the chain of coroutines, and initial coroutine is runned by creating a task.
But what if for test purposes I want to run only one coroutine this way just when running some file:
if __name__ == '__main__':
result = await fetch_data(*args)
print(result)
And obviously I can't do this since I'm trying to run and await coroutine from not coroutine function.
So the question is: is there some correct way to get data from coroutine without calling it function?
I can make some Future object for result and await it, but maybe there are some other more simple and clearer ways?
You will need to create an event loop to run your coroutine:
import asyncio
async def async_func():
return "hello"
loop = asyncio.get_event_loop()
result = loop.run_until_complete(async_func())
loop.close()
print(result)
Or as a function:
def run_coroutine(f, *args, **kwargs):
loop = asyncio.get_event_loop()
result = loop.run_until_complete(f(*args, **kwargs))
loop.close()
return result
Use like this:
print(run_coroutine(async_func))
Or:
assert "expected" == run_coroutine(fetch_data, "param1", param2="foo")
For Python 3.7+, you may use asyncio.run():
import asyncio
async def fetch_data():
return "sample_data"
asyncio.run(fetch_data())

Resources