Python threading not running 2 processes at once - multithreading

I have 2 functions that both have a while True loop in them and when I try running both at them at once, only the first one runs.
I have tried doing this with threading.Thread(target=hello()).start() as well as with multiprocessing.Process(target=hello()).start() and none worked.
import threading
def hello():
while True:
print("hello")
def world():
while True:
print("world")
threading.Thread(target=hello()).start()
threading.Thread(target=world()).start()

This happens because of CPython's Global Interpreter Lock . When you run your first thread, the second one is blocked and won't start running until the first is finished. In your case, the first thread has an infinite loop inside, so the second one is never awoken
What I'd suggest you is to use asyncio. It'll do the right work for you:
# python3.7+
import asyncio
async def hello():
while True:
print("Hello")
await asyncio.sleep(1)
async def world():
while True:
print("World")
await asyncio.sleep(1)
async def main():
await asyncio.gather(hello(), world())
asyncio.run(main())
Note asyncio.sleep(1) call in each coroutine. It's done to pass the control to the event loop and let another coroutine to be executed.

Related

how to write a simple Python Multi-thread task

I have the below simple functions:
import time
def foo():
print(f'i am working come back in 5mins')
time.sleep(300)
def boo():
print(f' boo!')
def what_ever_function():
print(f'do whatever function user input at run time.')
What I wish to do is execute foo() and then immediately execute boo() or what_ever_function() without having to wait for 300 seconds for foo() to finish.
Imagine a workflow in Ipython:
>>> foo()
i am working come back in 5mins
>>> boo()
boo!
The idea is after execute foo(), I can use the user-prompt to run another function immediately; whatever function that may be; without having to wait 300 seconds for foo() to finish.
I already tried googleing:
https://docs.python.org/3/library/asyncio.html
and
https://docs.python.org/3/library/threading.html#
But still couldn't achieve the above task.
Any pointer or help please?
Thanks
If you use asyncio, you should use asyncio.sleep instead of time.sleep because it would block the asycio event loop. here is a working example:
import asyncio
async def foo():
print("Waiting...")
await asyncio.sleep(5)
print("Done waiting!")
async def bar():
print("Hello, world!")
async def main():
t1 = asyncio.create_task(foo())
await asyncio.sleep(1)
t2 = asyncio.create_task(bar())
await t1, t2
if __name__ == "__main__":
asyncio.run(main())
In this example, foo and bar run concurrently: bar does execute while foo also do.

How to call asynchronous functions without expecting returns from them?

In the code below I'd like to call task1 and task2 but WITHOUT expecting returns from these methods, is it possible?
import asyncio
async def say(something, delay):
await asyncio.sleep(delay)
print(something)
loop = asyncio.get_event_loop()
task1 = loop.create_task(say('hi', 1))
task2 = loop.create_task(say('hoi', 2))
loop.run_until_complete(asyncio.gather(task1, task2))
I would like to process something from a queue that gets to the main in a while loop, without waiting, because I do not need to return the functions, for example, a pseudo code:
import asyncio
async def say(something, delay):
await asyncio.sleep(delay)
print(something)
def main():
while True:
# search for database news
# call say asynchronous, but I do not need any return, I just want you to do anything, independent
time.sleep(1)
If I understood you correctly, what you want you already have when you create task. Created task will be executed "in background": you don't have to await for it.
import asyncio
async def say(something, delay):
await asyncio.sleep(delay)
print(something)
async def main():
# run tasks without awaiting for their results
for i in range(5):
asyncio.create_task(say(i, i))
# do something while tasks running "in background"
while True:
print('Do something different')
await asyncio.sleep(1)
asyncio.run(main())
Result:
Do something different
0
Do something different
1
2
Do something different
3
Do something different
4
Do something different
Do something different
Do something different
Do something different

Python Asyncio: RuntimeEror: This eventloop is already running

I am working on a ayncio module and having issues in terminating program. I am running my program in terminal and Ctrl + C is not working to stop the running program.However, if I close the terminal and try to run program again, I get this issue :
INFO:root:In main
ERROR:root:This event loop is already running
Below is my sample code for understanding.
# all_tasks.py
import asyncio
import logging
# module imports
import settings
#configure logging into a file
logging.basicConfig(filename=settings.LOG_FILENAME,level=logging.DEBUG)
class AsyncTest(object):
async def taskOne(self):
while True:
print("Task One") # Print is just an example, I am doing lot of stuff inside.
await asyncio.sleep(60)
async def taskTwo(self):
while True:
print("Task Two") # Print is just an example, I am doing lot of stuff inside.
await asyncio.sleep(60)
async def main(self):
try:
loop = asyncio.get_event_loop()
tasks = [
asyncio.ensure_future(self.taskOne()),
asyncio.ensure_future(self.taskTwo()),
]
loop.run_until_complete(asyncio.wait(tasks))
except RuntimeError as error:
logging.info("In main")
logging.error(error)
if __name__ == '__main__':
asynctest = AsyncTest()
asyncio.run(asynctest.main())
Config: Windows 10, python 3.7.0
File Name: all_tasks.py
Command: python all_tasks.py
Any help is much appreciated.
Thanks
asyncio.run creates and runs event loop. You shouldn't create and run one, especially inside a coroutine (function defined with async def). In a coroutine you should only await for something.
Modify the code accordingly:
# ...
async def main(self):
tasks = [
asyncio.ensure_future(self.taskOne()),
asyncio.ensure_future(self.taskTwo()),
]
await asyncio.wait(tasks)
if __name__ == '__main__':
asynctest = AsyncTest()
asyncio.run(asynctest.main())
It'll work.

Calling coroutine and getting future in asyncio.Protocol.data_received()?

I need to get the future result inside asyncio loop, it is similar to Calling a coroutine from asyncio.Protocol.data_received
But asyncio in PY35 and PY34 are completely different, here are the code which could run correctly in PY34, but in PY35 it will pause at yield from and never return.
# PY34
class RelayClient(asyncio.Protocol):
pass
class Server(asyncio.Protocol):
def data_received(self, data):
# I need to connect to another address, and get future result at current function.
# Also I could not run loop.run_until_complete().
loop = asyncio.get_event_loop()
result = yield from loop.create_connection(RelayClient, 'www.google.com', 443)
do_some_thing_with_result(result)
So, how to do this in python 3.5?
Any advice is appreciated.
You cannot await a coroutine from a function that is not a coroutine. data_received is not a coroutine, so as was mentioned in the comments, you need to use the ensure_future helper to create a "background" task from your coroutine.
No need to start using callbacks however:
async def do_stuff(data):
result = await loop.create_connection(RelayClient, 'www.google.com', 443)
await do_some_thing_with_result(result)
class Server(asyncio.Protocol):
def data_received(self, data):
asyncio.ensure_future(do_stuff(data))
I would point out however, that asyncio gives no garanties whatsoever that data_received will be called with the complete data you are expecting. Usually the pattern you see in a Protocol looks a lot like this:
async def process_line(line):
...
class Server(asyncio.Protocol):
def __init__(self):
self.buffer = b''
def data_received(self, data):
self.buffer += data
if b'\n' not in self.buffer:
return
line, self.buffer = self.buffer.split(b'\n')
fut = asyncio.ensure_future(process_line(line))
fut.add_done_callback(self._handle_exception)
def _handle_exception(self, fut):
if fut.exception() is not None:
print('Processing failed', fut.exception())
(this is just an example, it copies the buffer way too much and would be very inefficient in most production use-cases)

Multi-threaded asyncio in Python

I'm currently doing my first steps with asyncio in Python 3.5 and there is one problem that's bugging me. Obviously I haven't fully understood coroutines...
Here is a simplified version of what I'm doing.
In my class I have an open() method that creates a new thread. Within that thread I create a new event loop and a socket connection to some host. Then I let the loop run forever.
def open(self):
# create thread
self.thread = threading.Thread(target=self._thread)
self.thread.start()
# wait for connection
while self.protocol is None:
time.sleep(0.1)
def _thread(self):
# create loop, connection and run forever
self.loop = asyncio.new_event_loop()
coro = self.loop.create_connection(lambda: MyProtocol(self.loop),
'somehost.com', 1234)
self.loop.run_until_complete(coro)
self.loop.run_forever()
Stopping the connection is now quite simple, I just stop the loop from the main thread:
loop.call_soon_threadsafe(loop.stop)
Unfortunately I need to do some cleanup, especially I need to empty a queue before disconnecting from the server. So I tried something like this stop() method in MyProtocol:
class MyProtocol(asyncio.Protocol):
def __init__(self, loop):
self._loop = loop
self._queue = []
async def stop(self):
# wait for all queues to empty
while self._queue:
await asyncio.sleep(0.1)
# disconnect
self.close()
self._loop.stop()
The queue gets emptied from within the protocol's data_received() method, so I just want to wait for that to happen using the while loop with the asyncio.sleep() call. Afterwards I close the connection and stop the loop.
But how do I call this method from the main thread and wait for it?
I tried the following, but none of them seem to work (protocol is the currently used instance of MyProtocol):
loop.call_soon_threadsafe(protocol.stop)
loop.call_soon_threadsafe(functools.partial(asyncio.ensure_future, protocol.stop(), loop=loop))
asyncio.ensure_future(protocol.stop(), loop=loop)
Can anyone please help me here? Thanks!
Basically you want to schedule coroutine on loop of different thread. You could use run_coroutine_threadsafe:
future = asyncio.run_coroutine_threadsafe(protocol.stop, loop=loop)
future.result() # wait for results
Or the old style async like in https://stackoverflow.com/a/32084907/681044
import asyncio
from threading import Thread
loop = asyncio.new_event_loop()
def f(loop):
asyncio.set_event_loop(loop)
loop.run_forever()
t = Thread(target=f, args=(loop,))
t.start()
#asyncio.coroutine
def g():
yield from asyncio.sleep(1)
print('Hello, world!')
loop.call_soon_threadsafe(asyncio.async, g())

Resources