Thread is blocked in django - Python - python-3.x

the last 4 hours I have been trying to understand threading with django. Nothing seems to work. I want to let the website run in the foreground and let the backend communicate with some other devices on a thread. I want the thread to start at the startup of the website but the program is stuck when I call the thread until the thread comes to an end.
Do you know a way to fix it? Please I need help.
The urls.py file
def add(x, y):
i=0
while i < 100000000:
x += y
i += 1
def postpone(function):
t = threading.Thread(target=function, args=(1,))
t.setDaemon(True)
t.start()
return 0
print("Before thread")
postpone(add(4,4))
print("After thread")
The server will not start until the while loop is finished.
Thanks for reading, I hope someone knows an answer.

add function is called before the thread started, you need to pass add as reference though.
# decomposition
# first, add gets called
r = add(4,4)
# then the result is passed to func `postpone`
postpone(r)
# postpone accept a function and args, which eventually get passed to the function
def postpone(function, *args):
t = threading.Thread(target=function, args=args)
t.setDaemon(True)
t.start()
return 0
print("Before thread")
# pass func as a reference, also send args to the postpone func also
postpone(add, 4,4)
print("After thread")

Related

Python Asyncio and Multithreading

I have created a greatly simplified version of an application below that intends to use Python's asyncio and threading modules. The general structure is as follows:
import asyncio
import threading
class Node:
def __init__(self, loop):
self.loop = loop
self.tasks = set()
async def computation(self, x):
print("Node: computation called with input ", x)
await asyncio.sleep(1)
def schedule_computation(self, x):
print("Node: schedule_computation called with input ", x)
task = self.loop.create_task(self.computation(x))
self.tasks.add(task)
class Router:
def __init__(self, loop):
self.loop = loop
self.nodes = {}
def register_node(self, id):
self.nodes[id] = Node(self.loop)
def schedule_computation(self, node_id, x):
print("Router: schedule_computation called with input ", x)
self.nodes[node_id].schedule_computation(x)
class Client:
def __init__(self, router):
self.router = router
self.counter = 0
def run(self):
while True:
if self.counter == 1000000:
self.router.schedule_computation(1, 5)
self.counter += 1
def main():
loop = asyncio.get_event_loop()
# construct Router instance and register a node
router = Router(loop)
router.register_node(1)
# construct Client instance
client = Client(router)
client_thread = threading.Thread(target=client.run)
client_thread.start()
loop.run_forever()
main()
In practice the Node.computation method is doing some network I/O and thus I'd like to perform said work asynchronously. The Client.run method is synchronous and blocking and I'd like to give this function it's own thread to execute in (in fact I'd like the ability to run this method in a separate process if possible).
Upon executing this application we get the following output:
Router: schedule_computation called with input 5
Node: schedule_computation called with input 5
However, I expect that "Node: computation called with input 5" should print as well because the Node.schedule_computation method creates a task to run on loop. In summary, why does it seem that Node.computation is never scheduled?
Use loop.call_soon_threadsafe
In general, asyncio isn't thread safe
Almost all asyncio objects are not thread safe, which is typically not
a problem unless there is code that works with them from outside of a
Task or a callback. If there’s a need for such code to call a
low-level asyncio API, the loop.call_soon_threadsafe() method should
be used
https://docs.python.org/3/library/asyncio-dev.html#concurrency-and-multithreading
SCHEDULE COMPUTATION
loop.call_soon_threadsafe(self.nodes[node_id].schedule_computation,x)
Node.computation runs on main thread
Not sure if you are aware, but even though you can use call_soon_threadsafe to initiate a coroutine from another thread. The coroutine always runs in the thread the loop was created in. If you want to run coroutines on another thread, then your background thread will need its own EventLoop also.

Dart: Store heavy object in an isolate and access its method from main isolate without reinstatiating it

is it possible in Dart to instantiate a class in an isolate, and then send message to this isolate to receive a return value from its methods (instead of spawning a new isolate and re instantiate the same class every time)? I have a class with a long initialization, and heavy methods. I want to initialize it once and then access its methods without compromising the performance of the main isolate.
Edit: I mistakenly answered this question thinking python rather than dart. snakes on the brain / snakes on a plane
I am not familiar with dart programming, but it would seem the concurrency model has a lot of similarities (isolated memory, message passing, etc..). I was able to find an example of 2 way message passing with a dart isolate. There's a little difference in how it gets set-up, and the streams are a bit simpler than python Queue's, but in general the idea is the same.
Basically:
Create a port to receive data from the isolate
Create the isolate passing it the port it will send data back on
Within the isolate, create the port it will listen on, and send the other end of it back to main (so main can send messages)
Determine and implement a simple messaging protocol for remote method call on an object contained within the isolate.
This is basically duplicating what a multiprocessing.Manager class does, however it can be helpful to have a simplified example of how it can work:
from multiprocessing import Process, Lock, Queue
from time import sleep
class HeavyObject:
def __init__(self, x):
self._x = x
sleep(5) #heavy init
def heavy_method(self, y):
sleep(.2) #medium weight method
return self._x + y
def HO_server(in_q, out_q):
ho = HeavyObject(5)
#msg format for remote method call: ("method_name", (arg1, arg2, ...), {"kwarg1": 1, "kwarg2": 2, ...})
#pass None to exit worker cleanly
for msg in iter(in_q.get, None): #get a remote call message from the queue
out_q.put(getattr(ho, msg[0])(*msg[1], **msg[2])) #call the method with the args, and put the result back on the queue
class RMC_helper: #remote method caller for convienience
def __init__(self, in_queue, out_queue, lock):
self.in_q = in_queue
self.out_q = out_queue
self.l = lock
self.method = None
def __call__(self, *args, **kwargs):
if self.method is None:
raise Exception("no method to call")
with self.l: #isolate access to queue so results don't pile up and get popped off in possibly wrong order
print("put to queue: ", (self.method, args, kwargs))
self.in_q.put((self.method, args, kwargs))
return self.out_q.get()
def __getattr__(self, name):
if not name.startswith("__"):
self.method = name
return self
else:
super().__getattr__(name)
def child_worker(remote):
print("child", remote.heavy_method(5)) #prints 10
sleep(3) #child works on something else
print("child", remote.heavy_method(2)) #prints 7
if __name__ == "__main__":
in_queue = Queue()
out_queue = Queue()
lock = Lock() #lock is used as to not confuse which reply goes to which request
remote = RMC_helper(in_queue, out_queue, lock)
Server = Process(target=HO_server, args=(in_queue, out_queue))
Server.start()
Worker = Process(target=child_worker, args=(remote, ))
Worker.start()
print("main", remote.heavy_method(3)) #this will *probably* start first due to startup time of child
Worker.join()
with lock:
in_queue.put(None)
Server.join()
print("done")

How to have my defined refresh function running in the background of my twisted server

I have a simple twisted TCP server running absolutely fine, it basically deals with database requests and displays the right things its just an echo client with a bunch of functions, the database that is being read also updates I have this refresh function to open the database and refresh it however if I add this to the message functions it'll take too long to respond as the refresh function takes around 6/7 seconds to complete, my initial idea was to have this function in a while loop and running constantly refreshing every 5/10 mins but after reading about the global interpreter lock its made me think that that isn't possible, any suggestions on how to run this function in the background of my code would be greatly appreciated
I've tried having it in a thread but it doesn't seem to run at all when I start the thread, I put it under the if name == 'main': function and no luck!
Here is my refresh function
def refreshit()
Application = win32com.client.Dispatch("Excel.Application")
Workbook = Application.Workbooks.open(database)
Workbook.RefreshAll()
Workbook.Save()
Application.Quit()
xlsx = pd.ExcelFile(database)
global datess
global refss
df = pd.read_excel(xlsx, sheet_name='Sheet1')
datess = df.groupby('documentDate')
refss = df.groupby('reference')
class Echo(Protocol):
global Picked_DFS
Picked_DFS = None
label = None
global errors
global picked
errors = []
picked = []
def dataReceived(self, data):
"""
As soon as any data is received, write it back.
"""
response = self.handle_message(data)
print('responding with this')
print(response)
self.transport.write(response)
def main():
f = Factory()
f.protocol = Echo
reactor.listenTCP(8000, f)
reactor.run()
if __name__ == '__main__':
main()
I had tried this to no avail
if __name__ == '__main__':
main()
thread = Thread(target = refreshit())
thread.start()
thread.join()
You have an important error on this line:
thread = Thread(target = refreshit())
Though you have not included the definition of refreshit (perhaps a function to consider renaming), I assume refreshit is a function that performs your refresh.
In this case, what you are doing here is calling refreshit and waiting for it to return a value. Then, the value it returns is used as the target of the Thread you create here. This is probably not what you meant. Instead:
thread = Thread(target = refreshit)
That is, refreshit itself is what you want the target of the thread to be.
You also need to be sure to sequence your operations so that everything gets to run concurrently:
if __name__ == '__main__':
# Start your worker/background thread.
thread = Thread(target = refreshit)
thread.start()
# Run Twisted
main()
# Cleanup/wait on your worker/background thread.
thread.join()
You may also just want to use Twisted's thread support instead of using the threading module directly (but this is not mandatory).
if __name__ == '__main__':
# Start your worker/background thread.
thread = Thread(target = refreshit)
thread.start()
# Run Twisted
main()
# Cleanup/wait on your worker/background thread.
thread.join()

How to log periodically a variable on background

I would like to log a variable (local scope) every second in python 3.5.
The call should be asyncronous (means the main thread should keep running).
I have tried the following code :
def functionX(self):
#do stuff
class BackgroundTimer(Thread):
def run(self):
while True:
print("sum_data ", sum)
sleep(60)
timer = BackgroundTimer()
timer.start()
#keep doing stuff
Thank you.

kivy- need to wait until the thread is finish

i have in my app thread that append something to list and then i want to print it in other screen, but, the program run the thread after the print and it is give me error that there is no thing in my list. i am need to stop the program until the thread done, how can i do this? i tried to use .join() but it is didnt work... thanks for help
my app:
class LoginScreen(GridLayout):
def __init__(self, **kwargs):
super(LoginScreen, self).__init__(**kwargs)
self.cols = 2
self.add_widget(Label(text='username'))
self.username = TextInput(multiline=False)
self.add_widget(self.username)
self.add_widget(Label(text='Password'))
self.password = TextInput(multiline=False, password=True)
self.add_widget(self.password)
self.submit_button = Button(text='sumbit',size_hint=(.5,
.25),font_size=20)
self.submit_button.bind(on_press=self.submit_username)
self.add_widget(self.submit_button)
def submit_username(self, *args):
self.msg=threading.Thread(target=send_data(self.username.text))
self.msg.start()
self.msg.join()
sm.current = 'searchi'
sm.transition.direction = 'left'
def send_data(name):
my_socket = socket.socket()
my_socket.connect(('127.0.0.1', 8093))
my_socket.send(name.encode('utf-8'))
name,address = my_socket.recvfrom(1024)
msg = name.decode('utf-8')
alist.append(msg)
my_socket.close()
# Declare both screens
class Searchi(Screen):
def __init__(self, **kwargs):
super(Searchi, self).__init__(**kwargs)
self.add_widget(Label(text=alist[0]))
the list have to "msg" if i am delete the line"self.add_widget(Label(text=alist[0]))" then it is no problem in the recv line. i just need to wait until the thread is finish.
Why bother with a Thread if you are going to wait for it anyway? You could just call send_data(self.username.text) and be done with it.
But doing this is usually bad practice (doing blocking calls without a thread, or waiting for the thread to finish in a blocking way, which is equivalent), what you want, instead of waiting for the task to be done before proceeding, is to react to the task being done, that is, at the end of your thread, do something that will allow your app to proceed.
you could have a callback to move your user to the new screen, called at the end of the thread.
def submit_username(self, *args):
def callback():
sm.current = 'searchi'
sm.transition.direction = 'left'
threading.Thread(target=send_data(self.username.text, callback).start()
def send_data(name, callback):
my_socket = socket.socket()
my_socket.connect(('127.0.0.1', 8093))
my_socket.send(name.encode('utf-8'))
name,address = my_socket.recvfrom(1024)
msg = name.decode('utf-8')
alist.append(msg)
my_socket.close()
callback()
if you want your user to touch anything while the action happens, i would advise putting a Popup with auto_dismiss=False, and a content indicating that data is being processed, and to close it in the callback.

Resources