Python start multiprocessing without print/logging statements from processes - python-3.x

I am starting two processes via multiprocessing and this is working fine. The only problems which I have are the print and debug statements from these two processes.
The hope is, to use the REPL and start the processes, like in the background. However, I do not get this to run. I always get the debug statements and therefore can't use the REPL anymore. This is how I call the processes:
processes = [
Process(target=start_viewer, args=()),
Process(target=start_server, args=(live, amount, fg))
]
for p in processes:
p.start()
Any idea on how to "mute" the process, or get them in the background?

If I correct understand you, you want to not show printing from one of processes.
You can achieve this by redirect output of the Python Interpreter.
Add sys.stdout = open("/dev/null", 'w') to the process which you want to "mute".
Full working example below.
from multiprocessing import Process
from time import sleep
import sys
def start_viewer():
sys.stdout = open("/dev/null", 'w')
while True:
print("start_viewer")
sleep(1)
def start_server():
while True:
print("start_server")
sleep(1)
if __name__ == '__main__':
processes = [
Process(target=start_viewer, args=()),
Process(target=start_server, args=())
]
for p in processes:
p.start()
Be aware that /dev/null is like passing prints to nowhere, if you want to save it you can use text file. Also to achieve multi os support you should use os.devnull.

Related

Get data from process

Suppose I have a python script program.py
that looks as follows:
# program.py
import random
class Program
def __init__(self):
self.data = []
def run():
while True:
self.data.append(random.random()]
if __name__ == "__main__":
program = Program()
program.run()
Now suppose that I have another script script.py that calls program.py as a separate process.
# script.py
import subprocess
import time
process = subprocess.Popen(["py", "program.py"])
while True:
data = get_program_data(process)# how?
time.sleep(10)
The purpose of script.py is to illustrate the fact that I don't have access to the class Program. In my case this is due to the fact that I will be triggering program.py from a .NET application. I thought I'll try to understand how to deal with this problem from a python script first then apply it to the .NET application. So here is my question (keep in mind that I can alter the code in program.py and script.py, but script.py can not access program.py):
How should I go about accessing self.data from the process? I have been searching all over and I'm not quite sure in what direction I should be going. In my case, I will need to trigger difference commands for different ind of data generated in Program, i.e. get_program_data1(), get_program_data2(),....
The way I have been "solving" this issue is to have a file controller.json that script.py modifies and program.py reads and acts accordingly. I does not feel quite right doing and so I want your opinion about this. Remember that ultimately, script.py is a .NET application. Thanks

Printing from other thread when waiting for input()

I am trying to write a shell that needs to run socket connections on a seperate thread. On my testings, when print() is used while cmd.Cmd.cmdloop() waiting for input, the print is displaying wrong.
from core.shell import Shell
import time
import threading
def test(shell):
time.sleep(2)
shell.write('Doing test')
if __name__ == '__main__':
shell = Shell(None, None)
testThrd = threading.Thread(target=test, args=(shell,))
testThrd.start()
shell.cmdloop()
When the above command runs, here is what happens:
python test.py
Welcome to Test shell. Type help or ? to list commands.
>>asd
*** Unknown syntax: asd
>>[17:59:25] Doing test
As you can see, printing from another threads add output after prompt >> not in a new line. How can I do it so that it appears in a new line and prompt appears?
What you can do, is redirect stdout from your core.shell.Shell to a file like object such as StringIO. You would also redirect the output from your thread into a different file like object.
Now, you can have some third thread read both of these objects and print them out in whatever fashion you want.
You said core.shell.Shell inherits from cmd.Cmd, which allows redirection as a parameter to the constructor:
import io
import time
import threading
from core.shell import Shell
def test(output_obj):
time.sleep(2)
print('Doing test', file=output_obj)
cmd_output = io.StringIO()
thr_output = io.StringIO()
shell = Shell(stdout=cmd_output)
testThrd = threading.Thread(target=test, args=(thr_output,))
testThrd.start()
# in some other process/thread
cmd_line = cmd_output.readline()
thr_line = thr_output.readline()
That's quite difficult. Both your threads are sharing the same stdout. So the output from each of those threads are concurrently sent to your stdout buffer where they are printed in some arbitrary order.
What you need to do is coordinate the output from both threads, and that's a tough nut to crack. Even bash doesn't do that!
That said, maybe you can try using a lock to make sure your threads access stdout in a controlled manner. Check out: http://effbot.org/zone/thread-synchronization.htm

Python: Kill the parent process after creating child processes

I have a polling job written in Python that is executed every 15 minutes to check if the status of an entry in the job table is True. If the status is true then I need to take the values from the table and pass them as arguments to another script that executes something.
I am creating the child processes using Process in Python's Multiprocessing module but I am unable to exit the polling job(parent script) after starting these processes. The polling job keeps waiting until the children complete even if there is a sys.exit() written after creating the children.
#pollingjob.py
import sys
import multiprocessing
from multiprocessing import Process
from secondscript import secondscriptfunction
def createParallelBatches(a,b,c):
for i in [1,2,3]:
p1 = Process(target=secondscriptfunction,args=(a,b,c)).start()
sys.exit()
if __name__=='__main__':
# Check the table for *status=True* rows
# If such rows exit call CreateParallelBatches with the column values
What I am failing to understand is that, why sys.exit() won't let me exit the program leaving the spawned processes as orphans. I tried subprocess module but it also behaves in the same way. I don't want the parent process waiting on its children to complete. Any help would be appreciated. Thanks.
you need to launch an independent sub process externally. This is one way to do it :
put the secondscriptfunction into one executable python file
File runscript.py
import sys
from secondscript import secondscriptfunction
if __name__=='__main__':
secondscriptfunction(sys.argv[1:]) #passing arguments to the func
use subprocess.popen in your script:
File pollingjob.py
import subprocess
import shlex
def createParallelBatches(a,b,c):
for i in [1,2,3]:
command = "python runscript.py %s %s %s "%(a,b,c)
cmd_args = shlex.split(command)
subprocess.Popen(cmd_args)
sys.exit()
Just remove the process object from the _children set of the current process object, the the parent process will exit immediately.
Theh multiprocessing module manages child processes in a private set and join them when the current process exits. You can remove children from the set if you don't care of them.
process = multiprocessing.Process(target=proc_main)
multiprocessing.current_process()._children.discard(process)
exit(0)

Python multithreading from multiple files

This is a sample demo of how I want to use threading.
import threading
import time
def run():
print("started")
time.sleep(5)
print("ended")
thread = threading.Thread(target=run)
thread.start()
for i in range(4):
print("middle")
time.sleep(1)
How can I make this threading work demo even from multiple files?
Example:
# Main.py
import background
""" Here I will have a main program and \
I want the command from the background file to constantly run. \
Not just once at the start of the program """
The second file:
# background.py
while True:
print("This text should always be printing \
even when my code in the main function is running")
Put all the lines before your for loop in background.py. When it is imported it will start the thread running. Change the run method to do your infinite while loop.
You may also want to set daemon=True when starting the thread so it will exit when the main program exits.
main.py
import time
import background
for i in range(4):
print("middle")
time.sleep(1)
background.py
import threading
import time
def run():
while True:
print("background")
time.sleep(.5)
thread = threading.Thread(target=run,daemon=True)
thread.start()
Output
background
middle
background
middle
background
background
background
middle
background
background
middle
background
background

Closing Pipe in Python

import multiprocessing as mp
import time
"""
1. send item via pipe
2. Receive on the other end by a generator
3. if the pipe is closed on the sending side, retrieve
all item left and then quit.
"""
def foo(conn):
for i in range(7):
time.sleep(.3)
conn.send(i)
conn.close()
def bar(conn):
while True:
try:
yield conn.recv()
except EOFError:
break
if __name__ == '__main__':
"""Choose which start method is used"""
recv_conn, send_conn = mp.Pipe(False)
p = mp.Process(target = foo, args = (send_conn,)) # f can only send msg.
p.start()
# send_conn.close()
for i in bar(recv_conn):
print(i)
I'm using Python 3.4.1 on Ubuntu 14.04 and the code is not working. At the end of the program, there is no EOFError, which should terminates the code, although the Pipe has been closed. Closing the Pipe inside a function does not close the Pipe. Why is this the case?
Uncomment your send_conn.close() line. You should be closing pipe ends in processes that don't need them. The issue is that once you launch the subprocess, the kernel is tracking two open references to the send connection of the pipe: one in the parent process, one in your subprocess.
The send connection object is only being closed in your subprocess, leaving it open in the parent process, so your conn.recv() call won't raise EOFError. The pipe is still open.
This answer may be useful to you as well.
I verified that this code works in Python 2.7.6 if you uncomment the send_conn.close() call.

Resources