how to run and exit python scripts from another python program? - python-3.x

I wish to launch a python file when I send an on request and kill that python process once I send the off request . I am able to send the on and off requests but am not able to run the other python file or kill it from the program I have written.
I could make a subprocess call but I think there should be a way to call other python scripts inside a python script and also a way to kill those scripts once their purpose is fulfilled.

I suggest to use a thread.
Write all the code in your python script in a function doit (except import statements)
and then import it:
content of thescript.py:
import time
def doit(athread):
while not athread.stopped():
print("Hello World")
time.sleep(1)
your program should look like:
import threading
import time
import thescript
class FuncThread(threading.Thread):
def __init__(self, target):
self.target=target
super(FuncThread,self).__init__()
self._stop_event=threading.Event()
def stop(self):
self._stop_event.set()
def stopped(self):
return self._stop_event.is_set()
def run(self):
self.target(self)
t1=FuncThread(thescript.doit)
t1.start()
time.sleep(5)
t1.stop()
t1.join()
You can exit the thread any time, I just waited 5 seconds and then called the stop() method.

Related

Get data from process

Suppose I have a python script program.py
that looks as follows:
# program.py
import random
class Program
def __init__(self):
self.data = []
def run():
while True:
self.data.append(random.random()]
if __name__ == "__main__":
program = Program()
program.run()
Now suppose that I have another script script.py that calls program.py as a separate process.
# script.py
import subprocess
import time
process = subprocess.Popen(["py", "program.py"])
while True:
data = get_program_data(process)# how?
time.sleep(10)
The purpose of script.py is to illustrate the fact that I don't have access to the class Program. In my case this is due to the fact that I will be triggering program.py from a .NET application. I thought I'll try to understand how to deal with this problem from a python script first then apply it to the .NET application. So here is my question (keep in mind that I can alter the code in program.py and script.py, but script.py can not access program.py):
How should I go about accessing self.data from the process? I have been searching all over and I'm not quite sure in what direction I should be going. In my case, I will need to trigger difference commands for different ind of data generated in Program, i.e. get_program_data1(), get_program_data2(),....
The way I have been "solving" this issue is to have a file controller.json that script.py modifies and program.py reads and acts accordingly. I does not feel quite right doing and so I want your opinion about this. Remember that ultimately, script.py is a .NET application. Thanks

How to force os.stat re-read file stats by same path

I have a code that is architecturally close to posted below (unfortunately i can't post full version cause it's proprietary). I have an self-updating executable and i'm trying to test this feature. We assume that full path to this file will be in A.some_path after executing input. My problem is that assertion failed, because on second call os.stat still returning the previous file stats (i suppose it thinks that nothing could changed so it's unnecessary). I have tried to launch this manually and self-updating works completely fine and the file is really removing and recreating with stats changing. Is there any guaranteed way to force os.stat re-read file stats by the same path, or alternative option to make it works (except recreating an A object)?
from pathlib import Path
import unittest
import os
class A:
some_path = Path()
def __init__(self, _some_path):
self.some_path = Path(_some_path)
def get_path(self):
return self.some_path
class TestKit(unittest.TestCase):
def setUp(self):
pass
def check_body(self, a):
some_path = a.get_path()
modification_time = os.stat(some_path).st_mtime
# Launching self-updating executable
self.assertTrue(modification_time < os.stat(some_path).st_mtime)
def check(self):
a = A(input('Enter the file path\n'))
self.check_body(a)
def Tests():
suite = unittest.TestSuite()
suite.addTest(TestKit('check'))
return suite
def main():
tests_suite = Tests()
unittest.TextTestRunner().run(tests_suite)
if __name__ == "__main__":
main()
I have found the origins of the problem: i've tried to launch self-updating via os.system which wait till the process done. But first: during the self-updating we launch several detached proccesses and actually should wait unitl all them have ended, and the second: even the signal that the proccess ends doesn't mean that OS really completely realease the file, and looks like on assertTrue we are not yet done with all our routines. For my task i simply used sleep, but normal solution should analyze the existing proccesses in the system and wait for them to finish, or at least there should be several attempts with awaiting.

Python start multiprocessing without print/logging statements from processes

I am starting two processes via multiprocessing and this is working fine. The only problems which I have are the print and debug statements from these two processes.
The hope is, to use the REPL and start the processes, like in the background. However, I do not get this to run. I always get the debug statements and therefore can't use the REPL anymore. This is how I call the processes:
processes = [
Process(target=start_viewer, args=()),
Process(target=start_server, args=(live, amount, fg))
]
for p in processes:
p.start()
Any idea on how to "mute" the process, or get them in the background?
If I correct understand you, you want to not show printing from one of processes.
You can achieve this by redirect output of the Python Interpreter.
Add sys.stdout = open("/dev/null", 'w') to the process which you want to "mute".
Full working example below.
from multiprocessing import Process
from time import sleep
import sys
def start_viewer():
sys.stdout = open("/dev/null", 'w')
while True:
print("start_viewer")
sleep(1)
def start_server():
while True:
print("start_server")
sleep(1)
if __name__ == '__main__':
processes = [
Process(target=start_viewer, args=()),
Process(target=start_server, args=())
]
for p in processes:
p.start()
Be aware that /dev/null is like passing prints to nowhere, if you want to save it you can use text file. Also to achieve multi os support you should use os.devnull.

Python: Kill the parent process after creating child processes

I have a polling job written in Python that is executed every 15 minutes to check if the status of an entry in the job table is True. If the status is true then I need to take the values from the table and pass them as arguments to another script that executes something.
I am creating the child processes using Process in Python's Multiprocessing module but I am unable to exit the polling job(parent script) after starting these processes. The polling job keeps waiting until the children complete even if there is a sys.exit() written after creating the children.
#pollingjob.py
import sys
import multiprocessing
from multiprocessing import Process
from secondscript import secondscriptfunction
def createParallelBatches(a,b,c):
for i in [1,2,3]:
p1 = Process(target=secondscriptfunction,args=(a,b,c)).start()
sys.exit()
if __name__=='__main__':
# Check the table for *status=True* rows
# If such rows exit call CreateParallelBatches with the column values
What I am failing to understand is that, why sys.exit() won't let me exit the program leaving the spawned processes as orphans. I tried subprocess module but it also behaves in the same way. I don't want the parent process waiting on its children to complete. Any help would be appreciated. Thanks.
you need to launch an independent sub process externally. This is one way to do it :
put the secondscriptfunction into one executable python file
File runscript.py
import sys
from secondscript import secondscriptfunction
if __name__=='__main__':
secondscriptfunction(sys.argv[1:]) #passing arguments to the func
use subprocess.popen in your script:
File pollingjob.py
import subprocess
import shlex
def createParallelBatches(a,b,c):
for i in [1,2,3]:
command = "python runscript.py %s %s %s "%(a,b,c)
cmd_args = shlex.split(command)
subprocess.Popen(cmd_args)
sys.exit()
Just remove the process object from the _children set of the current process object, the the parent process will exit immediately.
Theh multiprocessing module manages child processes in a private set and join them when the current process exits. You can remove children from the set if you don't care of them.
process = multiprocessing.Process(target=proc_main)
multiprocessing.current_process()._children.discard(process)
exit(0)

Python multithreading from multiple files

This is a sample demo of how I want to use threading.
import threading
import time
def run():
print("started")
time.sleep(5)
print("ended")
thread = threading.Thread(target=run)
thread.start()
for i in range(4):
print("middle")
time.sleep(1)
How can I make this threading work demo even from multiple files?
Example:
# Main.py
import background
""" Here I will have a main program and \
I want the command from the background file to constantly run. \
Not just once at the start of the program """
The second file:
# background.py
while True:
print("This text should always be printing \
even when my code in the main function is running")
Put all the lines before your for loop in background.py. When it is imported it will start the thread running. Change the run method to do your infinite while loop.
You may also want to set daemon=True when starting the thread so it will exit when the main program exits.
main.py
import time
import background
for i in range(4):
print("middle")
time.sleep(1)
background.py
import threading
import time
def run():
while True:
print("background")
time.sleep(.5)
thread = threading.Thread(target=run,daemon=True)
thread.start()
Output
background
middle
background
middle
background
background
background
middle
background
background
middle
background
background

Resources