Running a function or BASH command after exiting npyscreen - python-3.x

I'm trying to run a function after entering in npyscreen, tried a few things and am still stuck. Just exits npyscreen and returns to a bash screen. This function is supposed to start a watchdog/rsync watch-folder waiting for files to backup.
#!/usr/bin/env python
# encoding: utf-8
import npyscreen as np
from nextscript import want_to_run_this_function
class WuTangClan(np.NPSAppManaged):
def onStart(self):
self.addForm('MAIN', FormMc, name="36 Chambers")
class FormMc(np.ActionFormExpandedV2):
def create(self):
self.rza_gfk = self.add(np.TitleSelectOne, max_height=4, name="Better MC:", value=[0], values=["RZA", "GhostFace Killah"], scroll_exit=True)
def after_editing(self):
if self.rza_gfk.value == [0]:
want_to_run_this_function()
self.parentApp.setNextForm(None)
else:
self.parentApp.setNextForm(None)
if __name__ == "__main__":
App = WuTangClan()
App.run()

I not sure if i understood correctly what you want.
For executing any kind of bash command i like to use subprocess module, he has the Popen constructor, which you can use to run anything from a bash.
e.g, on windows
import subprocess
process = subprocess.Popen(['ipconfig','/all'])
On unix like system:
import subprocess
process = subprocess.Popen(['ip','a'])
If you have a ".py" file you can pass the parameters like if you where running it from the terminal
e.g
import subprocess
process = subprocess.Popen(['python3','sleeper.py'])
You can even retrieve the process pid and kill it whenever you want, you can look at subprocess module documentation here

Related

v2ray (vmess|vless|trojan|ss) test in python

Friends, how can I test the configurations of (vmess|vless|trojan|ss) with Python?
I need a function to test the speed of given v2ray configs
There is a project vmessping by v2fly that support only vmess but there is a LiteSpeedTest for trojan/ss
sample:
from subprocess import Popen, PIPE
def speedtest(vmesslink):
process = Popen(["./vmessspeed", vmesslink], stdout=PIPE)
stdout = process.communicate()[0]
return stdout

Multiprocessing processes don't start

When I try to make a process start, no matter what, it just fails to do so. This simple code doesn't work:
import multiprocessing
def function():
print("function started")
function_process = multiprocessing.Process(target = function)
function_process.start()
function_process.join()
The output of this code is simply nothing. If I print function_process after this, it returns <Process name='Process-1' pid=13432 parent=7564 stopped exitcode=0>. Adding if __name__ == "__main__" does nothing. Is there something I'm missing here?
Works for me:
$ python mp.py
function started
Maybe the code you show is part of a larger program, and this code is never called?
This is the code:
import multiprocessing
def function():
print("function started")
function_process = multiprocessing.Process(target = function)
function_process.start()
function_process.join()
Some IDEs redirect STDIN, STDERR and STDOUT, and data gets lost. The problem does not happen with Visual Studio Code or PyCharm.

how can restrict processes in parallel

I have this script, but I only want to start 3 perl processes at a time. Once these 3 are done, the script should start the next three.
at the moment all processes are started in parallel
unfortunately I don't know what to do. can someone help me?
my script:
import json, os
import subprocess
from subprocess import Popen, PIPE
list = open('list.txt', 'r')
procs = []
for dirs in list:
args = ['perl', 'test.pl', '-a', dirs]
proc = subprocess.Popen(args)
procs.append(proc)
for proc in procs:
proc.wait()
list.txt :
dir1
dir2
dir3
dir4
dir5
dir6
dir7
dir8
dir9
dir10
dir11
test.pl
$com=$ARGV[0];
$dirs=$ARGV[1];
print "$com $dirs";
sleep(5);
Use Python's concurrent.futures module - it has the figure of a 'Process Pool' that will automatically keep only that many worker process, and start new tasks as the older ones are completed.
As target function, put a simple Python function to open your external process, and wait synchronously for the result - a function with the lines currently inside your for loop.
Using concurrent.futures, your code might look like this:
import json, os
import subprocess
from concurrent.futures import ThreadPoolExecutor, as_completed
from subprocess import Popen, PIPE
mylist = open('list.txt', 'r')
def worker(dirs):
args = ['perl', 'test.pl', '-a']
proc = subprocess.run(args + [dirs])
executor = ThreadPoolExecutor(3) # 3 is: max-workers.
# ProcessPoolExecutor could be an option, but you don't need
# it - the `perl` process will run in other process anyway.
procs = []
for dirs in mylist:
proc = executor.submit(worker, dirs)
procs.append(proc)
for proc in as_completed(procs):
try:
result = proc.result()
except Exception as exc:
# handle any error that may have been raised in the worker
pass

Calling functions from a script using argparse without using subprocess

I have been given an existing script (let's call it existing.py) that in its MVCE form has the following structure.
import argparse
FLAGS = None
def func():
print(FLAGS.abc)
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument(
'--abc',
type=str,
default='',
help='abc.'
)
FLAGS, unparsed = parser.parse_known_args()
func()
As this is part of tool that gets constantly updated, I cannot change existing.py. Normally, existing.py is invoked with commandline arguments.
python -m existing.py --abc "Ok"
which prints the output 'Ok'.
I wish to call the functions (not the whole script) in existing.py using another script. How can I feed in the FLAGS object that is used in the functions of the script? I do not wish to use subprocess will just run the script in its entirety.
I know that argparse creates the FLAGS as a Namespace dictionary and I can construct it in calling.py (see code below) but I cannot then push it back into the function that is imported from existing.py into calling.py. The following is the calling.py that I've tried.
from existing import func
import argparse
args = argparse.Namespace()
args.abc = 'Ok'
FLAGS = args
func()
which throws an error
AttributeError: 'NoneType' object has no attribute 'abc'
This is different from other StackOverflow questions as this question explicitly forbids subprocess and the existing script cannot be changed.
Import existing and use
existing.FLAGS = args
Now functions defined in the existing namespace should see the desired FLAGS object.

Print in real time the result of a bash command launched with subprocess in Python

I'm using the subprocess module to run a bash command. I want to display the result in real time, including when there's no new line added, but the output is still modified.
I'm using python 3. My code is running with subprocess, but I'm open to any other module. I have some code that return a generator for every new line added.
import subprocess
import shlex
def run(command):
process = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
while True:
line = process.stdout.readline().rstrip()
if not line:
break
yield line.decode('utf-8')
cmd = 'ls -al'
for l in run(cmd):
print(l)
The problem comes with commands of the form rsync -P file.txt file2.txt for example, which shows a progress bar.
For example, we can start by creating a big file in bash:
base64 /dev/urandom | head -c 1000000000 > file.txt
Then try to use python to display the rsync command:
cmd = 'rsync -P file.txt file2.txt'
for l in run(cmd):
print(l)
With this code, the progress bar is only printed at the end of the process, but I want to print the progress in real time.
From this answer you can disable buffering when print in python:
You can skip buffering for a whole python process using "python -u"
(or #!/usr/bin/env python -u etc) or by setting the environment
variable PYTHONUNBUFFERED.
You could also replace sys.stdout with some other stream like wrapper
which does a flush after every call.
Something like this (not really tested) might work...but there are
probably problems that could pop up. For instance, I don't think it
will work in IDLE, since sys.stdout is already replaced with some
funny object there which doesn't like to be flushed. (This could be
considered a bug in IDLE though.)
>>> class Unbuffered:
.. def __init__(self, stream):
.. self.stream = stream
.. def write(self, data):
.. self.stream.write(data)
.. self.stream.flush()
.. def __getattr__(self, attr):
.. return getattr(self.stream, attr)
..
>>> import sys
>>> sys.stdout=Unbuffered(sys.stdout)
>>> print 'Hello'
Hello

Resources