Pexpect - Read from constantly outputting shell - python-3.x

I'm attempting to have pexpect begin running a command which basically continually outputs some information every few milliseconds until cancelled with Ctrl + C.
I've attempted getting pexpect to log to a file, though these outputs are simply ignored and are never logged.
child = pexpect.spawn(command)
child.logfile = open('mylogfile.txt', 'w')
This results in the command being logged with an empty output.
I have also attempted letting the process run for a few seconds, then sending an interrupt to see if that logs the data, but this again, results in an almost empty log.
child = pexpect.spawn(command)
child.logfile = open('mylogfile.txt', 'w')
time.sleep(5)
child.send('\003')
child.expect('$')
This is the data in question:
image showing the data constantly printing to the terminal
I've attempted the solution described here: Parsing pexpect output though it hasn't worked for me and results in a timeout.

Managed to get it working by using Python Subprocess for this, not sure of a way to do it with Pexpect, but this got what I described.
def echo(self, n_lines):
output = []
if self.running is False:
# start shell
self.current_shell = Popen(cmd, stdout=PIPE, shell=True)
self.running = True
i = 0
# Read the lines from stdout, break after reading the desired amount of lines.
for line in iter(self.current_shell.stdout.readline, ''):
output.append(line[0:].decode('utf-8').strip())
if i == n_lines:
break
i += 1
return output

Related

subprocess gets hang when executed batch file and checked for live output to kill the process

I have this code. Where I have to check the string of the batch file output, if matches kill the process but unfortunately it is getting stuck after printing None as output of Poll() and returns nothing. expecting it to return me something
def install_prereq():
# subprocess.call([r'date.bat'])
cmd = r'date.bat'
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True, shell=False)
print(proc.poll())
stdout, stderr = proc.communicate() #.stdout.readline()
lines = stdout.split()
print(lines)
proc.kill()
Here is my bat file -
#echo off
timeout 5
color a
echo "this is timings"
pause
Can someone please guide me right direction where I'm going off the track. And How I can kill the process in the end?
Note: I have already gone through plenty of problems on SO but none of them answers my question.
TIA

Reset timeout duration on command output

I have the problem when running a command very irregularly it fails to run.
Currently I am using timeout, but the problem is that when the command does work it takes a long time to finish (several minutes).
Ideally I want to set the timeout to infinity if the command shows some signs of life and keep it 15sec otherwise.
Any suggestions?
At the end I solved it by wrapping it up into a python script.
The usage is
python script.py [your command]
The script will control if the commands needs a re-run
import sys
import subprocess
import signal
import time
def handler(signum, frame):
global atm
if (atm<3): # attempt to re-run 2 times
print('re-running')
atm = atm + 1
p.kill()
signal.alarm(0)
time.sleep(15) # not sure that is needed, but just in case
run()
else:
raise OSError("Number of re-run attempts exceeded")
def run(): # run the command which is passed as a parameter to this script
global p
p = subprocess.Popen(args, stdout=subprocess.PIPE)
signal.signal(signal.SIGALRM, handler) # will not work on Windows
signal.alarm(5) # give 5 seconds to produce some output, rerun otherwise
for line in iter(p.stdout.readline, b''):
signal.alarm(0) # if output produced remove the alarm
topr = (line).decode('UTF-8');
print(topr, end='')
args = sys.argv[1::] # command passed here
atm = 0
run()

tqdm progress bar not updating when using paramiko

When using Paramiko to execute commands remotely, I can't see an updating progress bar using tqdm. I'm guessing this is because it isn't printing a new line when tqdm updates the bar
Here's a simple code example I've been using, but you'll need to supply your own SSH credentials
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.connect('8.tcp.ngrok.io', username=username, get_pty=True)
command = 'python3 -c "import tqdm; import time; [time.sleep(1) for i in tqdm.tqdm(range(5))]"'
stdin, stdout, stderr = ssh_client.exec_command('sudo -S '+command)
stdin.write(password+'\n')
stdin.flush()
###new_method
for l, approach in line_buffered(stdout):
if approach=='print':
print( l)
if approach=='overlay':
print( l, end='\r')
ssh.close()
Is there a way I can print the tqdm bar as it updates?
Based on Martin Prikryl's suggestion, I tried to incorporate the solution from:
Paramiko with continuous stdout
And adapted the code to print regardless of a new line
def line_buffered(f):
line_buf = ""
while not f.channel.exit_status_ready():
# f.read(1).decode("utf-8")
line_buf += f.read(1).decode("utf-8", 'ignore')
if line_buf.endswith('\n'):
yield line_buf, 'print'
line_buf = ''
# elif len(line_buf)>40:
elif line_buf.endswith('\r'):
yield line_buf, 'overlay'
This does successfully print the as the output as it is generated, and reprints on the tqdm line, but when I run this code I get the following output
100%|| 5/5 [00:05<00:00, 1.00s/it]1.00s/it]1.00s/it]1.00s/it]1.00s/it]?, ?it/s]
Not very pretty, and getting swamped by the iteration time. It doesn't seem to be printing the actual progress bar. any ideas?
It probably because you are (correctly) using non-interactive session to automate your command execution.
Most decently designed commands do not print an output intended for an interactive human use, like progress, when executed non-interactively.
If you really want to see the progress display, try setting get_pty argument of SSHClient.exec_command to true.
stdin, stdout, stderr = ssh_client.exec_command('sudo -S '+command, get_pty=True)

Continuous communication between parent and child subprocess in Python (Windows)?

I have this script:
import subprocess
p = subprocess.Popen(["myProgram.exe"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
while True:
out, _ = p.communicate(input().encode())
print(out.decode())
which works fine until the second input where I get:
ValueError: Cannot send input after starting communication
Is there a way to have multiple messages sent between the parent and child process in Windows ?
[EDIT]
I don't have access to the source code of myProgram.exe
It is an interactive command line application returning results from queries
Running >> myProgram.exe < in.txt > out.txt works fine with in.txt:
query1;
query2;
query3;
Interacting with another running process via stdin/stdout
To simulate the use case where a Python script starts a command line interactive process and sends/receives text over stdin/stdout, we have a primary script that starts another Python process running a simple interactive loop.
This can also be applied to cases where a Python script needs to start another process and just read its output as it comes in without any interactivity beyond that.
primary script
import subprocess
import threading
import queue
import time
if __name__ == '__main__':
def enqueue_output(outp, q):
for line in iter(outp.readline, ''):
q.put(line)
outp.close()
q = queue.Queue()
p = subprocess.Popen(["/usr/bin/python", "/test/interact.py"],
stdin = subprocess.PIPE,
stdout = subprocess.PIPE,
# stderr = subprocess.STDOUT,
bufsize = 1,
encoding ='utf-8')
th = threading.Thread(target=enqueue_output, args=(p.stdout, q))
th.daemon = True
th.start()
for i in range(4):
print("dir()", file=p.stdin)
print(f"Iteration ({i}) Parent received: {q.get()}", end='')
# p.stdin.write("dir()\n")
# while q.empty():
# time.sleep(0)
# print(f"Parent: {q.get_nowait()}")
interact.py script
if __name__ == '__main__':
for i in range(2):
cmd = raw_input()
print("Iteration (%i) cmd=%s" % (i, cmd))
result = eval(cmd)
print("Iteration (%i) result=%s" % (i, str(result)))
output
Iteration (0) Parent received: Iteration (0) cmd=dir()
Iteration (1) Parent received: Iteration (0) result=['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'cmd', 'i']
Iteration (2) Parent received: Iteration (1) cmd=dir()
Iteration (3) Parent received: Iteration (1) result=['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'cmd', 'i', 'result']
This Q&A was leveraged to simulate non-blocking reads from the target process: https://stackoverflow.com/a/4896288/7915759
This method provides a way to check for output without blocking in the main thread; q.empty() will tell you if there's no data. You can play around with blocking calls too using q.get() or with a timeout q.get(2) - the parameter is number of seconds. It can be a float value less than zero.
Text based interaction between processes can be done without the thread and queue, but this implementation gives more options on how to retrieve the data coming back.
The Popen() parameters, bufsize=1 and encoding='utf-8' make it possible to use <stdout>.readline() from the primary script and sets the encoding to an ascii compatible codec understood by both processes (1 is not the size of the buffer, it's a symbolic value indicating line buffering).
With this configuration, both processes can simply use print() to send text to each other. This configuration should be compatible for a lot of interactive text based command line tools.

Python - Using timeout while printing line by line in a subprocess with Popen

(in Python 3.5)
I am having difficulties to print stdout line by line (while running the program), and maintain the timeout function (to stop the program after sometime).
I have:
import subprocess as sub
import io
file_name = 'program.exe'
dir_path = r'C:/directory/'
p = sub.Popen(file_name, cwd = dir_path, shell=True, stdout = sub.PIPE, stderr = sub.STDOUT)
And while running "p", do these 2 things:
for line in io.TextIOWrapper(p.stdout, encoding="utf-8"):
print(line)
And do:
try:
outs = p.communicate(timeout=15) # Just to use timeout
except Exception as e:
print(str(e))
p.kill()
The program should print every output line but should not run the simulation for more than 15 seconds.
If I use the "p.communicate" before the "p.stdout", it will wait for the timeout ou the program to finish. If I use it on the other way, the program will not count the timeout.
I would like to do it without threading, and if possible without io too, it seems to be possible, but I donĀ“t know how (need more practice and study). :-(
PS: The program I am running was written in fortran and is used to simulate water flow. If I run the exe from windows, it opens a cmd and prints a line on each timestep. And I am doing a sensitivity analysis changing the inputs on exe file.
That's because your process\child processes are not getting killed correctly
just modify your try,except as below
try:
pid_id=p.pid
outs = p.communicate(timeout=15) # Just to use timeout
except Exception as e:
print(str(e))
import subprocesss
#This will kill all the process and child process associated with p forcefully
subprocess.Popen('taskkill /F /T /PID %i' % pid_id)

Resources