Python subprocess.Popen result stored in a variable - python-3.x

I've seen various other posts about this, but unfortunately I still haven't been able to figure this out:
If I do something like this:
temp = subprocess.Popen("whoami", shell=True, stdout=subprocess.PIPE)
out = temp.communicate()
print(out)
then I get something of the form
(b'username\n', None)
With other attempts (such as adding a .wait()) I've been getting the username on one line, and a 0 as a return code on the next, however only the 0 was being stored in my variable.
Is there an easy way I can format that to store only the username in a variable? I tried something like out[3:11] but that didn't work.
Thanks

The easiest way is to use subprocess.check_output():
username = subprocess.check_output("whoami").strip()

username = subprocess.check_output(['whoami']).strip()
Or better:
username = getpass.getuser()

Adding the universal_newlines=True argument tells subprocess calls to return strings. I've been using this instead of explicitly decoding bytestreams.
temp = subprocess.Popen("whoami",
shell=True,
stdout=subprocess.PIPE,
universal_newlines=True)
out = temp.communicate()
print(out)
# prints: ('username\n', None)
Subprocess docs:
If universal_newlines is True, the file objects stdin, stdout and stderr will be opened as text streams in universal newlines mode using the encoding returned by locale.getpreferredencoding(False).

After communicate, you can read the return code from temp.returncode.
From http://docs.python.org/dev/library/subprocess.html#subprocess.Popen.returncode:
Popen.returncode
The child return code, set by poll() and wait() (and indirectly by communicate()). A None value indicates that the process hasn’t terminated yet.
If all you care about is that the call succeeds, use subprocess.check_output; non-zero return will raise CalledProcessError.

Related

Python Subprocess The filename, directory name, or volume label syntax is incorrect

I am trying to download an audio dataset, I have all the audio's link stored in a csv. I read the csv and get all the links now I have to download the audio's one by one. Here's the code.
if not os.path.exists(audio_path_orig):
line = f"wget {episode_url}"
print('command:',line)
process = subprocess.Popen([(line)],shell=True)
process.wait()
for a sample, the line variable contains
wget https://stutterrockstar.files.wordpress.com/2012/08/male-episode-14-with-grant.mp3
Note that the url works and you can check for yourself, but when I try to download it using python it gives me the below error.
error: The filename, directory name, or volume label syntax is incorrect
Look at the documentation for Popen:
args should be a sequence of program arguments or else a single string or path-like object. By default, the program to execute is the first item in args if args is a sequence.
And:
Unless otherwise stated, it is recommended to pass args as a sequence.
Also:
The recommended approach to invoking subprocesses is to use the run() function for all use cases it can handle.
So use:
if not os.path.exists(audio_path_orig):
args = ["wget", f"{episode_url}"]
print('command:', " ".join(args))
result = subprocess.run(
args, capture_output=True, text=True
)
if result.returncode != 0:
print(f"wget returned error {result.returncode}")
print("Standard output:")
print(result.stdout)
print("Error output:")
print(result.stderr)

Subprocess stdout: remove unnecessary char

I´m using the subprocess module and it works fine, the only thing is that the stdout returns a value "b'" or in some cases longer text like "user config - ignore ...". Is it possible to remove this first part of the stdout without using str.substring() or similar methodes.
output = subprocess.run(['ls', '-l'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
in the abrove example the std.out.decode() function could be used and it will be saved as < str >
decoded_output = nodes.stdout.decode()
And if some type of commands support json output(for example pvesh in proxmox) you could use the string and load it as json.
json_output = json.loads(decoded_output)

Python: how to write to stdin of a subprocess and read its output in real time

I have 2 programs.
The first (which could be written in any language, actually and therefore cannot be altered at all) looks like this:
#!/bin/env python3
import random
while True:
s = input() # get input from stdin
i = random.randint(0, len(s)) # process the input
print(f"New output {i}", flush=True) # prints processed input to stdout
It runs forever, read something from stdin, processes it and writes the result to stdout.
I am trying to write a second program in Python using the asyncio library.
It executes the first program as a subprocess and attempt to feed it input via its stdin and retrieve the result from the its stdout.
Here is my code so far:
#!/bin/env python3
import asyncio
import asyncio.subprocess as asp
async def get_output(process, input):
out, err = await process.communicate(input)
print(err) # shows that the program crashes
return out
# other attempt to implement
process.stdin.write(input)
await process.stdin.drain() # flush input buffer
out = await process.stdout.read() # program is stuck here
return out
async def create_process(cmd):
process = await asp.create_subprocess_exec(
cmd, stdin=asp.PIPE, stdout=asp.PIPE, stderr=asp.PIPE)
return process
async def run():
process = await create_process("./test.py")
out = await get_output(process, b"input #1")
print(out) # b'New output 4'
out = await get_output(process, b"input #2")
print(out) # b''
out = await get_output(process, b"input #3")
print(out) # b''
out = await get_output(process, b"input #4")
print(out) # b''
async def main():
await asyncio.gather(run())
asyncio.run(main())
I struggle to implement the get_output function. It takes a bytestring (as needed by the input parameter of the .communicate() method) as parameter, writes it to the stdin of the program, reads the response from its stdout and returns it.
Right now, only the first call to get_output works properly. This is because the implementation of the .communicate() method calls the wait() method, effectively causing the program to terminate (which it isn't meant to). This can be verified by examining the value of err in the get_output function, which shows the first program reached EOF. And thus, the other calls to get_output return an empty bytestring.
I have tried another way, even less successful, since the program gets stuck at the line out = await process.stdout.read(). I haven't figured out why.
My question is how do I implement the get_output function to capture the program's output in (near) real time and keep it running ? It doesn't have to be using asyncio, but I have found this library to be the best one so far for that.
Thank you in advance !
If the first program is guaranteed to print only one line of output in response to the line of input that it has read, you can change await process.stdout.read() to await process.stdout.readline() and your second approach should work.
The reason it didn't work for you is that your run function has a bug: it never sends a newline to the child process. Because of that, the child process is stuck in input() and never responds. If you add \n at the end of the bytes literals you're passing to get_output, the code works correctly.

Pexpect - Read from constantly outputting shell

I'm attempting to have pexpect begin running a command which basically continually outputs some information every few milliseconds until cancelled with Ctrl + C.
I've attempted getting pexpect to log to a file, though these outputs are simply ignored and are never logged.
child = pexpect.spawn(command)
child.logfile = open('mylogfile.txt', 'w')
This results in the command being logged with an empty output.
I have also attempted letting the process run for a few seconds, then sending an interrupt to see if that logs the data, but this again, results in an almost empty log.
child = pexpect.spawn(command)
child.logfile = open('mylogfile.txt', 'w')
time.sleep(5)
child.send('\003')
child.expect('$')
This is the data in question:
image showing the data constantly printing to the terminal
I've attempted the solution described here: Parsing pexpect output though it hasn't worked for me and results in a timeout.
Managed to get it working by using Python Subprocess for this, not sure of a way to do it with Pexpect, but this got what I described.
def echo(self, n_lines):
output = []
if self.running is False:
# start shell
self.current_shell = Popen(cmd, stdout=PIPE, shell=True)
self.running = True
i = 0
# Read the lines from stdout, break after reading the desired amount of lines.
for line in iter(self.current_shell.stdout.readline, ''):
output.append(line[0:].decode('utf-8').strip())
if i == n_lines:
break
i += 1
return output

How can I redirect a file in a subprocess using "<"?

I have looked at other similar questions but was not able to find an answer to my question.
This is what I want to execute:
gagner -arg1 < file1
This is my code so far:
filePath = tkinter.filedialog.askopenfilename(filetypes=[("All files", "*.*")])
fileNameStringForm = (basename(filePath ))
fileNameByteForm = fileNameStringForm.encode(encoding='utf-8')
process = subprocess.Popen(['gagner','-arg1'], shell = True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = process .communicate(fileNameByteForm)
stringOutput = stdout.decode('utf-8')
print(stringOutput)
Currently if I run this code, nothing happens. I get no errors, but no output is also printed.
Could someone please show me how I can execute the linux command above using python.
You shouldn't pass the name to the process (it expects to read file data from stdin, not a file name); instead, pass the file handle itself (or using PIPE, the raw file data).
So you could just do:
with open(fileNameStringForm, 'rb') as f:
process = subprocess.Popen(['gagner','-arg1'], stdin=f, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = process.communicate()
or somewhat less efficiently (since Python has to read it, then write it, instead of letting the process read directly):
process = subprocess.Popen(['gagner','-arg1'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
with open(fileNameStringForm, 'rb') as f:
stdout, stderr = process.communicate(f.read())
Note that I removed shell=True from both invocations; using list command form removes the need, and it's both faster, safer and more stable to avoid the shell wrapping.

Resources