Equivalent of bash "|" in python3 [duplicate] - python-3.x

I want to use subprocess.check_output() with ps -A | grep 'process_name'.
I tried various solutions but so far nothing worked. Can someone guide me how to do it?

To use a pipe with the subprocess module, you have to pass shell=True.
However, this isn't really advisable for various reasons, not least of which is security. Instead, create the ps and grep processes separately, and pipe the output from one into the other, like so:
ps = subprocess.Popen(('ps', '-A'), stdout=subprocess.PIPE)
output = subprocess.check_output(('grep', 'process_name'), stdin=ps.stdout)
ps.wait()
In your particular case, however, the simple solution is to call subprocess.check_output(('ps', '-A')) and then str.find on the output.

Or you can always use the communicate method on the subprocess objects.
cmd = "ps -A|grep 'process_name'"
ps = subprocess.Popen(cmd,shell=True,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
output = ps.communicate()[0]
print(output)
The communicate method returns a tuple of the standard output and the standard error.

Using input from subprocess.run you can pass the output of one command into a second one.
import subprocess
ps = subprocess.run(['ps', '-A'], check=True, capture_output=True)
processNames = subprocess.run(['grep', 'process_name'],
input=ps.stdout, capture_output=True)
print(processNames.stdout.decode('utf-8').strip())

See the documentation on setting up a pipeline using subprocess: http://docs.python.org/2/library/subprocess.html#replacing-shell-pipeline
I haven't tested the following code example but it should be roughly what you want:
query = "process_name"
ps_process = Popen(["ps", "-A"], stdout=PIPE)
grep_process = Popen(["grep", query], stdin=ps_process.stdout, stdout=PIPE)
ps_process.stdout.close() # Allow ps_process to receive a SIGPIPE if grep_process exits.
output = grep_process.communicate()[0]

Also, try to use 'pgrep' command instead of 'ps -A | grep 'process_name'

You can try the pipe functionality in sh.py:
import sh
print sh.grep(sh.ps("-ax"), "process_name")

command = "ps -A | grep 'process_name'"
output = subprocess.check_output(["bash", "-c", command])

Related

Python subprocess performance for multiple pipelined commands

I was writing a python code using subprocess module and I got stuck in this situation where I need to use pipes to pass a result of a commnad to another to obtain specific data I need.
However, this also can be achieved through pure Python code.
Ex)
from subprocess import Popen
cmd_result = Popen('ls -l ./ | awk -F " " \'{if ($5 > 10000) print $0}\'' | grep $USER', shell=True).communicate().split('\n')
Or
cmd_result = Popen('ls -l ./', shell=True).communicate().split('\n')
result_lst = []
for result in cmd_result:
result_items = result.split()
if int(result_item[4]) > 10000 and result_item[2] == "user_name":
result_lst.append(result)
And I am wondering which method is better than the other in efficiency-wise.
I found that the one with pure python code is slower than the one with pipelines, but not sure if that means using pipes is more efficient.
Thank you in advance.
The absolutely best solution to this is to avoid using a subprocess at all.
import os
myuid = os.getuid()
for file in os.scandir("."):
st = os.stat(file)
if st.st_size > 10000 and st.st_uid == myuid:
print(file)
In general, if you want to run and capture the output of a command, the simplest by far is subprocess.check_output; but really, don't parse ls output, and, of course, try to avoid superfluous subprocesses like useless greps if efficiency is important.
files = subprocess.check_output(
"""ls -l . | awk -v me="$USER" '$5 > 10000 && $2 == me { print $9 }'""",
text=True, shell=True)
This has several other problems; $4 could contain spaces (it does, on my system) and $9 could contain just the beginning of the file name if it contains spaces.
If you need to run a process which could produce a lot of output concurrently and fetch its output as it arrives, not when the process has finished, the Stack Overflow subprocess tag info page has a couple of links to questions about how to do that; I am guessing it is not worth the effort for this simple task you are asking about, though it could be useful for more complex ones.

Can I pass a variable from python to bash file?

I have a bash file with a bunch of sed commands like this :
sed -i 's/hello my name is Thibault/hello my name is Louis/g' "$1"
so for now i'm doing all of this "by hand", however, I have a python script with a tkinter GUI and several input fields for the user. I would like to find a trick so that if the user inputs "hello my name is Olivia" in the text field then the regex would look like this:
sed -i 's/hello my name is Thibault/hello my name is Olivia/g' "$1"
So I was thinking that i could store the python text input result in the variable to have the regex look like this:
sed -i 's/hello my name is Thibault/$my_variable/g' "$1"
but i don't know how or if this is even possible. Lastly I want to mention that i know i could just ask for the user input in the bash script but this is for my first internship and I have to go through the python GUI.
Edit: i'm on windows 10 if this is any important
Try it like this :
import os
original_text = 'hello my name is Thibault'
new_text = 'hello my name is Louis'
filename = 'test.txt'
os.system (f'sed -i "s/{original_text}/{new_text}/g" {filename}')
For passing data (in your case: some string) from your Python program to a subprocess running a bash script, you have first of all the same options like when calling one bash script from another one: Either design the called script to expect positional parameters (use it as $1 for example) and pass the string as parameter. For instance, if the string is stored in the Python variable parameter, it would look like:
import subprocess
subprocess.call ['bash', './script_to_be_called', parameter]
The other possibility is to design the bash script so that it expects the string to be stored in a variable of a certain name (use it as $PARSTRING for instance) and pass the data via the environment:
import os
os.environ['PARSTRING']=parameter
subprocess.call['bash', './script_to_be_called']
If the "script" executes only a single command, you could alternatively synthesize the command line in your Python program. Assume that you have a string bashcommand, which already holds the complete command which is supposed to be executed by bash, you could do a
import subprocess
subprocess.call ['bash', '-c', bashcommand]
While this should answer your question, I can't help but pointing out, that for executing a single external command, I would not create a shell process, but invoke this program directly as a child process. Also don't forget that spawning a child process takes time, and if you have many such invocations, it might make sense to redesign your approach, for instance by doing everything inside Python, or having only one child prcocess which gets as input the data for all the substitutions to be performed (typically via a file).

How to execute multiple command in command line with Python 3.x

Thanks, everyone. I am writing a script to execute multiple command in command line. It is one part of my whole script.
I have checked many answers, but none of them solved my problem. Some of them are too old to use.
My commands are like this
cd C:/Users/Bruce/Desktop/test
set I_MPI_ROOT=C:\Program Files\firemodels\FDS6\bin\mpi
set PATH=%I_MPI_ROOT%;%PATH%
fds_local -o 1 -p 1 test.fds
python test.py
I tried to use subprocess.run or os.system, etc. But they do not work. I don't know what happened. Here is an example I have used.
file_path = "C:/Users/Bruce/Desktop/test"
cmd1 = 'cd ' + file_path
cmd2 = "set I_MPI_ROOT=C:/Program Files/firemodels/FDS6/bin/mpi"
cmd3 = "set PATH=%I_MPI_ROOT%;%PATH%"
nMPI = '-p {}'.format(1)
nOpenMP = '-o {}'.format(1)
cmd4 = "fds_local {} {} ".format(nMPI, nOpenMP) + file_name
cmd = '{} && {} && {} && {}'.format(cmd1, cmd2, cmd3, cmd4)
subprocess.Popen(cmd, shell=True)
I am not quite familiar with subprocess. But I have worked for one week to solve this problem. It makes me crazy. Any suggestions?
cmd needs to be a list of text, as whatever you see on shell separated by blanks. E.g.
"ls -l /var/www" should be cmd=['ls','-l','/var/www']
That said, cd is better done with os.chdir. Set is better done with providing the environ dictionary into subprocess calls. Multiline is better done by putting several lines into a shell script (which can take parameters) so you do not have to mess up in python.
here is an example. If a command is not in OS's $PATH, you can fully qualify its path
from subprocess import Popen
cmd=['cd',r'C:\Program Files (x86)\Notepad++','&&','notepad','LICENSE','&&',r'D:\Program\Tools\Putty.exe','-v']
d=Popen(cmd, shell=True)

How can I have subprocess or Popen show the command line that it has just run?

Currently, I have a command that looks something like the following:
my_command = Popen([activate_this_python_virtualenv_file, \
"-m", "my_command", "-l", \
directory_where_ini_file_for_my_command_is + "/" + my_ini_file_name], \
stderr=subprocess.STDOUT, stdout=subprocess.PIPE, shell=False,
universal_newlines=False, cwd=directory_where_my_module_is)
I have figured out how to access and process the output, deal with subprocess.PIPE, and make subprocess do a few other neat tricks.
However, it seems odd to me that the standard Python documentation for subprocess doesn't mention a way to just get the actual command line as subprocess.Popen puts it together from arguments to the Popen constructor.
For example, perhaps my_command.get_args() or something like that?
Is it just that getting the command line run in Popen should be easy enough?
I can just put the arguments together on my own, without accessing the command subprocess runs with Popen, but if there's a better way, I'd like to know it.
It was added in Python 3.3. According to docs:
The following attributes are also available:
Popen.args The args argument as it was passed to Popen – a sequence of
program arguments or else a single string.
New in version 3.3.
So sample code would be:
my_args_list = [] # yourlist
p = subprocess.Popen(my_args_list)
assert p.args == my_args_list

python script to capture output of top command

I was trying to capture output of top command using the following python script:
import os
process = os.popen('top')
preprocessed = process.read()
process.close()
output = 'show_top.txt'
fout = open(output,'w')
fout.write(preprocessed)
fout.close()
However, the script does not work for top. It gets stuck for a long time. However it works well with commands like 'ls'. I have no clue why this is happening?
Since you're waiting for the process to finish, you need to tell top to only print its output once, and then quit.
You can do that by running:
top -n 1
-b argument required when stdout read from python
os.popen('top -b -n 1')
top -b -n 1

Resources