Python subprocess call through crontab not working - node.js

I am using a python 3.6 script in a Raspberry Pi Zero W that contains the following lines:
import subprocess
result = subprocess.run(['which', 'node'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
nodeCmd = result.stdout.decode("utf-8").replace('\n', '')
print(nodeCmd)
result = subprocess.run([nodeCmd, './script.js'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
The script tries to find the node binary and make a call to a js script. When ran manually, the program works OK, but when I schedule the call through crontab, the nodeCmd variable appears blank (instead of /usr/local/bin/node) and I get the following error:
[Errno 13] Permission denied: ''
What is going on here? Is this a permissions issue?

So the reason seems to be that crontab has the $PATH variable set to a different value from the user $PATH. To fix it, I just had to set the wanted value in the cron file, just above the schedule lines:
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

Related

Running a string command using exec with popen

I have a simple cmd_str containing a set of lines. Using exec, I can run those lines juts fine. However, running those lines in a separate process when shell=True is failing. Is this dues to missing quotes? what is happening under the hood?
import subprocess
cmd_str = """
import sys
for r in range(10):
print('rob')
"""
exec(cmd_str) # works ok
full_cmd = f'python3 -c "exec( "{cmd_str}" )"'
process = subprocess.Popen([full_cmd],
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
(output, error) = process.communicate()
exit_code = process.wait()
output_msg = output.decode("utf-8", 'ignore')
error_msg = error.decode("utf-8", 'ignore').strip()
Your approach is slightly inaccurate. I believe the problem you're having has to do with the subprocess usage. The first thing you must realise is that exec
is a way to send and execute python code to and from the interpreter directly. This is why it works inside python programs(and it is generally not a good approach). Subprocesses on the other hand handle command like they are being called directly from the terminal or shell. This means that you no longer need to include exec cause you are already interacting with the python interpreter when you call python -c.
To get this to run in a subprocess environment all you need to do is
full_cmd = f'python3 -c "{cmd_str}"'
process = subprocess.Popen(full_cmd,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
Also, notice the absence of square brackets when calling subprocess.Popen, this is because that particular approach works slightly different and if you want to use the square brackets your command will have to be
full_cmd = ['python3', '-c', f'{cmd_str}']
process = subprocess.Popen(full_cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
And with these few changes, everything should work OK

How to get command's standard output with subprocess?

For some reason, I want to gather the help message of some commands. In order to do it, I use the subprocess module in Python3. My code works fine for linux commands but not when I use it on BASH commands. Typically I want it to work on the cd BASH command.
Here is the snippet of code I use for now:
import subprocess
instruction = ["cat", "--help"]
proc = subprocess.run(instruction, stdout=subprocess.PIPE,stderr=subprocess.PIPE, universal_newlines=True)
return proc.stdout
As said before, it works fine and it returns the help message of the command cat.
Here is what it returns when I try to adapt my code in order to handle BASH commands:
>>> import subprocess
>>> subprocess.run("cd --help", shell=True, stderr=subprocess.PIPE, stdout=subprocess.PIPE).stdout
b''
My question is simple, is it possible to get BASH commands' help message using Python3. If so, how to do that ?
You can take a look at what subprocess.run returns:
>>> import subprocess
>>> result = subprocess.run("cd --help", shell=True, stderr=subprocess.PIPE, stdout=subprocess.PIPE)
>>> result
CompletedProcess(args='cd --help', returncode=1, stdout=b'', stderr=b'/bin/sh: line 0: cd: --: invalid option\ncd: usage: cd [-L|-P] [dir]\n')
Turns out, cd --help is an error:
$ cd --help
-bash: cd: --: invalid option
cd: usage: cd [-L|-P] [dir]
So you should look for it in result.stderr.

Unable to get send sudo password to subprocess.Popen() successfully in Python for bash

I'm attempting to create a python script to compile github surface kernel using their recommended steps.
https://github.com/dmhacker/arch-linux-surface
So far I'm stuck at a couple of sections.
Per the instructions for compiling the setup.sh must be run using sudo.
I've tried sending in the password before calling process using
preproc = subprocess.Popen(password, shell=True, stdout=subprocess.PIPE)
process = subprocess.Popen(["sudo", 'sh setup.sh'], shell=True, stdin=preproc.stdout, encoding='utf8')
I've tried sudo -S which doesn't seem to work at all. I've also tried lowercase -s.
I've tried changing subprocess.Popen to subprocess.call
password = getpass.getpass()
process = subprocess.Popen(["sudo", 'sh setup.sh'], shell=True,
stdin=subprocess.PIPE, encoding='utf8')
print(process.communicate(password + "\n"))
process.wait()
I expected the shell to be run at sudo level but it's not.
I'm not exactly sure what the difference is as I've since gone through many iterations, but finally got it to work and simplified. Hope this helps someone in the future.
import getpass
from subprocess import Popen, PIPE
password = getpass.getpass()
command = "./setup.sh"
process = Popen(['sudo', '-S', command], stdout=PIPE, encoding='utf8')
process.communicate(password)

How to write command using subprocess in python3

import subprocess
proc = subprocess.Popen(['systemctl', 'reload', 'postgresql-9.6.service'], stdout=subprocess.PIPE, shell=True)
(db_cmd, err) = proc.communicate()
print (db_cmd)
I am trying to run systemctl reload postgresql-9.6.service using python 3 but I am not able to get output. And I am getting output such as:
reload: systemctl: command not found
b''
First of all: read the docs: Subprocess module Python 3.
You mistyped the import statement: it should be: "import subprocess".
Use sudo to execute the program you wrote: sudo python /full/path/to/your/script.
Then: it is more pythonic if you write db_cmd = proc.communicate()[0] because in this way you create only the variable you use.
Finally, your error is indicating that something went wrong while you were processing the systemctl command. In particular it seems to be missing the reload command. Try using restart.
In addition: this question is a duplicate of How to use subprocess.

How can I have subprocess or Popen show the command line that it has just run?

Currently, I have a command that looks something like the following:
my_command = Popen([activate_this_python_virtualenv_file, \
"-m", "my_command", "-l", \
directory_where_ini_file_for_my_command_is + "/" + my_ini_file_name], \
stderr=subprocess.STDOUT, stdout=subprocess.PIPE, shell=False,
universal_newlines=False, cwd=directory_where_my_module_is)
I have figured out how to access and process the output, deal with subprocess.PIPE, and make subprocess do a few other neat tricks.
However, it seems odd to me that the standard Python documentation for subprocess doesn't mention a way to just get the actual command line as subprocess.Popen puts it together from arguments to the Popen constructor.
For example, perhaps my_command.get_args() or something like that?
Is it just that getting the command line run in Popen should be easy enough?
I can just put the arguments together on my own, without accessing the command subprocess runs with Popen, but if there's a better way, I'd like to know it.
It was added in Python 3.3. According to docs:
The following attributes are also available:
Popen.args The args argument as it was passed to Popen – a sequence of
program arguments or else a single string.
New in version 3.3.
So sample code would be:
my_args_list = [] # yourlist
p = subprocess.Popen(my_args_list)
assert p.args == my_args_list

Resources