how can i run another process in heroku - python-3.x

I have two files in Heroku like "app.py" and "notify.py".
when I run the "app.py".
In some case, I'd like to run "notify.py", so I use this function "subprocess.Popen" to call "notify.py" .and I hope that process "notify.py" can run in loop, even after the main process ends.
How can I do that?
By the way, I get the error like this
/bin/sh: 1: ./notify.py: Permission denied
I use some methods like below
child = subprocess.Popen(['./notify.py'], shell=True, cwd='/app', stdin=None, stdout=subprocess.PIPE)
child = subprocess.call(['./notify.py'], shell=True, cwd='/app', stdin=None, stdout=subprocess.PIPE)
but all doesn't work
and I try another like this
i make a function notify() in app.py
and change the code like below
child = subprocess.Popen([notify(text)], shell=True)
it work.
But it will run "notify(text)" then print "test" and return.
Please tell me why and how can I solve this problem.
if event.message.text[0:6] == 'push**':
t = event.message.text
text = t.split('**')
child = subprocess.Popen(['./notify.py'], shell=True, cwd='/app', stdin=None, stdout=subprocess.PIPE)
print('test')
return 0
/bin/sh: 1: ./notify.py: Permission denied

Related

Python 3 piping Ghostscript output to internal variable

I'm loosing my head trying to make work this piece of code. I want to pipe the Ghostscript's data output of
'-sDEVICE=ink_cov' to an internal variable instead of using a external file that I must read (like I'm doing right now), but I can't do it work. Here are some of my trys:
__args = ['gswin64', f'-sOutputFile={salida_temp}', '-dBATCH', '-dNOPAUSE',
'-dSIMPLE', '-sDEVICE=ink_cov', '-dShowAnnots=false', '-dTextFormat=3', fichero_pdf]
# __args = ['gswin64', '-dBATCH', '-dNOPAUSE', '-dSIMPLE', '-sDEVICE=ink_cov',
'-dShowAnnots=false', '-dTextFormat=3', fichero_pdf]
# __args = ['gswin64', '-sOutputFile=%%pipe%%', '-q', '-dQUIET', '-dBATCH', '-dNOPAUSE',
'-dSIMPLE', '-sDEVICE=ink_cov', '-dTextFormat=3', fichero_pdf]
ghost_output = subprocess.run(__args, capture_output=True, text=True)
# ghost_output = subprocess.check_output(__args)
# ghost_output = subprocess.run(__args, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# ghost_output = subprocess.Popen(__args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# with subprocess.Popen(__args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) as process:
# for line in process.stdout:
# print(line.decode('utf8'))
print('GS stdout:', ghost_output.stdout)
I have tried a bounch of parameters, subprocess run, check_output, Popen, context manager, but I can't get something in ghost_output.stdout, only empty string or sometimes b'' if I use decode().
(BTW, if I use the '-dQUIET' option, Ghostcript don't show any data but still opens an output window . I don't found the way to don't open any window, neither.)
Anybody knows how to do it properly?

How to exit python subprocess if it fails

I am reading some output of remote server using SSH bash(command). I need to do it constantly, so I spawn subprocess. It works fine, unless server server become unavailable for brief period.
How do I restart SSH command (whole subprocess) if it fails?
I have following piece of code:
(...)
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True, universal_newlines=True)
while True:
line = process.stdout.readline()
if lines == "" and process.poll() is not None:
break
(...)
I would think that process.poll() is not None should do the trick
but it seems to hang on
1000 21431 0.0 0.0 0 0 pts/0 Z+ Oct22 0:00 [ssh] <defunct>
And does not break out of while True:
I had silly typo error in my code preventing if lines == "" and process.poll() is not None: from executing properly.
Another thing to look at is ssh_config, It is wise to set values to disconnect #60 seconds/1 attempt.
And avoid process.communicate() as it is blocking the whole thread.

subprocess.Popen: does not retun complete output , when run through crontab

I am calling some java binary in unix environment wrapped inside python script
When I call script from bash, output comes clean and also being stored in desired variable , However when i run the same script from Cron, Output stored(in a Variable) is incomplete
my code:
command = '/opt/HP/BSM/PMDB/bin/abcAdminUtil -abort -streamId ETL_' \
'SystemManagement_PA#Fact_SCOPE_OVPAGlobal'
proc = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
(output, err) = proc.communicate() # Storing Output in output variable
Value of output variable when running from shell:
Abort cmd output:PID:8717
Executing abort function
hibernateConfigurationFile = /OBRHA/HPE-OBR/PMDB/lib/hibernate-core-4.3.8.Final.jar
Starting to Abort Stream ETL_SystemManagement_PA#Fact_SCOPE_OVPAGlobal
Aborting StreamETL_SystemManagement_PA#Fact_SCOPE_OVPAGlobal
Value of output variable when running from cron:
PID:830
It seems output after creating new process is not being stored inside variable , i don't know why ?
Kintul.
You question seems to be very similar to this one: Capture stdout stderr of python subprocess, when it runs from cron or rc.local
See if that helps you.
This happened because Java utility was trowing exception which is not being cached by subprocess.Popen
However exception is catched by subprocess.check_output
Updated Code :
try:
output = subprocess.check_output(command, shell=True, stderr=subprocess.STDOUT, stdin=subprocess.PIPE)
except subprocess.CalledProcessError as exc:
print("Status : FAIL", exc.returncode, exc.output)
else:
print("Output of Resume cmd: \n{}\n".format(output))
file.write("Output of Resume cmd: \n{}\n".format(output) + "\n")
Output of code:
('Status : FAIL', -11, 'PID:37319\n')
('Status : FAIL', -11, 'PID:37320\n')
Hence , command is throwing exception is being cached by subprocess.check_output but not by subprocess.Popen
Extract form official page of subprocess.check_output
If the return code was non-zero it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute and any output in the output attribute.

how do I make my python program to wait for the subprocess to be completed

I have a python program which should execute a command line (command line is a psexec command to call a batch file on the remote server)
I used popen to call the command line. The batch on the remote server produces a return code of 0.
Now I have to wait for this return code and on the basis of the return code I should continue my program execution.
I tried to use .wait() or .check_output() but for some reason did not work for me.
cmd = """psexec -u CORPORATE\user1 -p force \\\sgeinteg27 -s cmd /c "C:\\Planview\\Interfaces\\ProjectPlace_Sree\\PP_Run.bat" """
p = subprocess.Popen(cmd, bufsize=2048, shell=True,
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
p.wait()
print(p.returncode)
##The below block should wait until the above command runs completely.
##And depending on the return code being ZERO i should continue the rest of
##the execution.
if p.returncode ==0:
result = tr.test_readXid.readQuery(cert,planning_code)
print("This is printed depending if the return code is zero")
Here is the EOF the batch file execution and the return code
Can anybody help me with this ?

how to execute python or bash script through ssh connection and get the return code

I have a python file at the location \tmp\ this file print something and return with exit code 22. I'm able to run this script perfectly with putty but not able to do it with paramiko module.
this is my execution code
import paramiko
def main():
remote_ip = '172.xxx.xxx.xxx'
remote_username = 'root'
remote_password = 'xxxxxxx'
remote_path = '/tmp/ab.py'
sub_type = 'py'
commands = ['echo $?']
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(remote_ip, username=remote_username,password=remote_password)
i,o,e = ssh_client.exec_command('/usr/bin/python /tmp/ab.py')
print o.read(), e.read()
i,o,e = ssh_client.exec_command('echo $?')
print o.read(), e.read()
main()
this is my python script to be executed on remote machine
#!/usr/bin/python
import sys
print "hello world"
sys.exit(20)
I'm not able to understand what is actually wrong in my logic. Also when i do cd \tmp and then ls, i'll still be in root folder.
The following example runs a command via ssh and then get command stdout, stderr and return code:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname='hostname', username='username', password='password')
channel = client.get_transport().open_session()
command = "import sys; sys.stdout.write('stdout message'); sys.stderr.write(\'stderr message\'); sys.exit(22)"
channel.exec_command('/usr/bin/python -c "%s"' % command)
channel.shutdown_write()
stdout = channel.makefile().read()
stderr = channel.makefile_stderr().read()
exit_code = channel.recv_exit_status()
channel.close()
client.close()
print 'stdout:', stdout
print 'stderr:', stderr
print 'exit_code:', exit_code
hope it helps
Each time you run exec_command, a new bash subprocess is being initiated.
That's why when you run something like:
exec_command("cd /tmp");
exec_command("mkdir hello");
The dir "hello" is created in dir, and not inside tmp.
Try to run several commands in the same exec_command call.
A different way is to use python's os.chdir()

Resources