asyncio call works in cmd but not in python - python-3.x

I am using sox to retrieve audio file information. There are 2 windows cmd commands which return information correctly:
C:\Users\Me> "path\to\sox.exe" "--i" "path\to\audiofile.wav"
C:\Users\Me> "path\to\sox.exe" "path\to\audiofile.wav" "-n" "stat"
I'm using an asyncio script to run these two commands and collect the data for processing. I use the below code:
async def async_subprocess_command(*args):
# Create subprocess
process = await asyncio.create_subprocess_exec(
*args,
# stdout must a pipe to be accessible as process.stdout
stdout=asyncio.subprocess.PIPE)
# Wait for the subprocess to finish
stdout, stderr = await process.communicate()
# Return stdout
return stdout.decode().strip()
data1 = await async_subprocess_command(soxExecutable, "--i", audiofilepath)
data2 = await async_subprocess_command(soxExecutable, audiofilepath,"-n", "stat")
As both the cmd commands act as expected, I am confused that data2 from the python script is always blank. Data1 is as expected (with data from sox).
Can anyone help me understand why?

For some reason the second command returns its result via sterr. I added the additional parameter to the asyncio.create_subprocess_exec function to connect sterr to the asyncio.subprocess.PIPE.
async def async_subprocess_command(*args):
# Create subprocess
process = await asyncio.create_subprocess_exec(
*args,
# stdout must a pipe to be accessible as process.stdout
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE)
# Wait for the subprocess to finish
stdout, stderr = await process.communicate()
# Return stdout and sterr
return stdout.decode().strip(),sterr.decode().strip()
data1a, data1b = await async_subprocess_command(soxExecutable, "--i", audiofilepath)
data2a, data2b = await async_subprocess_command(soxExecutable, audiofilepath,"-n", "stat")

Related

Python 3 piping Ghostscript output to internal variable

I'm loosing my head trying to make work this piece of code. I want to pipe the Ghostscript's data output of
'-sDEVICE=ink_cov' to an internal variable instead of using a external file that I must read (like I'm doing right now), but I can't do it work. Here are some of my trys:
__args = ['gswin64', f'-sOutputFile={salida_temp}', '-dBATCH', '-dNOPAUSE',
'-dSIMPLE', '-sDEVICE=ink_cov', '-dShowAnnots=false', '-dTextFormat=3', fichero_pdf]
# __args = ['gswin64', '-dBATCH', '-dNOPAUSE', '-dSIMPLE', '-sDEVICE=ink_cov',
'-dShowAnnots=false', '-dTextFormat=3', fichero_pdf]
# __args = ['gswin64', '-sOutputFile=%%pipe%%', '-q', '-dQUIET', '-dBATCH', '-dNOPAUSE',
'-dSIMPLE', '-sDEVICE=ink_cov', '-dTextFormat=3', fichero_pdf]
ghost_output = subprocess.run(__args, capture_output=True, text=True)
# ghost_output = subprocess.check_output(__args)
# ghost_output = subprocess.run(__args, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# ghost_output = subprocess.Popen(__args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# with subprocess.Popen(__args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) as process:
# for line in process.stdout:
# print(line.decode('utf8'))
print('GS stdout:', ghost_output.stdout)
I have tried a bounch of parameters, subprocess run, check_output, Popen, context manager, but I can't get something in ghost_output.stdout, only empty string or sometimes b'' if I use decode().
(BTW, if I use the '-dQUIET' option, Ghostcript don't show any data but still opens an output window . I don't found the way to don't open any window, neither.)
Anybody knows how to do it properly?

subprocess.Popen: does not retun complete output , when run through crontab

I am calling some java binary in unix environment wrapped inside python script
When I call script from bash, output comes clean and also being stored in desired variable , However when i run the same script from Cron, Output stored(in a Variable) is incomplete
my code:
command = '/opt/HP/BSM/PMDB/bin/abcAdminUtil -abort -streamId ETL_' \
'SystemManagement_PA#Fact_SCOPE_OVPAGlobal'
proc = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
(output, err) = proc.communicate() # Storing Output in output variable
Value of output variable when running from shell:
Abort cmd output:PID:8717
Executing abort function
hibernateConfigurationFile = /OBRHA/HPE-OBR/PMDB/lib/hibernate-core-4.3.8.Final.jar
Starting to Abort Stream ETL_SystemManagement_PA#Fact_SCOPE_OVPAGlobal
Aborting StreamETL_SystemManagement_PA#Fact_SCOPE_OVPAGlobal
Value of output variable when running from cron:
PID:830
It seems output after creating new process is not being stored inside variable , i don't know why ?
Kintul.
You question seems to be very similar to this one: Capture stdout stderr of python subprocess, when it runs from cron or rc.local
See if that helps you.
This happened because Java utility was trowing exception which is not being cached by subprocess.Popen
However exception is catched by subprocess.check_output
Updated Code :
try:
output = subprocess.check_output(command, shell=True, stderr=subprocess.STDOUT, stdin=subprocess.PIPE)
except subprocess.CalledProcessError as exc:
print("Status : FAIL", exc.returncode, exc.output)
else:
print("Output of Resume cmd: \n{}\n".format(output))
file.write("Output of Resume cmd: \n{}\n".format(output) + "\n")
Output of code:
('Status : FAIL', -11, 'PID:37319\n')
('Status : FAIL', -11, 'PID:37320\n')
Hence , command is throwing exception is being cached by subprocess.check_output but not by subprocess.Popen
Extract form official page of subprocess.check_output
If the return code was non-zero it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute and any output in the output attribute.

how do I make my python program to wait for the subprocess to be completed

I have a python program which should execute a command line (command line is a psexec command to call a batch file on the remote server)
I used popen to call the command line. The batch on the remote server produces a return code of 0.
Now I have to wait for this return code and on the basis of the return code I should continue my program execution.
I tried to use .wait() or .check_output() but for some reason did not work for me.
cmd = """psexec -u CORPORATE\user1 -p force \\\sgeinteg27 -s cmd /c "C:\\Planview\\Interfaces\\ProjectPlace_Sree\\PP_Run.bat" """
p = subprocess.Popen(cmd, bufsize=2048, shell=True,
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
p.wait()
print(p.returncode)
##The below block should wait until the above command runs completely.
##And depending on the return code being ZERO i should continue the rest of
##the execution.
if p.returncode ==0:
result = tr.test_readXid.readQuery(cert,planning_code)
print("This is printed depending if the return code is zero")
Here is the EOF the batch file execution and the return code
Can anybody help me with this ?

Paramiko run script and exit

I am trying to make paramiko run a script on external machine and exit, but having problems to make it run the script. Do anyone know why its not running the script? I have tried to run the command manual on the VM and that worked.
command = "/home/test.sh > /dev/null 2>&1 &"
def start_job(host):
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, port=22, username=username, password=password)
return client.exec_command(command)
finally:
client.close()
start_job(hostname)
First, there's a typo in the start_job function on either the argument or the 1st param to client.connect(). Beyond that, just return the stdout/stderr contents before closing the connection:
def start_job(hostname):
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, port=22, username=username, password=password)
(_, stdout, stderr) = client.exec_command(command)
return (stdout.read(), stderr.read())
finally:
client.close()
In your case the command is redirecting stderr to stdout and running in background so I wouldn't expect nothing but two empty strings to be returned back:
start_job(hostname)
('', '')

how to execute python or bash script through ssh connection and get the return code

I have a python file at the location \tmp\ this file print something and return with exit code 22. I'm able to run this script perfectly with putty but not able to do it with paramiko module.
this is my execution code
import paramiko
def main():
remote_ip = '172.xxx.xxx.xxx'
remote_username = 'root'
remote_password = 'xxxxxxx'
remote_path = '/tmp/ab.py'
sub_type = 'py'
commands = ['echo $?']
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(remote_ip, username=remote_username,password=remote_password)
i,o,e = ssh_client.exec_command('/usr/bin/python /tmp/ab.py')
print o.read(), e.read()
i,o,e = ssh_client.exec_command('echo $?')
print o.read(), e.read()
main()
this is my python script to be executed on remote machine
#!/usr/bin/python
import sys
print "hello world"
sys.exit(20)
I'm not able to understand what is actually wrong in my logic. Also when i do cd \tmp and then ls, i'll still be in root folder.
The following example runs a command via ssh and then get command stdout, stderr and return code:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname='hostname', username='username', password='password')
channel = client.get_transport().open_session()
command = "import sys; sys.stdout.write('stdout message'); sys.stderr.write(\'stderr message\'); sys.exit(22)"
channel.exec_command('/usr/bin/python -c "%s"' % command)
channel.shutdown_write()
stdout = channel.makefile().read()
stderr = channel.makefile_stderr().read()
exit_code = channel.recv_exit_status()
channel.close()
client.close()
print 'stdout:', stdout
print 'stderr:', stderr
print 'exit_code:', exit_code
hope it helps
Each time you run exec_command, a new bash subprocess is being initiated.
That's why when you run something like:
exec_command("cd /tmp");
exec_command("mkdir hello");
The dir "hello" is created in dir, and not inside tmp.
Try to run several commands in the same exec_command call.
A different way is to use python's os.chdir()

Resources