Paramiko run script and exit - linux

I am trying to make paramiko run a script on external machine and exit, but having problems to make it run the script. Do anyone know why its not running the script? I have tried to run the command manual on the VM and that worked.
command = "/home/test.sh > /dev/null 2>&1 &"
def start_job(host):
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, port=22, username=username, password=password)
return client.exec_command(command)
finally:
client.close()
start_job(hostname)

First, there's a typo in the start_job function on either the argument or the 1st param to client.connect(). Beyond that, just return the stdout/stderr contents before closing the connection:
def start_job(hostname):
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, port=22, username=username, password=password)
(_, stdout, stderr) = client.exec_command(command)
return (stdout.read(), stderr.read())
finally:
client.close()
In your case the command is redirecting stderr to stdout and running in background so I wouldn't expect nothing but two empty strings to be returned back:
start_job(hostname)
('', '')

Related

wait till command completed in paramiko invoke_shell() [duplicate]

This question already has answers here:
Execute multiple dependent commands individually with Paramiko and find out when each command finishes
(1 answer)
Executing command using "su -l" in SSH using Python
(1 answer)
Closed 5 days ago.
I wanted to wait the given command execution has been completed on remote machines. this case it just executed and return and not waiting till its completed.
import paramiko
import re
import time
def scp_switch(host, username, PasswdValue):
ssh = paramiko.SSHClient()
try:
# Logging into remote host as my credentials
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host, username=username, password=PasswdValue ,timeout=30)
try:
# switcing to powerbroker/root mode
command = "pbrun xyz -u root\n"
channel = ssh.invoke_shell()
channel.send(command)
time.sleep(3)
while not re.search('Password',str(channel.recv(9999), 'utf-8')):
time.sleep(1)
print('Waiting...')
channel.send("%s\n" % PasswdValue)
time.sleep(3)
#Executing the command on remote host with root (post logged as root)
# I dont have any specific keyword to search in given output hence I am not using while loop here.
cmd = "/tmp/slp.sh cool >/tmp/slp_log.txt \n"
print('Executing %s' %cmd)
channel.send(cmd) # its not waiting here till the process completed,
time.sleep(3)
res = str(channel.recv(1024), 'utf-8')
print(res)
print('process completed')
except Exception as e:
print('Error while switching:', str(e))
except Exception as e:
print('Error while SSH : %s' % (str(e)))
ssh.close()
""" Provide the host and credentials here """
HOST = 'abcd.us.domain.com'
username = 'heyboy'
password = 'passcode'
scp_switch(HOST, username, password)
As per my research, it will not return any status code, is there any logic to get the return code and wait till the process completed?
I know this is an old post, but leaving this here in case someone has the same problem.
You can use an echo that will run in case your command executes successfully, for example if you are doing an scp ... && echo 'transfer complete', then you can catch this output with a loop
while True:
s = chan.recv(4096)
s = s.decode()
if 'transfer done' in s:
break
time.sleep(1)

Multiple printing to file

I'm newbie with python 3.6 and I have question about printing to file multiple times using ssh commands.
I'm trying to print multiple show commands after i create ssh session.
The ssh session established and the first show command works fine (printed to created file).
Any other command isn't working as the first one.
My code:
import paramiko
host = '192.168.100.1'
user = 'MyUser'
secret = 'MyPass'
port = 22
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy()) #Set policy to use when connecting to servers without a known host key
ssh.connect(hostname=host, username=user, password=secret, port=port)
file = open("output/" + host + ".txt", "w")
stdin, stdout, stderr = ssh.exec_command('sh ver')
output = stdout.readlines()
file.write(''.join(output))
stdin.flush()
stdin, stdout, stderr = ssh.exec_command('sh arp')
output = stdout.readlines()
file.write(''.join(output))
file.close()
I would like to get assist,
Thanks.

asyncio call works in cmd but not in python

I am using sox to retrieve audio file information. There are 2 windows cmd commands which return information correctly:
C:\Users\Me> "path\to\sox.exe" "--i" "path\to\audiofile.wav"
C:\Users\Me> "path\to\sox.exe" "path\to\audiofile.wav" "-n" "stat"
I'm using an asyncio script to run these two commands and collect the data for processing. I use the below code:
async def async_subprocess_command(*args):
# Create subprocess
process = await asyncio.create_subprocess_exec(
*args,
# stdout must a pipe to be accessible as process.stdout
stdout=asyncio.subprocess.PIPE)
# Wait for the subprocess to finish
stdout, stderr = await process.communicate()
# Return stdout
return stdout.decode().strip()
data1 = await async_subprocess_command(soxExecutable, "--i", audiofilepath)
data2 = await async_subprocess_command(soxExecutable, audiofilepath,"-n", "stat")
As both the cmd commands act as expected, I am confused that data2 from the python script is always blank. Data1 is as expected (with data from sox).
Can anyone help me understand why?
For some reason the second command returns its result via sterr. I added the additional parameter to the asyncio.create_subprocess_exec function to connect sterr to the asyncio.subprocess.PIPE.
async def async_subprocess_command(*args):
# Create subprocess
process = await asyncio.create_subprocess_exec(
*args,
# stdout must a pipe to be accessible as process.stdout
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE)
# Wait for the subprocess to finish
stdout, stderr = await process.communicate()
# Return stdout and sterr
return stdout.decode().strip(),sterr.decode().strip()
data1a, data1b = await async_subprocess_command(soxExecutable, "--i", audiofilepath)
data2a, data2b = await async_subprocess_command(soxExecutable, audiofilepath,"-n", "stat")

how do I make my python program to wait for the subprocess to be completed

I have a python program which should execute a command line (command line is a psexec command to call a batch file on the remote server)
I used popen to call the command line. The batch on the remote server produces a return code of 0.
Now I have to wait for this return code and on the basis of the return code I should continue my program execution.
I tried to use .wait() or .check_output() but for some reason did not work for me.
cmd = """psexec -u CORPORATE\user1 -p force \\\sgeinteg27 -s cmd /c "C:\\Planview\\Interfaces\\ProjectPlace_Sree\\PP_Run.bat" """
p = subprocess.Popen(cmd, bufsize=2048, shell=True,
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
p.wait()
print(p.returncode)
##The below block should wait until the above command runs completely.
##And depending on the return code being ZERO i should continue the rest of
##the execution.
if p.returncode ==0:
result = tr.test_readXid.readQuery(cert,planning_code)
print("This is printed depending if the return code is zero")
Here is the EOF the batch file execution and the return code
Can anybody help me with this ?

how to execute python or bash script through ssh connection and get the return code

I have a python file at the location \tmp\ this file print something and return with exit code 22. I'm able to run this script perfectly with putty but not able to do it with paramiko module.
this is my execution code
import paramiko
def main():
remote_ip = '172.xxx.xxx.xxx'
remote_username = 'root'
remote_password = 'xxxxxxx'
remote_path = '/tmp/ab.py'
sub_type = 'py'
commands = ['echo $?']
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(remote_ip, username=remote_username,password=remote_password)
i,o,e = ssh_client.exec_command('/usr/bin/python /tmp/ab.py')
print o.read(), e.read()
i,o,e = ssh_client.exec_command('echo $?')
print o.read(), e.read()
main()
this is my python script to be executed on remote machine
#!/usr/bin/python
import sys
print "hello world"
sys.exit(20)
I'm not able to understand what is actually wrong in my logic. Also when i do cd \tmp and then ls, i'll still be in root folder.
The following example runs a command via ssh and then get command stdout, stderr and return code:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname='hostname', username='username', password='password')
channel = client.get_transport().open_session()
command = "import sys; sys.stdout.write('stdout message'); sys.stderr.write(\'stderr message\'); sys.exit(22)"
channel.exec_command('/usr/bin/python -c "%s"' % command)
channel.shutdown_write()
stdout = channel.makefile().read()
stderr = channel.makefile_stderr().read()
exit_code = channel.recv_exit_status()
channel.close()
client.close()
print 'stdout:', stdout
print 'stderr:', stderr
print 'exit_code:', exit_code
hope it helps
Each time you run exec_command, a new bash subprocess is being initiated.
That's why when you run something like:
exec_command("cd /tmp");
exec_command("mkdir hello");
The dir "hello" is created in dir, and not inside tmp.
Try to run several commands in the same exec_command call.
A different way is to use python's os.chdir()

Resources