Python os.system - set max time execution - python-3.x

I have a simple function which is aimed to parse IP and execute ipwhois on windows machine, printing output in serveral txt files:
for ip in unique_ip:
os.system('c:\whois64 -v ' + ip + ' > ' + 'C:\\ipwhois\\' + ip +'.txt' )
It happen that this os.system call get stuck, and entire process froze.
Question: is it possible to set max time execution on os.system command?
EDIT
This works:
def timeout_test():
command_line = 'whois64 -v xx.xx.xx.xx'
args = shlex.split(command_line)
print(args)
try:
with open('c:\test\iptest.txt', 'w') as fp:
subprocess.run(args, stdout=fp, timeout=5)
except subprocess.TimeoutExpired:
print('process ran too long')
return (True)
test = timeout_test()

You can add a timeout argument to subprocess.call which works almost the same way as os.system. the timeout is measured in seconds since first attempted execution

Related

Run linux shell commands in Python3

I am creating a service manager to manage services such as apache , tomcat .. etc.
I can enable/disable services by srvmanage.sh enable <service_name> in shell.
I want to do this using python script. How to do it?
service_info = ServiceDB.query.filter_by(service_id=service_id).first()
service_name = service_info.service
subprocess.run(['/home/service_manager/bin/srvmanage.sh enable', service_name],shell=True)
what is the problem with this code ?
I'm guessing if you want to do this in python you may want more functionality. If not #programandoconro answer would do. However, you could also use the subprocess module to get more functionality. It will allow you to run a command with arguments and return a CompletedProcess instance. For example
import subprocess
# call to shell script
process = subprocess.run(['/path/to/script', options/variables], capture_output=True, text=True)
You can add in additional functionality by capturing the stderr/stdout and return code. For example:
# call to shell script
process = subprocess.run(['/path/to/script', options/variables], capture_output=True, text=True)
# return code of 0 test case. Successful run should return 0
if process.returncode != 0:
print('There was a problem')
exit(1)
Docs for subprocess is here
You can use os module to access system commands.
import os
os.system("srvmanage.sh enable <service_name>")
I fixed this issue .
operation = 'enable'
service_operation_script = '/home/service_manager/bin/srvmanage.sh'
service_status = subprocess.check_output(
"sudo " +"/bin/bash " + service_operation_script + " " + operation + " " + service_name, shell=True)
response = service_status.decode("utf-8")
print(response)

wait till command completed in paramiko invoke_shell() [duplicate]

This question already has answers here:
Execute multiple dependent commands individually with Paramiko and find out when each command finishes
(1 answer)
Executing command using "su -l" in SSH using Python
(1 answer)
Closed 5 days ago.
I wanted to wait the given command execution has been completed on remote machines. this case it just executed and return and not waiting till its completed.
import paramiko
import re
import time
def scp_switch(host, username, PasswdValue):
ssh = paramiko.SSHClient()
try:
# Logging into remote host as my credentials
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host, username=username, password=PasswdValue ,timeout=30)
try:
# switcing to powerbroker/root mode
command = "pbrun xyz -u root\n"
channel = ssh.invoke_shell()
channel.send(command)
time.sleep(3)
while not re.search('Password',str(channel.recv(9999), 'utf-8')):
time.sleep(1)
print('Waiting...')
channel.send("%s\n" % PasswdValue)
time.sleep(3)
#Executing the command on remote host with root (post logged as root)
# I dont have any specific keyword to search in given output hence I am not using while loop here.
cmd = "/tmp/slp.sh cool >/tmp/slp_log.txt \n"
print('Executing %s' %cmd)
channel.send(cmd) # its not waiting here till the process completed,
time.sleep(3)
res = str(channel.recv(1024), 'utf-8')
print(res)
print('process completed')
except Exception as e:
print('Error while switching:', str(e))
except Exception as e:
print('Error while SSH : %s' % (str(e)))
ssh.close()
""" Provide the host and credentials here """
HOST = 'abcd.us.domain.com'
username = 'heyboy'
password = 'passcode'
scp_switch(HOST, username, password)
As per my research, it will not return any status code, is there any logic to get the return code and wait till the process completed?
I know this is an old post, but leaving this here in case someone has the same problem.
You can use an echo that will run in case your command executes successfully, for example if you are doing an scp ... && echo 'transfer complete', then you can catch this output with a loop
while True:
s = chan.recv(4096)
s = s.decode()
if 'transfer done' in s:
break
time.sleep(1)

Setting timeout when using os.system function

Firstly, I'd like to say I just begin to learn python, And I want to execute maven command inside my python script (see the partial code below)
os.system("mvn surefire:test")
But unfortunately, sometimes this command will time out, So I wanna to know how to set a timeout threshold to control this command.
That is to say, if the executing time is beyond X seconds, the program will skip the command.
What's more, can other useful solution deal with my problem? Thanks in advance!
use the subprocess module instead. By using a list and sticking with the default shell=False, we can just kill the process when the timeout hits.
p = subprocess.Popen(['mvn', 'surfire:test'])
try:
p.wait(my_timeout)
except subprocess.TimeoutExpired:
p.kill()
Also, you can use in terminal timeout:
Do like that:
import os
os.system('timeout 5s [Type Command Here]')
Also, you can use s, m, h, d for second, min, hours, day.
You can send different signal to command. If you want to learn more, see at:
https://linuxize.com/post/timeout-command-in-linux/
Simple answer
os.system not support timeout.
you can use Python 3's subprocess instead, which support timeout parameter
such as:
yourCommand = "mvn surefire:test"
timeoutSeconds = 5
subprocess.check_output(yourCommand, shell=True, timeout=timeoutSeconds)
Detailed Explanation
in further, I have encapsulate to a function getCommandOutput for you:
def getCommandOutput(consoleCommand, consoleOutputEncoding="utf-8", timeout=2):
"""get command output from terminal
Args:
consoleCommand (str): console/terminal command string
consoleOutputEncoding (str): console output encoding, default is utf-8
timeout (int): wait max timeout for run console command
Returns:
console output (str)
Raises:
"""
# print("getCommandOutput: consoleCommand=%s" % consoleCommand)
isRunCmdOk = False
consoleOutput = ""
try:
# consoleOutputByte = subprocess.check_output(consoleCommand)
consoleOutputByte = subprocess.check_output(consoleCommand, shell=True, timeout=timeout)
# commandPartList = consoleCommand.split(" ")
# print("commandPartList=%s" % commandPartList)
# consoleOutputByte = subprocess.check_output(commandPartList)
# print("type(consoleOutputByte)=%s" % type(consoleOutputByte)) # <class 'bytes'>
# print("consoleOutputByte=%s" % consoleOutputByte) # b'640x360\n'
consoleOutput = consoleOutputByte.decode(consoleOutputEncoding) # '640x360\n'
consoleOutput = consoleOutput.strip() # '640x360'
isRunCmdOk = True
except subprocess.CalledProcessError as callProcessErr:
cmdErrStr = str(callProcessErr)
print("Error %s for run command %s" % (cmdErrStr, consoleCommand))
# print("isRunCmdOk=%s, consoleOutput=%s" % (isRunCmdOk, consoleOutput))
return isRunCmdOk, consoleOutput
demo :
isRunOk, cmdOutputStr = getCommandOutput("mvn surefire:test", timeout=5)

how do I make my python program to wait for the subprocess to be completed

I have a python program which should execute a command line (command line is a psexec command to call a batch file on the remote server)
I used popen to call the command line. The batch on the remote server produces a return code of 0.
Now I have to wait for this return code and on the basis of the return code I should continue my program execution.
I tried to use .wait() or .check_output() but for some reason did not work for me.
cmd = """psexec -u CORPORATE\user1 -p force \\\sgeinteg27 -s cmd /c "C:\\Planview\\Interfaces\\ProjectPlace_Sree\\PP_Run.bat" """
p = subprocess.Popen(cmd, bufsize=2048, shell=True,
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
p.wait()
print(p.returncode)
##The below block should wait until the above command runs completely.
##And depending on the return code being ZERO i should continue the rest of
##the execution.
if p.returncode ==0:
result = tr.test_readXid.readQuery(cert,planning_code)
print("This is printed depending if the return code is zero")
Here is the EOF the batch file execution and the return code
Can anybody help me with this ?

Execute python scripts from another python script opening another shell

I'm using python 3, I need one script to call the other and run it in a different shell, without passing arguments, I'm using mac os x, but I need it to be cross platform.
I tried with
os.system('script2.py')
subprocess.Popen('script2.py', shell=true)
os.execl(sys.executable, "python3", 'script2.py')
But none of them accomplish what I need.
I use the second script to get inputs, while the first one handles the outputs...
EDIT
This is the code on my second script:
import sys
import os
import datetime
os.remove('Logs/consoleLog.txt')
try:
os.remove('Temp/commands.txt')
except:
...
stopSim = False
command = ''
okFile = open('ok.txt', 'w')
okFile.write('True')
consoleLog = open('Logs/consoleLog.txt', 'w')
okFile.close()
while not stopSim:
try:
sysTime = datetime.datetime.now()
stringT = str(sysTime)
split1 = stringT.split(" ")
split2 = split1[0].split("-")
split3 = split1[1].split(":")
for i in range(3):
split2.append(split3[i])
timeString = "{0}-{1}-{2} {3}:{4}".format(split2[2], split2[1], split2[0], split2[3], split2[4])
except:
timestring = "Time"
commandFile = open('Temp/commands.txt', 'w')
command = input(timeString + ": ")
command = command.lower()
consoleLog.write(timeString + ': ' + command + "\n")
commandFile.write(command)
commandFile.close()
if command == 'stop simulation' or command == 'stop sim':
stopSim = True
consoleLog.close()
os.remove('Temp/commands.txt')
and this is where I call and what for the other script to be operative in script 1:
#Open console
while not consoleOpent:
try:
okFile = open('ok.txt', 'r')
c = okFile.read()
if c == 'True':
consoleOpent = True
except:
...
Sorry for the long question...
Any suggestion to improve the code is welcomed.
Probably the easiest solution is to make the contents of you second script a function in the first script, and execute it as a multiprocessing Process. Note that you can use e.q. multiprocessing.Pipe or multiprocessing.Queue to exchange data between the different processes. You can also share values and arrays via multiprocessing.sharedctypes.
This will be platform-dependent. Here a solution for Mac OS X.
Create new file run_script2 with this content:
/full/path/to/python /full/path/to/script2.py
Make it executable.: chmod +x run_script2
Run from Python with:
os.system('open -a Terminal run_script2')
Alternatively you can use: subprocess.call.
subprocess.call(['open -a Terminal run_script2'], shell=True)
On Windows you can do something similar with (untested):
os.system('start cmd /D /C "python script2.py && pause"')

Resources