I am trying to execute commands that are in a file and respective host credentials in another file, the target is to pull list of commands from that file and execute the commands in the hosts which are in the Credential file.
The outputs should be routed to another file, I tried to write a code and I was able to achieve on for one device and one command. The output is printing in the screen but with /n for every line not as the desired output.
For now I had placed the command in the same file but my target was to pull the commands in line procedure, Like first line execute the command after that execute the next command.
Please help me in resolving this, I am trying to write it in python.
HOST_CRED.txt:
Hostname <IP> <USERID> <PASSWORD> <command>
import socket
import subprocess
from subprocess import Popen, PIPE, STDOUT
with open("HOST_CRED.txt") as f:
for line in f:
x=line.split()
cmd='plink.exe -ssh '+ x[2] + '#'+ x[1] + ' -pw ' + x[3] + ' ' + x[4]
p=Popen(cmd, stdout=PIPE)
output=p.stdout.read()
print(output)
Related
I have written a python script to search through shodan, my script code returns a file containing an ip list that each line contains single ip.
here is my code:
import shodan
SHODAN_API="YOUR_SHODAN_API"
api = shodan.Shodan(SHODAN_API)
try:
# Search Using Shodan
results = api.search('EXAMPLE')
# Showing the results
print 'Results found: %s' % results['total']
for result in results['matches']:
print '%s' % result['ip_str']
''' following lines could be uncommented due to more information
Don't Uncomment if you are using scanning methods with the results '''
#print result['data']
#print ''
except shodan.APIError, e:
print 'Error: %s' % e
I was wondering if there is any way to automate the task of running my code and then scanning the ip list by external script or something that work on OSX and Linux ?
You can simply use a bash script like the following one:
#!/bin/bash
python ShodanSearch.py >> IPResult.txt
cat IPResult.txt | while read line
do
sudo nmap -n -Pn -sV -p 80,8080 -oG - $line >> NResult.txt
done
As an alternative to the solution above, you could also execute nmap using the python os module to execute shell commands within your python script, or the now preferred method is with the subprocess module, haven’t personally used the latter, but it can definitely do what you want.
I am trying to use the subprocess module of Python 3 to call a command (i.e. netstat -ano > output.txt), but when I run the script, the output file gets created, but nothing gets written into it, in other words, its just blank.
I've tried looking into the subprocess module API about how the subprocess.call() method works, and searching Google for a solution. I tried using the subprocess.check_output() method, but it printed it out as an unformatted string, rather than the column-like format that entering netstat -ano into Windows command prompt usually gives.
This is my current code:
import subprocess as sp
t = open('output.txt', 'w')
command = 'netstat -ano > output.txt'
cmd = command.split(' ')
sp.call(cmd) # sp.call(['netstat', '-ano', '>', 'output.txt'])
t.close()
I thought it was maybe because I didn't use the write() method. But when I changed my code to be
t.write(sp.call(cmd))
I would get the error that the write() method expects a string input, but received an int.
I expected the output to give me what I would normally see if I were to open command prompt (in Windows 10) and type netstat -ano > output.txt, which would normally generate a file called "output.txt" and have the output of my netstat command.
However when I run that command in my current script, it creates the 'output.txt' file, but there's nothing written in it.
If you just cast your cmd to a str like t.write(str(cmd))it will write to output.txt
import subprocess as sp
t = open('output.txt', 'w')
command = 'netstat -ano > output.txt'
cmd = command.split(' ')
t.write(str(cmd))
sp.call(cmd) # sp.call(['netstat', '-ano', '>', 'output.txt'])
t.close()
I have a python program, which calls the shell script through subprocess() module. I am looking for a way to pass a simple file, as an input to shell script. Does this happen through subproess and popen?
I have tried this code for an AWS lambda function
It would be nice/helpful if you could share some excerpt of your code in your question.
But assuming bits of it.
Here is a way to achieve this.
import shlex
from subprocess import PIPE, Popen
import logger
def run_script(script_path, script_args):
"""
This function will run a shell script.
:param script_path: String: the path of script that needs to be called
:param script_args: String: the arguments needed by the shell script
:return:
"""
logger.info("Running bash script {script} with parameters:{params}".format(script=script_path, params=script_args))
# Adding a whitespace in shlex.split because the path gets distorted if args are added without it
session = Popen(shlex.split(script_path + " " + script_args), stderr=PIPE, stdout=PIPE, shell=False)
stdout, stderr = session.communicate()
# Beware that stdout and stderr will be bytes so in order to get a proper python string decode the values.
logger.debug(stdout.decode('utf-8'))
if stderr:
logger.error(stderr)
raise Exception("Error " + stderr.decode('utf-8'))
return True
Now a couple of things to note here
Your bash script should be able to handle the args properly may it be $1 or named params like --file or -f
Just give all the params you want in the string array in shlex method.
Also note the comments mentioned in code above.
This is a simple code that logs into a linux box, and execute a grep to a resource on the box. I just need to be able to view the output of the command I execute, but nothing is returned. Code doesn't report any error but the desired output is not written.
Below is the code:
import socket
import re
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('linux_box', port=22, username='abc', password='xyz')
stdin, stdout, stderr = ssh.exec_command('grep xyz *.set')
output2 = stdout.readlines()
type(output2)
This is the output I get:
C:\Python34\python.exe C:/Python34/Paramiko.py
Process finished with exit code 0
You never actually print anything to standard output.
Changing last line to print(output2) should print value correctly.
Your code was likely based on interactive Python shell experiments, where return value of last executed command is printed to standard output implicitly. In non-interactive mode such behavior does not occur. That's why you need to use print function explicitly.
I want to execute following command on linux terminal using python script
hg log -r "((last(tag())):(first(last(tag(),2))))" work
This command give changesets between last two tags who have affected files in "work" directory
I tried:
import subprocess
releaseNotesFile = 'diff.txt'
with open(releaseNotesFile, 'w') as f:
f.write(subprocess.call(['hg', 'log', '-r', '"((last(tag())):(first(last(tag(),2))))"', 'work']))
error:
abort: unknown revision '((last(tag())):(first(last(tag(),2))))'!
Traceback (most recent call last):
File "test.py", line 4, in <module>
f.write(subprocess.call(['hg', 'log', '-r', '"((last(tag())):(first(last(tag(),2))))"', 'work']))
TypeError: expected a character buffer object
Working with os.popen()
with open(releaseNotesFile, 'w') as file:
f = os.popen('hg log -r "((last(tag())):(first(last(tag(),2))))" work')
file.write(f.read())
How to execute that command using subprocess ?
To solve your problem, change the f.write(subprocess... line to:
f.write(subprocess.call(['hg', 'log', '-r', '((last(tag())):(first(last(tag(),2))))', 'dcpp']))
Explanation
When calling a program from a command line (like bash), will "ignore" the " characters. The two commands below are equivalent:
hg log -r something
hg "log" "-r" "something"
In your specific case, the original version in the shell has to be enclosed in double quotes because it has parenthesis and those have a special meaning in bash. In python that is not necessary since you are enclosing them using single quotes.