Get the output of lsof command Python 3.5 - linux

I am trying to do a script that listen to a directory waiting to new files, then send to a Nextcloud. The files may be big ones,so I want to check if they are complete before sending. I thought about using lsof +D path/to/directory and check if the files are in the output of the command and send them when the file is not. The code would be something like:
command=list()
command.append("lsof")
command.append("+D")
command.append("/path/to/dir")
lsof = subprocess.check_output(command, stderr = subprocess.STDOUT)
But I get subprocess.CalledProcessError returned non-zero exit status 1
Can someone help to ecexute the command and get the output into the variable?
EDIT:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 708, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['lsof', '+D', '/home/CLI2Cache/sync']' returned non-zero exit status 1

There are couple of workarounds for this. You can use shell=True in the check_output -
lsof = subprocess.check_output(command, shell=True, stderr = subprocess.STDOUT)
Please note that shell=True is not safe as it also gives access to a lot of shell commands which might lead to some vulnerabilities if the command is user-specified or not sanitized properly. Please go through this to understand the risks.
A better way would be to use subprocess.Popen -
lsof = subprocess.Popen(command, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
try:
output, errs = lsof.communicate(timeout=20)
except TimeoutExpired:
lsof.kill()
output, errs = proc.communicate()
communicate is also useful, if you want to send the input to the spawned process and get corresponding output at each step.

Related

pythons subprocess hangs (no use of PIPE)

I am calling several bash scripts in a loop:
import subprocess
for script in scripts:
cmd = './{}'.format(script)
subprocess.call(cmd, shell=True)
My issue is that a few scripts will hang and I don't understand why. It seems as though subprocess is just waiting after script has finished executing. Note that I am not using stdout=subprocess.PIPE, etc, that the documentation specifically suggests can overflow a buffer and cause a related issue.
Any suggestions? I can set timeout on the process, but would prefer not to resort to this measure.
When sigterm, the python output reads:
File "/home/ubuntu/anaconda3/lib/python3.6/subprocess.py", line 269, in call
return p.wait(timeout=timeout)
File "/home/ubuntu/anaconda3/lib/python3.6/subprocess.py", line 1457, in wait
(pid, sts) = self._try_wait(0)
File "/home/ubuntu/anaconda3/lib/python3.6/subprocess.py", line 1404, in _try_wait
(pid, sts) = os.waitpid(self.pid, wait_flags)

Timeout using python subprocess

I've got a very strange error. I load large amounts of data to a PG database using psql. One function in my code does ALL loads. One part of my codebase calls the load function, and it works JUST fine. Another part calls the same function with different data), and the psql subprocess call hangs (timeout has to kill it). Executing the SAME command from the command line works just fine:
Here is my code:
myEnv = os.environ.copy()
myEnv["PGPASSWORD"] = <<db password>>
output = None
output = subprocess.check_output(popenArgs, stderr=subprocess.STDOUT, timeout=120, env=myEnv)
The timeout error is:
Traceback (most recent call last):
File "C:\Data\Dropbox\Engagements\<Client>\Src\prod_db.py", line 102, in _copyFrom
output = subprocess.check_output(popenArgs, stderr=subprocess.STDOUT, timeout=120, env=myEnv)
File "C:\Tools\WinPython-64bit-3.5.1.3\python-3.5.1.amd64\lib\subprocess.py", line 629, in check_output
**kwargs).stdout
File "C:\Tools\WinPython-64bit-3.5.1.3\python-3.5.1.amd64\lib\subprocess.py", line 703, in run
stderr=stderr)
subprocess.TimeoutExpired: Command '"C:/Program Files/PostgreSQL/9.4/bin/psql.exe" -h <<DB LOCATION>> -p 5432 -d forecast_dev -U forecast -v ON_ERROR_STOP=1 -AtXwa -c "\copy di_entities_load from C:\Users\Marc\AppData\Local\Temp\copytempdi_entities_load7.csv with csv"' timed out after 120 seconds
If I set the PGPASSWORD variable manually in a command line window, and then copy and paste the command in the error to the command line, it runs fine, and quickly, exiting and returning upon completion.
As I said, this code works when called from a different part of my application, to a different table, with different data.
Any idea what would cause this to fail when called from python, while it still works on the command line?
Windows 10 Pro, Py3.5, PG 9.4, pg8000 db module.
So, it turns out that I was locking the table prior to the copy, and since the copy runs in a different process, it was seeing a locked table to write to and hanging.

Python3.4 -Nmap Requires root privileges

Running on Mac Os 10.10.5
Running this script to scan for hosts on the network:
import nmap
nm = nmap.PortScanner()
nm.scan('192.168.5.1/24', arguments='-O')
for h in nm.all_hosts():
if 'mac' in nm[h]['addresses']:
print(nm[h]['addresses'], nm[h]['vendor'])
When running it its printing:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/nmap/nmap.py", line 290, in analyse_nmap_xml_scan
dom = ET.fromstring(self._nmap_last_output)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/xml/etree/ElementTree.py", line 1326, in XML
return parser.close()
File "<string>", line None
xml.etree.ElementTree.ParseError: no element found: line 1, column 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/*/Documents/*.py", line 3, in <module>
nm.scan('192.168.0.0/24', arguments='-O')
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/nmap/nmap.py", line 235, in scan
nmap_err_keep_trace = nmap_err_keep_trace)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/nmap/nmap.py", line 293, in analyse_nmap_xml_scan
raise PortScannerError(nmap_err)
nmap.nmap.PortScannerError: 'TCP/IP fingerprinting (for OS scan) requires root privileges.\nQUITTING!\n'
I tried going to that directory and running this command in the terminal:
sudo python *.py
({'mac': '02:62:31:41:6D:84', 'ipv4': '192.168.5.1'}, {})
Any suggestions to run the script from the python IDLE?
Running IDLE as root might work, but it might not be a great idea. sudo idle
Option 1 (recommended):
Put the code requiring elevated privileges in a python file which you run with sudo. I assume you want to play with the results, so you could have the script save the results to a file, which you then read in IDLE.
The following code works in python 2.7 and 3.4
import nmap
import json
nm = nmap.PortScanner()
nm.scan('192.168.5.1/24',arguments='-O') #Note that I tested with -sP to save time
output = []
with open('output.txt', 'a') as outfile:
for h in nm.all_hosts():
if 'mac' in nm[h]['addresses']:
item = nm[h]['addresses']
if nm[h]['vendor'].values():
item['vendor'] = list(nm[h]['vendor'].values())[0]
output.append(item)
json.dump(output, outfile)
Run sudo python nmaproot.py
Since the file is written by root, you need to change ownership back to yourself.
sudo chown -r myusername output.txt
In IDLE:
import json
input = open('output.txt','r'):
json_data = json.load(input)
json_data[0] # first host
Option 2 (not recommended at all):
Use subprocess to run the file with the elevated code as root and return the output. It gets kind of messy and requires you to hardcode your password...but it's possible.
from subprocess import Popen, PIPE
cmd = ['sudo', '-S', 'python', 'nmaproot.py']
sudopass = 'mypassword'
p = Popen(cmd, stdin=PIPE, stderr=PIPE,universal_newlines=True, stdout=PIPE)
output = p.communicate(sudopass + '\n')
I'm unsure of how you can run a given portion of your python code as root without saving it to a file and running it separately. I recommend you go with option 1 as option 2 isn't very good (but it was fun to figure out).
Copy the idle desktop shortcut and name it rootidle then right and change properties. Goto desktop entry and add gksu before /usr/bin/idle3. Then load and run the program
maybe this might help someone here. Found this from one site
scanner.scan(ip_addr, '1-1024', '-v -sS', sudo=True)
use
sudo = True

Execute command on linux terminal using subprocess in python

I want to execute following command on linux terminal using python script
hg log -r "((last(tag())):(first(last(tag(),2))))" work
This command give changesets between last two tags who have affected files in "work" directory
I tried:
import subprocess
releaseNotesFile = 'diff.txt'
with open(releaseNotesFile, 'w') as f:
f.write(subprocess.call(['hg', 'log', '-r', '"((last(tag())):(first(last(tag(),2))))"', 'work']))
error:
abort: unknown revision '((last(tag())):(first(last(tag(),2))))'!
Traceback (most recent call last):
File "test.py", line 4, in <module>
f.write(subprocess.call(['hg', 'log', '-r', '"((last(tag())):(first(last(tag(),2))))"', 'work']))
TypeError: expected a character buffer object
Working with os.popen()
with open(releaseNotesFile, 'w') as file:
f = os.popen('hg log -r "((last(tag())):(first(last(tag(),2))))" work')
file.write(f.read())
How to execute that command using subprocess ?
To solve your problem, change the f.write(subprocess... line to:
f.write(subprocess.call(['hg', 'log', '-r', '((last(tag())):(first(last(tag(),2))))', 'dcpp']))
Explanation
When calling a program from a command line (like bash), will "ignore" the " characters. The two commands below are equivalent:
hg log -r something
hg "log" "-r" "something"
In your specific case, the original version in the shell has to be enclosed in double quotes because it has parenthesis and those have a special meaning in bash. In python that is not necessary since you are enclosing them using single quotes.

Subprocess library won't execute compgen

I am trying to make list of every command available on my linux (Lubuntu) machine. I would like to further work with that list in Python. Normally to list the commands in the console I would write "compgen -c" and it would print the results to stdout.
I would like to execute that command using Python subprocess library but it gives me an error and I don't know why.
Here is the code:
#!/usr/bin/python
import subprocess
#get list of available linux commands
l_commands = subprocess.Popen(['compgen', '-c'])
print l_commands
Here is the error I'm getting:
Traceback (most recent call last):
File "commands.py", line 6, in <module>
l_commands = subprocess.Popen(['compgen', '-c'])
File "/usr/lib/python2.7/subprocess.py", line 679, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1249, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
I'm stuck. Could you guys help me with this? How to I execute the compgen command using subprocess?
compgen is a builtin bash command, run it in the shell:
from subprocess import check_output
output = check_output('compgen -c', shell=True, executable='/bin/bash')
commands = output.splitlines()
You could also write it as:
output = check_output(['/bin/bash', '-c', 'compgen -c'])
But it puts the essential part (compgen) last, so I prefer the first variant.
I'm not sure what compgen is, but that path needs to be absolute. When I use subprocess, I spell out the exact page /absolute/path/to/compgen

Resources