I need to run an external C_program using subprocess from a python script.
Here is the tricky parts:
the external program needs to run with super user privileges, and thus I need to ask the user to input his password to make sure he is allowed to run the program.
The external program might run for extremely long time (sometimes days) and the user might need to terminate it manually using CTRL+C
The output of the C_program should be printed to the screen thus the stdout should be piped to the subprocess.
this is what I've tried doing:
try:
proc = subprocess.Popen(c_program, shell=True, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
#needs sudo password to run this program
proc.communicate(getpass.getpass())
except KeyboardInterrupt:
print "you stopped c_program, script will now go on..."
#do other things
unfortunately, this will cause two problems which I cannot seem to be able to fix:
pressing CTRL+C while c_program is being executed will do absolutly nothing (not only won't it stop the execution of c_program or even the python script itself, it is just ignored...)
In case the script does finish its run, the terminal starts behaving... it will not display what the user is typing in but it does respond to it (for example, typing ls -l will not be displayed on the screen, but, after pressing enter, the output of the command will be printed to the screen)
I'm using Python 2.7 on Ubuntu
Please help :)
Related
I would like to run a batch of bash commands (all together) in a server shell through a python3 script in my local machine.
The reason why I'm not running the python3 script on my laptop is that I can't create the same environment on the server and I want to keep the settings I have on my machine while executing the script.
What I would like to do is:
-Run python commands locally
-Run at a certain point those command on the server
-Wait for the end of server execution
-Continue running python script
(This will be done in a loop)
What I'm trying is to put all the commands in a bash script ssh_commands.sh and use the following command:
subprocess.call('cat ssh_commands.sh | ssh -T -S {} -p {} {}'.format(socket, port, user_host).split(),shell=True)
But when the execution of the script reaches that line get stuck until subprocess.call timeout. The execution of the script anyway won't take that much. The only way to stop the script before is through Ctrl+C
I've also tried to set up the ssh connection in the ~/.ssh/config file but I'm getting the same result.
I know that ssh connection works fine and if I run ssh_commands.sh on the server manually, it runs without any problem.
Can somebody suggest:
- A way for fixing what I'm trying to do
- A better way for achieving the final result written above
- Some debugging way to find out what could be the problem
Thank you in advance
To expand on my comment - and I haven't tested your specific case with ssh, could be there are other complications there). This is actually copy/pasted from my own code in a situation that I already know works.
from subprocess import Popen, PIPE, DEVNULL
from shlex import split as sh_split
proc1 = Popen(sh_split(file_cmd1), stdout=PIPE)
proc2 = Popen(file_cmd2, shell=True, stdin=proc1.stdout, stdout=PIPE)
proc1.stdout.close()
I have a specific reason to use shell=True in the second, but you should probably be able to use shlex.split there too I'm guessing.
Basically you're running one command, outputting to `PIPEĀ“, then using this as input for the second command.
I'm running a Python script on AWS (Amazon Linux AMI). The script is meant to run 24/7 and prints out to the interpreter or command terminal.
It's still in development, so I like to check to see how it's behaving or if it's stopped due to an error. But, I want to be able to close my ssh connection or turn off my local machine without interrupting the script, and then ssh back in and see it in real-time.
First I used:
[me#aws-x.x.x.x dir]$ nohup python3 myscript.py &
That worked fine for closing the ssh connection, coming back in, and seeing that it was still running, and then writing all the print statements to nohup.out.
Is there a way for me to bring this to the foreground and see the print statements in real-time, and then send it back to the background to disconnect from ssh without interrupting the program?
The code has the general form:
import time
count = 0
while count < 100000:
print('Im working!')
time.sleep(10)
count += 1
print('All finished')
You can use tmux or screen (depending on which is available on the system) to run your program in a terminal multiplexer, detach from it and close the connection. When you return, you can attach to the same session and see your program running.
For tmux
$ tmux
# run your program in the tmux shell:
$ python3 myscript.py
Detach from the tmux session with Ctrl + b and then d
You can now safely exit your ssh session
Next time you log in, just tmux attach and you can see you script running.
Addition:
For screen the detach command is Ctrl + a and d, reattaching is done with screen -r.
Keep the tasks in background, and just using tail -f nohup.out
Since this morning Terminal has been acting strangely. It runs login and bash (the shell), but the ID keeps running up (it should equal the ID of login and remain constant) and opens additional processes like sort, cat, env and then closing them. This prevented me from entering any commands.
In trying to solve the problem I did a Shell -> Send Reset and Shell -> Send Hard Reset, which didn't solve the problem and now my command line is gone. I can still enter commands via Shell -> New commands.
I've created another admin user and tested terminal and that works.
I'm looking to stop the endless loop and restore my command line.
My OS is OSX 10.9.5
I tried running a python script, but it immediately closes when it encounters an error. Is there a way to stop the console window from closing after an error without using a batch file and without typing this command:
C:\WINDOWS\system32\cmd.exe /K <command>
By the way, adding try and except still doesn't stop the console window from closing. Even using:
except:
sys.exit(0)
Well the console closes as soon as the python program is finished. So if you sys.exit(0) in your except statement, the program finishes and the console close.
Either you write a .bat file next to your python file, that makes sure to call cmd.exe properly, and you run that one. Or you wait in the except statement for some user input/confirmation and sys.exit() only then.
Last solution if you want to see the console only to see the messages, try to print the messages to a file, either using print( "abc", file=opened_file ) in the python code, or adding > filename at the end of the shell command to redirect the standard output to a file.
Personally, I just run the program directly from cmd.exe's shell and never close the terminal.
I have written a Fortran program (let's call it program.exe) with does some simulation for me. Via ssh I'm logging ino some far away computers to start runs there whose results I collect after a few days. To be up-to-date how the program proceeds I want to write the shell output into a text file output.txt also (since I can't be logged in the far away computers all the time). The command should be something like
nohup program.exe | tee output.txt > /dev/null &
This enables me to have a look at output.txt to see the current status even though the program hasn't ended its run yet. The above command works fine on my local machine. I tried first with the command '>' but here the problem was that nothing was written into the text file until the whole program had finish (maybe related to the pipe buffer?). So I used the workaround with 'tee'.
The problem is now that when I log into the computer via ssh (ssh -X user#machine), execute the above command and look at output.txt with the VI editor nothing appears until the program has finished. If I omit the 'nohup' and '&' I will not even get any shell output until it has finished. My thought was that it might have to do something with data being buffered by ssh but I'm rather a Linux newbie. For any ideas or workaround I would be very grateful!
I would use screen utility http://www.oreillynet.com/linux/cmd/cmd.csp?path=s/screen instead of nohup. Thus I would be able to set my program to detached state (^A^D) reconnect to the host, retrieve my screen session (screen -r)
and monitor my output as if I never logged out.