I have some fairly time consuming python scripts to run ~3 hours or so on my machine. I don't want to run them concurrently since it might crash my machine. Alone I have more than enough memory but running 5 or so might cause an issue. I am running them remotely so I ssh into my server and run them like this:
nohup python my_script.py > my_output.txt &
That way if my connection gets interrupted I can re-establish the connection and my result is right there. I want to run the same python script a couple times with different command line arguments sequentially so I can run everything I need without me needing to set up the next one every few hours. I could manually code all of the arguments into a python script and do it that way but it seems inelegant. I don't want to have to fiddle with my python script every time I do this. Is there some sort of listener I could use to trigger the next one when one of them finishes?
I'd suggest writing a bash script that runs the python jobs sequentially:
#!/bin/bash
python3 my_script1.py > my_output1.txt
python3 my_script2.py > my_output2.txt
Then nohup that:
nohup ./driver.sh &
You really want to read up on utilities like tmux or screen and just script the while thing.
Related
I'm working on optimizing some of my workflow and was wondering if anyone has run into a similar issue as mine. I've been struggling to figure out how to start multiple "nohup" shell scripts at the same time. For example, I have several scripts that look like this:
start.sh
rm nohup.out
nohup python -u script.py args
I've tried running them with a script like this:
start_option_1.sh
process_directory_1/start.sh & process_directory_2/start.sh ... (3-5 more of these)
And like this:
start_option_2.sh
process_directory_1/start.sh && process_directory_2/start.sh ... (3-5 more of these)
but no dice... the scripts won't even start. Any ideas/help would be greatly appreciated!! Using python3.6 if that's important too (but seems like it's more of a nohup issue).
There is a big difference between using '&' and using '&&'. The first will run each of the scripts in the background. The second will executed them in sequence, as long as each scripts will return success ('exit 0', or equivalent).
From the context of 'start.sh', looks like you want the first option (start all scripts together). Each script is executing a python program 'script.py'. The post did not specify if there is one script at the initial working directory, or if there are multiple 'script.py', one in each folder. Probably the second option.
If that case, you want to launch you scripts from the process_directory_* folder. Consider making a change to:
( cd process_directory_1 && exec ./start.sh) &
( cd process_directory_2 && exec ./start.sh) &
...
Notes:
All scripts are launched at the same time.
Each script is executed in a different folder, to access the script.py in that folder.
Each job will leave log in the run folder 'nohup.log'.
I would like to run a batch of bash commands (all together) in a server shell through a python3 script in my local machine.
The reason why I'm not running the python3 script on my laptop is that I can't create the same environment on the server and I want to keep the settings I have on my machine while executing the script.
What I would like to do is:
-Run python commands locally
-Run at a certain point those command on the server
-Wait for the end of server execution
-Continue running python script
(This will be done in a loop)
What I'm trying is to put all the commands in a bash script ssh_commands.sh and use the following command:
subprocess.call('cat ssh_commands.sh | ssh -T -S {} -p {} {}'.format(socket, port, user_host).split(),shell=True)
But when the execution of the script reaches that line get stuck until subprocess.call timeout. The execution of the script anyway won't take that much. The only way to stop the script before is through Ctrl+C
I've also tried to set up the ssh connection in the ~/.ssh/config file but I'm getting the same result.
I know that ssh connection works fine and if I run ssh_commands.sh on the server manually, it runs without any problem.
Can somebody suggest:
- A way for fixing what I'm trying to do
- A better way for achieving the final result written above
- Some debugging way to find out what could be the problem
Thank you in advance
To expand on my comment - and I haven't tested your specific case with ssh, could be there are other complications there). This is actually copy/pasted from my own code in a situation that I already know works.
from subprocess import Popen, PIPE, DEVNULL
from shlex import split as sh_split
proc1 = Popen(sh_split(file_cmd1), stdout=PIPE)
proc2 = Popen(file_cmd2, shell=True, stdin=proc1.stdout, stdout=PIPE)
proc1.stdout.close()
I have a specific reason to use shell=True in the second, but you should probably be able to use shlex.split there too I'm guessing.
Basically you're running one command, outputting to `PIPEĀ“, then using this as input for the second command.
I am running my code from the shell(SSH) in google cloud as
python3 mycode.py
If I close the shell, the computation stops. How can I start a computation and then close the shell(Computation takes a long Time:)).....come back later and see how it is doing.
My code keeps printing results after a certain number of iteration.
Well, in general what you can do is run the code in a way where you can detach from the interactive environment. Using a tool such as screen or tmux. However, Google Cloud Shell is not made for running background tasks, and if i recall correctly, it will terminate after an hour.
You might need to provision a virtual machine to run it on instead. I can recommend using tmux. With tmux, it will be as simple as running tmux and then in the new shell running your script python3 mycode.py. You can then detach using ctrl+b d or simply disconnect. When you reconnect you run tmux attach -dto get back to your script.
I will provide you as well the following possibility:
A Bash Builtin command that you could also use is disown. First, run your process/script in the background (using &, or stopping it with ^Z and then restarting with bg):
$ long_operation_command &
[1] 1156
Note that at this point the process is still linked to the session and in case it is closed it will be killed.
You can the process attached to the session check running jobs in the background:
$ jobs
[1]+ Running long_operation_command
Therefore you can run disown in order to detach the processes from the session:
$ disown
You can confirm this checking the result of your script or command logging in again or checking with top the process still running.
Check also this because it could be interesting, i.e. the difference between nohup foo, foo & and $ foo & disown
I have 2 programs that I want to run, programA.py and programB.py. When I run them manually, I have to open separate terminals and type the following commands:
terminal1
python programA.py
terminal2
python programB.py
Each of these programs then output some data on the command line. Lastly, programA.py has to be fully started and waiting before programB.py can start (takes ~2s for programA.py to start and be ready to accept data).
If I am running these programs in Ubuntu, how can I write a bash script that accomplishes that? Right now, I have the following:
#!/bin/bash
python programA.py
python programB.py
This starts programA.py, but because programA.py then waits for input, programB.py doesn't start until you close out of programA.py. How can I change my script to run the two programs simultaneously?
Edit:
Using the advice given by Andreas Neumann below, changing the script to the following successfully launches the two programs:
#!/bin/bash
python programA.py &
sleep 5
python programB.py &
However, when both programs are launched, the code then doesn't work properly. Basically, programA.py is setting up a listening socket, and then creates an interface that the user works with. programB.py then starts afterwards, and runs a process, talking to programA.py over the sockets. When running the above script, programA starts, waits, programB starts, and then programA and B connect, form the interface, but then programB isn't running its background processes correctly.
Updated Answer
If you find my original answer below doesn't work, yet you still want to solve your question with a single script, you could do something like this:
#!/bin/bash
xterm -e "python ProgramA.py" &
sleep 5
python ProgramB.py
Original Answer
If programA is creating a user interface, you probably need that to be in the foreground, so start programB in the background:
{ sleep 5; python programB.py; } &
python ProgramA.py
#!/bin/bash
python programA.py &
sleep 5 # give enough time to start
python programB.py &
I wrote an init script to execute last that will start some pythjon script. The Python script will just run and never terminate and this makes my little linux box (getty terminal on tty) to just outpout the script but never the login prompt. I made the mistake to not assign a fix ip so i basically had to start over again (re-download the initial build onto the flash). However now I'm wondering what different possibilities I have, is it enough to launch the script in my init script with a & at the end or do I need a nohup/ What's the best way to resolve this?
Thank you!
Ron
I got this resolved by directing the output of the script to /dev/null and adding an ampersand at the end kinda like
myscript.sh > /dev/null &
this will return control to the shell and keep executing the script in the back without reporting the results to stdout.
Yes, you need to make sure you init startup script returns. It's good to follow the LSB documentation and have a script that excepts standard actions like start, stop, status and make it return proper return codes.
Launching the python script in the background with an & at the end of the command should be sufficient. You may also want to take a look at the command start-stop-daemon to start your process.