How to create a background process of Phoenix Shell in Cent OS? - linux

so I know about creating named pipes using makefifo.
I use this:-
makefifo yourfifo
python sqlline.py < yourfifo &
cat "a.sql" > yourfifo
The problem is I have to do fg once, for the process to actually run. How can I truly make it run in background??

Related

Is there a way in node or shell-scripting to know if a child process starts its own subprocess?

Say I have a simple 1-line shell script test.sh that starts up some other process, like npm start or python server.py.
Now say I run that shell script from within a parent node program, i.e.:
let child = process.spawn("./test.sh");
Is there any way for the node process to keep track of the processes started by that shell script? Say I want to get the pid of the python or npm process and monitor its life cycle.
It's equally useful to know if there's a way for a shell script to do it, because I could write a shell script to run the other shell script, and then have my node project run my shell script as a man-in-the-middle....so if either the shell or node can do it, either way works.
Is it possible?
You can list all the children of a process using something like this in python and providing the PID of the parent as the first argument
#! /usr/bin/env python
import sys
import psutil
for child in psutil.Process(int(sys.argv[1])).children(recursive=True):
print(child.pid)
On Linux, you can strace the target process and look for fork system calls.
Similar techniques are probably possible on most other platforms, though the details will differ. On some platforms, you need root or equivalent privileges to inspect the internals of a process, even if it belongs to yourself.

how to free python process from its console

I have a python program where I made a new process using multiprocessing
but the function doesn't seem to run after FreeConsole() "check" is printed but after that none of the codes work under the sendme() and the main function works after it. actually i want the function sendme to run in the background without console
import multiprocessing as mp
def sendme():
import win32console as con
print("check")
con.FreeConsole()
f=open ("hello2.txt",'w')
f.close()
if name=="__main__":
p=mp.Process(target=sendme)
p.start()
print ("main")
The easiest way to Use the shebang line in your python script. Make it executable using the command,
chmod +x python_script.py
Use no hangup to run a program in background even if you close your terminal.
nohup /path/to/python_script.py &
OR
python /path/to/python_script.py &
Do not forget to use & to put it in background.
To see the process again, use in terminal,
ps ax | grep python_script.py
OR
$ jobs
[1]+ Running nohup python_script.py &
If you are using windows then:
On Windows this can be achieved with the pythonw.exe executable. This will run your program in the background, with no visible process or way to interact with it. You also cannot terminate it without a system monitor.. You can check for the details here pythonw.exe or python.exe?

Trigger a script when nohup process terminates

I have some fairly time consuming python scripts to run ~3 hours or so on my machine. I don't want to run them concurrently since it might crash my machine. Alone I have more than enough memory but running 5 or so might cause an issue. I am running them remotely so I ssh into my server and run them like this:
nohup python my_script.py > my_output.txt &
That way if my connection gets interrupted I can re-establish the connection and my result is right there. I want to run the same python script a couple times with different command line arguments sequentially so I can run everything I need without me needing to set up the next one every few hours. I could manually code all of the arguments into a python script and do it that way but it seems inelegant. I don't want to have to fiddle with my python script every time I do this. Is there some sort of listener I could use to trigger the next one when one of them finishes?
I'd suggest writing a bash script that runs the python jobs sequentially:
#!/bin/bash
python3 my_script1.py > my_output1.txt
python3 my_script2.py > my_output2.txt
Then nohup that:
nohup ./driver.sh &
You really want to read up on utilities like tmux or screen and just script the while thing.

How to start 2 programs that need separate terminals from a bash script?

I have 2 programs that I want to run, programA.py and programB.py. When I run them manually, I have to open separate terminals and type the following commands:
terminal1
python programA.py
terminal2
python programB.py
Each of these programs then output some data on the command line. Lastly, programA.py has to be fully started and waiting before programB.py can start (takes ~2s for programA.py to start and be ready to accept data).
If I am running these programs in Ubuntu, how can I write a bash script that accomplishes that? Right now, I have the following:
#!/bin/bash
python programA.py
python programB.py
This starts programA.py, but because programA.py then waits for input, programB.py doesn't start until you close out of programA.py. How can I change my script to run the two programs simultaneously?
Edit:
Using the advice given by Andreas Neumann below, changing the script to the following successfully launches the two programs:
#!/bin/bash
python programA.py &
sleep 5
python programB.py &
However, when both programs are launched, the code then doesn't work properly. Basically, programA.py is setting up a listening socket, and then creates an interface that the user works with. programB.py then starts afterwards, and runs a process, talking to programA.py over the sockets. When running the above script, programA starts, waits, programB starts, and then programA and B connect, form the interface, but then programB isn't running its background processes correctly.
Updated Answer
If you find my original answer below doesn't work, yet you still want to solve your question with a single script, you could do something like this:
#!/bin/bash
xterm -e "python ProgramA.py" &
sleep 5
python ProgramB.py
Original Answer
If programA is creating a user interface, you probably need that to be in the foreground, so start programB in the background:
{ sleep 5; python programB.py; } &
python ProgramA.py
#!/bin/bash
python programA.py &
sleep 5 # give enough time to start
python programB.py &

How to run a normal C program using monit

I'm trying to monitor a normal C program in Monit, but I don't know how to run the program, what configuration should be set in the control file of Monit.
You need to get the PID of the program to be able to monitor it with Monit. Some programs allow commandline arguments to give the location of a file that they are to write their PID to. Otherwise, you can try starting the program from a wrapper script that writes the PID to a known location, e.g. /usr/bin/myprogram & && jobs -p > /var/run/myprogram.pid in bash.

Resources