How do I stop the previous command in a bash script? [duplicate] - node.js

This question already has answers here:
linux: kill background task
(8 answers)
Closed 6 years ago.
I'm trying to build a bash script that runs unit tests on a web server.
I wish to run a simple bash script that could be (for example) in a makefile that does the following:
Start a web server
Run the unit tests
Stop the web server
Assuming that the web server is written for NodeJS, here's what it could look like:
npm start &
sleep 5
npm test
The issue with this approach is that the web server will keep on running. How do I stop it after the npm test stage?

The PID of the last started background process is stored in $!, so pick it up after starting npm and kill it when needed:
npm start &
npm_pid=$!
...
kill $npm_pid

You might consider running the process in a screen.
screen -dmS npm npm start #Starts a detached screen named npm, and runs "npm start" inside
Later, when you need to kill the process, you can just kill the screen. When the screen dies, it will take it's child processes with it.
screen -X -S npm quit

Perhaps a duplicate of this solution
ps -ef | grep your_process_name | grep -v grep | awk '{print $2}' | xargs kill)

Related

centos | perl processes are not shown

I have a centos server in which I have install perl package to run some perl scripts. today I run some perl scripts, and when I run ps -ef | grep perl it shows nothing although the scripts are working properly.
When I use pkill -f (name_of_script) the perl process stopped however they are not shown at all.
Note that yesterday I deleted a user ( X ) which was affected to folder /home/scripts. What do you think the problem is from?
The reason for not showing the process in ps -ef might be due to the process being run as a background process. In that case, the process won't be associated with the terminal where you started it from and thus won't be shown in the output of ps -ef. To see all processes running on the system, including background processes, you can use ps aux instead.

Make Nodejs script run in background in gitlab CI

Our dev project start by command npm run serve Is it possible to run it on background mode? I tried to use nohup, & in end of string. It works properly in shell, but when it start by CI on Gitlab, pipeline state always is "running" cause npm output permanently shows on screen
The clean way would be to run a container whose run command is "npm run serve"
I'm not certain running a non-blocking command through your pipeline is the right way but you should try using "&"
"npm run serve" will run the command in "detached mode.
I've faced the same problem using nohup and &. It was working well in shell, but not on Gitlab CI, It looks like npm start was not detached.
What worked for me is to call npm start inside a bash script and run it on before_script hook.
test:
stage: test
before_script:
- ./serverstart.sh
script:
- npm test
after_script:
- kill -9 $(ps aux | grep '\snode\s' | awk '{print $2}')
on the bash script serverstart.sh
# !/bin/bash
# start the server and send the console and error logs on nodeserver.log
npm start > nodeserver.log 2>&1 &
# keep waiting until the server is started
# (in my case wait for mongodb://localhost:27017/app-test to be logged)
while ! grep -q "mongodb://localhost:27017/app-test" nodeserver.log
do
sleep .1
done
echo -e "server has started\n"
exit 0
this allowed me to detach npm start and pass to next command while keeping npm startprocess alive

Node Restart in Shell Script

I want to rerun the node server, whenever there's changes in file, but i want to use fswatch. i am using fswatch with the shell script like
nohup node server.js &
but i can't run the same script again since it will throw EADDRINUSE.
whats the best way to restart the node via script with the help of fswatch (or alternate, without any new installation) ?
Why node doesn't have something like node restart?
what i could think of is get pid from ps and kill it with script, but i guess there should be a better solution.
I am able to do that with the help of fswatch.
fswatch -o mydir | xargs -n1 -I{} ps | grep '[n]ode server.js$' | awk '{system("kill "$1)}'
or putting the same code in seperate shell file. and using it as
xargs -n1 './location-of-shell-file.sh'
when i run grep, that process is also included in ps, so to exclude that i had to use
grep '[n]ode server.js'
EADDRINUSE comes because of you are already bind on the same port. Node.js has no build-in restart mechanics, so yes, you should use some bash scripts (or frameworks) for good restarting background app

Upstart resulting in two processes - why?

I am using upstart to start a NodeJS process which is using NVM (node version manager).
The upstart command is like this:
description "Service to start node app"
author "Barry Steyn"
setuid devuser
setgid devuser
env DIR=/home/devuser/nodejs/authentication
script
chdir $DIR
exec bash -c 'source /home/devuser/nvm/nvm.sh && node app'
end script
respawn
This starts node fine, but when I do a ps wax | grep node, I get these two processes:
4284 ? Ss 0:00 bash -c source /home/devuser/nvm/nvm.sh && node app
4316 ? Sl 1:09 node app
Why do I get two processes? Is this in anyway less efficient?
The first process is the instance of bash that started node.js. The second process is the actual node.js process.
The bash process is using a few resources (mostly memory), but is just sitting around waiting on node.js to exit.
I believe you can get rid of the extra bash by doing this:
Replace:
exec bash -c 'source /home/devuser/nvm/nvm.sh && node app'
With:
exec bash -c 'source /home/devuser/nvm/nvm.sh && exec node app'
This gets the bash process to exec node.js without using a fork first. Mostly that means it won't wait around for node.js to exit.

How to run Node.js as a background process and never die?

I connect to the linux server via putty SSH. I tried to run it as a background process like this:
$ node server.js &
However, after 2.5 hrs the terminal becomes inactive and the process dies. Is there anyway I can keep the process alive even with the terminal disconnected?
Edit 1
Actually, I tried nohup, but as soon as I close the Putty SSH terminal or unplug my internet, the server process stops right away.
Is there anything I have to do in Putty?
Edit 2 (on Feb, 2012)
There is a node.js module, forever. It will run node.js server as daemon service.
nohup node server.js > /dev/null 2>&1 &
nohup means: Do not terminate this process even when the stty is cut
off.
> /dev/null means: stdout goes to /dev/null (which is a dummy
device that does not record any output).
2>&1 means: stderr also goes to the stdout (which is already redirected to /dev/null). You may replace &1 with a file path to keep a log of errors, e.g.: 2>/tmp/myLog
& at the end means: run this command as a background task.
Simple solution (if you are not interested in coming back to the process, just want it to keep running):
nohup node server.js &
There's also the jobs command to see an indexed list of those backgrounded processes. And you can kill a backgrounded process by running kill %1 or kill %2 with the number being the index of the process.
Powerful solution (allows you to reconnect to the process if it is interactive):
screen
You can then detach by pressing Ctrl+a+d and then attach back by running screen -r
Also consider the newer alternative to screen, tmux.
You really should try to use screen. It is a bit more complicated than just doing nohup long_running &, but understanding screen once you never come back again.
Start your screen session at first:
user#host:~$ screen
Run anything you want:
wget http://mirror.yandex.ru/centos/4.6/isos/i386/CentOS-4.6-i386-binDVD.iso
Press ctrl+A and then d. Done. Your session keeps going on in background.
You can list all sessions by screen -ls, and attach to some by screen -r 20673.pts-0.srv command, where 0673.pts-0.srv is an entry list.
This is an old question, but is high ranked on Google. I almost can't believe on the highest voted answers, because running a node.js process inside a screen session, with the & or even with the nohup flag -- all of them -- are just workarounds.
Specially the screen/tmux solution, which should really be considered an amateur solution. Screen and Tmux are not meant to keep processes running, but for multiplexing terminal sessions. It's fine, when you are running a script on your server and want to disconnect. But for a node.js server your don't want your process to be attached to a terminal session. This is too fragile. To keep things running you need to daemonize the process!
There are plenty of good tools to do that.
PM2: http://pm2.keymetrics.io/
# basic usage
$ npm install pm2 -g
$ pm2 start server.js
# you can even define how many processes you want in cluster mode:
$ pm2 start server.js -i 4
# you can start various processes, with complex startup settings
# using an ecosystem.json file (with env variables, custom args, etc):
$ pm2 start ecosystem.json
One big advantage I see in favor of PM2 is that it can generate the system startup script to make the process persist between restarts:
$ pm2 startup [platform]
Where platform can be ubuntu|centos|redhat|gentoo|systemd|darwin|amazon.
forever.js: https://github.com/foreverjs/forever
# basic usage
$ npm install forever -g
$ forever start app.js
# you can run from a json configuration as well, for
# more complex environments or multi-apps
$ forever start development.json
Init scripts:
I'm not go into detail about how to write a init script, because I'm not an expert in this subject and it'd be too long for this answer, but basically they are simple shell scripts, triggered by OS events. You can read more about this here
Docker:
Just run your server in a Docker container with -d option and, voilá, you have a daemonized node.js server!
Here is a sample Dockerfile (from node.js official guide):
FROM node:argon
# Create app directory
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
# Install app dependencies
COPY package.json /usr/src/app/
RUN npm install
# Bundle app source
COPY . /usr/src/app
EXPOSE 8080
CMD [ "npm", "start" ]
Then build your image and run your container:
$ docker build -t <your username>/node-web-app .
$ docker run -p 49160:8080 -d <your username>/node-web-app
Always use the proper tool for the job. It'll save you a lot of headaches and over hours!
another solution disown the job
$ nohup node server.js &
[1] 1711
$ disown -h %1
nohup will allow the program to continue even after the terminal dies. I have actually had situations where nohup prevents the SSH session from terminating correctly, so you should redirect input as well:
$ nohup node server.js </dev/null &
Depending on how nohup is configured, you may also need to redirect standard output and standard error to files.
Nohup and screen offer great light solutions to running Node.js in the background. Node.js process manager (PM2) is a handy tool for deployment. Install it with npm globally on your system:
npm install pm2 -g
to run a Node.js app as a daemon:
pm2 start app.js
You can optionally link it to Keymetrics.io a monitoring SAAS made by Unitech.
$ disown node server.js &
It will remove command from active task list and send the command to background
I have this function in my shell rc file, based on #Yoichi's answer:
nohup-template () {
[[ "$1" = "" ]] && echo "Example usage:\nnohup-template urxvtd" && return 0
nohup "$1" > /dev/null 2>&1 &
}
You can use it this way:
nohup-template "command you would execute here"
Have you read about the nohup command?
To run command as a system service on debian with sysv init:
Copy skeleton script and adapt it for your needs, probably all you have to do is to set some variables. Your script will inherit fine defaults from /lib/init/init-d-script, if something does not fits your needs - override it in your script. If something goes wrong you can see details in source /lib/init/init-d-script. Mandatory vars are DAEMON and NAME. Script will use start-stop-daemon to run your command, in START_ARGS you can define additional parameters of start-stop-daemon to use.
cp /etc/init.d/skeleton /etc/init.d/myservice
chmod +x /etc/init.d/myservice
nano /etc/init.d/myservice
/etc/init.d/myservice start
/etc/init.d/myservice stop
That is how I run some python stuff for my wikimedia wiki:
...
DESC="mediawiki articles converter"
DAEMON='/home/mss/pp/bin/nslave'
DAEMON_ARGS='--cachedir /home/mss/cache/'
NAME='nslave'
PIDFILE='/var/run/nslave.pid'
START_ARGS='--background --make-pidfile --remove-pidfile --chuid mss --chdir /home/mss/pp/bin'
export PATH="/home/mss/pp/bin:$PATH"
do_stop_cmd() {
start-stop-daemon --stop --quiet --retry=TERM/30/KILL/5 \
$STOP_ARGS \
${PIDFILE:+--pidfile ${PIDFILE}} --name $NAME
RETVAL="$?"
[ "$RETVAL" = 2 ] && return 2
rm -f $PIDFILE
return $RETVAL
}
Besides setting vars I had to override do_stop_cmd because of python substitutes the executable, so service did not stop properly.
Apart from cool solutions above I'd mention also about supervisord and monit tools which allow to start process, monitor its presence and start it if it died. With 'monit' you can also run some active checks like check if process responds for http request
For Ubuntu i use this:
(exec PROG_SH &> /dev/null &)
regards
Try this for a simple solution
cmd & exit

Resources