Run command in parallel and fail the run if any command fails using wait - linux

I can see there is parallel command which can do similar to what I want and answer is here: run commands in parallel with exit fail if any command fails
But I am using very minimal vm image and so I can't use parallel .
So Is that possible to run commands in parallel and return exist status fail if any of the command fails in the batch.
Ex.
(npm install --global bower ng-cli) & (cd $1 npm install) & (cd $2 bower install); wait
In the above command if 2nd command fails, it should return exit status fail.
Please let me know if I should provide any more information.
(Worst case) if someone can help me converting above command to parallel command that will be also useful.

Using GNU Parallel:
parallel --halt now,fail=1 ::: \
"npm install --global bower ng-cli" \
"cd $1 npm install" \
"cd $2 bower install" && echo All is OK
It will return with failure as soon as one of the jobs fail.

Related

Is it possible to suppress NPM's echo of the commands it is running?

I've got a bash script that starts up a server and then runs some functional tests. It's got to happen in one script, so I'm running the server in the background. This all happens via 2 npm commands: start:nolog and test:functional.
All good. But there's a lot of cruft in the output that I don't care about:
✗ ./functional-tests/runInPipeline.sh
(... "good" output here)
> #co/foo#2.2.10 pretest:functional /Users/jcol53/Documents/work/foo
> curl 'http://localhost:3000/foo' -s -f -o /dev/null || (echo 'Website must be running locally for functional tests.' && exit 1)
> #co/foo#2.2.10 test:functional /Users/jcol53/Documents/work/foo
> npm run --prefix functional-tests test:dev:chromeff
> #co/foo-functional-tests#1.0.0 test:dev:chromeff /Users/jcol53/Documents/work/foo/functional-tests
> testcafe chrome:headless,firefox:headless ./tests/**.test.js -r junit:reports/functional-test.junit.xml -r html:reports/functional-test.html --skip-js-errors
That's a lot of lines that I don't need there. Can I suppress the #co/foo-functional-tests etc lines? They aren't telling me anything worthwhile...
npm run -s kills all output from the command, which is not what I'm looking for.
This is probably not possible but that's OK, I'm curious, maybe I missed something...

Start multiple commands in parallel in Linux and wait for all of them to finish [duplicate]

This question already has answers here:
Run multiple commands at once in the same terminal
(6 answers)
Closed 4 years ago.
I am running a shell script that basically does a "cd dir && git pull" in multiple directories that are part of one big app. After each of them is successfully updated, the script runs a "npm run build", but given that the update is on 30+ directories, the process takes quite some time.
I figured it would be easier if each of the 30 different "git pull" operations are parallelized, but I have no idea from where to start. I was thinking to span separate threads for the "git pull" commands and I'm familiar with a couple of options to do a "detached terminal" command on Linux but they all prevent the main terminal from understading when the detached command has finished its work (when all the asynchronous "git pull" opearations are done).
Here's a simplified version of my script:
cd ./app/dir1 && git pull
cd ./app/dir2 && git pull
...
cd ./app && npm run build
The question is not related to Git or NPM in any way.
To summarize: I would like to spawn multiple commands asynchronously, wait for all of them to complete, and continue with further commands afterwards.
It sounds you look for the parallel tool http://www.gnu.org/software/parallel/ .
They have an quick intro video: http://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Something like this could be working for your example
# run in parallel
ls ./app/dir* | parallel 'cd {} && git pull'
# the last command without parallel
cd ./app && npm run build

Make Nodejs script run in background in gitlab CI

Our dev project start by command npm run serve Is it possible to run it on background mode? I tried to use nohup, & in end of string. It works properly in shell, but when it start by CI on Gitlab, pipeline state always is "running" cause npm output permanently shows on screen
The clean way would be to run a container whose run command is "npm run serve"
I'm not certain running a non-blocking command through your pipeline is the right way but you should try using "&"
"npm run serve" will run the command in "detached mode.
I've faced the same problem using nohup and &. It was working well in shell, but not on Gitlab CI, It looks like npm start was not detached.
What worked for me is to call npm start inside a bash script and run it on before_script hook.
test:
stage: test
before_script:
- ./serverstart.sh
script:
- npm test
after_script:
- kill -9 $(ps aux | grep '\snode\s' | awk '{print $2}')
on the bash script serverstart.sh
# !/bin/bash
# start the server and send the console and error logs on nodeserver.log
npm start > nodeserver.log 2>&1 &
# keep waiting until the server is started
# (in my case wait for mongodb://localhost:27017/app-test to be logged)
while ! grep -q "mongodb://localhost:27017/app-test" nodeserver.log
do
sleep .1
done
echo -e "server has started\n"
exit 0
this allowed me to detach npm start and pass to next command while keeping npm startprocess alive

Run script command on parallel

i’ve bash script which I need to run on it two command in parallel
For example I’m executing a command of npm install which takes some time (20 -50 secs)
and I run it on two different folders in sequence first npm install on books folder and the second
is for orders folder, is there a way to run both in parallel in shell script ?
For example assume the script is like following:
#!/usr/bin/env bash
dir=$(pwd)
cd $tmpDir/books/
npm install
grunt
npm prune production
cd $tmpDir/orders/
npm install
grunt
npm prune production
You could use & to run the process in the background, for example:
#!/bin/sh
cd $HOME/project/books/
npm install &
cd $HOME/project/orders/
npm install &
# if want to wait for the processes to finish
wait
To run and wait for nested/multiple processes you could use a subshell () for example:
#!/bin/sh
(sleep 10 && echo 10 && sleep 1 && echo 1) &
cd $HOME/project/books/
(npm install && grunt && npm prune production ) &
cd $HOME/project/orders/
(npm install && grunt && npm prune production ) &
# waiting ...
wait
In this case, notice the that the commands are within () and using && that means that only the right side will be evaluated if the left size succeeds (exit 0) so for the example:
(sleep 10 && echo 10 && sleep 1 && echo 1) &
It creates a subshell putting things between ()
runs sleep 10 and if succeeds && then runs echo 10, if succeeds && then run sleep 1 and if succeeds && then runs echo 1
run all this in the background by ending the command with &

How to make jenkins move to the next stage if its "terminal" has been blocked?

I'm trying to run http calls for testing a live web api that's going to run in the jenkins machine.
This is the pipeline script that's been used.
stage 'build'
node {
git url: 'https://github.com/juliomarcos/sample-http-test-ci/'
sh "npm install"
sh "npm start"
}
stage 'test'
node {
sh "npm test"
}
But jenkins won't move to the test step. How can I run npm test after the web app has fully started?
One approach is to start the web app with an & at the end so it will run in the background. i.e.
npm start &
You can try to redirect the output of npm start to a text file like this:
npm start > output.txt &
Then in the next step, loop until the "started" message is available, something like:
tail -f output.txt | while read LOGLINE
do
[[ "${LOGLINE}" == *"listening on port"* ]] && pkill -P $$ tail
done
Code not tested :)

Resources