Start multiple commands in parallel in Linux and wait for all of them to finish [duplicate] - linux

This question already has answers here:
Run multiple commands at once in the same terminal
(6 answers)
Closed 4 years ago.
I am running a shell script that basically does a "cd dir && git pull" in multiple directories that are part of one big app. After each of them is successfully updated, the script runs a "npm run build", but given that the update is on 30+ directories, the process takes quite some time.
I figured it would be easier if each of the 30 different "git pull" operations are parallelized, but I have no idea from where to start. I was thinking to span separate threads for the "git pull" commands and I'm familiar with a couple of options to do a "detached terminal" command on Linux but they all prevent the main terminal from understading when the detached command has finished its work (when all the asynchronous "git pull" opearations are done).
Here's a simplified version of my script:
cd ./app/dir1 && git pull
cd ./app/dir2 && git pull
...
cd ./app && npm run build
The question is not related to Git or NPM in any way.
To summarize: I would like to spawn multiple commands asynchronously, wait for all of them to complete, and continue with further commands afterwards.

It sounds you look for the parallel tool http://www.gnu.org/software/parallel/ .
They have an quick intro video: http://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Something like this could be working for your example
# run in parallel
ls ./app/dir* | parallel 'cd {} && git pull'
# the last command without parallel
cd ./app && npm run build

Related

Does the rsync command will end once it is completed

I'm running a command to sync the folder.
I would like to know whether the rsync command will keep on running even if all files are synced.
Since I'm having script
#!/bin/bash
function simSync {
ssh zsclxengcc1d mkdir -p $RESULTS
rsync -avzh --include=d3plot* --include=binout* $SCRATCH/ HOST1:$RESULTS
}
# Sync the Files
simSync
#Run simulation
mpirun -report-bindings $SOLVER ncpus=$NCPU i=$IFILE memory=${MEMORY}m memory2=$(($MEMORY/$NCPU))m && cleanup
Does this code start the Sync process and run simulation immediately.
I need to end up all progress once run simulation command is completed

How to run application in background in shell script

I have a dotnet application and I want to publish it in the container. I have one requirement, when the application is started I have to push some data in my application.
To do that I created a shell script and put it in the entry point command. Now I am running my application using a shell script.
The problem is when the application is started the script is not running the commands below it.
This is my shell script:
cd /app/WebAPI
dotnet WebAPI.dll
cd /app/data
for f in `ls *.txt`; do
// some other command
done
I want to run this for loop after the application is started.
Summarizing the comments above:
cd /app/WebAPI
dotnet WebAPI.dll &
cd /app/data
for f in *.txt; do
# some other command
done

Is it possible to suppress NPM's echo of the commands it is running?

I've got a bash script that starts up a server and then runs some functional tests. It's got to happen in one script, so I'm running the server in the background. This all happens via 2 npm commands: start:nolog and test:functional.
All good. But there's a lot of cruft in the output that I don't care about:
✗ ./functional-tests/runInPipeline.sh
(... "good" output here)
> #co/foo#2.2.10 pretest:functional /Users/jcol53/Documents/work/foo
> curl 'http://localhost:3000/foo' -s -f -o /dev/null || (echo 'Website must be running locally for functional tests.' && exit 1)
> #co/foo#2.2.10 test:functional /Users/jcol53/Documents/work/foo
> npm run --prefix functional-tests test:dev:chromeff
> #co/foo-functional-tests#1.0.0 test:dev:chromeff /Users/jcol53/Documents/work/foo/functional-tests
> testcafe chrome:headless,firefox:headless ./tests/**.test.js -r junit:reports/functional-test.junit.xml -r html:reports/functional-test.html --skip-js-errors
That's a lot of lines that I don't need there. Can I suppress the #co/foo-functional-tests etc lines? They aren't telling me anything worthwhile...
npm run -s kills all output from the command, which is not what I'm looking for.
This is probably not possible but that's OK, I'm curious, maybe I missed something...

Run command in parallel and fail the run if any command fails using wait

I can see there is parallel command which can do similar to what I want and answer is here: run commands in parallel with exit fail if any command fails
But I am using very minimal vm image and so I can't use parallel .
So Is that possible to run commands in parallel and return exist status fail if any of the command fails in the batch.
Ex.
(npm install --global bower ng-cli) & (cd $1 npm install) & (cd $2 bower install); wait
In the above command if 2nd command fails, it should return exit status fail.
Please let me know if I should provide any more information.
(Worst case) if someone can help me converting above command to parallel command that will be also useful.
Using GNU Parallel:
parallel --halt now,fail=1 ::: \
"npm install --global bower ng-cli" \
"cd $1 npm install" \
"cd $2 bower install" && echo All is OK
It will return with failure as soon as one of the jobs fail.

Make Nodejs script run in background in gitlab CI

Our dev project start by command npm run serve Is it possible to run it on background mode? I tried to use nohup, & in end of string. It works properly in shell, but when it start by CI on Gitlab, pipeline state always is "running" cause npm output permanently shows on screen
The clean way would be to run a container whose run command is "npm run serve"
I'm not certain running a non-blocking command through your pipeline is the right way but you should try using "&"
"npm run serve" will run the command in "detached mode.
I've faced the same problem using nohup and &. It was working well in shell, but not on Gitlab CI, It looks like npm start was not detached.
What worked for me is to call npm start inside a bash script and run it on before_script hook.
test:
stage: test
before_script:
- ./serverstart.sh
script:
- npm test
after_script:
- kill -9 $(ps aux | grep '\snode\s' | awk '{print $2}')
on the bash script serverstart.sh
# !/bin/bash
# start the server and send the console and error logs on nodeserver.log
npm start > nodeserver.log 2>&1 &
# keep waiting until the server is started
# (in my case wait for mongodb://localhost:27017/app-test to be logged)
while ! grep -q "mongodb://localhost:27017/app-test" nodeserver.log
do
sleep .1
done
echo -e "server has started\n"
exit 0
this allowed me to detach npm start and pass to next command while keeping npm startprocess alive

Resources