program self loops in shell script, but want script to control looping - linux

I have a shell script (plotter.sh) that runs a program that currently loops infinitely on it's own (see below). The problem is when I want to tweak the program's options/syntax I typically wait until a loop is complete, stop the script, make my tweaks to the program's options and start the script again. The program does have an option to run once. With that said, how do I set it up so the shell script controls the looping and I can make changes to the program's options on the fly, the program automatically runs with the new changes on the subsequent loop - all without having to stop anything?
#!/bin/bash
#VARIABLES
poolcontactaddress=gjghkjhj
farmerpublickey=gjhgjhgjhgjh
farmid=${farmerpublickey: -6}
timestamp=$(date +'%Y%m%d-%H%M%S')
loopid=$(uuid)
#runid=$(uuid)
~/chia/chia-plotter/build/chia_plot -c $poolcontactaddress -f $farmerpublickey -t /mnt/u2-0052/ -2 /mnt/ramdisk/ -n -1 -r 48 -u 128 2>&1 | tee ~/chia/logs/plotter-$farmid-$loopid-$timestamp.log;

Related

BASH: simultaneous execution of a multiloop function without waiting

Usecase:
need to transfer binary files (1Gb) to array of IPs and start executing them upon arrival to their destinations without waiting all binaries to be transferred/executed. Sort of parallel mode.
Situation:
I have 2 functions - transfer and execution (depending on approach it can be shortened to 1 with 2 loops).
for N in "${NODES[#]}"; do
rsync -Pcz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --timeout=10 $FILE user#$N
done
and
for N in "${NODES[#]}"; do
ssh user#$N "cd ~/; ./exec.sh"
done
The point is that in this case i have to wait till all transfers finish first (and there sometimes can be tens of addresses)and just afterwards start the execution.
If i combine the loops into a single one, i have to wait again - this time for transfer+execution per node.
Expectation:
I'd like to transfer a file to the first node, start its execution, and switch to the second node with the same process, and so on. So timing would count for the transfers only, whereas each node executes the file on its own in parallel.
Obstacles:
1- need to be able to have an execution output from each node
2- additional packages, like screen are not an option.
What did i try:
i was thinking about injecting some script to the remote nodes via the loop to control the execution from there. But i'm sure there must be some less barbaric option.
What can be done here?
You should be able to use a single loop, and run the ssh command with a & suffix, which runs it in the background (i.e. without waiting for it to finish), and then after the loop use wait to wait for all of them to finish. Collecting output will be more interesting... I think you'll need to collect each run's output into a file, and then print the files at the end. Something like this (note that I have not tested this properly):
tmpdir="$(mktemp -qd -t "$(basename "$0")")" || {
echo "Error creating temporary directory" >&2
exit 1
}
for nodenum in "${!NODES[#]}"; do
# The ${!array[#]} idiom gets a list of array *indexes*, not elements; get the element by index:
N=${NODES[nodenum]}
# Copy file, and wait for copy to finish:
rsync -Pcz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --timeout=10 $FILE user#$N
# Start the script, and *don't* wait for it to finish:
ssh user#$N "cd ~/ sh exec.sh" >"$tmpdir/$nodenum.out" 2>&1 &
done
# Wait for all of the scripts to finish
wait
# Print all of the outputs (in order)
for nodenum in "${!NODES[#]}"; do
echo
echo "Output from ${NODES[nodenum]}:"
cat "$tmpdir/$nodenum.out"
done
# Clean up the temp directory
rm -R "$tmpdir"
BTW, the remote command "cd ~/ sh exec.sh" doesn't make sense. Is there supposed to be a semicolon in there? Also, I recommend using lower or mixed-case variable names to avoid conflicts with the many all-caps variables that have some sort of special meaning, and putting double-quotes around variable references (i.e. rsync ... "$FILE" "user#$N" instead of rsync ... $FILE user#$N).
EDIT: this assumes you want to start the script on each host as soon as that particular copy is done; if you want to wait until all copies are done, then fire all scripts at once, use two loops: one to do the copies, then a second that does the ssh commands in the background (collecting output as above), then wait for those to all finish, then print all of the outputs.
You could do the transfer and script as a single background task, so that the script on a particular host starts as soon as its transfer is complete
for N in "${NODES[#]}"; do
(rsync -Pcz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --timeout=10 $FILE user#$N
ssh user#$N "cd ~/; ./exec.sh") > ${N}.log 2>&1 &
done
You then collect all of the hostname.log files

How Can I Run an Infinite Loop from a Bash Script with Output to Foreground

So, I want to run the below command from a bash script and have it output to the shell, however, all my attempts result in the script running in the background:
while [ 1 ]; do timeout -k9 21600 sngrep -c -O "/var/log/sngrep/sngrep_capture_$(date +%F-%H-%M-%S).pcap"; sleep 1; done
When I run the command directly in the shell prompt, it outputs as expected. The application, SNGREP, launches with the specified parameters and works well.
I have experimented with sending the command to Screen, but it still ends up in the background. I have also tried modifying the command by sleeping first (as follows):
while sleep 1; do timeout -k9 21600 sngrep -c -O "/var/log/sngrep/sngrep_capture_$(date +%F-%H-%M-%S).pcap"; done
It, too, goes to the background but then runs fine if I type it directly into the shell prompt. What else can I try to get the command to output to the foreground when run from a bash script? Any help is appreciated, thanks.
PS. My end goal is to launch SNGREP in a Putty window from a Windows Batch File. I've got everything working, but this last bit.
It is not clear from your command why it is running in background, rather it should run on foreground only. You can try with below, redirect(2>&1) the output to standard output always:-
while [ 1 ]; do timeout -k9 21600 sngrep -c -O "/var/log/sngrep/sngrep_capture_$(date +%F-%H-%M-%S).pcap" 2>&1; sleep 1; done

Bash redirection in a script in parallel

I have a bash script with a loop of processes that I want to run in parallel:
for i in {1..5}
do
echo Running for simulation $i
python script.py $i > ./outlogs/$i.log 2>&1 &
done
But when I do this the file redirection doesn't work, so $i.log just stays empty. The redirection only works when I do not use the & at the end, but then the script waits for each process to finish before starting the next one, which I don't want.
I tried a solution using script -c, but this does not update in realtime, only once the process ends. Does anyone have better suggestions, where the file redirection works in this script but it still updates in realtime?
You need simply add -u option so it will look like this:
python -u script.py $i > ./outlogs/$i.log 2>&1 &
Option -u is for unbuffered binary stdout and stderr

create screen session that doesn't terminate with the program

I'm working on a startup script that is initiated from rc.local. I start up several programs with
screen -d -m my-prog
and that works great. However, if one of the programs has problems and exits, so does the session. I'd like to be able to have the session stick around so I can attach to it and see the output from the program before it crashed.
Is there a way to do this? I thought about
screen -d -m bash -c my-prog
But again, if my-prog terminates then so does bash and then so does screen.
You can follow the answer at https://unix.stackexchange.com/questions/47271/prevent-gnu-screen-from-terminating-session-once-executed-script-ends
They suggest something like you were trying in your second attempt, but instead of using bash to invoke the command (which terminates with the command as you noted), invoke bash after the command finishes like:
screen -dmS session_name sh -c 'my-prog; exec bash'

Redirecting Output of Bash Child Scripts

I have a basic script that outputs various status messages. e.g.
~$ ./myscript.sh
0 of 100
1 of 100
2 of 100
...
I wanted to wrap this in a parent script, in order to run a sequence of child-scripts and send an email upon overall completion, e.g. topscript.sh
#!/bin/bash
START=$(date +%s)
/usr/local/bin/myscript.sh
/usr/local/bin/otherscript.sh
/usr/local/bin/anotherscript.sh
RET=$?
END=$(date +%s)
echo -e "Subject:Task Complete\nBegan on $START and finished at $END and exited with status $RET.\n" | sendmail -v group#mydomain.com
I'm running this like:
~$ topscript.sh >/var/log/topscript.log 2>&1
However, when I run tail -f /var/log/topscript.log to inspect the log I see nothing, even though running top shows myscript.sh is currently being executed, and therefore, presumably outputting status messages.
Why isn't the stdout/stderr from the child scripts being captured in the parent's log? How do I fix this?
EDIT: I'm also running these on a remote machine, connected via ssh using pseudo-tty allocation, e.g. ssh -t user#host. Could the pseudo-tty be interfering?
I just tried your the following: I have three files t1.sh, t2.sh, and t3.sh all with the following content:
#!/bin/bash
for((i=0;i<10;i++)) ; do
echo $i of 9
sleep 1
done
And a script called myscript.sh with the following content:
#!/bin/bash
./t1.sh
./t2.sh
./t3.sh
echo "All Done"
When I run ./myscript.sh > topscript.log 2>&1 and then in another terminal run tail -f topscript.log I see the lines being output just fine in the log file.
Perhaps the things being run in your subscripts use a large output buffer? I know when I've run python scripts before, it has a pretty big output buffer so you don't see any output for a while. Do you actually see the entire output in the email that gets sent out at the end of topscript.sh? Is it just that while the processes run you're not seeing the output?
try
unbuffer topscript.sh >/var/log/topscript.log 2>&1
Note that unbuffer is not always available as a std binary in old-style Unix platforms and may require a search and installation for a package to support it.
I hope this helps.

Resources