How Can I Run an Infinite Loop from a Bash Script with Output to Foreground - linux

So, I want to run the below command from a bash script and have it output to the shell, however, all my attempts result in the script running in the background:
while [ 1 ]; do timeout -k9 21600 sngrep -c -O "/var/log/sngrep/sngrep_capture_$(date +%F-%H-%M-%S).pcap"; sleep 1; done
When I run the command directly in the shell prompt, it outputs as expected. The application, SNGREP, launches with the specified parameters and works well.
I have experimented with sending the command to Screen, but it still ends up in the background. I have also tried modifying the command by sleeping first (as follows):
while sleep 1; do timeout -k9 21600 sngrep -c -O "/var/log/sngrep/sngrep_capture_$(date +%F-%H-%M-%S).pcap"; done
It, too, goes to the background but then runs fine if I type it directly into the shell prompt. What else can I try to get the command to output to the foreground when run from a bash script? Any help is appreciated, thanks.
PS. My end goal is to launch SNGREP in a Putty window from a Windows Batch File. I've got everything working, but this last bit.

It is not clear from your command why it is running in background, rather it should run on foreground only. You can try with below, redirect(2>&1) the output to standard output always:-
while [ 1 ]; do timeout -k9 21600 sngrep -c -O "/var/log/sngrep/sngrep_capture_$(date +%F-%H-%M-%S).pcap" 2>&1; sleep 1; done

Related

How do I setup two curl commands to execute at different times forever?

For example, I want to run one command every 10 seconds and the other command every 5 minutes. I can only get the first one to log properly to a text file. Below is the shell script I am working on:
echo "script Running. Press CTRL-C to stop the process..."
while sleep 10;
do
curl -s -I --http2 https://www.ubuntu.com/ >> new.txt
echo "------------1st command--------------------" >> logs.txt;
done
||
while sleep 300;
do
curl -s -I --http2 https://www.google.com/
echo "-----------------------2nd command---------------------------" >> logs.txt;
done
I would advise you to go with #Marvin Crone's answer, but researching cronjobs and back-ground processes doesn't seem like the kind of hassle I would go through for this little script. Instead, try putting both loops into separate scripts; like so:
script1.sh
echo "job 1 Running. Type fg 1 and press CTRL-C to stop the process..."
while sleep 10;
do
echo $(curl -s -I --http2 https://www.ubuntu.com/) >> logs.txt;
done
script2.sh
echo "job 2 Running. Type fg 2 and press CTRL-C to stop the process..."
while sleep 300;
do
echo $(curl -s -I --http2 https://www.google.com/) >> logs.txt;
done
adding executable permissions
chmod +x script1.sh
chmod +x script2.sh
and last but not least running them:
./script1.sh & ./script2.sh &
this creates two separate jobs in the background that you can call by typing:
fg (1 or 2)
and stop them with CTRL-C or send them to background again by typing CTRL-Z
I think what is happening is that you start the first loop. Your first loop needs to complete before the second loop will start. But, the first loop is designed to be infinite.
I suggest you put each curl loop in a separate batch file.
Then, you can run each batch file separately, in the background.
I offer two suggestions for you to investigate for your solution.
One, research the use of crontab and set up a cron job to run the batch files.
Two, research the use of nohup as a means of running the batch files.
I strongly suggest you also research the means of monitoring the jobs and knowing how to terminate the jobs if anything goes wrong. You are setting up infinite loops. A simple Control C will not terminate jobs running in the background. You are treading in areas that can get out of control. You need to know what you are doing.

How to keep a bash script running in the background

I write a simple bash script:
while :
do
sleep 2;
//my code
done
Now I want this bash script always be running.
bash mybash.sh > /dev/null &
When I run above command my bash works fine. but when I close my terminal I think my bash is killed. because it doesn't work as my script make some files when it running.
Run the script "bash script.sh" in terminal and press ctrl+z and then use 'bg' command to put the script in background
#!/bin/bash
while true; do
// your code
sleep 5;
done;
write a bash script and put it that to cron and check once it will start comment the cron it will run in a background.
insted of sleep 5 you can use whatever second you want to put.
For checking your process use below commend to get the details
ps -ef | grep script_file_name
if you find more then one process is running leave one process and rest kill the process for script.
Hope so this will resolve your issue....!!!!

Keep a script running through ssh after logout

This is the first question that I post here. I tried to do a throughout search, but if I haven't (and the answer is obvious somewhere else), please just let me know.
I have a script that runs a program for me, here it is:
csv_file=../data/teste_nohup.csv
trace_file=../data/gnp.trace
declare -i n=100
declare -i p=1
declare -i counter=0
while [ $counter -lt 3 ];
do
n=100
while true
do
nice -19 sage gnptest.py ${n} ${p} | tee -a ${csv_file}
notify-send "finished test gnp ${n} ${p}"
done
done
So, what I'm trying to do is run the gnptest.py program a few times, and have the result be written to the csv_file.
The problem is, that depending on the input, the program may take a long time to complete. So I'd like to connect to the server over ssh, start the program, close the terminal, and check the output file from time to time.
I've tried nohup and disown. Nohup creates a huge nohup.out file, full with errors that I don't get while normally running the script (it complains about using the -lt operand, for example). But the biggest problem that I'm facing is that no command (nohup ou disown -h) is executing the program and sending the output to the file that I've specified in the csv_file variable, which is being done using the tee command. Also, none of them seem to continue running after I logout...
Any help will be much appreciated.
Thanks in advance!!
i hv just joined so cannt add comment
Please try by using redirection instead of tee in script
And to get rid of Nohup.out use following to run script
nohup script.sh > /dev/null 2>&1 &
If above produces error use
nohup script.sh > /dev/null 2>&1 </dev/null &
Hope this will help.

Read command in bash script not waiting for user input when piped to bash?

Here is what I'm entering in Terminal:
curl --silent https://raw.githubusercontent.com/githubUser/repoName/master/installer.sh | bash
The WordPress installing bash script contains a "read password" command that is supposed to wait for users to input their MySQL password. But, for some reason, that doesn't happen when I run it with the "curl githubURL | bash" command. When I download the script via wget and run it via "sh installer.sh", it works fine.
What could be the cause of this? Any help is appreciated!
If you want to run a script on a remote server without saving it locally, you can try this.
#!/bin/bash
RunThis=$(lynx -dump http://127.0.0.1/example.sh)
if [ $? = 0 ] ; then
bash -c "$RunThis"
else
echo "There was a problem downloading the script"
exit 1
fi
In order to test it, I wrote an example.sh:
#!/bin/bash
# File /var/www/example.sh
echo "Example read:"
read line
echo "You typed: $line"
When I run Script.sh, the output looks like this.
$ ./Script.sh
Example read:
Hello World!
You typed: Hello World!
Unless you absolutely trust the remote scripts, I would avoid doing this without examining it before executing.
It wouldn't stop for read:
As when you are piping in a way you are forking a child which has been given input from parent shell.
You cannot give the values back to parent(modify parent's env) from child.
and through out this process you are always in parent process.

Redirecting Output of Bash Child Scripts

I have a basic script that outputs various status messages. e.g.
~$ ./myscript.sh
0 of 100
1 of 100
2 of 100
...
I wanted to wrap this in a parent script, in order to run a sequence of child-scripts and send an email upon overall completion, e.g. topscript.sh
#!/bin/bash
START=$(date +%s)
/usr/local/bin/myscript.sh
/usr/local/bin/otherscript.sh
/usr/local/bin/anotherscript.sh
RET=$?
END=$(date +%s)
echo -e "Subject:Task Complete\nBegan on $START and finished at $END and exited with status $RET.\n" | sendmail -v group#mydomain.com
I'm running this like:
~$ topscript.sh >/var/log/topscript.log 2>&1
However, when I run tail -f /var/log/topscript.log to inspect the log I see nothing, even though running top shows myscript.sh is currently being executed, and therefore, presumably outputting status messages.
Why isn't the stdout/stderr from the child scripts being captured in the parent's log? How do I fix this?
EDIT: I'm also running these on a remote machine, connected via ssh using pseudo-tty allocation, e.g. ssh -t user#host. Could the pseudo-tty be interfering?
I just tried your the following: I have three files t1.sh, t2.sh, and t3.sh all with the following content:
#!/bin/bash
for((i=0;i<10;i++)) ; do
echo $i of 9
sleep 1
done
And a script called myscript.sh with the following content:
#!/bin/bash
./t1.sh
./t2.sh
./t3.sh
echo "All Done"
When I run ./myscript.sh > topscript.log 2>&1 and then in another terminal run tail -f topscript.log I see the lines being output just fine in the log file.
Perhaps the things being run in your subscripts use a large output buffer? I know when I've run python scripts before, it has a pretty big output buffer so you don't see any output for a while. Do you actually see the entire output in the email that gets sent out at the end of topscript.sh? Is it just that while the processes run you're not seeing the output?
try
unbuffer topscript.sh >/var/log/topscript.log 2>&1
Note that unbuffer is not always available as a std binary in old-style Unix platforms and may require a search and installation for a package to support it.
I hope this helps.

Resources