Run while loop in background and execute the entire script - Bash - linux

I am trying to run a condition at specific interval in while loop but the issue is that it is only executing the test function, I am trying that the test function keep on getting executed at specific interval and the script should move to the next function as well keeping test function running in the background. Any help would be appreciated
test(){
while true;
do
echo "Hello"
sleep 5
done
}
function2(){
echo "I am inside function2"
}
test
function2

You can tell bash to run a command in a separate subshell, asynchronously, with a single trailing ampersand &. So you can make test run separately like this:
test(){
while true;
do
echo "Hello"
sleep 5
done
}
function2(){
echo "I am inside function2"
}
test &
function2
However, this will cause the script to terminate before test is finished running! You can make the program wait for test & to finish at a later point in the program by using wait:
test(){
while true;
do
echo "Hello"
sleep 5
done
}
function2(){
echo "I am inside function2"
}
test & p1=$!
function2
wait $p1

Related

Can I only call a function from the terminal rather the whole bash script?

I have a bash script that looks as such:
#!/bin/bash
function one {
echo "I am function one!!"
}
function two {
echo "I am function two!!"
}
one
two
If I simply do bash test.sh both functions are being executed.
What I'd like to do is to call the script from the terminal while also specifying one of the two functions, and executing only it.
Maybe something like: bash test.sh$one() and it should only print out
I am function one!!
Is this possible and if so, how will I go about achieving it?
Thanks!
=========================
EDIT: As per #Waqas suggestion I ended up implementing the below which did the trick for me:
function main {
if [ -z "$1" ]
then
some commands
# else run the given function only
else
$1
fi
}
main "$#"
Thanks!!!
There are many ways to write the code in order to fulfill your requirement. The way I will write the code for this, is the following:
#!/bin/bash
function main {
# If the argument is empty then run both functions else only run provided function as argument $1.
[ -z "$1" ] && { one; two; } || $1
}
function one {
echo "I am function one!!"
}
function two {
echo "I am function two!!"
}
main "$#"
If you only execute the script without passing argument then both functions will run and with passing argument only single function will work.
Example1 (Both functions will run): bash script_name
Example2 (Only function one will run): bash script_name one
Example3 (Only function two will run): bash script_name two
You better separate the files: Move the function definitions in a separate file, say ~/lib/testlib.src. Your test.sh then becomes
#!/bin/bash
. ~/lib/testlib.src
one
two
If you need the definitions in your interactive shell, either do there a . ~/lib/testlib.src manually, or if you want to have them always available, put this statement into your ~/.bashrc.
You could do this by writing a case statement after defining the functions but before any other lines of code.
#!/bin/bash
function one {
echo "I am function one!!"
}
function two {
echo "I am function two!!"
}
case $1 in
one)
one
;;
two)
two
;;
*)
one
two
;;
esac
Which could then be used as:
$ ./test.sh one
# I am function one!!
$ ./test.sh two
# I am function two!!
In the above example I put the body of your script under the *) option, but if it better suits your needs, you could instead have the one) and two) options "exit" after calling their single function:
case $1 in
one)
one
exit 0
;;
two)
two
exit 0
;;
esac
one
two
This is all assuming you are not passing any other arguments to the script and that $1 would be used for the desired function. The case statement would become more complex otherwise.

Gracefully exit from a PowerShell background wait-event loop on Linux

I have a PowerShell script I am kicking off in the background on Linux.
> ./test_wait.ps1 &
I am trying to exit the background process so that the finally block is executed. But none of the signals (or Stop-Process) works. The finally block is always skipped. If I start it in the foreground and use Ctrl-C then everything works. But when it is a background process it doesn't respond to SIGINT, and other signals just cause it to immediately exit. Am I missing something? Or am I going to have approach this another way?
#!/usr/bin/pwsh
$Start_Time = (Get-date).second
$n = 1
Try
{
While($true) {
$n ++
Wait-Event 1
}
}
Finally
{
$End_Time = (Get-date).second
$Time_Diff = $End_Time - $Start_Time
"Total time in seconds $Time_Diff" > out.log
}

ZSH - wrap all commands with a function

I'd like to print the PID of any process I run in my command at the beginning (even if it's not a background process).
I came up with this:
fn() {
eval $1
}
Now whenever I run a command, I want it to be processed as
fn "actual_command_goes_here"&; wait
So that this will trigger a background process (which will print the PID) and then run the actual command. Example:
$ fn "sleep 5; date +%s; sleep 5; date +%s; sleep 2"&; wait
[1] 29901
1479143885
1479143890
[1] + 29901 done fn "..."
My question is: is there any way to create a wrapper function for all commands in Bash/ZSH? That is, when ever I run ls, it should actually run fn "ls"&; wait.

Bash get output from class

I've got a script, and I want to do something like this :
text1() {
something here
}
show(){
echo test1()
and some text here
}
Basically I want to use output from the first class function in the second class function, how I can do this?
If you want to put to a variable a value that function returns to stdout, use $():
foo() {
printf '%s\n' 'ququ'
}
bar() {
VAR="$(foo)"
echo "$VAR"
}
I. e. functions in GNU Bash (and other shells as well) are like external utilities.
I don't know if this is really what you want, but ...
You must know that bash functions, internal commands and standard tools don't return their output. Instead they write it on stdout. When you don't use any redirection, stdout is the terminal screen where you launched the command.
function text1() {
echo "In text1()"
}
function show(){
test1
echo "In show()"
}
If I call text1 from my terminal:
sh$ text1
In text1()
The function text1 during its execution invokes echo that send output to stdout. I see the result on the console.
sh$ show
In text1()
In show()
Calling show executes text1 (producing the same output as previously) followed by the output of the second echo.
If you want to store in a variable the intermediate result of a function or command, you might use the VAR=$( ...) notation. Think of that like "capturing" the output:
function text1() {
echo "In text1()"
}
function show(){
MYVAR=$(text1)
echo "In show() where MYVAR = ${MYVAR}"
}
Please compare the output now, with the previous case:
sh$ show
In show() where MYVAR = In text1()

How to get the exit code of spawned process in expect shell script?

I am trying to execute a script that executes an EXPECT script and a spawned process which has exit code in it. But I'm unable to get the exit code of the spawned process to main script. I'm always getting zero as success.
expect script is :
[Linux Dev:anr ]$ cat testexit.sh
#!/bin/bash
export tmp_script_file="/home/anr/tmp_script_temp.sh"
cp /home/anr/tmp_script $tmp_script_file
chmod a+x $tmp_script_file
cat $tmp_script_file
expect << 'EOF'
set timeout -1
spawn $env(tmp_script_file)
expect {
"INVALID " { exit 4 }
timeout { exit 4 }
}
EOF
echo "spawned process status" $?
rm -f $tmp_script_file
echo "done"
Spawned script:
[Linux Dev:anr ]$ cat tmp_script
exit 3
Execution of Expect script:
[Linux Dev:anr ]$ ./testexit.sh
exit 3
spawn /home/anr/tmp_script_temp.sh
spawned process status 0
done
Problem is I am unable to get the spawned exit return code to expect script. I want the exit code 3 of spawned script to main script and main script should be exit with exit code 3.
Please help me to get the spawned exit code to main script.
You get the exit status of the spawned process with the wait command:
expect <<'END'
log_user 0
spawn sh -c {echo hello; exit 42}
expect eof
puts $expect_out(buffer)
lassign [wait] pid spawnid os_error_flag value
if {$os_error_flag == 0} {
puts "exit status: $value"
} else {
puts "errno: $value"
}
END
hello
exit status: 42
From the expect man page
wait [args]
delays until a spawned process (or the current process if none is named) terminates.
wait normally returns a list of four integers. The first integer is the pid of the process that was waited upon. The second integer is the corresponding spawn id. The third integer is -1 if an operating system error occurred, or 0 otherwise. If the third integer was 0, the fourth integer is the status returned by the spawned process. If the third integer was -1, the fourth integer is the value of errno set by the operating system. The global variable errorCode is also set.
Change
expect {
"INVALID " { exit 4 }
timeout { exit 4 }
}
to
expect {
"INVALID " { exit 4 }
timeout { exit 4 }
eof
}
Then add the lassign and if commands.
With the help of glenn, I got solution.. and my final script is::
expect script is
[Linux Dev:anr ]$ cat testexit.sh
#!/bin/bash
export tmp_script_file="/home/anr/tmp_script_temp.sh"
cp /home/anr/tmp_script $tmp_script_file
chmod a+x $tmp_script_file
cat $tmp_script_file
expect << 'EOF'
set timeout -1
spawn $env(tmp_script_file)
expect {
"INVALID " { exit 4 }
timeout { exit 4 }
eof
}
foreach {pid spawnid os_error_flag value} [wait] break
if {$os_error_flag == 0} {
puts "exit status: $value"
exit $value
} else {
puts "errno: $value"
exit $value
}
EOF
echo "spawned process status" $?
rm -f $tmp_script_file
echo "done"
Spawned script:
[Linux Dev:anr ]$ cat tmp_script
exit 3
Execution of Expect script:
[Linux Dev:anr ]$ ./testexit.sh
exit 3
spawn /home/anr/tmp_script_temp.sh
exit status: 3
spawned process status 3
done
Thanks Glenn once again..
After struggling few days with expanding variable inside the expect heredoc, finally i came across an another approach i thought may be helpful to someone in need. My requirement was to pass command and password to a shell function, execute the command in remote host as part of expect heredoc and get the return exit code.
Example:
function shell_function {
# Get the command and password as arguments
# Run command using expect
# Return the exit code
}
shell_function <cmd> <password>
echo $?
Like everyone else expanding of variable inside the heredoc was a problem, which required exporting the value into an environment variable and use env to get the variable inside heredoc. Since, password was one of the arguments i didn't want to store it as part of an environment variable. So, instead of enclosing heredoc opening with single quotes, the variables of heredoc have been escaped. This allowed the direct usage of arguments passed.
Following is the final script:
#! /bin/bash
# This function runs a command like 'ssh' and provides the password
function run_with_password {
cmd="$2"
paswd="$1"
expect << END
set timeout 60
spawn $cmd
expect {
"yes/no" { send "yes\r" }
"*assword*" { send -- $paswd\r }
}
expect EOF
catch wait result
exit [lindex \$result 3]
END
}
my_password="AnswerIS42Really?"
cmd_to_run="ssh userid#hostname"
cmd_to_run="$cmd_to_run ls .sawfish"
run_with_password $my_password "$cmd_to_run"
echo "Command run code: $?"
In the above code the escaped expect variable is $result. After changing the variable to \$result, the script started working like charm.
My sincere thanks to users who have provided answers to following questions, which served as a stepping stones to reach my solution.
Douglas Leeder: help with expect script, run cat on remote comp and get output of it to the variable
glenn jackman: How to return spawned process exit code in Expect script?

Resources