How can I execute more than one command in exec command of linux expect - linux

I'm trying to detect a host is available by using expect of Linux.
Now, I can using the following commands via bash to get the return value and check host status.
#!/usr/bin/bash
nc -z $DUT_IP 22 && echo $?
I want to do the same things via expect. But seems I failed.
#!/usr/bin/expect --
set DUT_IP "192.168.1.9"
set result [exec nc -z $DUT_IP 22]
send_user "$result\n\n"
I got the following error messages:
invalid port &&
while executing
"exec nc -z $DUT_IP 22 && echo $?"
invoked from within
"set result [exec nc -z $DUT_IP 22 && echo $?] "
(file "reboot.exp" line 44)

Use of && to separate commands is a shell construct, not an expect construct. You can explicitly launch a shell for that command list:
set result [exec sh -c "nc -z $DUT_IP 22 && echo $?"]
Note that will only print the exit status if the command succeeded, thus result will either be "0" or empty. You want to use ; instead of &&

Related

Getting exit code from SSH command running in background in Linux (Ksh script)

I trying to run few commands to be executed in parallel in couple of remote servers using SSH and i need to get the correct exit code from those commands but with no success.
I am using the command below
(ssh -o LogLevel=Error "remote_server1" -n " . ~/.profile 1>&- 2>&-;echo "success" 2>&1 > /dev/null" ) & echo $? > /tmp/test1.txt
(ssh -o LogLevel=Error "remote_server2" -n " . ~/.profile 1>&- 2>&-;caname 2>&1 > /dev/null" ) & echo $? > /tmp/test2.txt
The result is always "0" (echo $?) even i forced to failed like the second example above. Any idea?

SSH Remote command exit code

I know there are lots of discussions about it but i need you help with ssh remote command exit codes. I have that code:
(scan is a script which scans for viruses in the given file)
for i in $FILES
do
RET_CODE=$(ssh $SSH_OPT $HOST "scan $i; echo $?")
if [ $? -eq 0 ]; then
SOME_CODE
The scan works and it returns either 0 or (1 for errors) or 2 if a virus is found. But somehow my return code is always 0. Even, if i scan a virus.
Here is set -x output:
++ ssh -i /home/USER/.ssh/id host 'scan Downloads/eicar.com; echo 0'
+ RET_CODE='File Downloads/eicar.com: VIRUS: Virus found.
code of the Eicar-Test-Signature virus
0'
Here is the Output if i run those commands on the "remote" machine without ssh:
[user#ws ~]$ scan eicar.com; echo $?
File eicar.com: VIRUS: Virus found.
code of the Eicar-Test-Signature virus
2
I just want to have the return Code, i dont need all the other output of scan.
!UPDATE!
It seems like, echo is the problem.
The reason your ssh is always returning 0 is because the final echo command is always succeeding! If you want to get the return code from scan, either remove the echo or assign it to a variable and use exit. On my system:
$ ssh host 'false'
$ echo $?
1
$ ssh host 'false; echo $?'
1
$ echo $?
0
$ ssh host 'false; ret=$?; echo $ret; exit $ret'
1
$ echo $?
1
ssh returns the exit status of the entire pipeline that it runs - in this case, that's the exit status of echo $?.
What you want to do is simply use the ssh result directly (since you say that you don't want any of the output):
for i in $FILES
do
if ssh $SSH_OPT $HOST "scan $i >/dev/lull 2>&1"
then
SOME_CODE
If you really feel you must print the return code, that you can do that without affecting the overall result by using an EXIT trap:
for i in $FILES
do
if ssh $SSH_OPT $HOST "trap 'echo \$?' EXIT; scan $i >/dev/lull 2>&1"
then
SOME_CODE
Demo:
$ ssh $host "trap 'echo \$?' EXIT; true"; echo $?
0
0
$ ssh $host "trap 'echo \$?' EXIT; false"; echo $?
1
1
BTW, I recommend you avoid uppercase variable names in your scripts - those are normally used for environment variables that change the behaviour of programs.

Using expect to run "nc -w 3 -z $IP 22" command on macOS but get long waiting timeout

I using linux expect command to execute "nc -w 3 -z $IP 22" command on macOS.
I expected timeout after 3 seconds but I got 1m 15s (Due to "nc" command can setting timeout via "-w" argument).
I run the same code at other linux(Debian 7.8, ubuntu 14.04) machines are normally.
Who can tell me why I got the different result between linux and macOS ?
Following content is my source code:
#!/usr/bin/expect -f
set DUT_IP "192.168.1.2"
if {0!=[catch {exec sh -c "nc -w 3 -z $DUT_IP 22 && echo \$?"}]} {
send_user "Cannot connect to DUT $DUT_IP\n"
exit -1
} else {
puts "IP exist"
}

get return value of command run with script -c

Is there a way to capture the return value of a program run using script -c?
For example (in bash)
/bin/false; echo $? # outputs 1
/usr/bin/script -c "/bin/false" /dev/null; echo $?
# outputs 0 as script exited successfully.
I need to get the return value from /bin/false instead of from /usr/bin/script.. is this possible? I'm using script to trick a program into thinking it is running in a real tty even though it isn't... Thanks!
According to man script, using -e option will return the exit code of the child process.
-e, --return
Return the exit code of the child process.
Uses the same format as bash termination
on signal termination exit code is 128+n.
Here's some example.
$ /usr/bin/script -e -c "/bin/false" /dev/null; echo $?
1
$ /usr/bin/script -e -c "/bin/true" /dev/null; echo $?
0
$ /usr/bin/script -e -c "exit 123" /dev/null; echo $?
123

Execute Shell script after other script got executed successfully

Problem Statement:-
I have four shell script that I want to execute only when the previous script got executed successfully. And I am running it like this currently-
./verify-export-realtime.sh
sh -x lca_query.sh
sh -x liv_query.sh
sh -x lqu_query.sh
So In order to make other scripts run after previous script was successful. I need to do something like below? I am not sure whether I am right? If any script got failed due to any reason it will print as Failed due to some reason right?
./verify-export-realtime.sh
RET_VAL_STATUS=$?
echo $RET_VAL_STATUS
if [ $RET_VAL_STATUS -ne 0 ]; then
echo "Failed due to some reason"
exit
fi
sh -x lca_query.sh
RET_VAL_STATUS=$?
echo $RET_VAL_STATUS
if [ $RET_VAL_STATUS -ne 0 ]; then
echo "Failed due to some reason"
exit
fi
sh -x liv_query.sh
RET_VAL_STATUS=$?
echo $RET_VAL_STATUS
if [ $RET_VAL_STATUS -ne 0 ]; then
echo "Failed due to some reason"
exit
fi
sh -x lqu_query.sh
The shell provides an operator && to do exactly this. So you could write:
./verify-export-realtime.sh && \
sh -x lca_query.sh && \
sh -x liv_query.sh && \
sh -x lqu_query.sh
or you could get rid of the line continuations (\) and write it all on one line
./verify-export-realtime.sh && sh -x lca_query.sh && sh -x liv_query.sh && sh -x lqu_query.sh
If you want to know how far it got, you can add extra commands that just set a variable:
done=0
./verify-export-realtime.sh && done=1 &&
sh -x lca_query.sh && done=2 &&
sh -x liv_query.sh && done=3 &&
sh -x lqu_query.sh && done=4
The value of $done at the end tells you how many commands completed successfully. $? will get set to the exit value of the last command run (which is the one that failed), or 0 if all succeeded
You can simply run a chain of scripts in the command line (or from other script), when the first failing command will break this chain, using "&&" operator:
$ script1.sh && echo "First done, running the second" && script2.sh && echo "Second done, running the third" && script3.sh && echo "Third done, cool!"
And so on. The operation will break once one of the steps fails.
That should be right. You can also print the error code if necessary by echoing the $ variable. You can also make your own return value codes by actually returning your own values in those scripts and checking them in this main one. It might be more helpful then "The script failed for some reason".
if you want more flexible of handling errors
script1.sh
rc=$?
if [ ${rc} -eq 0 ];then
echo "script1 pass, starting script2"
script2.sh
rc=$?
if [ ${rc} -eq 0 ];then
echo "script2 pass"
else
echo "script2 failed"
fi
else
echo "script1 failed"
fi
The standard way to do this is to simply add a shell option that causes the script to abort if any simple command fails. Simply write the interpreter line as:
#!/bin/sh -e
or add the command:
set -e
(It is also common to do cmd1 && cmd2 && cmd3 as mentioned in other solutions.)
You absolutely should not attempt to print an error message. The command should print a relevant error message before it exits if it encounters an error. If the commands are not well behaved and do not write useful error messages, you should fix them rather than trying to guess what error they encountered. If you do write an error message, at the very least write it to the correct place. Errors belong on stderr:
echo "Some error occurred" >&2
As #William Pursell said, your scripts really should report their own errors. If you also need error reporting in the calling script, the easiest way to do it is like this:
if ! ./verify-export-realtime.sh; then
echo "Error running verify-export-realtime.sh; rest of script cancelled." >&2
elif ! sh -x lca_query.sh; then
echo "Error running lca_query.sh; rest of script cancelled." >&2
elif ! sh -x liv_query.sh; then
echo "Error running liv_query.sh; rest of script cancelled." >&2
elif ! sh -x lqu_query.sh; then
echo "Error running lqu_query.sh." >&2
fi

Resources