Can't use a variable out of while and pipe in bash - linux

I have a code like that
var="before"
echo "$someString" | sed '$someRegex' | while read line
do
if [ $condition ]; then
var="after"
echo "$var" #first echo
fi
done
echo "$var" #second echo
Here first echo print "after", but second is "before". How can I make second echo print "after". I think it is because of pipe buy I don't know how figure out.
Thanks for any solutions...
answer edit:
I corrected it and it works fine. Thanks eugene for your useful answer
var="before"
while read line
do
if [ $condition ]; then
var="after"
echo "$var" #first echo
fi
done < <(echo "$someString" | sed '$someRegex')
echo "$var" #second echo

The reason for this behaviour is that a while loop runs in a subshell when it's part of a pipeline. For the while loop above, a new subshell with its own copy of the variable var is created.
See this article for possible workarounds: I set variables in a loop that's in a pipeline. Why do they disappear after the loop terminates? Or, why can't I pipe data to read?.

Related

bash until not meeting condition [duplicate]

Wondering if it's possible to finagle this logic (checking a variable for changes over time and running a loop while true) into a bash if statement or while loop condition. I was hoping for something like:
var=$(du -h *flat*.vmdk)
var2=$(sleep 1 ; du -h *flat*.vmdk)
if [[ $var != $var2 ]]; then
while true
do
echo -ne $(du -h *flat*.vmdk)\\r
sleep 1
done
else
echo "Transfer complete"
fi
I've also played with a while loop, rather than an if then with no luck.
while [ $var != $var2 ] ; do echo -ne $(du -h *flat*.vmdk)\\r ; sleep 1 ; done
But I'm seeing that's not possible? Or I'm having issues where things are incorrectly getting expanded. I'm open to any solution, although I am limited by a very basic shell (ESXi Shell) where many common unix/shell tools may not be present.
You are doing while [ $var != $var2 ] but never updating any of these variables ..
I would do something like:
function get_size() {
echo $(du -h *flat*.vmdk)
}
var="$(get_size)"
sleep 1
var2="$(get_size)"
while [ $var != $var2 ]; do
var=$var2
var2="$(get_size)"
echo -ne "$(get_size)\\r"
sleep 1
done
echo "Transfer complete"
What it does:
Use a function, because when you have to write two times or more a same line, it should trigger a "I should make it a function" in your brain.
Updating $var and $var2 within the while loop, so you don't check the same exact values each time, but check diff between last value and current one.
Add newlines to your code, because code is done to be read by humans, not machines, humans does not likes one-liners :)
I've not tested it
Not a generic solution, but if what you need is to wait while file keeps on changing, you can simply monitor it's modification timestamp with find (taking that this command is available), like that:
while find . -name *flat*.vmdk -newermt $(date --date "-1 second" +#%s)|read
do
sleep 1
done
echo "Transfer Completed !"
w/o using any variables at all.
I like #zeppelin's approach and I think I would have used it, but the date command in my environment was limited and I wasn't looking to invest any more time trying to figure that out. I did go with Arount's solution with a few modifications as seen below:
get_size() {
echo $(du -h *flat*.vmdk)
}
update() {
var="$(get_size)"
sleep 2
var2="$(get_size)"
}
update
while [ "$var" != "$var2" ]; do
update
echo -ne "$(get_size)\\r"
sleep 1
done
echo "Transfer complete"
The changes I needed:
ESXi Shell uses sh/Dash so I wasn't able to use the proposed function get_size() {
For whatever reason, the variables always matched until I created the update function to run in and outside the while loop.
Works well/as expected now. Thank you everyone for your help...hope it helps someone else.

Unable to array values outside of function in shell script [duplicate]

Please explain to me why the very last echo statement is blank? I expect that XCODE is incremented in the while loop to a value of 1:
#!/bin/bash
OUTPUT="name1 ip ip status" # normally output of another command with multi line output
if [ -z "$OUTPUT" ]
then
echo "Status WARN: No messages from SMcli"
exit $STATE_WARNING
else
echo "$OUTPUT"|while read NAME IP1 IP2 STATUS
do
if [ "$STATUS" != "Optimal" ]
then
echo "CRIT: $NAME - $STATUS"
echo $((++XCODE))
else
echo "OK: $NAME - $STATUS"
fi
done
fi
echo $XCODE
I've tried using the following statement instead of the ++XCODE method
XCODE=`expr $XCODE + 1`
and it too won't print outside of the while statement. I think I'm missing something about variable scope here, but the ol' man page isn't showing it to me.
Because you're piping into the while loop, a sub-shell is created to run the while loop.
Now this child process has its own copy of the environment and can't pass any
variables back to its parent (as in any unix process).
Therefore you'll need to restructure so that you're not piping into the loop.
Alternatively you could run in a function, for example, and echo the value you
want returned from the sub-process.
http://tldp.org/LDP/abs/html/subshells.html#SUBSHELL
The problem is that processes put together with a pipe are executed in subshells (and therefore have their own environment). Whatever happens within the while does not affect anything outside of the pipe.
Your specific example can be solved by rewriting the pipe to
while ... do ... done <<< "$OUTPUT"
or perhaps
while ... do ... done < <(echo "$OUTPUT")
This should work as well (because echo and while are in same subshell):
#!/bin/bash
cat /tmp/randomFile | (while read line
do
LINE="$LINE $line"
done && echo $LINE )
One more option:
#!/bin/bash
cat /some/file | while read line
do
var="abc"
echo $var | xsel -i -p # redirect stdin to the X primary selection
done
var=$(xsel -o -p) # redirect back to stdout
echo $var
EDIT:
Here, xsel is a requirement (install it).
Alternatively, you can use xclip:
xclip -i -selection clipboard
instead of
xsel -i -p
I got around this when I was making my own little du:
ls -l | sed '/total/d ; s/ */\t/g' | cut -f 5 |
( SUM=0; while read SIZE; do SUM=$(($SUM+$SIZE)); done; echo "$(($SUM/1024/1024/1024))GB" )
The point is that I make a subshell with ( ) containing my SUM variable and the while, but I pipe into the whole ( ) instead of into the while itself, which avoids the gotcha.
#!/bin/bash
OUTPUT="name1 ip ip status"
+export XCODE=0;
if [ -z "$OUTPUT" ]
----
echo "CRIT: $NAME - $STATUS"
- echo $((++XCODE))
+ export XCODE=$(( $XCODE + 1 ))
else
echo $XCODE
see if those changes help
Another option is to output the results into a file from the subshell and then read it in the parent shell. something like
#!/bin/bash
EXPORTFILE=/tmp/exportfile${RANDOM}
cat /tmp/randomFile | while read line
do
LINE="$LINE $line"
echo $LINE > $EXPORTFILE
done
LINE=$(cat $EXPORTFILE)

Searching for a substring in a bash script will not work

I have been writing a bash script to call in my .bashrc file to print the results of whatis for a random command in my /usr/bin folder and wanted to exclude commands that returned "nothing appropriate" in the result and even if I use grep, wc, expr, ==, nothing seems to work. I have pretty much used every example here, and here with no progress. This is what I have so far but failes to do what I want when it finds somthing that contains "nothing appropriate." If anyone could figure out how to get it to work or what a good solution would be in this situation I would be greatfull.
#! /bin/bash
echo "Did you know that:";
while :
do
RESULT=$(whatis $(ls /usr/bin | shuf -n 1))
if [[ $RESULT != *"nothing appropriate"* ]]
then
echo $RESULT
break
fi
done
whatis prints the nothing appropriate message on the standard error stream. This stream is not caught by the $( ). This is the reason of your issue.
This is a way to fix it:
#! /bin/bash
echo "Did you know that:";
while :
do
RESULT=$(whatis $(ls /usr/bin | shuf -n 1) 2>&1 | cat - )
if [[ $RESULT != *"nothing appropriate"* ]]
then
echo $RESULT
break
fi
done
The 2>&1 | cat - addition does the trick

Bash script does not continue to read the next line of file

I have a shell script that saves the output of a command that is executed to a CSV file. It reads the command it has to execute from a shell script which is in this format:
ffmpeg -i /home/test/videos/avi/418kb.avi /home/test/videos/done/418kb.flv
ffmpeg -i /home/test/videos/avi/1253kb.avi /home/test/videos/done/1253kb.flv
ffmpeg -i /home/test/videos/avi/2093kb.avi /home/test/videos/done/2093kb.flv
You can see each line is an ffmpeg command. However, the script just executes the first line. Just a minute ago it was doing nearly all of the commands. It was missing half for some reason. I edited the text file that contained the commands and now it will only do the first line. Here is my bash script:
#!/bin/bash
# Shell script utility to read a file line line.
# Once line is read it will run processLine() function
#Function processLine
processLine(){
line="$#"
START=$(date +%s.%N)
eval $line > /dev/null 2>&1
END=$(date +%s.%N)
DIFF=$(echo "$END - $START" | bc)
echo "$line, $START, $END, $DIFF" >> file.csv 2>&1
echo "It took $DIFF seconds"
echo $line
}
# Store file name
FILE=""
# get file name as command line argument
# Else read it from standard input device
if [ "$1" == "" ]; then
FILE="/dev/stdin"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists"
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not read"
exit 2
fi
fi
# read $FILE using the file descriptors
# Set loop separator to end of line
BAKIFS=$IFS
IFS=$(echo -en "\n\b")
exec 3<&0
exec 0<$FILE
while read line
do
# use $line variable to process line in processLine() function
processLine $line
done
exec 0<&3
# restore $IFS which was used to determine what the field separators are
BAKIFS=$ORIGIFS
exit 0
Thank you for any help.
UPDATE 2
Its the ffmpeg commands rather than the shell script that isn't working. But I should of been using just "\b" as Paul pointed out. I am also making use of Johannes's shorter script.
I think that should do the same and seems to be correct:
#!/bin/bash
CSVFILE=/tmp/file.csv
cat "$#" | while read line; do
echo "Executing '$line'"
START=$(date +%s)
eval $line &> /dev/null
END=$(date +%s)
let DIFF=$END-$START
echo "$line, $START, $END, $DIFF" >> "$CSVFILE"
echo "It took ${DIFF}s"
done
no?
ffmpeg reads STDIN and exhausts it. The solution is to call ffmpeg with:
ffmpeg </dev/null ...
See the detailed explanation here: http://mywiki.wooledge.org/BashFAQ/089
Update:
Since ffmpeg version 1.0, there is also the -nostdin option, so this can be used instead:
ffmpeg -nostdin ...
I just had the same problem.
I believe ffmpeg is responsible for this behaviour.
My solution for this problem:
1) Call ffmpeg with an "&" at the end of your ffmpeg command line
2) Since now the skript will not wait till completion of the ffmpeg process,
we have to prevent our script from starting several ffmpeg processes.
We achieve this goal by delaying the loop pass while there is at least
one running ffmpeg process.
#!/bin/bash
cat FileList.txt |
while read VideoFile; do
<place your ffmpeg command line here> &
FFMPEGStillRunning="true"
while [ "$FFMPEGStillRunning" = "true" ]; do
Process=$(ps -C ffmpeg | grep -o -e "ffmpeg" )
if [ -n "$Process" ]; then
FFMPEGStillRunning="true"
else
FFMPEGStillRunning="false"
fi
sleep 2s
done
done
I would add echos before and after the eval to see what it's about to eval (in case it's treating the whole file as one big long line) and after (in case one of the ffmpeg commands is taking forever).
Unless you are planning to read something from standard input after the loop, you don't need to preserve and restore the original standard input (though it is good to see you know how).
Similarly, I don't see a reason for dinking with IFS at all. There is certainly no need to restore the value of IFS before exit - this is a real shell you are using, not a DOS BAT file.
When you do:
read var1 var2 var3
the shell assigns the first field to $var1, the second to $var2, and the rest of the line to $var3. In the case where there's just one variable - your script, for example - the whole line goes into the variable, just as you want it to.
Inside the process line function, you probably don't want to throw away error output from the executed command. You probably do want to think about checking the exit status of the command. The echo with error redirection is ... unusual, and overkill. If you're sufficiently sure that the commands can't fail, then go ahead with ignoring the error. Is the command 'chatty'; if so, throw away the chat by all means. If not, maybe you don't need to throw away standard output, either.
The script as a whole should probably diagnose when it is given multiple files to process since it ignores the extraneous ones.
You could simplify your file handling by using just:
cat "$#" |
while read line
do
processline "$line"
done
The cat command automatically reports errors (and continues after them) and processes all the input files, or reads standard input if there are no arguments left. The use of double quotes around the variable means that it is passed as a single unit (and therefore unparsed into separate words).
The use of date and bc is interesting - I'd not seen that before.
All in all, I'd be looking at something like:
#!/bin/bash
# Time execution of commands read from a file, line by line.
# Log commands and times to CSV logfile "file.csv"
processLine(){
START=$(date +%s.%N)
eval "$#" > /dev/null
STATUS=$?
END=$(date +%s.%N)
DIFF=$(echo "$END - $START" | bc)
echo "$line, $START, $END, $DIFF, $STATUS" >> file.csv
echo "${DIFF}s: $STATUS: $line"
}
cat "$#" |
while read line
do
processLine "$line"
done

How to properly handle wildcard expansion in a bash shell script?

#!/bin/bash
hello()
{
SRC=$1
DEST=$2
for IP in `cat /opt/ankit/configs/machine.configs` ; do
echo $SRC | grep '*' > /dev/null
if test `echo $?` -eq 0 ; then
for STAR in $SRC ; do
echo -en "$IP"
echo -en "\n\t ARG1=$STAR ARG2=$2\n\n"
done
else
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
fi
done
}
hello $1 $2
The above is the shell script which I provide source (SRC) & desitnation (DEST) path. It worked fine when I did not put in a SRC path with wild card ''. When I run this shell script and give ''.pdf or '*'as follows:
root#ankit1:~/as_prac# ./test.sh /home/dev/Examples/*.pdf /ankit_test/as
I get the following output:
192.168.1.6
ARG1=/home/dev/Examples/case_Contact.pdf ARG2=/home/dev/Examples/case_howard_county_library.pdf
The DEST is /ankit_test/as but DEST also get manupulated due to '*'. The expected answer is
ARG1=/home/dev/Examples/case_Contact.pdf ARG2=/ankit_test/as
So, if you understand what I am trying to do, please help me out to solve this BUG.
I'll be grateful to you.
Thanks in advance!!!
I need to know exactly how I use '*.pdf' in my program one by one without disturbing DEST.
Your script needs more work.
Even after escaping the wildcard, you won't get your expected answer. You will get:
ARG1=/home/dev/Examples/*.pdf ARG2=/ankit__test/as
Try the following instead:
for IP in `cat /opt/ankit/configs/machine.configs`
do
for i in $SRC
do
echo -en "$IP"
echo -en "\n\t ARG1=$i ARG2=$DEST\n\n"
done
done
Run it like this:
root#ankit1:~/as_prac# ./test.sh "/home/dev/Examples/*.pdf" /ankit__test/as
The shell will expand wildcards unless you escape them, so for example if you have
$ ls
one.pdf two.pdf three.pdf
and run your script as
./test.sh *.pdf /ankit__test/as
it will be the same as
./test.sh one.pdf two.pdf three.pdf /ankit__test/as
which is not what you expect. Doing
./test.sh \*.pdf /ankit__test/as
should work.
If you can, change the order of the parameters passed to your shell script as follows:
./test.sh /ankit_test/as /home/dev/Examples/*.pdf
That would make your life a lot easier since the variable part moves to the end of the line. Then, the following script will do what you want:
#!/bin/bash
hello()
{
SRC=$1
DEST=$2
for IP in `cat /opt/ankit/configs/machine.configs` ; do
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
done
}
arg2=$1
shift
while [[ "$1" != "" ]] ; do
hello $1 $arg2
shift
done
You are also missing a final "done" to close your outer for loop.
OK, this appears to do what you want:
#!/bin/bash
hello() {
SRC=$1
DEST=$2
while read IP ; do
for FILE in $SRC; do
echo -e "$IP"
echo -e "\tARG1=$FILE ARG2=$DEST\n"
done
done < /tmp/machine.configs
}
hello "$1" $2
You still need to escape any wildcard characters when you invoke the script
The double quotes are necessary when you invoke the hello function, otherwise the mere fact of evaluating $1 causes the wildcard to be expanded, but we don't want that to happen until $SRC is assigned in the function
Here's what I came up with:
#!/bin/bash
hello()
{
# DEST will contain the last argument
eval DEST=\$$#
while [ $1 != $DEST ]; do
SRC=$1
for IP in `cat /opt/ankit/configs/machine.configs`; do
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
done
shift || break
done
}
hello $*
Instead of passing only two parameters to the hello() function, we'll pass in all the arguments that the script got.
Inside the hello() function, we first assign the final argument to the DEST var. Then we loop through all of the arguments, assigning each one to SRC, and run whatever commands we want using the SRC and DEST arguments. Note that you may want to put quotation marks around $SRC and $DEST in case they contain spaces. We stop looping when SRC is the same as DEST because that means we've hit the final argument (the destination).
For multiple input files using a wildcard such as *.txt, I found this to work perfectly, no escaping required. It should work just like a native bash app like "ls" or "rm." This was not documented just about anywhere so since I spent a better part of 3 days trying to figure it out I decided I should post it for future readers.
Directory contains the following files (output of ls)
file1.txt file2.txt file3.txt
Run script like
$ ./script.sh *.txt
Or even like
$ ./script.sh file{1..3}.txt
The script
#!/bin/bash
# store default IFS, we need to temporarily change this
sfi=$IFS
#set IFS to $'\n\' - new line
IFS=$'\n'
if [[ -z $# ]]
then
echo "Error: Missing required argument"
echo
exit 1
fi
# Put the file glob into an array
file=("$#")
# Now loop through them
for (( i=0 ; i < ${#file[*]} ; i++ ));
do
if [ -w ${file[$i]} ]; then
echo ${file[$i]} " writable"
else
echo ${file[$i]} " NOT writable"
fi
done
# Reset IFS to its default value
IFS=$sfi
The output
file1.txt writable
file2.txt writable
file3.txt writable
The key was switching the IFS (Internal Field Separator) temporarily. You have to be sure to store this before switching and then switch it back when you are done with it as demonstrated above.
Now you have a list of expanded files (with spaces escaped) in the file[] array which you can then loop through. I like this solution the best, easiest to program for and easiest for the users.
There's no need to spawn a shell to look at the $? variable, you can evaluate it directly.
It should just be:
if [ $? -eq 0 ]; then
You're running
./test.sh /home/dev/Examples/*.pdf /ankit_test/as
and your interactive shell is expanding the wildcard before the script gets it. You just need to quote the first argument when you launch it, as in
./test.sh "/home/dev/Examples/*.pdf" /ankit_test/as
and then, in your script, quote "$SRC" anywhere where you literally want the things with wildcards (ie, when you do echo $SRC, instead use echo "$SRC") and leave it unquoted when you want the wildcards expanded. Basically, always put quotes around things which might contain shell metacharacters unless you want the metacharacters interpreted. :)

Resources