Prevent logging of clear command - linux

Suppose the following simple script:
#!/bin/bash
log="${HOME}/bin/test.log"
if [ -r "${log}" ]; then
rm -f "${log}"
fi
{
echo "Start of test"
clear
echo "End of test"
} 2>&1 | tee -a "${log}"
The contents of the generated log file look like the following:
Start of test
<unprintable>[H<unprintable>[2JEnd of test
Is there any way to avoid the extra characters resulting from issuing a clear command using this style of logging?

One possibility is to just filter them out of the stream that goes to the log file.
{
echo "Start of test"
clear
echo "End of test"
} 2>&1 | tee -a >(sed 's/.\[H.\[2J//' > "${log}")
(I'm not sure how to match a literal escape character using portable sed alone. Here, I just use . to match any character and assume that this regular expression will only match the intended sequence. One could "cheat" and use bash to generate a literal escape character in the sed command:
sed $'s/\e\\[H\e\\[2J//'
although it's not cheating too much since we're already using bash-specific process substitution.)

Related

How to use grep with single brackets?

I was looking at an answer in another thread about which bracket pair to use with if in a bash script. [[ is less surprising and has more features such as pattern matching (=~) whereas [ and test are built-in and POSIX compliant making them portable.
Recently, I was attempting to test the result of a grep command and it was failing with [: too many arguments. I was using [. But, when I switched to [[ it worked. How would I do such a test with [ in order to maintain the portability?
This is the test that failed:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
echo "slew mode"
else
echo "not slew mode"
fi
And the test that succeeded:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [[ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]]; then
echo "slew mode"
else
echo "not slew mode"
fi
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
This command will certainly fail for multiple matches. It will throw an error as the grep output is being split on line ending.
Multiple matches of grep are separated by new line and the test command becomes like:
[ match1 match2 match3 ... ]
which doesn't make much of a sense. You will get different error messages as the number of matches returned by grep (i.e the number of arguments for test command [).
For example:
2 matches will give you unary operator expected error
3 matches will give you binary operator expected error and
more than 3 matches will give you too many arguments error or such, in Bash.
You need to quote variables inside [ to prevent word splitting.
On the other hand, the Bash specific [[ prevents word splitting by default. Thus the grep output doesn't get split on new line and remains a single string which is a valid argument for the test command.
So the solution is to look only at the exit status of grep:
if grep -E -f slew_pattern /etc/sysconfig/ntpd; then
Or use quote when capturing output:
if [ "$(grep -E -f slew_pattern /etc/sysconfig/ntpd)" ]; then
Note:
You don't really need to capture the output here, simply looking at the exit status will suffice.
Additionally, you can suppress output of grep command to be printed with -q option and errors with -s option.

Forcing bash to expand variables in a string loaded from a file

I am trying to work out how to make bash (force?) expand variables in a string (which was loaded from a file).
I have a file called "something.txt" with the contents:
hello $FOO world
I then run
export FOO=42
echo $(cat something.txt)
this returns:
hello $FOO world
It didn't expand $FOO even though the variable was set. I can't eval or source the file - as it will try and execute it (it isn't executable as it is - I just want the string with the variables interpolated).
Any ideas?
I stumbled on what I think is THE answer to this question: the envsubst command:
echo "hello \$FOO world" > source.txt
export FOO=42
envsubst < source.txt
This outputs: hello 42 world
If you would like to continue work on the data in a file destination.txt, push this back to a file like this:
envsubst < source.txt > destination.txt
In case it's not already available in your distro, it's in the
GNU package gettext.
#Rockallite
I wrote a little wrapper script to take care of the '$' problem.
(BTW, there is a "feature" of envsubst, explained at
https://unix.stackexchange.com/a/294400/7088
for expanding only some of the variables in the input, but I
agree that escaping the exceptions is much more convenient.)
Here's my script:
#! /bin/bash
## -*-Shell-Script-*-
CmdName=${0##*/}
Usage="usage: $CmdName runs envsubst, but allows '\$' to keep variables from
being expanded.
With option -sl '\$' keeps the back-slash.
Default is to replace '\$' with '$'
"
if [[ $1 = -h ]] ;then echo -e >&2 "$Usage" ; exit 1 ;fi
if [[ $1 = -sl ]] ;then sl='\' ; shift ;fi
sed 's/\\\$/\${EnVsUbDolR}/g' | EnVsUbDolR=$sl\$ envsubst "$#"
Many of the answers using eval and echo kind of work, but break on various things, such as multiple lines, attempting to escaping shell meta-characters, escapes inside the template not intended to be expanded by bash, etc.
I had the same issue, and wrote this shell function, which as far as I can tell, handles everything correctly. This will still strip only trailing newlines from the template, because of bash's command substitution rules, but I've never found that to be an issue as long as everything else remains intact.
apply_shell_expansion() {
declare file="$1"
declare data=$(< "$file")
declare delimiter="__apply_shell_expansion_delimiter__"
declare command="cat <<$delimiter"$'\n'"$data"$'\n'"$delimiter"
eval "$command"
}
For example, you can use it like this with a parameters.cfg which is really a shell script that just sets variables, and a template.txt which is a template that uses those variables:
. parameters.cfg
printf "%s\n" "$(apply_shell_expansion template.txt)" > result.txt
In practice, I use this as a sort of lightweight template system.
you can try
echo $(eval echo $(cat something.txt))
You don't want to print each line, you want to evaluate it so that Bash can perform variable substitutions.
FOO=42
while read; do
eval echo "$REPLY"
done < something.txt
See help eval or the Bash manual for more information.
Another approach (which seems icky, but I am putting it here anyway):
Write the contents of something.txt to a temp file, with an echo statement wrapped around it:
something=$(cat something.txt)
echo "echo \"" > temp.out
echo "$something" >> temp.out
echo "\"" >> temp.out
then source it back in to a variable:
RESULT=$(source temp.out)
and the $RESULT will have it all expanded. But it seems so wrong !
Single line solution that doesn't need temporary file :
RESULT=$(source <(echo "echo \"$(cat something.txt)\""))
#or
RESULT=$(source <(echo "echo \"$(<something.txt)\""))
If you only want the variable references to be expanded (an objective that I had for myself) you could do the below.
contents="$(cat something.txt)"
echo $(eval echo \"$contents\")
(The escaped quotes around $contents is key here)
If something.txt has only one line, a bash method, (a shorter version of Michael Neale's "icky" answer),
using process & command substitution:
FOO=42 . <(echo -e echo $(<something.txt))
Output:
hello 42 world
Note that export isn't needed.
If something.txt has one or more lines, a GNU sed evaluate method:
FOO=42 sed 's/"/\\\"/g;s/.*/echo "&"/e' something.txt
Following solution:
allows replacing of variables which are defined
leaves unchanged variables placeholders which are not defined. This is especially useful during automated deployments.
supports replacement of variables in following formats:
${var_NAME}
$var_NAME
reports which variables are not defined in environment and returns error code for such cases
TARGET_FILE=someFile.txt;
ERR_CNT=0;
for VARNAME in $(grep -P -o -e '\$[\{]?(\w+)*[\}]?' ${TARGET_FILE} | sort -u); do
VAR_VALUE=${!VARNAME};
VARNAME2=$(echo $VARNAME| sed -e 's|^\${||g' -e 's|}$||g' -e 's|^\$||g' );
VAR_VALUE2=${!VARNAME2};
if [ "xxx" = "xxx$VAR_VALUE2" ]; then
echo "$VARNAME is undefined ";
ERR_CNT=$((ERR_CNT+1));
else
echo "replacing $VARNAME with $VAR_VALUE2" ;
sed -i "s|$VARNAME|$VAR_VALUE2|g" ${TARGET_FILE};
fi
done
if [ ${ERR_CNT} -gt 0 ]; then
echo "Found $ERR_CNT undefined environment variables";
exit 1
fi
foo=45
file=something.txt # in a file is written: Hello $foo world!
eval echo $(cat $file)
$ eval echo $(cat something.txt)
hello 42 world
$ bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin17)
Copyright (C) 2007 Free Software Foundation, Inc.
envsubst is a great solution (see LenW's answer) if the content you're substituting is of "reasonable" length.
In my case, I needed to substitute in a file's content to replace the variable name. envsubst requires that the content be exported as environment variables and bash has a problem when exporting environment variables that are more than a megabyte or so.
awk solution
Using cuonglm's solution from a different question:
needle="doc1_base64" # The "variable name" in the file. (A $ is not needed.)
needle_file="doc1_base64.txt" # Will be substituted for the needle
haystack=$requestfile1 # File containing the needle
out=$requestfile2
awk "BEGIN{getline l < \"${needle_file}\"}/${needle}/{gsub(\"${needle}\",l)}1" $haystack > $out
This solution works for even large files.
expenv () {
LF=$'\n'
echo "cat <<END_OF_TEXT${LF}$(< "$1")${LF}END_OF_TEXT" | bash
return $?
}
expenv "file name"
The following works: bash -c "echo \"$(cat something.txt)"\"

Bash shell `if` command returns something `then` do something

I am trying to do an if/then statement, where if there is non-empty output from a ls | grep something command then I want to execute some statements. I am do not know the syntax I should be using. I have tried several variations of this:
if [[ `ls | grep log ` ]]; then echo "there are files of type log";
Well, that's close, but you need to finish the if with fi.
Also, if just runs a command and executes the conditional code if the command succeeds (exits with status code 0), which grep does only if it finds at least one match. So you don't need to check the output:
if ls | grep -q log; then echo "there are files of type log"; fi
If you're on a system with an older or non-GNU version of grep that doesn't support the -q ("quiet") option, you can achieve the same result by redirecting its output to /dev/null:
if ls | grep log >/dev/null; then echo "there are files of type log"; fi
But since ls also returns nonzero if it doesn't find a specified file, you can do the same thing without the grep at all, as in D.Shawley's answer:
if ls *log* >&/dev/null; then echo "there are files of type log"; fi
You also can do it using only the shell, without even ls, though it's a bit wordier:
for f in *log*; do
# even if there are no matching files, the body of this loop will run once
# with $f set to the literal string "*log*", so make sure there's really
# a file there:
if [ -e "$f" ]; then
echo "there are files of type log"
break
fi
done
As long as you're using bash specifically, you can set the nullglob option to simplify that somewhat:
shopt -s nullglob
for f in *log*; do
echo "There are files of type log"
break
done
Or without if; then; fi:
ls | grep -q log && echo 'there are files of type log'
Or even:
ls *log* &>/dev/null && echo 'there are files of type log'
The if built-in executes a shell command and selects the block based on the return value of the command. ls returns a distinct status code if it does not find the requested files so there is no need for the grep part. The [[ utility is actually a built-in command from bash, IIRC, that performs arithmetic operations. I could be wrong on that part since I rarely stray far from Bourne shell syntax.
Anyway, if you put all of this together, then you end up with the following command:
if ls *log* > /dev/null 2>&1
then
echo "there are files of type log"
fi

How to open an editor from a bash function?

I have a simple function to open an editor:
open_an_editor()
{
nano "$1"
}
If called like open_an_editor file.ext, it works. But if I need to get some output from the function — smth=$(open_an_editor file.ext) — I cannot see the editor, script just stucks. What am I missing here?
Update: I am trying to write a function which would ask the user to write a value in editor, if it wasn't given in script arguments.
#!/bin/bash
open_an_editor()
{
if [ "$1" ]
then
echo "$1"
return 0
fi
tmpf=$(mktemp -t pref)
echo "default value, please edit" > "$tmpf"
# and here the editor should show up,
# allowing user to edit the value and save it
# this will stuck without showing the editor:
#nano "$tmpf"
# but this, with the help of Kimvais, works perfectly:
nano "$tmpf" 3>&1 1>&2 2>&3
cat "$tmpf"
rm "$tmpf"
}
something=$(open_an_editor "$1")
# and then I can do something useful with that value,
# for example count chars in it
echo -n "$something" | wc -c
So, if the script was called with an argument ./script.sh "A value", the function would just use that and immediately echo 7 bytes. But if called without arguments ./script.sh — nano should pop up.
If the input you need is the edited file, then you obviously need to cat filename after you do the open_an_editor filename
If you actually need the output of the editor, then you need to swap stderr and stdin i.e:
nano "$1" 3>&1 1>&2 2>&3
If yo need 'friendly' user input, see this question on how to use whiptail
if you need to get output from function and store in variable, you just display what's in file.
open_an_editor()
{
cat "$1"
}
smth=$(open_an_editor file.txt)
If all you want is for a user to enter a value then read is enough:
OLDIFS="$IFS"
IFS=$'\n'
read -p "Enter a value: " -e somevar
IFS="$OLDIFS"
echo "$somevar"

Bash script does not continue to read the next line of file

I have a shell script that saves the output of a command that is executed to a CSV file. It reads the command it has to execute from a shell script which is in this format:
ffmpeg -i /home/test/videos/avi/418kb.avi /home/test/videos/done/418kb.flv
ffmpeg -i /home/test/videos/avi/1253kb.avi /home/test/videos/done/1253kb.flv
ffmpeg -i /home/test/videos/avi/2093kb.avi /home/test/videos/done/2093kb.flv
You can see each line is an ffmpeg command. However, the script just executes the first line. Just a minute ago it was doing nearly all of the commands. It was missing half for some reason. I edited the text file that contained the commands and now it will only do the first line. Here is my bash script:
#!/bin/bash
# Shell script utility to read a file line line.
# Once line is read it will run processLine() function
#Function processLine
processLine(){
line="$#"
START=$(date +%s.%N)
eval $line > /dev/null 2>&1
END=$(date +%s.%N)
DIFF=$(echo "$END - $START" | bc)
echo "$line, $START, $END, $DIFF" >> file.csv 2>&1
echo "It took $DIFF seconds"
echo $line
}
# Store file name
FILE=""
# get file name as command line argument
# Else read it from standard input device
if [ "$1" == "" ]; then
FILE="/dev/stdin"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists"
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not read"
exit 2
fi
fi
# read $FILE using the file descriptors
# Set loop separator to end of line
BAKIFS=$IFS
IFS=$(echo -en "\n\b")
exec 3<&0
exec 0<$FILE
while read line
do
# use $line variable to process line in processLine() function
processLine $line
done
exec 0<&3
# restore $IFS which was used to determine what the field separators are
BAKIFS=$ORIGIFS
exit 0
Thank you for any help.
UPDATE 2
Its the ffmpeg commands rather than the shell script that isn't working. But I should of been using just "\b" as Paul pointed out. I am also making use of Johannes's shorter script.
I think that should do the same and seems to be correct:
#!/bin/bash
CSVFILE=/tmp/file.csv
cat "$#" | while read line; do
echo "Executing '$line'"
START=$(date +%s)
eval $line &> /dev/null
END=$(date +%s)
let DIFF=$END-$START
echo "$line, $START, $END, $DIFF" >> "$CSVFILE"
echo "It took ${DIFF}s"
done
no?
ffmpeg reads STDIN and exhausts it. The solution is to call ffmpeg with:
ffmpeg </dev/null ...
See the detailed explanation here: http://mywiki.wooledge.org/BashFAQ/089
Update:
Since ffmpeg version 1.0, there is also the -nostdin option, so this can be used instead:
ffmpeg -nostdin ...
I just had the same problem.
I believe ffmpeg is responsible for this behaviour.
My solution for this problem:
1) Call ffmpeg with an "&" at the end of your ffmpeg command line
2) Since now the skript will not wait till completion of the ffmpeg process,
we have to prevent our script from starting several ffmpeg processes.
We achieve this goal by delaying the loop pass while there is at least
one running ffmpeg process.
#!/bin/bash
cat FileList.txt |
while read VideoFile; do
<place your ffmpeg command line here> &
FFMPEGStillRunning="true"
while [ "$FFMPEGStillRunning" = "true" ]; do
Process=$(ps -C ffmpeg | grep -o -e "ffmpeg" )
if [ -n "$Process" ]; then
FFMPEGStillRunning="true"
else
FFMPEGStillRunning="false"
fi
sleep 2s
done
done
I would add echos before and after the eval to see what it's about to eval (in case it's treating the whole file as one big long line) and after (in case one of the ffmpeg commands is taking forever).
Unless you are planning to read something from standard input after the loop, you don't need to preserve and restore the original standard input (though it is good to see you know how).
Similarly, I don't see a reason for dinking with IFS at all. There is certainly no need to restore the value of IFS before exit - this is a real shell you are using, not a DOS BAT file.
When you do:
read var1 var2 var3
the shell assigns the first field to $var1, the second to $var2, and the rest of the line to $var3. In the case where there's just one variable - your script, for example - the whole line goes into the variable, just as you want it to.
Inside the process line function, you probably don't want to throw away error output from the executed command. You probably do want to think about checking the exit status of the command. The echo with error redirection is ... unusual, and overkill. If you're sufficiently sure that the commands can't fail, then go ahead with ignoring the error. Is the command 'chatty'; if so, throw away the chat by all means. If not, maybe you don't need to throw away standard output, either.
The script as a whole should probably diagnose when it is given multiple files to process since it ignores the extraneous ones.
You could simplify your file handling by using just:
cat "$#" |
while read line
do
processline "$line"
done
The cat command automatically reports errors (and continues after them) and processes all the input files, or reads standard input if there are no arguments left. The use of double quotes around the variable means that it is passed as a single unit (and therefore unparsed into separate words).
The use of date and bc is interesting - I'd not seen that before.
All in all, I'd be looking at something like:
#!/bin/bash
# Time execution of commands read from a file, line by line.
# Log commands and times to CSV logfile "file.csv"
processLine(){
START=$(date +%s.%N)
eval "$#" > /dev/null
STATUS=$?
END=$(date +%s.%N)
DIFF=$(echo "$END - $START" | bc)
echo "$line, $START, $END, $DIFF, $STATUS" >> file.csv
echo "${DIFF}s: $STATUS: $line"
}
cat "$#" |
while read line
do
processLine "$line"
done

Resources