Evaluation of curly braces in Linux - linux

I’ve noticed that we can use curly braces to make some of the commands much shorter as it is evaluated into list of arguments.
Input:
echo a{,b,c}
Output:
a ab ac
How do I force the same behaviour when the arguments are passed from the file?
Input:
cat file.txt | xargs echo
Output:
a{,b,c}
Expected output - same as in the previous example.

That {} expansion is a bash / zsh feature, as such then you need to explicitly run it thru any of these shells, in your case would be (using -I<STRING> to let xargs replace it in the string before running it):
cat file.txt |xargs -I# bash -c 'echo #'

xargs calls the echo as found in the $PATH, not the shell's builtin echo.
check the list of bash expansions: brace expansion happens first, so it won't get a chance to expand in that pipeline.
You'll have to do something like
while read -r line; do eval echo "$line"; done < file.txt
which exposes you to all kinds of nasty attacks if someone puts something malicious in that file.

Other than asking why would you want to do this... I offer the following:
add the string to a file:
echo 'a{,b,c}' > /tmp/foo
put the string in a variable:
export thing=`cat /tmp/foo`
eval the string:
eval $thing
If you had a bunch of these in a file then run the file through a loop and eval the loop value:
echo 'a{,b,c}' >> /tmp/foo
echo 'a{,b,c}' >> /tmp/foo
echo 'a{,b,c}' >> /tmp/foo
for i in `cat /tmp/foo`; do eval echo $i; done

Related

Use bash to write a bash script? [duplicate]

This question already has answers here:
echo "#!" fails -- "event not found"
(5 answers)
Closed 7 months ago.
echo "#!/bin/bash\nls -l /home/" > /home/myscript.sh
bash: !/bin/bash\nls: event not found
My script should be:
#!/bin/bash
ls -l /home/
Why does it ignore the echo "" string and think that there is some sort of event? Why does it not recognize #!/bin/bash as a special word?
the same thing happens when I
echo "#!/bin/bash" > /home/myscript.sh
so it's not the new line!
echo -e "#\!/bin/bash" > /home/myscript.sh
writes the file content as:
#\!/bin/bash
Why is this simple action going miserably wrong?
From the bash manpage:
Enclosing characters in double quotes preserves the literal value of
all characters within the quotes, with the exception of $, `, \,
and, when history expansion is
enabled, !.
So either use single quotes, or disable history expansion with set +o history.
But don't use echo. Instead, do :
printf '%s\n' '#!/bin/bash' 'ls -l /home/' > /home/myscript
or
cat > /home/myscript << 'EOF'
#!/bin/bash
ls -l /home/
EOF
echo -e '#!/bin/bash\nls -l /home/' > /home/myscript.sh
a combination of -e and using single quote fixed it.

While loop with sed

I have the following code but it doesnt work when i execute the code, the file th2.csv its empty.
The function of the sed is replace two words. I dont know how to make the script work correctly.
It must be done with the while.
bash th1.csv > th2.csv
Script bash
#!/bin/bash
while read -r line; do
echo "$line" | sed -E "s/,True,/,ll,/g;s/,False,/,th,/" th1.csv
done < th1.csv
Given the requirements that you must loop and apply regex, line by line, then consider:
#!/bin/bash
while read -r line; do
echo "$line" | sed -E "s/,True,/,ll,/g;s/,False,/,th,/" >> th2.csv
done < th1.csv
This reads, line by line, via a while loop. Each line is passed as stdin to sed. Note we remove the th1.csv at the end of your original sed attempt, as that will override sed reading from stdin (causing it to ignore it and instead process the file over and over again, every iteration). Lastly we append >> to your th2.csv file each iteration.
Guessing a step ahead, that you may want to pass the two files in as parameters to the script (just based on your first code snippet) then you can change this to:
#!/bin/bash
while read -r line; do
echo "$line" | sed -E "s/,True,/,ll,/g;s/,False,/,th,/" >> "$2"
done < "$1"
And, assuming this script is called myscript.sh you can call it like:
/bin/bash myscript.sh 'th1.csv' 'th2.csv'
Or, if you make it executable with chmod +x myscript.sh then:
./myscript.sh 'th1.csv' 'th2.csv'.

Bash script in bash variable

I have a bash script that arrives like:
SCRIPT=$(curl .... | parsing...)
echo $SCRIPT > myfile
But when I try to echo it in a file, some parts get evaluated. (Variables are substituted if any are defined, the * character is replaced by all files in the working directory, etc...)
Can I prevent bash from evaluating any content of a variable, while still echoing?
Yes, use double quotes for that. I'll demonstrate:
$ x='*'
$ echo $x
..list of files..
$ echo '$x'
$x
$ echo "$x"
*

How do I echo "-e"?

I want to echo a string that might contain the same parameters as echo. How can I do it without modifying the string?
For instance:
$ var="-e something"
$ echo $var
something
... didn't print -e
A surprisingly deep question. Since you tagged bash, I'll assume you mean bash's internal echo command, though the GNU coreutils' standalone echo command probably works similarly enough.
The gist of it is: if you really need to use echo (which would be surprising, but that's the way the question is written by now), it all depends on what exactly your string can contain.
The easy case: -e plus non-empty string
In that case, all you need to do is quote the variable before passing it to echo.
$ var="-e something"
$ echo "$var"
-e something
If the string isn't eaxctly an echo option or combination, which includes any non-option suffix, it won't be recognized as such by echo and will be printed out.
Harder: string can be -e only
If your case can reduce to just "-e", it gets trickier. One way to do it would be:
$ echo -e '\055e'
-e
(escaping the dash so it doesn't get interpreted as an option but as on octal sequence)
That's rewriting the string. It can be done automatically and non-destructively, so it feels acceptable:
$ var="-e something"
$ echo -e ${var/#-/\\055}
-e something
You noticed I'm actually using the -e option to interpret an octal sequence, so it won't work if you intended to echo -E. It will work for other options, though.
The right way
Seriously, you're not restricted to echo, are you?
printf '%s\n' "$var"
The proper bash way is to use printf:
printf "%s\n" "$var"
By the way, your echo didn't work because when you run:
var="-e something"
echo $var
(without quoting $var), echo will see two arguments: -e and something. Because when echo meets -e as its first argument, it considers it's an option (this is also true for -n and -E), and so processes it as such. If you had quoted var, as shown in other answers, it would have worked.
Quote it:
$ var="-e something"
$ echo "$var"
-e something
If what you want is to get echo -e's behaviour (enable interpretation of backslash escapes), then you have to leave the $var reference without quotes:
$ var="hi\nho"
$ echo $var
hi
ho
Or use eval:
$ var="hi\nho"
$ eval echo \${var}
hi\nho
$ var="-e hi\nho"
$ eval echo \${var}
hi
ho
Since we're using bash, another alternative to echo is to simply cat a "here string":
$ var="-e something"
$ cat <<< "$var"
-e something
$ var="-e"
$ cat <<< "$var"
-e
$
printf-based solutions will almost certainly be more portable though.
Try the following:
$ env POSIXLY_CORRECT=1 echo -e
-e
Due to shell aliases and built-in echo command, using an unadorned
echo interactively or in a script may get you different functionality
than that described here. Invoke it via env (i.e., env echo ...)
to avoid interference from the shell.
The environment variable POSIXLY_CORRECT was introduced to allow the user to force the standards-compliant behaviour. See: POSIX at Wikipedia.
Or use printf:
$ printf '%s\n' "$var"
Source: Why is bash swallowing -e in the front of an array at stackoverflow SE
Use printf instead:
var="-e bla"
printf "%s\n" "$var"
Using just echo "$var" will still fail if var contains just a -e or similar. If you need to be able to print that as well, use printf.

Forcing bash to expand variables in a string loaded from a file

I am trying to work out how to make bash (force?) expand variables in a string (which was loaded from a file).
I have a file called "something.txt" with the contents:
hello $FOO world
I then run
export FOO=42
echo $(cat something.txt)
this returns:
hello $FOO world
It didn't expand $FOO even though the variable was set. I can't eval or source the file - as it will try and execute it (it isn't executable as it is - I just want the string with the variables interpolated).
Any ideas?
I stumbled on what I think is THE answer to this question: the envsubst command:
echo "hello \$FOO world" > source.txt
export FOO=42
envsubst < source.txt
This outputs: hello 42 world
If you would like to continue work on the data in a file destination.txt, push this back to a file like this:
envsubst < source.txt > destination.txt
In case it's not already available in your distro, it's in the
GNU package gettext.
#Rockallite
I wrote a little wrapper script to take care of the '$' problem.
(BTW, there is a "feature" of envsubst, explained at
https://unix.stackexchange.com/a/294400/7088
for expanding only some of the variables in the input, but I
agree that escaping the exceptions is much more convenient.)
Here's my script:
#! /bin/bash
## -*-Shell-Script-*-
CmdName=${0##*/}
Usage="usage: $CmdName runs envsubst, but allows '\$' to keep variables from
being expanded.
With option -sl '\$' keeps the back-slash.
Default is to replace '\$' with '$'
"
if [[ $1 = -h ]] ;then echo -e >&2 "$Usage" ; exit 1 ;fi
if [[ $1 = -sl ]] ;then sl='\' ; shift ;fi
sed 's/\\\$/\${EnVsUbDolR}/g' | EnVsUbDolR=$sl\$ envsubst "$#"
Many of the answers using eval and echo kind of work, but break on various things, such as multiple lines, attempting to escaping shell meta-characters, escapes inside the template not intended to be expanded by bash, etc.
I had the same issue, and wrote this shell function, which as far as I can tell, handles everything correctly. This will still strip only trailing newlines from the template, because of bash's command substitution rules, but I've never found that to be an issue as long as everything else remains intact.
apply_shell_expansion() {
declare file="$1"
declare data=$(< "$file")
declare delimiter="__apply_shell_expansion_delimiter__"
declare command="cat <<$delimiter"$'\n'"$data"$'\n'"$delimiter"
eval "$command"
}
For example, you can use it like this with a parameters.cfg which is really a shell script that just sets variables, and a template.txt which is a template that uses those variables:
. parameters.cfg
printf "%s\n" "$(apply_shell_expansion template.txt)" > result.txt
In practice, I use this as a sort of lightweight template system.
you can try
echo $(eval echo $(cat something.txt))
You don't want to print each line, you want to evaluate it so that Bash can perform variable substitutions.
FOO=42
while read; do
eval echo "$REPLY"
done < something.txt
See help eval or the Bash manual for more information.
Another approach (which seems icky, but I am putting it here anyway):
Write the contents of something.txt to a temp file, with an echo statement wrapped around it:
something=$(cat something.txt)
echo "echo \"" > temp.out
echo "$something" >> temp.out
echo "\"" >> temp.out
then source it back in to a variable:
RESULT=$(source temp.out)
and the $RESULT will have it all expanded. But it seems so wrong !
Single line solution that doesn't need temporary file :
RESULT=$(source <(echo "echo \"$(cat something.txt)\""))
#or
RESULT=$(source <(echo "echo \"$(<something.txt)\""))
If you only want the variable references to be expanded (an objective that I had for myself) you could do the below.
contents="$(cat something.txt)"
echo $(eval echo \"$contents\")
(The escaped quotes around $contents is key here)
If something.txt has only one line, a bash method, (a shorter version of Michael Neale's "icky" answer),
using process & command substitution:
FOO=42 . <(echo -e echo $(<something.txt))
Output:
hello 42 world
Note that export isn't needed.
If something.txt has one or more lines, a GNU sed evaluate method:
FOO=42 sed 's/"/\\\"/g;s/.*/echo "&"/e' something.txt
Following solution:
allows replacing of variables which are defined
leaves unchanged variables placeholders which are not defined. This is especially useful during automated deployments.
supports replacement of variables in following formats:
${var_NAME}
$var_NAME
reports which variables are not defined in environment and returns error code for such cases
TARGET_FILE=someFile.txt;
ERR_CNT=0;
for VARNAME in $(grep -P -o -e '\$[\{]?(\w+)*[\}]?' ${TARGET_FILE} | sort -u); do
VAR_VALUE=${!VARNAME};
VARNAME2=$(echo $VARNAME| sed -e 's|^\${||g' -e 's|}$||g' -e 's|^\$||g' );
VAR_VALUE2=${!VARNAME2};
if [ "xxx" = "xxx$VAR_VALUE2" ]; then
echo "$VARNAME is undefined ";
ERR_CNT=$((ERR_CNT+1));
else
echo "replacing $VARNAME with $VAR_VALUE2" ;
sed -i "s|$VARNAME|$VAR_VALUE2|g" ${TARGET_FILE};
fi
done
if [ ${ERR_CNT} -gt 0 ]; then
echo "Found $ERR_CNT undefined environment variables";
exit 1
fi
foo=45
file=something.txt # in a file is written: Hello $foo world!
eval echo $(cat $file)
$ eval echo $(cat something.txt)
hello 42 world
$ bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin17)
Copyright (C) 2007 Free Software Foundation, Inc.
envsubst is a great solution (see LenW's answer) if the content you're substituting is of "reasonable" length.
In my case, I needed to substitute in a file's content to replace the variable name. envsubst requires that the content be exported as environment variables and bash has a problem when exporting environment variables that are more than a megabyte or so.
awk solution
Using cuonglm's solution from a different question:
needle="doc1_base64" # The "variable name" in the file. (A $ is not needed.)
needle_file="doc1_base64.txt" # Will be substituted for the needle
haystack=$requestfile1 # File containing the needle
out=$requestfile2
awk "BEGIN{getline l < \"${needle_file}\"}/${needle}/{gsub(\"${needle}\",l)}1" $haystack > $out
This solution works for even large files.
expenv () {
LF=$'\n'
echo "cat <<END_OF_TEXT${LF}$(< "$1")${LF}END_OF_TEXT" | bash
return $?
}
expenv "file name"
The following works: bash -c "echo \"$(cat something.txt)"\"

Resources