Format multiline command output in bash using printf - linux

My Bash script uses printf to print output of some other commands with formatting applied, in the following manner (note the two leading spaces):
printf " %-16s %s\n" "foo:" "$(bar)"
The two leading spaces are there, because the first line in the file is a comment, and I like to keep things nicely aligned:
# foo
foo: bar
foo: bar
...
This works perfectly fine for commands with one-line output. However, when the output is multiline, the output looses the formatting of the subsequent lines.
For example:
printf " %-16s %s\n" "Contents:" "$(ls -a)"
Results in something like this:
Contents: .
..
foo
bar
Instead, what I am trying to achieve is multiline output, with each line formatted (in columns, if you like), which would look like this (note the lack of "header" in the subsequent lines):
Contents: .
..
foo
bar
My understanding so far is that printf with two arguments (the first one being a string, and the other one the result of the given command) treats the second argument as a single string, which can include newline characters within. Therefore, the output is actually correctly formatted, according to what I asked printf to do, but it is not what I am looking for.
I am aware about some of the pitfalls related to parsing the output of commands like ls, which exhausts my current possibilities of solving this problem. Also, it is possible that printf is not the best facility to do this.
I am considering substituting multiline string into comma-separated list, if no solution to this problem is possible, but this would be a last resort.
What would be the best method to achieve the formatting I need?
Thanks for your help.

printf " %-16s %s\n" "foo:" "$(bar | sed '2,$s/^/ /g')"

You may find it difficult to apply the same print instructions to a range of commands. Also output from ls can be tedious to examine because it selects an output format depending upon environment variables.
I found the following worked well for ls on my Mac:
x=$(ls -a);
x=${x//$'\n'/$'\n' };
printf " %-16s %s" "Contents:" "$x" $'\n'

Related

formatting issue in printf script

I have a file stv.txt containing some names
For example stv.txt is as follows:
hello
world
I want to generate another file by using these names and adding some extra text to them.I have written a script as follows
for i in `cat stvv.txt`;
do printf 'if(!strcmp("$i",optarg))' > my_file;
done
output
if(!strcmp("$i",optarg))
desired output
if(!strcmp("hello",optarg))
if(!strcmp("world",optarg))
how can I get the correct result?
This is a working solution.
1 All symbols inside single quotes is considered a string. 2 When using printf, do not surround the variable with quotes. (in this example)
The code below should fix it,
for i in `cat stvv.txt`;
printf 'if(!strcmp('$i',optarg))' > my_file;
done
basically, break the printf statement into three parts.
1: the string 'if(!strcmp('
2: $i (no quotes)
3: the string ',optarg))'
hope that helps!
To insert a string into a printf format, use %s in the format string:
$ for line in $(cat stvv.txt); do printf 'if(!strcmp("%s",optarg))\n' "$line"; done
if(!strcmp("hello",optarg))
if(!strcmp("world",optarg))
The code $(cat stvv.txt) will perform word splitting and pathname expansion on the contents of stvv.txt. You probably don't want that. It is generally safer to use a while read ... done <stvv.txt loop such as this one:
$ while read -r line; do printf 'if(!strcmp("%s",optarg))\n' "$line"; done <stvv.txt
if(!strcmp("hello",optarg))
if(!strcmp("world",optarg))
Aside on cat
If you are using bash, then $(cat stvv.txt) could be replaced with the more efficient $(<stvv.txt). This question, however, is tagged shell not bash. The cat form is POSIX and therefore portable to all POSIX shells while the bash form is not.

While using printf how to escape special characters in shell script?

I am trying to format a string with printf in shell, i will get input string from a file , that have special characters like %,',"",,\user, \tan etc.
How to escape the special characters that are in the input string ?
Eg
#!/bin/bash
#
string='';
function GET_LINES() {
string+="The path to K:\Users\ca, this is good";
string+="\n";
string+="The second line";
string+="\t";
string+="123"
string+="\n";
string+="It also has to be 100% nice than %99";
printf "$string";
}
GET_LINES;
i am expecting this will print in the format i want like
The path to K:\Users\ca, this is good
The second line 123
It also has to be 100% nice than %99
But its giving unexpected out puts
./script: line 14: printf: missing unicode digit for \U
The path to K:\Users\ca, this is good
The second line 123
./script: line 14: printf: `%99': missing format character
It also has to be 100ice than
So how can i get rid of the special characters while printing.? echo -e also has the issue.
Try
printf "%s\n" "$string"
See printf(1)
For the benefit of people who got here by clicking on the first search result after Googling "bash printf escaped", the %q formatter is used by printf to produce bash-escaped text.
For example:
$ printf "<<%q>>\n" 'foe fum' "fee fie"$'\n'
<<foe\ fum>>
<<$'fee fie\n'>>
From man printf:
%q ARGUMENT is printed in a format that can be reused as shell
input, escaping non-printable characters with the
proposed POSIX $'' syntax.
May I remark that "man printf" shows clearly that a "%" character has to escaped by means of another "%"
so printf "%%" results in a single "%"
You can use $' ' to enclose the newlines and tab characters, then a plain echo will suffice:
#!/bin/bash
get_lines() {
local string
string+='The path to K:\Users\ca, this is good'
string+=$'\n'
string+='The second line'
string+=$'\t'
string+='123'
string+=$'\n'
string+='It also has to be 100% nice than %99'
echo "$string"
}
get_lines
I have also made a couple of other minor changes to your script. As well as making your FUNCTION_NAME lowercase, I have also used the more widely compatible function syntax. In this case, there's not a great deal of advantage (as $' ' strings are a bash extension anyway) but there's no reason to use the function func() syntax as far as I'm aware. Also, the scope of string may as well be local to the function in which it is used, so I changed that too.
Output:
The path to K:\Users\ca, this is good
The second line 123
It also has to be 100% nice than %99

Protecting arguments containing spaces from eval

In order to get eval to work on commands that contain spaces inside one of the parameters, I have only found this to work so far:
eval 'sed 's/foo/foo'" "'bar/g' filename'
In a hypothetical program where users would enter a command and then the command and arguments to be fed to eval, this isn't a very elegant or robust solution. Are there any other ways to run the eval command so that the interface for my_command can be a little more user friendly? The following is an example of how the program accepts arguments now.
my_command 'sed 's/foo/foo'" "'bar/g' filename'
I would like the interface to work something like this:
my_command sed 's/foo/foo bar/g' filename
edit:
I'll try asking a different question:
How do I get bash to read input from the command line literally? I want the exact input to be preserved, so if there are quotes I want to keep them. I can accomplish what I want to do by using egrep to read from file and then sanitizing the input, like so:
egrep '/.*/' filename |
sed 's/\(.*\)['"'"']\(.*\) \(.*\)['"'"']\(.*\)/\1'"\'"'\2" "\3'"\'"'\4/g'
with "filename" containing this line
sed 's/foo/foo bar/g' file
this gives me the desired output of:
sed 's/foo/foo" "bar/g' file
Problem here is that I can't echo "$#" because bash interprets the quotes. I want the literal input without having to read from file.
Original question
For your preferred use-case, you'd simply write (inside my_command):
"$#"
to execute the command as given.
Your eval line is odd:
eval 'sed 's/foo/foo'" "'bar/g' filename'
Because of the way single quotes don't nest, it is equivalent to:
eval 'sed s/foo/foo" "bar/g filename'
Revised question
Possible solution:
egrep '/.*/' filename | sh
This feeds what is in filename directly to the shell for interpretation. Given file containing:
Some text containing foo; and bar.
More foo bar?
More text; more foo and bar; more foo bar beyond the possibility of unfooing.
The output is:
Some text containing foo bar; and bar.
More foo bar bar?
More text; more foo bar and bar; more foo bar bar beyond the possibility of unfoo baring.
Fixing quotes is hard!
Note that your complex sed script is not complex enough. Given filename containing:
sed 's/foo/foo bar/g' file
sed 's/foo bar/foo bar baz/g' file
the output from:
egrep '/.*/' filename |
sed 's/\(.*\)['"'"']\(.*\) \(.*\)['"'"']\(.*\)/\1'"\'"'\2" "\3'"\'"'\4/g'
is:
sed 's/foo/foo" "bar/g' file
sed 's/foo bar/foo bar" "baz/g' file
which has not solved all the problems for the eval.
I've spent a lot of time, on and off, working on such issues over quite a long period of time (a quarter century is no exaggeration), and it isn't trivial. You can find one discussion in extenso in How to iterate over arguments in bash script. Somewhere, I have another answer which goes through gyrations about this stuff, but I can't immediately find it (where 'immediately' means an hour or so of distracted searching, where the distractions were sets of duplicate questions, etc). It may have been deleted, or I may have looked in the wrong place.
your design is flawed. Create a user interface that doesn't let them input commands directly. give options, or let them enter the parameters only.
At the back end, you do your sanitization check on the parameters before calling sed or other tools desired. You don't have to use eval
Array quoting
The following keeps spaces in arguments by quoting each element of array:
function token_quote {
local quoted=()
for token; do
quoted+=( "$(printf '%q' "$token")" )
done
printf '%s\n' "${quoted[*]}"
}
Example usage:
$ token_quote token 'single token' token
token single\ token token
Above, note the single token's space is quoted as \.
$ set $(token_quote token 'single token' token)
$ eval printf '%s\\n' "$#"
token
single token
token
$
This shows that the tokens are indeed kept separate.
Given some untrusted user input:
% input="Trying to hack you; date"
Construct a command to eval:
% cmd=(echo "User gave:" "$input")
Eval it, with seemingly correct quoting:
% eval "$(echo "${cmd[#]}")"
User gave: Trying to hack you
Thu Sep 27 20:41:31 +07 2018
Note you were hacked. date was executed rather than being printed literally.
Instead with token_quote():
% eval "$(token_quote "${cmd[#]}")"
User gave: Trying to hack you; date
%
eval isn't evil - it's just misunderstood :)
It can actually work as you desire. Use "$#" - this will pass all the arguments exactly as they were given on the command line.
If my_command.sh contains:
sed "$#"
Then my_command.sh 's/foo/foo bar/g' filename will do exactly what you expect.

Adding newline characters to unix shell variables

I have a variable in a shell script in which I'd like to format the data. The variable stores new data during every iteration of a loop. Each time the new data is stored, I'd like to insert a new line character. Here is how I'm trying to store the data into the variable.
VARIABLE="$VARIABLE '\n' SomeData"
Unfortunately, the output includes the literal '\n' Any help would be appreciative.
Try $'\n':
VAR=a
VAR="$VAR"$'\n'b
echo "$VAR"
gives me
a
b
A common technique is:
nl='
'
VARIABLE="PreviousData"
VARIABLE="$VARIABLE${nl}SomeData"
echo "$VARIABLE"
PreviousData
SomeData
Also common, to prevent inadvertently having your string start with a newline:
VARIABLE="$VARIABLE${VARIABLE:+$nl}SomeData"
(The expression ${VARIABLE:+$nl} will expand to a newline if and only if VARIABLE is set and non-empty.)
VAR="one"
VAR="$VAR.\n.two"
echo -e $VAR
Output:
one.
.two
Other than $'\n' you can use printf also like this:
VARIABLE="Foo Bar"
VARIABLE=$(printf "${VARIABLE}\nSomeData")
echo "$VARIABLE"
OUTPUT:
Foo Bar
SomeData
I had a problem with all the other solutions: when using a # followed by SPACE (quite common when writing in Markdown) both would get split onto a new line.
So, another way of doing it would involve using single quotes so that the "\n" get rendered.
FOO=$'# Markdown Title #\n'
BAR=$'Be *brave* and **bold**.'
FOOBAR="$FOO$BAR"
echo "$FOOBAR"
Output:
# Markdown Title #
Be *brave* and **bold**.
Single quote All special characters between these quotes lose their
special meaning.https://www.tutorialspoint.com/unix/unix-quoting-mechanisms.htm
So the syntax you use does something different that you want to achieve.
This is what you need:
The $'\X' construct makes the -e option in echo unnecessary.
https://linux.die.net/abs-guide/escapingsection.html
echo -e "something\nsomething"
or
echo "something"$'\n'"something"
It's a lot simpler than you think:
VARIABLE="$VARIABLE
SomeData"
Building upon the first two solutions, I'd do like shown below. Concatenating strings with the '+=' operator, somehow looks clearer to me.
Also rememeber to use printf as opposed to echo, you will save yourself so much trouble
sometext="This is the first line"
sometext+=$'\n\n'
sometext+="This is the second line AFTER the inserted new lines"
printf '%s' "${sometext}"
Outputs:
This is the first line
This is the third line AFTER the inserted new line
Your problem is in the echo command, in ash you have to use the option -e to expand special characters. This should work for you:
VAR="First line"
VAR="$VAR\nSecond line"
echo -e $VAR
This outputs
First line
Second line

How to pass the value of a variable to the standard input of a command?

I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?
Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.
Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.
(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.
Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.
This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.
I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.
As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.
Try this:
echo "$variable" | command
If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)

Resources