Bash: Variable substitution in quoted string looks like something from the Twilight Zone - linux

I've got a bash script that's reading essentially the output of telnet (actually socat to a unix domain socket).
while read -r LINE; do
if [ "$USER_DATA_FLAG" == "true" ]; then #Evaluates to false for the moment
...
else
printf "%s\n" "Variable> $LINE" >>$DEBUG_LOG
printf "%s\n" "Line xxxxz $LINE yxxxxx" >>$DEBUG_LOG
...
fi
done
This yields the following incomprehensible output:
Variable> >INFO:OpenVPN Management Interface Version 1 -- type 'help' for more info
yxxxxxxxxz >INFO:OpenVPN Management Interface Version 1 -- type 'help' for more info
Variable> OpenVPN CLIENT LIST
yxxxxxxxxz OpenVPN CLIENT LIST
The first printf statement works fine and shows exactly what I expect based on my experience at the terminal. But the second printf statement goes nuts, reverses the order of some the characters and prints $LINE entirely in the wrong place!

$LINE has a \r in it, which is sending the cursor back to column 1. Use parameter substitution or tr to remove it.
$ foo=$'1\r23'
$ echo "$foo"
23
$ echo "${foo/$'\r'/}"
123

This will ramble on a bit. The first thing I noticed was this:
printf "%s\n" "Variable> $LINE" >>$DEBUG_LOG
And I immediately think, "this is C code." In shell programming, printf is somewhat an odd tool which is sometimes useful, unlike the printf in C, which is the first thing you reach for when your printing something out. Try this instead:
echo "Variable> $LINE" >>$DEBUG_LOG
The other problem is potentially with telnet. If you parse telnet output in a shell, you're going to get some characters you don't want -- carriage returns, for example. The default value of IFS for bash is <space><tab><newline>, which will NOT catch carriage returns and the carriage returns WILL mess with the location of characters in your terminal.
Ideally, you would get a proper telnet client and use it non-interactively. However, in a pinch, you can do this:
while tr -d '\r' | read -r LINE; do
...
This will strip out carriage returns -- but there could still be other telnet junk in there, including WILL/WONT/DO/DONT commands, and NUL bytes.

Related

Bash: New line in echo string fails when output is piped to crontab [duplicate]

How do I print a newline? This merely prints \n:
$ echo -e "Hello,\nWorld!"
Hello,\nWorld!
Use printf instead:
printf "hello\nworld\n"
printf behaves more consistently across different environments than echo.
Make sure you are in Bash.
$ echo $0
bash
All these four ways work for me:
echo -e "Hello\nworld"
echo -e 'Hello\nworld'
echo Hello$'\n'world
echo Hello ; echo world
echo $'hello\nworld'
prints
hello
world
$'' strings use ANSI C Quoting:
Words of the form $'string' are treated specially. The word expands to string, with backslash-escaped characters replaced as specified by the ANSI C standard.
You could always do echo "".
For example,
echo "Hello,"
echo ""
echo "World!"
On the off chance that someone finds themselves beating their head against the wall trying to figure out why a coworker's script won't print newlines, look out for this:
#!/bin/bash
function GET_RECORDS()
{
echo -e "starting\n the process";
}
echo $(GET_RECORDS);
As in the above, the actual running of the method may itself be wrapped in an echo which supersedes any echos that may be in the method itself. Obviously, I watered this down for brevity. It was not so easy to spot!
You can then inform your comrades that a better way to execute functions would be like so:
#!/bin/bash
function GET_RECORDS()
{
echo -e "starting\n the process";
}
GET_RECORDS;
Simply type
echo
to get a new line
POSIX 7 on echo
http://pubs.opengroup.org/onlinepubs/9699919799/utilities/echo.html
-e is not defined and backslashes are implementation defined:
If the first operand is -n, or if any of the operands contain a <backslash> character, the results are implementation-defined.
unless you have an optional XSI extension.
So I recommend that you should use printf instead, which is well specified:
format operand shall be used as the format string described in XBD File Format Notation [...]
the File Format Notation:
\n <newline> Move the printing position to the start of the next line.
Also keep in mind that Ubuntu 15.10 and most distros implement echo both as:
a Bash built-in: help echo
a standalone executable: which echo
which can lead to some confusion.
str='hello\nworld'
$ echo | sed "i$str"
hello
world
You can also do:
echo "hello
world"
This works both inside a script and from the command line.
On the command line, press Shift+Enter to do the line break inside the string.
This works for me on my macOS and my Ubuntu 18.04 (Bionic Beaver) system.
For only the question asked (not special characters etc) changing only double quotes to single quotes.
echo -e 'Hello,\nWorld!'
Results in:
Hello,
World!
There is a new parameter expansion added in Bash 4.4 that interprets escape sequences:
${parameter#operator} - E operator
The expansion is a string that is the value of parameter with
backslash escape sequences expanded as with the $'…' quoting
mechanism.
$ foo='hello\nworld'
$ echo "${foo#E}"
hello
world
I just use echo without any arguments:
echo "Hello"
echo
echo "World"
To print a new line with echo, use:
echo
or
echo -e '\n'
This could better be done as
x="\n"
echo -ne $x
-e option will interpret backslahes for the escape sequence
-n option will remove the trailing newline in the output
PS: the command echo has an effect of always including a trailing newline in the output so -n is required to turn that thing off (and make it less confusing)
My script:
echo "WARNINGS: $warningsFound WARNINGS FOUND:\n$warningStrings
Output:
WARNING : 2 WARNINGS FOUND:\nWarning, found the following local orphaned signature file:
On my Bash script I was getting mad as you until I've just tried:
echo "WARNING : $warningsFound WARNINGS FOUND:
$warningStrings"
Just hit Enter where you want to insert that jump. The output now is:
WARNING : 2 WARNINGS FOUND:
Warning, found the following local orphaned signature file:
If you're writing scripts and will be echoing newlines as part of other messages several times, a nice cross-platform solution is to put a literal newline in a variable like so:
newline='
'
echo "first line${newline}second line"
echo "Error: example error message n${newline}${usage}" >&2 #requires usage to be defined
If the previous answers don't work, and there is a need to get a return value from their function:
function foo()
{
local v="Dimi";
local s="";
.....
s+="Some message here $v $1\n"
.....
echo $s
}
r=$(foo "my message");
echo -e $r;
Only this trick worked on a Linux system I was working on with this Bash version:
GNU bash, version 2.2.25(1)-release (x86_64-redhat-linux-gnu)
You could also use echo with braces,
$ (echo hello; echo world)
hello
world
This got me there....
outstuff=RESOURCE_GROUP=[$RESOURCE_GROUP]\\nAKS_CLUSTER_NAME=[$AKS_CLUSTER_NAME]\\nREGION_NAME=[$REGION_NAME]\\nVERSION=[$VERSION]\\nSUBNET-ID=[$SUBNET_ID]
printf $outstuff
Yields:
RESOURCE_GROUP=[akswork-rg]
AKS_CLUSTER_NAME=[aksworkshop-804]
REGION_NAME=[eastus]
VERSION=[1.16.7]
SUBNET-ID=[/subscriptions/{subidhere}/resourceGroups/makeakswork-rg/providers/Microsoft.Network/virtualNetworks/aks-vnet/subnets/aks-subnet]
Sometimes you can pass multiple strings separated by a space and it will be interpreted as \n.
For example when using a shell script for multi-line notifcations:
#!/bin/bash
notify-send 'notification success' 'another line' 'time now '`date +"%s"`
With jq:
$ jq -nr '"Hello,\nWorld"'
Hello,
World
Additional solution:
In cases, you have to echo a multiline of the long contents (such as code/ configurations)
For example:
A Bash script to generate codes/ configurations
echo -e,
printf might have some limitation
You can use some special char as a placeholder as a line break (such as ~) and replace it after the file was created using tr:
echo ${content} | tr '~' '\n' > $targetFile
It needs to invoke another program (tr) which should be fine, IMO.

Why am I getting command not found error on numeric comparison?

I am trying to parse each line of a file and look for a particular string. The script seems to be doing its intended job, however, in parallel it tries to execute the if command on line 6:
#!/bin/bash
for line in $(cat $1)
do
echo $line | grep -e "Oct/2015"
if($?==0); then
echo "current line is: $line"
fi
done
and I get the following (my script is readlines.sh)
./readlines.sh: line 6: 0==0: command not found
First: As Mr. Llama says, you need more spaces. Right now your script tries to look for a file named something like /usr/bin/0==0 to run. Instead:
[ "$?" -eq 0 ] # POSIX-compliant numeric comparison
[ "$?" = 0 ] # POSIX-compliant string comparison
(( $? == 0 )) # bash-extended numeric comparison
Second: Don't test $? at all in this case. In fact, you don't even have good cause to use grep; the following is both more efficient (because it uses only functionality built into bash and requires no invocation of external commands) and more readable:
if [[ $line = *"Oct/2015"* ]]; then
echo "Current line is: $line"
fi
If you really do need to use grep, write it like so:
if echo "$line" | grep -q "Oct/2015"; then
echo "Current line is: $line"
fi
That way if operates directly on the pipeline's exit status, rather than running a second command testing $? and operating on that command's exit status.
#Charles Duffy has a good answer which I have up-voted as correct (and it is), but here's a detailed, line by line breakdown of your script and the correct thing to do for each part of it.
for line in $(cat $1)
As I noted in my comment elsewhere this should be done as a while read construct instead of a for cat construct.
This construct will wordsplit each line making spaces in the file separate "lines" in the output.
All empty lines will be skipped.
In addition when you cat $1 the variable should be quoted. If it is not quoted spaces and other less-usual characters appearing in the file name will cause the cat to fail and the loop will not process the file.
The complete line would read:
while IFS= read -r line
An illustrative example of the tradeoffs can be found here. The linked test script follows. I tried to include an indication of why IFS= and -r are important.
#!/bin/bash
mkdir -p /tmp/testcase
pushd /tmp/testcase >/dev/null
printf '%s\n' '' two 'three three' '' ' five with leading spaces' 'c:\some\dos\path' '' > testfile
printf '\nwc -l testfile:\n'
wc -l testfile
printf '\n\nfor line in $(cat) ... \n\n'
let n=1
for line in $(cat testfile) ; do
echo line $n: "$line"
let n++
done
printf '\n\nfor line in "$(cat)" ... \n\n'
let n=1
for line in "$(cat testfile)" ; do
echo line $n: "$line"
let n++
done
let n=1
printf '\n\nwhile read ... \n\n'
while read line ; do
echo line $n: "$line"
let n++
done < testfile
printf '\n\nwhile IFS= read ... \n\n'
let n=1
while IFS= read line ; do
echo line $n: "$line"
let n++
done < testfile
printf '\n\nwhile IFS= read -r ... \n\n'
let n=1
while IFS= read -r line ; do
echo line $n: "$line"
let n++
done < testfile
rm -- testfile
popd >/dev/null
rmdir /tmp/testcase
Note that this is a bash-heavy example. Other shells do not tend to support -r for read, for example, nor is let portable. On to the next line of your script.
do
As a matter of style I prefer do on the same line as the for or while declaration, but there's no convention on this.
echo $line | grep -e "Oct/2015"
The variable $line should be quoted here. In general, meaning always unless you specifically know better, you should double-quote all expansion--and that means subshells as well as variables. This insulates you from most unexpected shell weirdness.
You decclared your shell as bash which means you will have there "Here string" operator <<< available to you. When available it can be used to avoid the pipe; each element of a pipeline executes in a subshell, which incurs extra overhead and can lead to unexpected behavior if you try to modify variables. This would be written as
grep -e "Oct/2015" <<<"$line"
Note that I have quoted the line expansion.
You have called grep with -e, which is not incorrect but is needless since your pattern does not begin with -. In addition you have full-quoted a string in shell but you don't attempt to expand a variable or use other shell interpolation inside of it. When you don't expect and don't want the contents of a quoted string to be treated as special by the shell you should single quote them. Furthermore, your use of grep is inefficient: because your pattern is a fixed string and not a regular expression you could have used fgrep or grep -F, which does string contains rather than regular expression matching (and is far faster because of this). So this could be
grep -F 'Oct/2015' <<<"$line"
Without altering the behavior.
if($?==0); then
This is the source of your original problem. In shell scripts commands are separated by whitespace; when you say if($?==0) the $? expands, probably to 0, and bash will try to execute a command called if(0==0) which is a legal command name. What you wanted to do was invoke the if command and give it some parameters, which requires more whitespace. I believe others have covered this sufficiently.
You should never need to test the value of $? in a shell script. The if command exists for branching behavior based on the return code of whatever command you pass to it, so you can inline your grep call and have if check its return code directly, thus:
if grep -F 'Oct/2015` <<<"$line" ; then
Note the generous whitespace around the ; delimiter. I do this because in shell whitespace is usually required and can only sometiems be omitted. Rather than try to remember when you can do which I recommend an extra one space padding around everything. It's never wrong and can make other mistakes easier to notice.
As others have noted this grep will print matched lines to stdout, which is probably not something you want. If you are using GNU grep, which is standard on Linux, you will have the -q switch available to you. This will suppress the output from grep
if grep -q -F 'Oct/2015' <<<"$line" ; then
If you are trying to be strictly standards compliant or are in any environment with a grep that doesn't know -q the standard way to achieve this effect is to redirect stdout to /dev/null/
if printf "$line" | grep -F 'Oct/2015' >/dev/null ; then
In this example I also removed the here string bashism just to show a portable version of this line.
echo "current line is: $line"
There is nothing wrong with this line of your script, except that although echo is standard implementations vary to such an extent that it's not possible to absolutely rely on its behavior. You can use printf anywhere you would use echo and you can be fairly confident of what it will print. Even printf has some caveats: Some uncommon escape sequences are not evenly supported. See mascheck for details.
printf 'current line is: %s\n' "$line"
Note the explicit newline at the end; printf doesn't add one automatically.
fi
No comment on this line.
done
In the case where you did as I recommended and replaced the for line with a while read construct this line would change to:
done < "$1"
This directs the contents of the file in the $1 variable to the stdin of the while loop, which in turn passes the data to read.
In the interests of clarity I recommend copying the value from $1 into another variable first. That way when you read this line the purpose is more clear.
I hope no one takes great offense at the stylistic choices made above, which I have attempted to note; there are many ways to do this (but not a great many correct) ways.
Be sure to always run interesting snippets through the excellent shellcheck and explain shell when you run into difficulties like this in the future.
And finally, here's everything put together:
#!/bin/bash
input_file="$1"
while IFS= read -r line ; do
if grep -q -F 'Oct/2015' <<<"$line" ; then
printf 'current line is %s\n' "$line"
fi
done < "$input_file"
If you like one-liners, you may use AND operator (&&), for example:
echo "$line" | grep -e "Oct/2015" && echo "current line is: $line"
or:
grep -qe "Oct/2015" <<<"$line" && echo "current line is: $line"
Spacing is important in shell scripting.
Also, double-parens is for numerical comparison, not single-parens.
if (( $? == 0 )); then

Looping through lines in a file in bash, without using stdin

I am foxed by the following situation.
I have a file list.txt that I want to run through line by line, in a loop, in bash. A typical line in list.txt has spaces in. The problem is that the loop contains a "read" command. I want to write this loop in bash rather than something like perl. I can't do it :-(
Here's how I would usually write a loop to read from a file line by line:
while read p; do
echo $p
echo "Hit enter for the next one."
read x
done < list.txt
This doesn't work though, because of course "read x" will be reading from list.txt rather than the keyboard.
And this doesn't work either:
for i in `cat list.txt`; do
echo $i
echo "Hit enter for the next one."
read x
done
because the lines in list.txt have spaces in.
I have two proposed solutions, both of which stink:
1) I could edit list.txt, and globally replace all spaces with "THERE_SHOULD_BE_A_SPACE_HERE" . I could then use something like sed, within my loop, to replace THERE_SHOULD_BE_A_SPACE_HERE with a space and I'd be all set. I don't like this for the stupid reason that it will fail if any of the lines in list.txt contain the phrase THERE_SHOULD_BE_A_SPACE_HERE (so malicious users can mess me up).
2) I could use the while loop with stdin and then in each loop I could actually launch e.g. a new terminal, which would be unaffected by the goings-on involving stdin in the original shell. I tried this and I did get it to work, but it was ugly: I want to wrap all this up in a shell script and I don't want that shell script to be randomly opening new windows. What would be nice, and what might somehow be the answer to this question, would be if I could figure out how to somehow invoke a new shell in the command and feed commands to it without feeding stdin to it, but I can't get it to work. For example this doesn't work and I don't really know why:
while read p; do
bash -c "echo $p; echo ""Press enter for the next one.""; read x;";
done < list.txt
This attempt seems to fail because "read x", despite being in a different shell somehow, is still seemingly reading from list.txt. But I feel like I might be close with this one -- who knows.
Help!
You must open as a different file descriptor
while read p <&3; do
echo "$p"
echo 'Hit enter for the next one'
read x
done 3< list.txt
Update: Just ignore the lengthy discussion in the comments below. It has nothing to do with the question or this answer.
I would probably count lines in a file and iterate each of those using eg. sed. It is also possible to read infinitely from stdin by changing while condition to: while true; and exit reading with ctrl+c.
line=0 lines=$(sed -n '$=' in.file)
while [ $line -lt $lines ]
do
let line++
sed -n "${line}p" in.file
echo "Hit enter for the next ${line} of ${lines}."
read -s x
done
AWK is also great tool for this. Simple way to iterate through input would be like:
awk '{ print $0; printf "%s", "Hit enter for the next"; getline < "-" }' file
As an alternative, you can read from stderr, which by default is connected to the tty as well. The following then also includes a test for that assumption:
(
tty -s <& 2|| exit 1
while read -r line; do
echo "$line"
echo 'Hit enter'
read x <& 2
done < file
)

Printf two var's into a file

I have a list of IP's I want to snmpget into a file. I'm having issues writing output to the file.
OID=1.3.6.1.2.1.25.3.2.1.3.1
cat printers.csv | while read IP ; do
OUT=$(snmpget -v1 -c public $IP $OID)
printf '%s, %s\n' $IP $OUT >> printerNames.csv
done
I'm new to the printf command. I'm guessing that's where it's messing up bc output is being written just incorrectly. Also, when there is no response it echos to console and I'd like it written to the output file. Any help will be much appreciated.
Try this:
OID=1.3.6.1.2.1.25.3.2.1.3.1
while read IP ; do
OUT=$(snmpget -v1 -c public "$IP" "$OID") && printf '%s, %s\n' "$IP" "$OUT"
done < printers.csv 2>&1 > printerNames.csv
It's a good idea to always quote parameter expansions unless you have a good reason not to. The redirections are applied to the while loop. read will read a line at a time from the input file (no cat needed); anything written to standard error is instead copied to standard output, and the standard output (errors included) is redirected to the output file. The printf is only executed if the snmpget command succeeds (I'm assuming it has a non-zero exit status if the lookup fails; that may not be the case).
It sounds like printers.csv has DOS line endings (\r\n). The carriage return is included as the last character of each line. When you print $IP, it prints the address, then the carriage return, which moves the cursor back to the beginning of the line. This causes the remainder of the line (, $OUT) to overwrite the address. To remove the carriage return, run the input file through dos2unix, or use some other method to turn the DOS line endings into UNIX line endings (\n alone).

Read values into a shell variable from a pipe

I am trying to get bash to process data from stdin that gets piped into, but no luck. What I mean is none of the following work:
echo "hello world" | test=($(< /dev/stdin)); echo test=$test
test=
echo "hello world" | read test; echo test=$test
test=
echo "hello world" | test=`cat`; echo test=$test
test=
where I want the output to be test=hello world. I've tried putting "" quotes around "$test" that doesn't work either.
Use
IFS= read var << EOF
$(foo)
EOF
You can trick read into accepting from a pipe like this:
echo "hello world" | { read test; echo test=$test; }
or even write a function like this:
read_from_pipe() { read "$#" <&0; }
But there's no point - your variable assignments may not last! A pipeline may spawn a subshell, where the environment is inherited by value, not by reference. This is why read doesn't bother with input from a pipe - it's undefined.
FYI, http://www.etalabs.net/sh_tricks.html is a nifty collection of the cruft necessary to fight the oddities and incompatibilities of bourne shells, sh.
if you want to read in lots of data and work on each line separately you could use something like this:
cat myFile | while read x ; do echo $x ; done
if you want to split the lines up into multiple words you can use multiple variables in place of x like this:
cat myFile | while read x y ; do echo $y $x ; done
alternatively:
while read x y ; do echo $y $x ; done < myFile
But as soon as you start to want to do anything really clever with this sort of thing you're better going for some scripting language like perl where you could try something like this:
perl -ane 'print "$F[0]\n"' < myFile
There's a fairly steep learning curve with perl (or I guess any of these languages) but you'll find it a lot easier in the long run if you want to do anything but the simplest of scripts. I'd recommend the Perl Cookbook and, of course, The Perl Programming Language by Larry Wall et al.
This is another option
$ read test < <(echo hello world)
$ echo $test
hello world
read won't read from a pipe (or possibly the result is lost because the pipe creates a subshell). You can, however, use a here string in Bash:
$ read a b c <<< $(echo 1 2 3)
$ echo $a $b $c
1 2 3
But see #chepner's answer for information about lastpipe.
I'm no expert in Bash, but I wonder why this hasn't been proposed:
stdin=$(cat)
echo "$stdin"
One-liner proof that it works for me:
$ fortune | eval 'stdin=$(cat); echo "$stdin"'
bash 4.2 introduces the lastpipe option, which allows your code to work as written, by executing the last command in a pipeline in the current shell, rather than a subshell.
shopt -s lastpipe
echo "hello world" | read test; echo test=$test
A smart script that can both read data from PIPE and command line arguments:
#!/bin/bash
if [[ -p /dev/stdin ]]
then
PIPE=$(cat -)
echo "PIPE=$PIPE"
fi
echo "ARGS=$#"
Output:
$ bash test arg1 arg2
ARGS=arg1 arg2
$ echo pipe_data1 | bash test arg1 arg2
PIPE=pipe_data1
ARGS=arg1 arg2
Explanation: When a script receives any data via pipe, then the /dev/stdin (or /proc/self/fd/0) will be a symlink to a pipe.
/proc/self/fd/0 -> pipe:[155938]
If not, it will point to the current terminal:
/proc/self/fd/0 -> /dev/pts/5
The bash [[ -p option can check it it is a pipe or not.
cat - reads the from stdin.
If we use cat - when there is no stdin, it will wait forever, that is why we put it inside the if condition.
The syntax for an implicit pipe from a shell command into a bash variable is
var=$(command)
or
var=`command`
In your examples, you are piping data to an assignment statement, which does not expect any input.
In my eyes the best way to read from stdin in bash is the following one, which also lets you work on the lines before the input ends:
while read LINE; do
echo $LINE
done < /dev/stdin
The first attempt was pretty close. This variation should work:
echo "hello world" | { test=$(< /dev/stdin); echo "test=$test"; };
and the output is:
test=hello world
You need braces after the pipe to enclose the assignment to test and the echo.
Without the braces, the assignment to test (after the pipe) is in one shell, and the echo "test=$test" is in a separate shell which doesn't know about that assignment. That's why you were getting "test=" in the output instead of "test=hello world".
Because I fall for it, I would like to drop a note.
I found this thread, because I have to rewrite an old sh script
to be POSIX compatible.
This basically means to circumvent the pipe/subshell problem introduced by POSIX by rewriting code like this:
some_command | read a b c
into:
read a b c << EOF
$(some_command)
EOF
And code like this:
some_command |
while read a b c; do
# something
done
into:
while read a b c; do
# something
done << EOF
$(some_command)
EOF
But the latter does not behave the same on empty input.
With the old notation the while loop is not entered on empty input,
but in POSIX notation it is!
I think it's due to the newline before EOF,
which cannot be ommitted.
The POSIX code which behaves more like the old notation
looks like this:
while read a b c; do
case $a in ("") break; esac
# something
done << EOF
$(some_command)
EOF
In most cases this should be good enough.
But unfortunately this still behaves not exactly like the old notation
if some_command prints an empty line.
In the old notation the while body is executed
and in POSIX notation we break in front of the body.
An approach to fix this might look like this:
while read a b c; do
case $a in ("something_guaranteed_not_to_be_printed_by_some_command") break; esac
# something
done << EOF
$(some_command)
echo "something_guaranteed_not_to_be_printed_by_some_command"
EOF
Piping something into an expression involving an assignment doesn't behave like that.
Instead, try:
test=$(echo "hello world"); echo test=$test
The following code:
echo "hello world" | ( test=($(< /dev/stdin)); echo test=$test )
will work too, but it will open another new sub-shell after the pipe, where
echo "hello world" | { test=($(< /dev/stdin)); echo test=$test; }
won't.
I had to disable job control to make use of chepnars' method (I was running this command from terminal):
set +m;shopt -s lastpipe
echo "hello world" | read test; echo test=$test
echo "hello world" | test="$(</dev/stdin)"; echo test=$test
Bash Manual says:
lastpipe
If set, and job control is not active, the shell runs the last command
of a pipeline not executed in the background in the current shell
environment.
Note: job control is turned off by default in a non-interactive shell and thus you don't need the set +m inside a script.
I think you were trying to write a shell script which could take input from stdin.
but while you are trying it to do it inline, you got lost trying to create that test= variable.
I think it does not make much sense to do it inline, and that's why it does not work the way you expect.
I was trying to reduce
$( ... | head -n $X | tail -n 1 )
to get a specific line from various input.
so I could type...
cat program_file.c | line 34
so I need a small shell program able to read from stdin. like you do.
22:14 ~ $ cat ~/bin/line
#!/bin/sh
if [ $# -ne 1 ]; then echo enter a line number to display; exit; fi
cat | head -n $1 | tail -n 1
22:16 ~ $
there you go.
The questions is how to catch output from a command to save in variable(s) for use later in a script. I might repeat some earlier answers but I try to line up all the answers I can think up to compare and comment, so bear with me.
The intuitive construct
echo test | read x
echo x=$x
is valid in Korn shell because ksh have implemented that the last command in a piped series is part of the current shell ie. the previous pipe commands are subshells. In contrast other shells define all piped commands as subshells including the last.
This is the exact reason I prefer ksh.
But having to copy with other shells, bash f.ex., another construct must be used.
To catch 1 value this construct is viable:
x=$(echo test)
echo x=$x
But that only caters for 1 value to be collected for later use.
To catch more values this construct is useful and works in bash and ksh:
read x y <<< $(echo test again)
echo x=$x y=$y
There is a variant which I have noticed work in bash but not in ksh:
read x y < <(echo test again)
echo x=$x y=$y
The <<< $(...) is a here-document variant which gives all the meta handling of a standard command line. < <(...) is an input redirection of a file-substitution operator.
I use "<<< $(" in all my scripts now because it seems the most portable construct between shell variants. I have a tools set I carry around on jobs in any Unix flavor.
Of course there is the universally viable but crude solution:
command-1 | {command-2; echo "x=test; y=again" > file.tmp; chmod 700 file.tmp}
. ./file.tmp
rm file.tmp
echo x=$x y=$y
I wanted something similar - a function that parses a string that can be passed as a parameter or piped.
I came up with a solution as below (works as #!/bin/sh and as #!/bin/bash)
#!/bin/sh
set -eu
my_func() {
local content=""
# if the first param is an empty string or is not set
if [ -z ${1+x} ]; then
# read content from a pipe if passed or from a user input if not passed
while read line; do content="${content}$line"; done < /dev/stdin
# first param was set (it may be an empty string)
else
content="$1"
fi
echo "Content: '$content'";
}
printf "0. $(my_func "")\n"
printf "1. $(my_func "one")\n"
printf "2. $(echo "two" | my_func)\n"
printf "3. $(my_func)\n"
printf "End\n"
Outputs:
0. Content: ''
1. Content: 'one'
2. Content: 'two'
typed text
3. Content: 'typed text'
End
For the last case (3.) you need to type, hit enter and CTRL+D to end the input.
How about this:
echo "hello world" | echo test=$(cat)

Resources