I have a bash script that employs the read command to read arguments to commands interactively, for example yes/no options. Is there a way to call this script in a non-interactive script passing default option values as arguments?
It's not just one option that I have to pass to the interactive script.
Many ways
pipe your input
echo "yes
no
maybe" | your_program
redirect from a file
your_program < answers.txt
use a here document (this can be very readable)
your_program << ANSWERS
yes
no
maybe
ANSWERS
use a here string
your_program <<< $'yes\nno\nmaybe\n'
For more complex tasks there is expect ( http://en.wikipedia.org/wiki/Expect ).
It basically simulates a user, you can code a script how to react to specific program outputs and related stuff.
This also works in cases like ssh that prohibits piping passwords to it.
You can put the data in a file and re-direct it like this:
$ cat file.sh
#!/bin/bash
read x
read y
echo $x
echo $y
Data for the script:
$ cat data.txt
2
3
Executing the script:
$ file.sh < data.txt
2
3
Just want to add one more way. Found it elsewhere, and is quite simple.
Say I want to pass yes for all the prompts at command line for a command "execute_command", Then I would simply pipe yes to it.
yes | execute_command
This will use yes as the answer to all yes/no prompts.
You can also use printf to pipe the input to your script.
var=val
printf "yes\nno\nmaybe\n$var\n" | ./your_script.sh
I have a bash script that employs the read command to read arguments to commands interactively, for example yes/no options. Is there a way to call this script in a non-interactive script passing default option values as arguments?
It's not just one option that I have to pass to the interactive script.
Many ways
pipe your input
echo "yes
no
maybe" | your_program
redirect from a file
your_program < answers.txt
use a here document (this can be very readable)
your_program << ANSWERS
yes
no
maybe
ANSWERS
use a here string
your_program <<< $'yes\nno\nmaybe\n'
For more complex tasks there is expect ( http://en.wikipedia.org/wiki/Expect ).
It basically simulates a user, you can code a script how to react to specific program outputs and related stuff.
This also works in cases like ssh that prohibits piping passwords to it.
You can put the data in a file and re-direct it like this:
$ cat file.sh
#!/bin/bash
read x
read y
echo $x
echo $y
Data for the script:
$ cat data.txt
2
3
Executing the script:
$ file.sh < data.txt
2
3
Just want to add one more way. Found it elsewhere, and is quite simple.
Say I want to pass yes for all the prompts at command line for a command "execute_command", Then I would simply pipe yes to it.
yes | execute_command
This will use yes as the answer to all yes/no prompts.
You can also use printf to pipe the input to your script.
var=val
printf "yes\nno\nmaybe\n$var\n" | ./your_script.sh
I have a program which accepts 2 prompts (y/n). For example:
stopprogram
do you want to stop the program (Y/N)? y
do you want to send an email to the admin about it (Y/N)? y
Now, I'd like to automate that using the 'at' command. the following works on Solaris but not on Linux RHEL:
at now +5 minutes << EOF
> for i in {1..2}
> do
> echo 'y'
> done | stopprogram
> EOF
commands will be executed using /usr/bin/bash
...
...
Any idea? Thanks!
Your problem may be because of the space between << and EOF.
Note that there is a special program yes for repeatedly outputting a line composed of all of its arguments. By default it outputs 'y'. It was created specially for forcing a scripted flow through those prompts.
Thus the short version of your command will look like this:
at now +5 minutes <<EOF
yes | stopprogram
EOF
I found the solution. This will work:
at now+5 min <<EOF
bash -l -c 'yes | stopprogram'
EOF
That's it!
I'm trying to write a pretty basic script in Linux shell but I'm still learning. Basically, everything is good to go except one part. I direct two outputs into the same file, e.g.:
echo `losetup -a` > partitionfile
echo "p1" >> partition final
Basically, I need to add the letter/number "p1" to the end of whatever is written in the file.
The problem is, it ends up being read (cat partitionfile) as:
/dev/loop0
p1
I need it on the same line to it reads out as:
/dev/loop0p1
There has to be a way to fix this, I just don't know it. Any help would be much appreciated!
Thanks!
I would go for:
echo "$(losetup -a)p1" > partitionfile
For an example, see the following transcript:
pax> echo "$(echo xyzzy_)p1"
xyzzy_p1
The xyzzy_ is the output of the inner echo command (which in your case would be losetup) and the outer echo command appends p1.
Hi Actually the correct option of echo to achieve this is "\c"
\c Keeps the cursor on the same line.
However you cannot use \c unless you have enabled it with
-e
Thus your code should be something like this ...
echo -e "`losetup -a` \c" > partitionfile
echo "p1" >> partition final
this will write in partitionfile as
< output of losetup -a > p1
everything on same line.
You can pass -n flag to the first echo statement to not print the trailing new line.
Ref: http://linux.die.net/man/1/echo
I am trying to get bash to process data from stdin that gets piped into, but no luck. What I mean is none of the following work:
echo "hello world" | test=($(< /dev/stdin)); echo test=$test
test=
echo "hello world" | read test; echo test=$test
test=
echo "hello world" | test=`cat`; echo test=$test
test=
where I want the output to be test=hello world. I've tried putting "" quotes around "$test" that doesn't work either.
Use
IFS= read var << EOF
$(foo)
EOF
You can trick read into accepting from a pipe like this:
echo "hello world" | { read test; echo test=$test; }
or even write a function like this:
read_from_pipe() { read "$#" <&0; }
But there's no point - your variable assignments may not last! A pipeline may spawn a subshell, where the environment is inherited by value, not by reference. This is why read doesn't bother with input from a pipe - it's undefined.
FYI, http://www.etalabs.net/sh_tricks.html is a nifty collection of the cruft necessary to fight the oddities and incompatibilities of bourne shells, sh.
if you want to read in lots of data and work on each line separately you could use something like this:
cat myFile | while read x ; do echo $x ; done
if you want to split the lines up into multiple words you can use multiple variables in place of x like this:
cat myFile | while read x y ; do echo $y $x ; done
alternatively:
while read x y ; do echo $y $x ; done < myFile
But as soon as you start to want to do anything really clever with this sort of thing you're better going for some scripting language like perl where you could try something like this:
perl -ane 'print "$F[0]\n"' < myFile
There's a fairly steep learning curve with perl (or I guess any of these languages) but you'll find it a lot easier in the long run if you want to do anything but the simplest of scripts. I'd recommend the Perl Cookbook and, of course, The Perl Programming Language by Larry Wall et al.
This is another option
$ read test < <(echo hello world)
$ echo $test
hello world
read won't read from a pipe (or possibly the result is lost because the pipe creates a subshell). You can, however, use a here string in Bash:
$ read a b c <<< $(echo 1 2 3)
$ echo $a $b $c
1 2 3
But see #chepner's answer for information about lastpipe.
I'm no expert in Bash, but I wonder why this hasn't been proposed:
stdin=$(cat)
echo "$stdin"
One-liner proof that it works for me:
$ fortune | eval 'stdin=$(cat); echo "$stdin"'
bash 4.2 introduces the lastpipe option, which allows your code to work as written, by executing the last command in a pipeline in the current shell, rather than a subshell.
shopt -s lastpipe
echo "hello world" | read test; echo test=$test
A smart script that can both read data from PIPE and command line arguments:
#!/bin/bash
if [[ -p /dev/stdin ]]
then
PIPE=$(cat -)
echo "PIPE=$PIPE"
fi
echo "ARGS=$#"
Output:
$ bash test arg1 arg2
ARGS=arg1 arg2
$ echo pipe_data1 | bash test arg1 arg2
PIPE=pipe_data1
ARGS=arg1 arg2
Explanation: When a script receives any data via pipe, then the /dev/stdin (or /proc/self/fd/0) will be a symlink to a pipe.
/proc/self/fd/0 -> pipe:[155938]
If not, it will point to the current terminal:
/proc/self/fd/0 -> /dev/pts/5
The bash [[ -p option can check it it is a pipe or not.
cat - reads the from stdin.
If we use cat - when there is no stdin, it will wait forever, that is why we put it inside the if condition.
The syntax for an implicit pipe from a shell command into a bash variable is
var=$(command)
or
var=`command`
In your examples, you are piping data to an assignment statement, which does not expect any input.
In my eyes the best way to read from stdin in bash is the following one, which also lets you work on the lines before the input ends:
while read LINE; do
echo $LINE
done < /dev/stdin
The first attempt was pretty close. This variation should work:
echo "hello world" | { test=$(< /dev/stdin); echo "test=$test"; };
and the output is:
test=hello world
You need braces after the pipe to enclose the assignment to test and the echo.
Without the braces, the assignment to test (after the pipe) is in one shell, and the echo "test=$test" is in a separate shell which doesn't know about that assignment. That's why you were getting "test=" in the output instead of "test=hello world".
Because I fall for it, I would like to drop a note.
I found this thread, because I have to rewrite an old sh script
to be POSIX compatible.
This basically means to circumvent the pipe/subshell problem introduced by POSIX by rewriting code like this:
some_command | read a b c
into:
read a b c << EOF
$(some_command)
EOF
And code like this:
some_command |
while read a b c; do
# something
done
into:
while read a b c; do
# something
done << EOF
$(some_command)
EOF
But the latter does not behave the same on empty input.
With the old notation the while loop is not entered on empty input,
but in POSIX notation it is!
I think it's due to the newline before EOF,
which cannot be ommitted.
The POSIX code which behaves more like the old notation
looks like this:
while read a b c; do
case $a in ("") break; esac
# something
done << EOF
$(some_command)
EOF
In most cases this should be good enough.
But unfortunately this still behaves not exactly like the old notation
if some_command prints an empty line.
In the old notation the while body is executed
and in POSIX notation we break in front of the body.
An approach to fix this might look like this:
while read a b c; do
case $a in ("something_guaranteed_not_to_be_printed_by_some_command") break; esac
# something
done << EOF
$(some_command)
echo "something_guaranteed_not_to_be_printed_by_some_command"
EOF
Piping something into an expression involving an assignment doesn't behave like that.
Instead, try:
test=$(echo "hello world"); echo test=$test
The following code:
echo "hello world" | ( test=($(< /dev/stdin)); echo test=$test )
will work too, but it will open another new sub-shell after the pipe, where
echo "hello world" | { test=($(< /dev/stdin)); echo test=$test; }
won't.
I had to disable job control to make use of chepnars' method (I was running this command from terminal):
set +m;shopt -s lastpipe
echo "hello world" | read test; echo test=$test
echo "hello world" | test="$(</dev/stdin)"; echo test=$test
Bash Manual says:
lastpipe
If set, and job control is not active, the shell runs the last command
of a pipeline not executed in the background in the current shell
environment.
Note: job control is turned off by default in a non-interactive shell and thus you don't need the set +m inside a script.
I think you were trying to write a shell script which could take input from stdin.
but while you are trying it to do it inline, you got lost trying to create that test= variable.
I think it does not make much sense to do it inline, and that's why it does not work the way you expect.
I was trying to reduce
$( ... | head -n $X | tail -n 1 )
to get a specific line from various input.
so I could type...
cat program_file.c | line 34
so I need a small shell program able to read from stdin. like you do.
22:14 ~ $ cat ~/bin/line
#!/bin/sh
if [ $# -ne 1 ]; then echo enter a line number to display; exit; fi
cat | head -n $1 | tail -n 1
22:16 ~ $
there you go.
The questions is how to catch output from a command to save in variable(s) for use later in a script. I might repeat some earlier answers but I try to line up all the answers I can think up to compare and comment, so bear with me.
The intuitive construct
echo test | read x
echo x=$x
is valid in Korn shell because ksh have implemented that the last command in a piped series is part of the current shell ie. the previous pipe commands are subshells. In contrast other shells define all piped commands as subshells including the last.
This is the exact reason I prefer ksh.
But having to copy with other shells, bash f.ex., another construct must be used.
To catch 1 value this construct is viable:
x=$(echo test)
echo x=$x
But that only caters for 1 value to be collected for later use.
To catch more values this construct is useful and works in bash and ksh:
read x y <<< $(echo test again)
echo x=$x y=$y
There is a variant which I have noticed work in bash but not in ksh:
read x y < <(echo test again)
echo x=$x y=$y
The <<< $(...) is a here-document variant which gives all the meta handling of a standard command line. < <(...) is an input redirection of a file-substitution operator.
I use "<<< $(" in all my scripts now because it seems the most portable construct between shell variants. I have a tools set I carry around on jobs in any Unix flavor.
Of course there is the universally viable but crude solution:
command-1 | {command-2; echo "x=test; y=again" > file.tmp; chmod 700 file.tmp}
. ./file.tmp
rm file.tmp
echo x=$x y=$y
I wanted something similar - a function that parses a string that can be passed as a parameter or piped.
I came up with a solution as below (works as #!/bin/sh and as #!/bin/bash)
#!/bin/sh
set -eu
my_func() {
local content=""
# if the first param is an empty string or is not set
if [ -z ${1+x} ]; then
# read content from a pipe if passed or from a user input if not passed
while read line; do content="${content}$line"; done < /dev/stdin
# first param was set (it may be an empty string)
else
content="$1"
fi
echo "Content: '$content'";
}
printf "0. $(my_func "")\n"
printf "1. $(my_func "one")\n"
printf "2. $(echo "two" | my_func)\n"
printf "3. $(my_func)\n"
printf "End\n"
Outputs:
0. Content: ''
1. Content: 'one'
2. Content: 'two'
typed text
3. Content: 'typed text'
End
For the last case (3.) you need to type, hit enter and CTRL+D to end the input.
How about this:
echo "hello world" | echo test=$(cat)