predefined input in a nohup shell script [duplicate] - linux

I have a script that calls an application that requires user input, e.g. run app that requires user to type in 'Y' or 'N'.
How can I get the shell script not to ask the user for the input but rather use the value from a predefined variable in the script?
In my case there will be two questions that require input.

You can pipe in whatever text you'd like on stdin and it will be just the same as having the user type it themselves. For example to simulating typing "Y" just use:
echo "Y" | myapp
or using a shell variable:
echo $ANSWER | myapp
There is also a unix command called "yes" that outputs a continuous stream of "y" for apps that ask lots of questions that you just want to answer in the affirmative.

If the app reads from stdin (as opposed to from /dev/tty, as e.g. the passwd program does), then multiline input is the perfect candidate for a here-document.
#!/bin/sh
the_app [app options here] <<EOF
Yes
No
Maybe
Do it with $SHELL
Quit
EOF
As you can see, here-documents even allow parameter substitution. If you don't want this, use <<'EOF'.

the expect command for more complicated situations, you system should have it. Haven't used it much myself, but I suspect its what you're looking for.
$ man expect
http://oreilly.com/catalog/expect/chapter/ch03.html

I prefer this way: If You want multiple inputs... you put in multiple echo statements as so:
{ echo Y; Y; } | sh install.sh >> install.out
In the example above... I am feeding two inputs into the install.sh script. Then... at the end, I am piping the script output to a log file to be archived and viewed for later.

Related

Piping Text into a bash script

When my User logs in, I need to enter the following manually so I am trying to create a script to do it for me
. oraenv
The app asks me for input so I enter "M40" (same text every time)
Then I have to run a linux app to launch my work environment.
So how do I automatically enter M40 followed by an enter key
The oraenv script is prompting for a value for ORACLE_SID, so you can set that yourself in a .profile or elsewhere.
export ORACLE_SID=M40
It also has a flag you can set to make it non-interactive:
ORAENV_ASK=NO
Regarding piped input specifically, the script would have to be written to handle it, for example using read or commands such as cat without a filename. See Pipe input into a script for more details. However, this is not how the standard oraenv is coded (assuming that is the script you are using).
I am not sure if anyone of these operations helps you.
echo M40 | . oraenv
This one uses echo pipe.
printf M40 | . oraenv
This one uses printf for pipe. Using echo is different from using printf in some situations, however I don't know their actual difference.
. oraenv <<< M40
This one uses Here String (Sorry for using ABS as reference), a stripped-down form of Heredoc.
. oraenv < <(echo M40)
This one uses Process Substitution, you may see https://superuser.com/questions/1059781/what-exactly-is-in-bash-and-in-zsh for the difference between this one and the above one.
expect -c "spawn . oraenv; expect \"nput\"; send \"M40\r\n\"; interact"
This one uses expect to do automatic input, it has more extensibility in many situations. Note to change the expect \"nput\" part with your actual situation.

Automatic enter input in shell command line [duplicate]

This question already has answers here:
Have bash script answer interactive prompts [duplicate]
(6 answers)
Closed 5 years ago.
I am running a script (I can't edit it), and there are three yes/no questions. How can I automatically respond to these questions? I need to answer yes, yes, no (in that order).
Try this:
echo -e "yes\nyes\nno" | /path/to/your/script
From help echo:
-e: enable interpretation of the following backslash escapes
Pipe to Standard Input
Some scripts can take replies from standard input. One of the many ways to do this would be:
$ printf "%s\n" yes yes no | ./foo.sh
yes yes no
This is simple and easy to read, but relies on how your script internals handle standard input, and if you can't edit the target script that can sometimes be a problem.
Use Expect for Interactive Prompts
While you can sometimes get away with using standard input, interactive prompts are generally better handled by tools like Expect. For example, given a script foo.sh, you can write foo.exp to automate it.
Note: You can also use autoexpect to create a a script from an interactive session, which you can then edit if necessary. I'd highly recommend this for people new to Expect.
Bash Script: foo.sh
This is the script you might want to automate.
#!/usr/bin/env bash
for question in Foo Bar Baz; do
read -p "${question}? "
replies=("${replies[#]}" "$REPLY")
done
echo "${replies[#]}"
Expect Script: foo.exp
Here is a simplistic Expect script to automate the Bash script above. Expect loops, branching, and regular expressions can provide much more flexibility than this oversimplified example shows, but it does show how easy a minimal Expect script can be!
#!/usr/bin/env expect
spawn -noecho /tmp/foo.sh
expect "Foo? " { send -- "1\r" }
expect "Bar? " { send -- "2\r" }
expect "Baz? " { send -- "3\r" }
interact
Sample Interactive Session
This is what your interactive session will look like when you run the Expect script. It will spawn your Bash script, and respond as instructed to each different prompt.
$ /tmp/foo.exp
Foo? 1
Bar? 2
Baz? 3
1 2 3

call bash script and fill input data from another bash script

I have a bash script, a.sh
And when I run a.sh, I need to fill several read. Let's say it like this
./a.sh
Please input a comment for script usage
test (I need to type this line mannually when running the script a.sh, and type "enter" to continue)
Now I call a.sh in my new script b.sh. Can I let b.sh to fill in the "test" string automaticlly ?
And one other question, a.sh owns lots of prints to the console, can I mute the prints from a.sh by doing something in my b.sh without changing a.sh ?
Thanks.
Within broad limits, you can have one script supply the standard input to another script.
However, you'd probably still see the prompts, even though you'd not see anything that satisfies those prompts. That would look bad. Also, depending on what a.sh does, you might need it to read more information from standard input — but you'd have to ensure the script calling it supplies the right information.
Generally, though, you try to avoid this. Scripts that prompt for input are bad for automation. It is better to supply the inputs via command line arguments. That makes it easy for your second script, b.sh, to drive a.sh.
a.sh
#!/bin/bash
read myvar
echo "you typed ${myvar}"
b.sh
#!/bin/bash
echo "hello world"
You can do this in 2 methods:
$ ./b.sh | ./a.sh
you typed hello world
$ ./a.sh <<< `./b.sh`
you typed hello world

How to get output from bash shell [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
bash - automatically capture output of last executed command into a variable
For example
In Bash shell
user:/ & echo "Hello"
Hello
user:/ & date
Sat Mar 17 01:48:45 ICT 2012
How to get the "Hello" or "Sat Mar 17 01:48:45 ICT 2012" variables
I mean every output from command bash shell
The purpose of this point, I want to get those variables and do something with its.
Like, when echo Hello I want to reverse string to olleH and then print it out.
Something like this
echo "Hello"
olleH
More Information
In Command Line Interface ***Not in Shell Script
When I type a command echo,date (or whatever command) in the Command Line Interface, The output will show in the next line.
user:/ & echo "Hello" <br>
Hello
The "Hello" variable is the output of bash shell. So, I want to store output Hello into variable and pass into "Dictionary Script"
The Hello will matched and translated depend on Dictionary Script
Hello = Hi (Just example)
Now Hello translated into Hi, send back to shell and print out
Result
user:/ & echo "Hello" <br>
Hi
How can I grab the output?
Could I expose the Bash Shell source, edit and re-compile?
Could I grab from the /proc/BASHPID?
Attempt at answering edited question
From your edit, it sounds as if you want to have bash executing commands, but the outputs from bash should be passed to a 'transmogrifier' which changes the output according to your whims.
This is not going to be trivial; I'm not sure it will be usable for more than a few seconds of fun.
At one level, you want an interactive shell with its standard output (and maybe standard error) going to your modifier program:
bash -i 2>&1 | transmogrifier
For basic command line work, this will just about be usable, though line editing (even with just backspace) becomes a little problematic. Command line editing becomes intolerable if transmogrifier is changing what you appear to be typing as you type it. (I tried bash -i 2>&1 | tee, which worked surprisingly well, but tee just copies its input to its output; and I'm not sure why I chose tee instead of cat. An interrupt, though, killed the pipeline; just another thing you'd have to worry about.)
Attempt at answering unedited question
For the Hello, you would be best off using:
var1="Hello"
You could also use:
var1=$(echo "Hello")
For the date, you more or less must use the latter notation:
var2=$(date)
You can now do things with those variables, such as:
$ echo "$var1" | rev
olleH
$
Note that simple prompts such as '$' (for Bash and relatives) or '%' (for C shell and relatives) or '#' (for root) are conventional.
I had decided not to mention the use of backticks, but since mgilson commented about them, they do, indeed, exist, and you may come across them in older scripts, and you should not write them in your own scripts.
The $(...) notation for command substitution, as it is called, is simpler in nested scenarios, such as:
cd $(dirname $(dirname $(which gcc)))/lib
compared with the backtick notation for the same operation:
cd `dirname \'dirname \\\`which gcc\\\`\``/lib
(Don't even think of trying to get that second line into MarkDown for comments! That's why I've added this to the answer.)

Linux shell bug? Variable assignment in pipe does not work

How come FILE_FOUND is 0 at the end of this bugger :
FILE_FOUND=0
touch /tmp/$$.txt
ls -1 /tmp/$$.* 2>/dev/null | while read item; do
FILE_FOUND=1
echo "FILE_FOUND = $FILE_FOUND"
done
echo "FILE_FOUND = $FILE_FOUND"
rm -f /tmp/$$.txt 2>/dev/null
??!!
On Unix FILE_FOUND stays at 1 (as it should), but on Linux (RedHat, Cygwin, ..) it jumps back to 0!!
Is it a Linux shell feature, not a bug? :)
Please help.
common issue which is caused because you're piping into the while, which is therefore run in a subshell, which can't pass environment variables back to its parent. I'm guessing that "unix" is different in this regard as you're running a different shell there (ksh?)
piping to the while loop may not be required. could you use this idiom instead?
for item in /tmp/$$.*; do
....
done
If you must use a subshell, then you'll have to do something external to the processes like:
touch /tmp/file_found
This is a "feature" of the "bash" shell. The "bash" manual entry states this clearly:
Each command in a pipeline is executed as a separate process (i.e., in a subshell).
The same construct executed with the Korn shell ("ksh") runs the "while" in the same process (not in a subshell) and hence gives the expected answer. So I checked the spec.
The POSIX shell specification is not crystal clear on this, but its does not say anything about changing the "shell execution environment", so I think that the UNIX / Korn shell implementations are compliant and the Bourne Again shell implementation is not. But then, "bash" does not claim to be POSIX compliant!
It has already been mentioned, but since you're piping into the while, the entire while-loop is run in a subshell. I'm not exactly sure which shell you're using on 'Unix', which I suppose means Solaris, but bash should behave consistenly regardless of platform.
To solve this problem, you can do a lot of things, the most common is to examin the result of the while loop somehow, like so
result=`mycommand 2>/dev/null | while read item; do echo "FILE_FOUND"; done`
and look for data in $result. Another common approach is to have the while loop produce valid variable assignments, and eval it directly.
eval `mycommand | while read item; do echo "FILE_FOUND=1"; done`
which will be evaluated by your 'current' shell to assign the given variables.
I'm assuming you don't want to just iterate over files, in which case you should be doing
for item in /tmp/$$.*; do
# whatever you want to do
done
As others have mentioned, it's the extra shell you're creating by using the pipe notation. Try this:
while read item; do
FILE_FOUND=1
echo "FILE_FOUND = $FILE_FOUND"
done < <(ls -1 /tmp/$$.* 2>/dev/null)
In this version, the while loop is in your script's shell, while the ls is a new shell (the opposite of what your script is doing).
another way , bring the result back,
FILE_FOUND=0
result=$(echo "something" | while read item; do
FILE_FOUND=1
echo "$FILE_FOUND"
done )
echo "FILE_FOUND outside while = $result"
Ummmm... ls -1 not l is on your script?

Resources