How to append several lines of text in a file using a shell script - linux

I want to write several lines (5 or more) to a file I'm going to create in script. I can do this by echo >> filename. But I would like to know what the best way to do this?

You can use a here document:
cat <<EOF >> outputfile
some lines
of text
EOF

I usually use the so-called "here-document" Dennis suggested. An alternative is:
(echo first line; echo second line) >> outputfile
This should have comparable performance in bash, as (....) starts a subshell, but echo is 'inlined' - bash does not run /bin/echo, but does the echo by itself.
It might even be faster because it involves no exec().
This style is even more useful if you want to use output from another command somewhere in the text.

Related

How to pre-specify a selection when executing a program on Linux [duplicate]

I have a bash script that employs the read command to read arguments to commands interactively, for example yes/no options. Is there a way to call this script in a non-interactive script passing default option values as arguments?
It's not just one option that I have to pass to the interactive script.
Many ways
pipe your input
echo "yes
no
maybe" | your_program
redirect from a file
your_program < answers.txt
use a here document (this can be very readable)
your_program << ANSWERS
yes
no
maybe
ANSWERS
use a here string
your_program <<< $'yes\nno\nmaybe\n'
For more complex tasks there is expect ( http://en.wikipedia.org/wiki/Expect ).
It basically simulates a user, you can code a script how to react to specific program outputs and related stuff.
This also works in cases like ssh that prohibits piping passwords to it.
You can put the data in a file and re-direct it like this:
$ cat file.sh
#!/bin/bash
read x
read y
echo $x
echo $y
Data for the script:
$ cat data.txt
2
3
Executing the script:
$ file.sh < data.txt
2
3
Just want to add one more way. Found it elsewhere, and is quite simple.
Say I want to pass yes for all the prompts at command line for a command "execute_command", Then I would simply pipe yes to it.
yes | execute_command
This will use yes as the answer to all yes/no prompts.
You can also use printf to pipe the input to your script.
var=val
printf "yes\nno\nmaybe\n$var\n" | ./your_script.sh

File redirection fails in Bash script, but not Bash terminal

I am having a problem where cmd1 works, but not cmd2 in my Bash script ending in .sh. I have made the Bash script executable.
Additionally, I can execute cmd2 just fine from my Bash terminal. I have tried to make a minimally reproducible example, but my larger goal is to run a complicated executable with command line arguments and pass output to a file that may or may not exist (rather than displaying the output in the terminal).
Replacing > with >> also gives the same error in the script, but not the terminal.
My Bash script:
#!/bin/bash
cmd1="cat test.txt"
cmd2="cat test.txt > a"
echo $cmd1
$cmd1
echo $cmd2
$cmd2
test.txt has the words "dog" and "cat" on two separate lines without quotes.
Short answer: see BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!.
Long answer: the shell expands variable references (like $cmd1) toward the end of the process of parsing a command line, after it's done parsing redirects (like > a is supposed to be) and quotes and escapes and... In fact, the only thing it does with the expanded value is word splitting (e.g. treating cat test.txt > a as "cat" followed by "test.txt", ">", and finally "a", rather than a single string) and wildcard expansion (e.g. if $cmd expanded to cat *.txt, it'd replace the *.txt part with a list of matching files). (And it skips word splitting and wildcard expansion if the variable is in double-quotes.)
Partly as a result of this, the best way to store commands in variables is: don't. That's not what they're for; variables are for data, not commands. What you should do instead, though, depends on why you were storing the command in a variable.
If there's no real reason to store the command in a variable, then just use the command directly. For conditional redirects, just use a standard if statement:
if [ -f a ]; then
cat test.txt > a
else
cat test.txt
fi
If you need to define the command at one point, and use it later; or want to use the same command over and over without having to write it out in full each time, use a function:
cmd2() {
cat test.txt > a
}
cmd2
It sounds like you may need to be able to define the command differently depending on some condition, you can actually do that with a function as well:
if [ -f a ]; then
cmd() {
cat test.txt > a
}
else
cmd() {
cat test.txt
}
fi
cmd
Alternately, you can wrap the command (without redirect) in a function, then use a conditional to control whether it redirects:
cmd() {
cat test.txt
}
if [ -f a ]; then
cmd > a
else
cmd
fi
It's also possible to wrap a conditional redirect into a function itself, then pipe output to it:
maybe_redirect_to() {
if [ -f "$1" ]; then
cat > "$1"
else
cat
fi
}
cat test.txt | maybe_redirect_to a
(This creates an extra cat process that isn't really doing anything useful, but if it makes the script cleaner, I'd consider that worth it. In this particular case, you could minimize the stray cats by using maybe_redirect_to a < test.txt.)
As a last resort, you can store the command string in a variable, and use eval to parse it. eval basically re-runs the shell parsing process from the beginning, meaning that it'll recognize things like redirects in the string. But eval has a well-deserved reputation as a bug magnet, because it's easy for it to treat parts of the string you thought were just data as command syntax, which can cause some really weird (& dangerous) bugs.
If you must use eval, at least double-quote the variable reference, so it runs through the parsing process just once, rather than sort-of-once-and-a-half as it would unquoted. Here's an example of what I mean:
cmd3="echo '5 * 3 = 15'"
eval "$cmd3"
# prints: 5 * 3 = 15
eval $cmd3
# prints: 5 [list of files in the current directory] 3 = 15
# ...unless there are any files with shell metacharacters in their names, in
# which case something more complicated might happen.
BashFAQ #50 discusses some other possible reasons and solutions. Note that the array approach will not work here, since arrays also get expanded after redirects are parsed.
If you pop an 'eval' in front of $cmd2 it should work as expected:
#!/bin/bash
cmd2="cat test.txt > a"
eval $cmd2
If you're not sure about the operation of a script you could always use the debug mode to see if you can determine the error.
bash -x scriptname
This will run the command and display the output of variable evaluations. Hopefully this will reveal any issues with syntax.

Automate "Press enter to continue" in shell script [duplicate]

I have a bash script that employs the read command to read arguments to commands interactively, for example yes/no options. Is there a way to call this script in a non-interactive script passing default option values as arguments?
It's not just one option that I have to pass to the interactive script.
Many ways
pipe your input
echo "yes
no
maybe" | your_program
redirect from a file
your_program < answers.txt
use a here document (this can be very readable)
your_program << ANSWERS
yes
no
maybe
ANSWERS
use a here string
your_program <<< $'yes\nno\nmaybe\n'
For more complex tasks there is expect ( http://en.wikipedia.org/wiki/Expect ).
It basically simulates a user, you can code a script how to react to specific program outputs and related stuff.
This also works in cases like ssh that prohibits piping passwords to it.
You can put the data in a file and re-direct it like this:
$ cat file.sh
#!/bin/bash
read x
read y
echo $x
echo $y
Data for the script:
$ cat data.txt
2
3
Executing the script:
$ file.sh < data.txt
2
3
Just want to add one more way. Found it elsewhere, and is quite simple.
Say I want to pass yes for all the prompts at command line for a command "execute_command", Then I would simply pipe yes to it.
yes | execute_command
This will use yes as the answer to all yes/no prompts.
You can also use printf to pipe the input to your script.
var=val
printf "yes\nno\nmaybe\n$var\n" | ./your_script.sh

Writing variables to file with bash

I'm trying to configure a file with a bash script. And the variables in the bash script are not written in file as it is written in script.
Ex:
#!/bin/bash
printf "%s" "template("$DATE\t$HOST\t$PRIORITY\t$MSG\n")" >> /file.txt
exit 0
This results to template('tttn') instead of template("$DATE\t$HOST\t$PRIORITY\t$MSG\n in file.
How do I write in the script so that the result is template("$DATE\t$HOST\t$PRIORITY\t$MSG\n in the configured file?
Is it possible to write variable as it looks in script to file?
Enclose the strings you want to write within single quotes to avoid variable replacement.
> FOO=bar
> echo "$FOO"
bar
> echo '$FOO'
$FOO
>
Using printf in any shell script is uncommon, just use echo with the -e option.
It allows you to use ANSI C metacharacters, like \t or \n. The \n at the end however isn't necessary, as echo will add one itself.
echo -e "template(${DATE}\t${HOST}\t${PRIORITY}\t${MSG})" >> file.txt
The problem with what you've written is, that ANSI C metacharacters, like \t can only be used in the first parameter to printf.
So it would have to be something like:
printf 'template(%s\t%s\t%s\t%s)\n' ${DATE} ${HOST} ${PRIORITY} ${MSG} >> file.txt
But I hope we both agree, that this is very hard on the eyes.
There are several escaping issues and the power of printf has not been used, try
printf 'template(%s\t%s\t%s\t%s)\n' "${DATE}" "${HOST}" "${PRIORITY}" "${MSG}" >> file.txt
Reasons for this separate answer:
The accepted answer does not fit the title of the question (see comment).
The post with the right answer
contains wrong claims about echo vs printf as of this post and
is not robust against whitespace in the values.
The edit queue is full at the moment.

Do a complete flux of work on bash script

I'm trying to automate a proces which I have to do over and over again in which I have to parse the output from a shell function, look for 5 different things, and then put them on a file
I know I can match patterns with grep however I don't know how to store the result on a variable so I can use it after :(
I also have to parse this very same output to get the other 5 values
I have no idea on how to use the same output for the 5 grep's i need to do and then store it to 5 different variables for after use
I know i have to create a nice and tidy .sh but I don't know how to do this
Currently im trying this
#!/bin/bash
data=$(cat file)
lol=$(echo data|grep red)
echo $lol
not working , any ideas?
you should show some examples of what you want to do next time..
assuming you shell function is called func1
func1(){
echo "things i want to get are here"
}
func1 | grep -E "things|want|are|here|get" > outputfile.txt
Update:
your code
#!/bin/bash
data=$(cat file)
lol=$(echo data|grep red)
echo $lol
practically just means this
lol=$(grep "red" file)
or
lol=$(awk '/red/' file)
also, if you are considering using bash, this is one way you can do it
while read -r myline
do
case "$myline" in
*"red"* ) echo "$myline" >> output.txt
esac
done <file
You can use the following syntax:
VAR=$(grep foo bar)
or alternatively:
VAR=`grep foo bar`
The easiest thing to do would be to redirect the output of the function to a file. You can then run multiple greps on it and only delete the file once you are done with it.
To save the output, you want to use command substitution. This runs a command and then converts the output into command line parameter. Combined with variable assignment you get:
variable=$(grep expression file)
Your second line is wrong. Change it to this:
lol=$(echo "$data"|grep red)
use egrep istead of grep.
variable=$(egrep "exp1|exp2|exp3|exp4|exp5" file)

Resources