Generating a bash script from a bash script - linux

I need to generate a script from within a script but am having problems because some of the commands going into the new script are being interpreted rather than written to the new file. For example i want to create a file called start.sh in it I want to set a variable to the current IP address:
echo "localip=$(ip addr | grep 'state UP' -A2 | tail -n1 | awk '{print $2}' | cut -f1 -d'/')" > /start.sh
what gets written to the file is:
localip=192.168.1.78
But what i wanted was the following text in the new file:
localip=$(ip addr | grep 'state UP' -A2 | tail -n1 | awk '{print $2}' | cut -f1 -d'/')"
so that the IP is determined when the generated script is run.
What am i doing wrong ?

You're making this unnecessary hard. Use a heredoc with a quoted sigil to pass literal contents through without any kind of expansion:
cat >/start.sh <<'EOF'
localip=$(ip addr | grep 'state UP' -A2 | tail -n1 | awk '{print $2}' | cut -f1 -d'/')
EOF
Using <<'EOF' or <<\EOF, as opposed to just <<EOF, is essential; the latter will perform expansion just as your original code does.
If anything you're writing to start.sh needs to be based on current variables, by the way, be sure to use printf %q to safely escape their contents. For instance, to set your current $1, $2, etc. to be active during start.sh execution:
# open start.sh for output on FD 3
exec 3>/start.sh
# build a shell-escaped version of your argument list
printf -v argv_str '%q ' "$#"
# add to the file we previously opened a command to set the current arguments to that list
printf 'set -- %s\n' "$argv_str" >&3
# pass another variable through safely, just to be sure we demonstrate how:
printf 'foo=%q\n' "$foo" >&3
# ...go ahead and add your other contents...
cat >&3 <<'EOF'
# ...put constant parts of start.sh here, which can use $1, $2, etc.
EOF
# close the file
exec 3>&-
This is far more efficient than using >>/start.sh on every line that needs to append: Using exec 3>file and then >&3 only opens the file once, rather than opening it once per command that generates output.

Related

.sh script doesn't return variable correctly

I have created a file called alias.sh in which I have put this code:
#!/bin/bash
OUTPUT="$(alias | awk -F'[ =]' '{print $2}')"
echo "${OUTPUT}"
Whenever I run the command alias | awk -F'[ =]' '{print $2}' in a terminal, it correctly returns a list of set aliases in my preferred format.
However, when I execute the script like $ ./alias.sh, it simply returns an empty line.
The script works if I replace the alias | awk -F'[ =]' '{print $2}' command with the ls command. It even preserves line breaks.
Can anybody help me understand why the script does not return the expected result?
The actual reason for the error is because alias is not expanded when the shell is not interactive,
From the man bash page,
[..] Aliases are not expanded when the shell is not interactive, unless the expand_aliases shell option is set using shopt [..]
add the line below to the top of the script for force the expansion manually.
shopt -s expand_aliases
and then source the script and not execute it,
#!/bin/bash
shopt -s expand_aliases
output=$(alias | awk -F'[ =]' '{print $2}')
echo "$output"
and now source the script as
. ./myScript.sh
You use $2 in the AWK script. However, that is replaced by the shell as the second argument to the shell script, which is nothing. You need to escape the dollar-sign, as in \$2. Or not use double-quotes " around the sub-shell.

Why is my shell command working at the prompt, but not as a bash script?

New to bash scripting. I'm getting pretty familiar with shell scripting pretty well. I wrote this text transform script for a feed for a client. And extracts the url's I want, and the titles of articles. Awesome.
echo $(var=$(curl -L website.com/news)) |
grep -Po '<h3 class="article-link"><a href="\K[^<]+' <<< $var |
result=$(sed 's/"/\n/g' | sed 's/ \//\n\//g' | sed 's/>//g') ; let this=0 ; echo "$result" | while read line ; do if ((this % 2 == 0 )) ; then echo website.com/news$line ; else echo $line ; fi ; let this+=1 ; done
When I try to extract it to a file and run it with bash OR sh myThing.sh, it doesn't work at all. The only thing that echo's is 'webiste.com/news', when I try to echo $this, all I get is 1. What am I doing wrong?
#!/bin/bash
echo $(var=$(curl -L website.com/news)) |
grep -Po '<h3 class="article-link"><a href="\K[^<]+' <<< $var |
result=$(sed 's/"/\n/g' | sed 's/ \//\n\//g' | sed 's/>//g')
let this=0
echo "$result" | while read line
do
if ((this % 2 == 0 ))
then
echo website.com/news$line
else
echo $line
fi
let this+=1
done
edit:
#!/bin/bash
var=$(curl -L linux.com/news)
select=$(grep -Po '<h3 class="article-list__title"><a href="\K[^<]+' <<< $var)
result=$(sed 's/"/\n/g' | sed 's/ \//\n\//g' | sed 's/>//g')
let this=0
echo "$result" | while read line
do
if ((this % 2 == 0 ))
then
echo website.com/news$line
else
echo $line
fi
let this+=1
done
This answer solves the OP's specific problem, but to address the question "Why is my shell command working at the prompt, but not as a bash script?" generally, Etan Reisner provides an excellent answer in the comments:
"You are either not running that exact command or it "works" because you have shell state that is affecting things in ways you take to be "working" and your script doesn't have that state. Try launching an entirely new shell session and see if that command, on its own, works for you there."
echo $(var=...) will assign a value to variable $var, but will not output anything, so the echo command will simply print a newline.
Furthermore, because the assignment to $var happens inside $(...) (a command substitution), it is confined to the subshell that the command inside the substitution ran in, so $var will not be defined in the calling shell.
(A subshell is a child process that contains a duplicate of the current shell's environment, without being able to modify the current shell's environment).
More generally, you cannot meaningfully define variables inside a pipeline - they will neither be visible to other pipeline segments, nor after the pipeline finishes.[1]
The only reason your [original] command could ever have worked is if $var had a preexisting value in your shell.
In fact, given that you provide input to grep via a here-string (<<<), the first segment of your pipeline (echo ...) is entirely ignored.
To pass the output of curl through the pipeline to grep and then to sed, no intermediate variables are needed at all.
Furthermore, your sed command is lacking input: you probably meant to feed it $var in your first attempt, and $select in the 2nd (your 2nd attempt came close to a correct solution).
What you were probably ultimately looking for:
result=$(curl -L website.com/news |
grep -Po '<h3 class="article-link"><a href="\K[^<]+' |
sed 's/"/\n/g' | sed 's/ \//\n\//g' | sed 's/>//g')
# ... processing of "$result"
Some additional notes:
You could combine the 3 sed calls into a single one.
You could feed the pipeline output directly into your while loop, without the need for intermediate variable $result.
You should generally double-quote variable references (e.g., use "$line" instead of $line to protect them from interpretation by the shell (word-splitting, globbing).
let this+=1 is better expressed as (( ++this )) in modern Bash.
This answer of mine contains links to resources for learning about bash.
[1] All commands involved in a pipeline by default run in a subshell in bash, so they all see copies of the parent shell's variables. Bash 4.2+ offers the lastpipe option (off by default) to allow you to create variables in the current shell instead of in a subshell, by running the last pipeline segment (only) in the current shell instead of in a subshell, to facilitate scenarios such as ... | while read -r line ... and have $line continue to exist after the pipeline finishes.
Note that this still doesn't enable defining a variable in an earlier pipeline segment in the hopes that a later segment will see it - this can never work, because the commands that make up a pipeline are launched at the same time, and it is only through coordination of the input and output streams that effective left-to-right processing happens.
This line is totally wrong. You are attempting to pass thru pipes the standard output of each process when none of them ever prints anything except standard error.
echo $(var=$(curl -L website.com/news)) | grep -Po '<h3 class="article-link"><a href="\K[^<]+' <<< $var | result=$(sed 's/"/\n/g' | sed 's/ \//\n\//g' | sed 's/>//g')
I'll break down what I believe you are attempting to do.
echo $(var=$(curl -: website.com/news))
The above code will only print the standard error, which is a separate stream than standard output. The standard output is assigned to $var. However you are attempting to pass the standard output to the next process which is nothing but a newline at this time.
grep -Po '<h3 class="article-link"><a href="\K[^<]+' <<< $var
The here-string <<< takes precedence over pipe. But variable $var is lost as it was defined inside a sub-shell and not in the parent shell. Thanks to #mklement0.
The proper way to accomplish all this is to not use $var. All you wanted is the value stored in $result.
result=$(curl -L website.com/news | grep -Po '<h3 class="article-link"><a href="\K[^<]+'| sed 's/"/\n/g' | sed 's/ \//\n\//g' | sed 's/>//g')
I don't intend to optimize your script. This is more of a suggested solution. A more comprehensive answer to your question Why is my shell command working at the prompt, but not as a bash script? is answered by mklement0 here.

Bash tries to execute commands in heredoc

I am trying to write a simple bash script that will print a multiline output to another file. I am doing it through heredoc format:
#!/bin/sh
echo "Hello!"
cat <<EOF > ~/Desktop/what.txt
a=`echo $1 | awk -F. '{print $NF}'`
b=`echo $2 | tr '[:upper:]' '[:lower:]'`
EOF
I was expecting to see a file in my desktop with these contents:
a=`echo $1 | awk -F. '{print $NF}'`
b=`echo $2 | tr '[:upper:]' '[:lower:]'`
But instead, I am seeing these as the contents of my what.txt file:
a=
b=
Somehow, even though it is part of a heredoc, bash is trying to execute it line by line. How do I prevent this, and print the contents to the file as it is?
Quote EOF so that bash takes inputs literally:
cat <<'EOF' > what.txt
a=`echo $1 | awk -F. '{print $NF}'`
b=`echo $2 | tr '[:upper:]' '[:lower:]'`
EOF
Also start using $() for command substitution instead of old and problematic ``.

Issues passing AWK output to BASH Variable

I'm trying to parse lines from an error log in BASH and then send a certain part out to a BASH variable to be used later in the script and having issues once I try and pass it to a BASH variable.
What the log file looks like:
1446851818|1446851808.1795|12|NONE|DID|8001234
I need the number in the third set (in this case, the number is 12) of the line
Here's an example of the command I'm running:
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | awk -F '[|]' '{print $3}'
The line of code is trying to accomplish this:
Grab the last lines of the log file
Search for a phrase (in this case connect, I'm using the same command to trigger different items)
Separate the number in the third set of the line out so it can be used elsewhere
If I run the above full command, it runs successfully like so:
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | awk -F '[|]' '{print $3}'
12
Now if I try and assign it to a variable in the same line/command, I'm unable to have it echo back the variable.
My command when assigning to a variable looks like:
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | brand=$(awk -F '[|]' '{print $3}')
(It is being run in the same script as the echo command so the variable should be fine, test script looks like:
#!/bin/bash
tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | brand=$(awk -F '[|]' '{print $3}')
echo "$brand";
I'm aware this is most likely not the most efficient/eloquent solution to do this, so if there are other ideas/ways to accomplish this I'm open to them as well (my BASH skills are basic but improving)
You need to capture the output of the entire pipeline, not just the final section of it:
brand=$(tail -n5 /var/log/asterisk/queue_log | grep 'CONNECT' | awk -F '|' '{print $3}')
You may also want to consider what will happen if there is more than one line containing CONNECT in the final five lines of the file (or indeed, if there are none). That's going to cause brand to have multiple (or no) values.
If your intent is to get the third field from the latest line in the file containing CONNECT, awk can pretty much handle the entire thing without needing tail or grep:
brand=$(awk -F '|' '/CONNECT/ {latest = $3} END {print latest}')

passing grep into a variable in bash

I have a file named email.txt like these one :
Subject:My test
From:my email <myemail#gmail.com>
this is third test
I want to take out only the email address in this file by using bash script.So i put this script in my bash script named myscript:
#!/bin/bash
file=$(myscript)
var1=$(awk 'NR==2' $file)
var2=$("$var1" | (grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b'))
echo $var2
But I failed to run this script.When I run this command manually in bash i can obtain the email address:
echo $var1 | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b'
I need to put the email address to store in a variable so i can use it in other function.Can someone show me how to solve this problem?
Thanks.
I think this is an overly complicated way to go about things, but if you just want to get your script to work, try this:
#!/bin/bash
file="email.txt"
var1=$(awk 'NR==2' $file)
var2=$(echo "$var1" | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b')
echo $var2
I'm not sure what file=$(myscript) was supposed to do, but on the next line you want a file name as argument to awk, so you should just assign email.txt as a string value to file, not execute a command called myscript. $var1 isn't a command (it's just a line from your text file), so you have to echo it to give grep anything useful to work with. The additional parentheses around grep are redundant.
What is happening is this:
var2=$("$var1" | (grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b'))
^^^^^^^ Execute the program named (what is in variable var1).
You need to do something like this:
var2=$(echo "$var1" | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b')
or even
var2=$(awk 'NR==2' $file | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b')
There are very helpful flags for bash: -xv
The line with
var2=$("$var1" | (grep...
should be
var2=$(echo "$var1" | (grep...
Also my version of grep doesn't have -o flag.
And, as far as grep patterns are "greedy" even as the following code runs, it's output is not exactly what you want.
#!/bin/bash -xv
file=test.txt
var1=$(awk 'NR==2' $file)
var2=$(echo "$var1" | (grep -Ei '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+.[A-Z]{2,4}\b'))
echo $var2
Use Bash parameter expansion,
var2="${var1#*:}"
There's a cruder way:
cat $file | grep # | tr '<>' '\012\012' | grep #
That is, extract the line(s) with # signs, turn the angle brackets into newlines, then grep again for anything left with an # sign.
Refine as needed...

Resources