Extracting a string in csh - linux

Would you please explain why the following shell command wouldn't work:
sh-3.1$ echo $MYPATH
/opt/Application/DATA/CROM/my_application
sh-3.1$ awk '{print substr($MYPATH,3)}'
Thanks
Best Regards

MYPATH is not going to be substituted by the shell since the string uses single quotes. Consider the following:
csh$ echo '{print substr($USER,3)}'
{print substr($USER,3)}
csh$ echo "{print substr($USER,3)}"
{print substr(dshawley,3)}
The usage of single quotes instructs the shell to pass the string argument to the program as-is. Double quotes tell the shell to perform variable expansion on the argument before passing it to the program. This is a basic shell feature that is common amongst shells and some programming languages (e.g., perl).
The next problem that you are going to run into is that awk will want quotes around the first parameter to substr or the parse will fail. You will probably see an "Illegal variable name" warning in this case. This is where I get lost with csh since I have no clue how to properly escape a double-quote within a quoted string. In bash/sh/ksh, you would do the following:
sh$ awk "{print substr(\"$USER\",3)}"
input
^D
hawley
sh$
Just in case you do not already know this, awk will require an input stream before it is going to do anything. I had to type "input" and the EOF character for the little example.

Quoting and escaping
"string" is a weak quote. Enclosed whitespace and wildcards are taken as literals, but variable and command substitutions are still performed.
'string' is a strong quote. The entire enclosed string is taken as a literal.
You can use the -v option to pass variable to awk:
awk -v mypath=$MYPATH 'BEGIN{print substr(mypath, 3)}'

Related

bash echo environment variable containing escaped characters

I have an script that echo the input given, into a file as follows:
echo $# > file.txt
When I pass a sting like "\"" I want it to exactly print "\"" to the file however it prints ".
My question is how can I print all characters of a variable containing a string without considering escapes?
When I use echo in bash like echo "\"" it only prints " while when I use echo '"\""' it prints it correctly. I thought maybe that would be the solution to use single quotes around the variable, however I cannot get the value of a variable inside single quotes.
First, note that
echo $# > file.txt
can fail in several ways. Shellcheck identifies one problem (missing quotes on $#). See the accepted, and excellent, answer to Why is printf better than echo? for others.
Second, as others have pointed out, there is no practical way for a Bash program to know exactly how parameters were specified on the command line. For instance, for all of these invocations
prog \"
prog "\""
prog '"'
the code in prog will see a $1 value that consists of one double-quote character. Any quoting characters that are used in the invocation of prog are removed by the quote removal part of the shell expansions done by the parent shell process.
Normally that doesn't matter. If variables or parameters contain values that would need to be quoted when entered as literals (e.g. "\"") they can be used safely, including passing them as parameters to other programs, by quoting uses of the variable or parameter (e.g. "$1", "$#", "$x").
There is a problem with variables or parameters that require quoting when entered literally if you need to write them in a way that they can be reused as shell input (e.g. by using eval or source/.). Bash supports the %q format specification to the printf builtin to handle this situation. It's not clear what the OP is trying to do, but one possible solution to the question is:
if (( $# > 0 )) ; then
printf -v quoted_params '%q ' "$#" # Add all parameters to 'quoted_params'
printf '%s\n' "${quoted_params% }" # Remove trailing space when printing
fi >file.txt
That creates an empty 'file.txt' when no positional parameters are provided. The code would need to be changed if that is not what is required.
If you run echo \", the function of the backslash in bash is to escape the character after it. This actually enables you to use the double quotes as an argument. You cannot use a backslash by itself; if you want to have a backslash as an argument you need to use another slash to escape that: echo \\
Now if you want to create a string where these things are not escaped, use single quotes: echo '\'
See for a better explanation this post: Difference between single and double quotes in Bash

Double /Single quote syntax

When querying a list and putting the value in a variable and trying to use the variable in another script it doenst get the format needed.
script 1:
cilist=$(opr-ci-list.sh -view_name TN_UD_REFRESH_MRE | sed -e '/^[TL-]/d' -e '/^\s*$/d' -e 's/^....//' | awk -vORS=, '{ print $1 }' | sed 's/,$/\n/')
The output of this script will be ID's comma seperated string like: 7c553435c1376c8f5f020fcee0b8ef51,7d427dd75235bf513286d3210e1bd787
echo $cilist
7c553435c1376c8f5f020fcee0b8ef51,7d427dd75235bf513286d3210e1bd787
=> no quotes to be seen when doing a echo
script 2:
opr-downtime.sh -cis "\"$cilist\""
i receive an error because the are single quotes surrounding the variable:
-cis '"7c553435c1376c8f5f020fcee0b8ef51,7d427dd75235bf513286d3210e1bd787 "'
I tried several syntax ways but keep getting the wrong input for the second script. Or i have no quotes or quotes like '" in front and behind.
Any help or feedback on the correct syntax would be appreciated.
The shell treats the quote characters as special characters. For double quote ("), it treats the enclosed data as a single argument to the command. This would be useful if the input had a space (or other shell separator token) within it. However, when the argument is provided to the command, the quote is removed.
You can try using backslash (\) to escape the double quote. But, you may still want to enclose everything with a double quote incase $cilist has input that requires quoting.
script.sh -cis "\"$cilist\""

How to increment a shell variable in an awk action

My shell script is something like this:
#!/bin/bash
global_var=0
func() {
awk '$1 ~/^pattern/ {global_var=$((global_var+1))}' $1
}
func input_file_name
I want to increment the global (shell) variable global_var inside the awk action. How to do so? Normal shell style incrementing does not seem to be working.
Try this:
func() {
awk '$1~/^pattern/ {++awk_var} END {print awk_var+0}' "$1"
}
shell_var=$(func input_file_name)
The shell and awk are separate worlds, and you should treat them as such(*) (which, in effect, you're already doing, by enclosing your awk program in single quotes, which prevents the shell from expanding any shell variable references in your akw program string).
Thus, use an awk[-internal] variable to perform your counting (awk_var) and output it after having finished processing the input file (in the END block, using print to output the awk variable to stdout - the +0 part is to ensure that the output defaults to 0 in case NO match was found.)
Note that, generally, awk variables need no explicit initialization, because they default to 0 in numerical and Boolean contexts, and to "" (empty string) in string contexts).
Also note that awk has its own syntax, and shell constructs such as $((...)) for arithmetic expansion do not apply. Generally, awk variables are referred to just by name (no $ prefix), and arithmetic operations such as ++ can be applied directly.
Using command substitution - $(...) - in the shell then allows you to capture output from the awk command.
In your specific case you have no need to pass variable values into the awk program, but if you needed to do that, you'd use one or more instances of awk's -v option; e.g.: awk -v awk_var="$shell_var" ...
On the shell (bash) side, if you wanted to add awk's output to the shell variable instead of just assigning it:
declare -i shell_var # make sure variable is an integer
shell_var+=$(func input_file_name) # add function's output to existing value
(*) The shell and awk have completely separate namespaces that have no direct way of interacting with one another: awk has no concept of shell variables, and the shell has no concept of awk variables.
It is technically feasible, but ill-advised to integrate shell variable VALUES into an awk program - by using a double-quoted string to represent the awk program in which you reference shell variable VALUES, which are then expanded by the shell ONCE, BEFORE the string gets passed as a program to awk.
What you CANNOT do is to modify a shell variable from inside an awk program.
Since it gets complicated quickly as to which parts of the awk program are interpreted by the shell up front vs. which parts are interpreted by awk later (where '$ has special meaning too, for instance), the best approach is to:
use a single-quoted string to represent the awk program, so as to protect it from interpretation by the shell
if values need to be passed in, use instances of the -v option
if something needs to be passed out, print to stdout from awk and use command substitution or redirection to capture it via the shell.

BASH: A variable inside a defined variable?

i have the following function in a bash script which does not work?
do_get() {
cmd='<command version="33" cmd="GETINFO" $3</command>'
echo $cmd
}
Now, if i echo $3 right before the cmd variable it echos out 1234 which i am passing as a 3 argument when executing this. BUT it shows just $3 when i do an echo $cmd.
i tried a couple of things like such below thinking its getting striped out
'$3' but it then shows blank
'"$3"' same as above
The variable doesn't expand when inside single quotes. You need to use double quotes instead, but since you have double quotes on the inside, you need to make sure you remember to escape those as well.
do_get() {
cmd="<command version=\"33\" cmd=\"GETINFO\" $3</command>"
echo $cmd
}
Newer versions of bash add a -v flag to the printf command that makes assignments like this a little easier on the eye, in that quoting is reduced.
printf -v cmd '<command version="33" cmd="GETINFO" %d </command' "$3"
The single quote in Bash prevents variable substitution. In order for the third parameter to be substituted, you should enclose your string in double quotes. of course, you have the problem of the double quotes that are part of your string, so they need to be escaped with a backslash:
cmd="<command version=\"33\" cmd=\"GETINFO\" $3 </command>"

Replace a phrase in a file with a string which contains special Characters

I am using sed -e "s/foo/$bar/" -e "s/some/$text/" file.whatever to replace a phrase in a certain file. The problem is that the $bar string contains multiple special characters like /. So when I try to replace something in a text file using the following code...
#!/bin/bash
bar="http://stackoverflow.com/"
sed -e "s/foo/$bar/" -e "s/some/$text/ file.whatever
...then I get an error saying : sed: unknown option to s is there anything I can do about it?
You can use any delimiter. s#some#SOME# for example. Another good delimiter is vertical-bar. Other chars can work but have special significance for some contexts such as regular expressions.
You can get this difficulty in sed regardless of what delimiters you use, especially if you don't know what the string contains. I'd pick a different method for passing the shell variables into the helper interpreter.
awk -v rep1="$bar" -v rep2="$text" '{sub(/foo/, rep1); sub(/some/, rep2); print}'
or
perl -spe 's/foo/$rep1/; s/some/$rep2/' -- -rep1="$bar" -rep2="$text"
Correctness trumps brevity in this case.
(reference for Perl example)

Resources