Why doesnt command execute successfully if i use variables? - linux

I have the following script
WSO2_SCRIPT="JAVA_HOME=$JAVA_HOME /opt/autopilot/wso2is/bin/wso2server.sh"
WSO2_LOG="/var/log/autopilot/wso2is/autopilot-wso2is-initd.log"
${WSO2_SCRIPT} start >> ${WSO2_LOG} 2>&1 || echo failed
JAVA_HOME=$JAVA_HOME /opt/autopilot/wso2is/bin/wso2server.sh start >> /var/log/autopilot/wso2is/autopilot-wso2is-initd.log 2>&1 || echo failedagain
The third line of the code results in failure as I have "failed" echoed?
However the fourth line is successful and I don't get "failedagain" echoed.
Line 3 and 4 should result in exactly the same thing. Only difference is I am using variables in in line 3, and being explicit in line 4.
Why does using variables result in a failure?

When variables are expanded without quotes, they undergo word splitting and pathname expansion, but not shell grammar parsing.
This means that you can put the following in variables:
Multiple arguments (including command name) to be split up on spaces and spaces only
Globs like *.txt to be expanded
It also means that you can not put anything of the following in variables:
Redirections
Quotes
Pipes
Backgrounding &
Conditionals and loops, including if and [[ .. ]]
Brace and parenthesis groups
Parameter expansion
Command substitutions
Process substitutions
and as you've discovered: variable assignments
If you want to pass any of the above around as a variable, you should use a function and refer to the function instead. If you don't care about security and good practice, you can also use eval to evaluate a text string as shell code.

Related

How do I pass ">>" or "<<" to my script without the terminal trying to interpret it as me either appending to something or getting stdin?

My python script can take a series of bitwise operators as one of its arguments. They all work fine except for "=<<" which is roll left, and "=>>" which is roll right. I run my script like ./script.py -b +4,-4,=>>10,=<<1, where anything after -b can be any combination of similar operations. As soon as the terminal sees "<<" though, it just drops the cursor to a new line after the command and asks for more input instead of running the script. When it sees ">>", my script doesn't process the arguments correctly. I know it's because bash uses these characters for a specific purpose, but I'd like to get around it while still using "=>>" and "=<<" in my arguments for my script. Is there any way to do it without enclosing the argument in quotation marks?
Thank you for your help.
You should enclose the parameters that contain special symbols into single quotation marks (here, echo represents your script):
> echo '+4,-4,=>>10,=<<1'
+4,-4,=>>10,=<<1
Alternatively, save the parameters to a file (say, params.txt) and read them from the file onto the command line using the backticks:
> echo `cat params.txt`
+4,-4,=>>10,=<<1
Lastly, you can escape some offending symbols:
> echo +4,-4,=\>\>10,=\<\<1
+4,-4,=>>10,=<<1

How to get the complete calling command of a BASH script from inside the script (not just the arguments)

I have a BASH script that has a long set of arguments and two ways of calling it:
my_script --option1 value --option2 value ... etc
or
my_script val1 val2 val3 ..... valn
This script in turn compiles and runs a large FORTRAN code suite that eventually produces a netcdf file as output. I already have all the metadata in the netcdf output global attributes, but it would be really nice to also include the full run command one used to create that experiment. Thus another user who receives the netcdf file could simply reenter the run command to rerun the experiment, without having to piece together all the options.
So that is a long way of saying, in my BASH script, how do I get the last command entered from the parent shell and put it in a variable? i.e. the script is asking "how was I called?"
I could try to piece it together from the option list, but the very long option list and two interface methods would make this long and arduous, and I am sure there is a simple way.
I found this helpful page:
BASH: echoing the last command run
but this only seems to work to get the last command executed within the script itself. The asker also refers to use of history, but the answers seem to imply that the history will only contain the command after the programme has completed.
Many thanks if any of you have any idea.
You can try the following:
myInvocation="$(printf %q "$BASH_SOURCE")$((($#)) && printf ' %q' "$#")"
$BASH_SOURCE refers to the running script (as invoked), and $# is the array of arguments; (($#)) && ensures that the following printf command is only executed if at least 1 argument was passed; printf %q is explained below.
While this won't always be a verbatim copy of your command line, it'll be equivalent - the string you get is reusable as a shell command.
chepner points out in a comment that this approach will only capture what the original arguments were ultimately expanded to:
For instance, if the original command was my_script $USER "$(date +%s)", $myInvocation will not reflect these arguments as-is, but will rather contain what the shell expanded them to; e.g., my_script jdoe 1460644812
chepner also points that out that getting the actual raw command line as received by the parent process will be (next to) impossible. Do tell me if you know of a way.
However, if you're prepared to ask users to do extra work when invoking your script or you can get them to invoke your script through an alias you define - which is obviously tricky - there is a solution; see bottom.
Note that use of printf %q is crucial to preserving the boundaries between arguments - if your original arguments had embedded spaces, something like $0 $* would result in a different command.
printf %q also protects against other shell metacharacters (e.g., |) embedded in arguments.
printf %q quotes the given argument for reuse as a single argument in a shell command, applying the necessary quoting; e.g.:
$ printf %q 'a |b'
a\ \|b
a\ \|b is equivalent to single-quoted string 'a |b' from the shell's perspective, but this example shows how the resulting representation is not necessarily the same as the input representation.
Incidentally, ksh and zsh also support printf %q, and ksh actually outputs 'a |b' in this case.
If you're prepared to modify how your script is invoked, you can pass $BASH_COMMANDas an extra argument: $BASH_COMMAND contains the raw[1]
command line of the currently executing command.
For simplicity of processing inside the script, pass it as the first argument (note that the double quotes are required to preserve the value as a single argument):
my_script "$BASH_COMMAND" --option1 value --option2
Inside your script:
# The *first* argument is what "$BASH_COMMAND" expanded to,
# i.e., the entire (alias-expanded) command line.
myInvocation=$1 # Save the command line in a variable...
shift # ... and remove it from "$#".
# Now process "$#", as you normally would.
Unfortunately, there are only two options when it comes to ensuring that your script is invoked this way, and they're both suboptimal:
The end user has to invoke the script this way - which is obviously tricky and fragile (you could however, check in your script whether the first argument contains the script name and error out, if not).
Alternatively, provide an alias that wraps the passing of $BASH_COMMAND as follows:
alias my_script='/path/to/my_script "$BASH_COMMAND"'
The tricky part is that this alias must be defined in all end users' shell initialization files to ensure that it's available.
Also, inside your script, you'd have to do extra work to re-transform the alias-expanded version of the command line into its aliased form:
# The *first* argument is what "$BASH_COMMAND" expanded to,
# i.e., the entire (alias-expanded) command line.
# Here we also re-transform the alias-expanded command line to
# its original aliased form, by replacing everything up to and including
# "$BASH_COMMMAND" with the alias name.
myInvocation=$(sed 's/^.* "\$BASH_COMMAND"/my_script/' <<<"$1")
shift # Remove the first argument from "$#".
# Now process "$#", as you normally would.
Sadly, wrapping the invocation via a script or function is not an option, because the $BASH_COMMAND truly only ever reports the current command's command line, which in the case of a script or function wrapper would be the line inside that wrapper.
[1] The only thing that gets expanded are aliases, so if you invoked your script via an alias, you'll still see the underlying script in $BASH_COMMAND, but that's generally desirable, given that aliases are user-specific.
All other arguments and even input/output redirections, including process substitutiions <(...) are reflected as-is.
"$0" contains the script's name, "$#" contains the parameters.
Do you mean something like echo $0 $*?

Passing \* as a parameter for a parameter

Using ksh. Trying to reuse a current script without modifying it, which basically boils down to something like this:
`expr 5 $1 $2`
How do i pass in a a multiplication command (*) as parameter $1 ?
I first attempted using "*" and even \* but that isn't working.
I've tried multiple escape backslash and quote combinations but i think im doing it wrong.
Without modifying the script, I don't think this can be done:
On calling, you can pass a literal * as '*', "*" or \* (any will do): this will initially protect the * from shell expansions (interpretation by the shell).
The callee (the script) will then receive literal * (as $1), but due to unquoted use of $1 inside the script, * will invariably be subject to filename expansion (globbing), and will expand to all (non-hidden) filenames in the current folder, breaking the expr command.
Trying to add an extra layer of escaping - such as "'*'" or \\\* - will NOT work, because the extra escaping will become an embedded, literal part of the argument - the target script will see literal '*' or \* and pass it as-is to expr, which will fail, because neither is a valid operator.
Here's a workaround:
Change to an empty directory.
By default, ksh will return any glob (pattern) as-is if there are no matching filenames. Thus, * (or any other glob) will be left unmodified in an empty directory, because there's nothing to match (thanks, #Peter Cordes).
For the calling script / interactive shell, you could disable globbing altogether by running set -f, but note that this will not affect the called script.
It's now safe to invoke your script with '*' (or any other glob), because it will simply be passed through; e.g., script '*' 2, will now yield 10, as expected
If both the shell you invoke from and the script's shell are ksh (or bash) with their default configuration, you can even get away with script * 2; i.e., you can make do without quoting * altogether.
Glob expansion happens very late, after parameter expansion, and word-splitting (in that order). Quote-removal doesn't happen on the results of earlier expansions, just what was on the command line to start with. This rules out passing in a quoted \* or similar (see mklement0's answer), by using an extra layer of quoting.
It also rules out passing in space-padded *: Word-splitting removes the spaces before pathname (glob) expansion, so it still ends up expanding * to all the filenames in the directory.
foo(){ printf '"%s"\n' "$#"; set -x; expr 5 $1 $2; set +x; }
$ foo ' * ' 4
" * "
"4"
+ expr 5 ...contents of my directory... 4
expr: syntax error
+ set +x
You should fix this buggy script before someone runs it with an arg that breaks it in a dangerous way, rather than just inconvenient.
If you don't need to support exactly the same operators as expr, you might want to use arithmetic expansion to do it without running an external command:
result=$((5 $1 $2)) # arithmetic expansion for the right-hand side
# or
((result=5 "$1" "$2")) # whole command is an arithmetic expression.
Double-quotes around parameters are optional inside an arithmetic expression, but you need to not use them in an arithmetic expansion (in bash. Apparently this works in ksh).
Normally it's not a bad habit to just always quote unless you specifically want word-splitting and glob expansion.

create variables inside unix script

I am trying to create a variable in a script, based on another variable.
I just don't know what needs to be adjusted in my code, if it is possible.
I am simplifying my code for your understanding, so this is not the original code.
The code goes like that:
#!/bin/csh -f
set list_names=(Albert Bela Corine David)
set Albert_house_number=1
set Bela_house_number=2
set Corine_house_number=3
set David_house_number=4
foreach name ($list_names)
#following line does not work....
set house_number=$($name\_house_number)
echo $house_number
end
the desired output should be:
1
2
3
4
Thanks for your help.
Unfortunately, the bashism ${!varname} is not available to us in csh, so we'll have to go the old-fashioned route using backticks and eval. csh's quoting rules are different from those of POSIX-conforming shells, so all of this is csh specific. Here we go:
set house_number = `eval echo \" \$${name}_house_number\"`
echo "$house_number"
${name} is expanded into the backticked command, so this becomes equivalent to, say,
set house_number = `eval echo \" \$Albert_house_number\"`
which then evaluates
echo " $Albert_house_number"
and because of the backticks, the output of that is then assigned to house_number.
The space before \$$ is necessary in case the value of the expanded variable has special meaning to echo (such as -n). We could not simply use echo "-n" (it wouldn't print anything), but echo " -n" is fine.1
The extra space is stripped by csh when the backtick expression is expanded. This leads us to the remaining caveat: Spaces in variable values are going to be stripped; csh's backticks do that. This means that if Albert_house_number were defined as
set Albert_house_number = "3 4"
house_number would end up with the value 3 4 (with only one space). I don't know a way to prevent this.
1 Note that in this case, the echo "$house_number" line would have to be amended as well, or it would run echo "-n" and not print anything even though house_number has the correct value.

Bash variable defaulting doesn't work if followed by pipe (bash bug?)

I've just discovered a strange behaviour in bash that I don't understand. The expression
${variable:=default}
sets variable to the value default if it isn't already set. Consider the following examples:
#!/bin/bash
file ${foo:=$1}
echo "foo >$foo<"
file ${bar:=$1} | cat
echo "bar >$bar<"
The output is:
$ ./test myfile.txt
myfile.txt: ASCII text
foo >myfile.txt<
myfile.txt: ASCII text
bar ><
You will notice that the variable foo is assigned the value of $1 but the variable bar is not, even though the result of its defaulting is presented to the file command.
If you remove the innocuous pipe into cat from line 4 and re-run it, then it both foo and bar get set to the value of $1
Am I missing somehting here, or is this potentially a bash bug?
(GNU bash, version 4.3.30)
In second case file is a pipe member and runs as every pipe member in its own shell. When file with its subshell ends, $b with its new value from $1 no longer exists.
Workaround:
#!/bin/bash
file ${foo:=$1}
echo "foo >$foo<"
: "${bar:=$1}" # Parameter Expansion before subshell
file $bar | cat
echo "bar >$bar<"
It's not a bug. Parameter expansion happens when the command is evaluated, not parsed, but a command that is part of a pipeline is not evaluated until the new process has been started. Changing this, aside from likely breaking some existing code, would require extra level of expansion before evaluation occurs.
A hypothetical bash session:
> foo=5
> bar='$foo'
> echo "$bar"
$foo
# $bar expands to '$foo' before the subshell is created, but then `$foo` expands to 5
# during the "normal" round of parameter expansion.
> echo "$bar" | cat
5
To avoid that, bash would need some way of marking pieces of text that result from the new first round of pre-evaluation parameter expansion, so that they do not undergo a second
round of evaluation. This type of bookkeeping would quickly lead to unmaintainable code as more corner cases are found to be handled. Far simpler is to just accept that parameter expansions will be deferred until after the subshell starts.
The other alternative is to allow each component to run in the current shell, something that is allowed by the POSIX standard, but is not required, either. bash made the choice long ago to execute each component in a subshell, and reversing that would break too much existing code that relies on the current behavior. (bash 4.2 did introduce the lastpipe option, allowing the last component of a pipeline to execute in the current shell if explicitly enabled.)

Resources