Made up example (perl):
my $x = read_input_from_file();
# $x now contains string $ENV{SOMETHING}/dir/$ENV{SOMETHING_ELSE}
my $y = eval($x); # doesnt work
How can I get value of string contained in $x in the script?
So far I have tried using eval which doesn't generate any output. I am hoping that something already exists in perl and these string expressions do not need to be parsed and evaluated.
The "string" eval is a little specific:
eval in all its forms is used to execute a little Perl program.
...
In a string eval, the value of the expression (which is itself determined within scalar context) is first parsed, and if there were no errors, executed as a block within the lexical context of the current Perl program.
So this evaluates code, and with a variable to be evaluated containing a string literal we have a "bareword" ("Unquoted string") which is generally no good. In your case, those / in $x cause additional trouble.
If the content of the variable to evaluate is a string literal (not code) it need be quoted
my $y = eval q(") . $x . q("); # double-quote so that it interpolates
I use the operator form of a single quote, q(). Quoted under it is a double-quote since $x itself seems to contain variables that need be evaluated (interpolated).
Keep in mind that running code from external sources can be a serious security problem.
Related
I've read this and it doesn't solve my problem.
I have a space-separated string, let's say $MyString = "arg1 arg2". Suppose I have a command line program called MyProgram, which accepts an arbitrary number of positional arguments, so it can be run like MyProgram arg1 arg2. However doing MyProgram $MyString doesn't work, and neither does MyProgram ($MyString -split ' ') nor MyProgram $($MyString -split ' '). I get the same error which basically says that it doesn't recognise the argument "arg1 arg2", which I guess is because it still thinks it's one argument containing a space rather than two arguments. In practice, $MyString may be quite huge and is read from a file. How do I make this work?
Oh I just found out how LOL. I should have thought of this sooner; basically, just use splatting The following worked for me:
$MyArray = $($MyString -split " ")
MyProgram #MyArray
Explanation: The first line converts the string into an array of strings split by space (" "); The $(...) notation around a command captures the output of the command, which I then assign to $MyArray. Then, instead of using $MyArray with a dollar sign $, I use it with # to splat the array of strings into arguments for MyProgram.
tl;dr
For calling PowerShell commands you indeed need splatting in order to pass the elements of an array as individual, positional arguments; this requires defining the array in an auxiliary variable that can then be passed with sigil # in lieu of $ to request splatting:
$myArray = -split $myString # See below for limitations, bottom section for fix
MyPowerShellCommand #myArray # Array elements are passed as indiv. arguments.
While this technique also works with external programs, it isn't strictly necessary there, and you can pass an array directly to achieve the same effect:
MyExternalProgram (-split $myString) # Array elements are passed as indiv. args.
Note that (...) rather than $(...) is used to pass the expression as an argument. (...) is usually sufficient and generally preferable, because $(...) can have side effects - see this answer for details.
Just to bring the post you link to in your question and your answer here together:
First, to be clear: neither answer, due to splitting by spaces only will deal properly with arguments inside the argument-list string that have embedded spaces (and therefore, of necessity use embedded quoting), e.g., $myString = "arg1 `"arg2 with spaces`" arg3" would not work as expected - see the bottom section for a solution.
Leaving that aside, the difference is:
When calling an external program, as in the linked post, passing an array causes each element to become its own argument.
That is, myExternalProgram (-split $MyString) would work.
Note that I'm using the unary form of the -split operator for more flexible tokenization, which splits by any non-empty run of whitespace while ignoring leading and trailing whitespace (same as awk's default behavior).
When calling a PowerShell command, as in your case, an array is by default passed as-is, as a whole, as a single argument.
To achieve the same effect as with external programs, i.e. to pass the array's elements as individual, positional arguments, you indeed have to use splatting, i.e. you have to:
save the array in a variable first: $myArray = -split $myString,
which you can then pass as as a splatted argument by using # instead of $ as the sigil: MyPowerShellCommand #myArray
Do note that when calling PowerShell commands it is more common - and more robust - to use hashtable- rather than an array-based splatting, as it allows you to explicitly bind to parameters by name rather than by position - and PowerShell commands often have parameters that can only be bound by name.
E.g., if MyPowerShellCommand accepts parameters -Foo and -Bar, you could use:
$myArgs = #{ Foo='foo value'; Bar='bar value '}; MyPowerShellCommand #myArgs
If you do want to handle argument-list strings that have arguments with embedded quoting:
$myString = 'arg1 "arg2 with spaces" arg3'
$myArray = (Invoke-Expression ('Write-Output -- ' + $myString -replace '\$', "`0")) -replace "`0", '$$'
Note: Invoke-Expression (iex) should generally be avoided, but the extra precautions taken in this particular command make its use safe.
$myArray is then a 3-element array with verbatim elements arg1, arg2 with spaces and arg3, which can again be used as shown above.
See this answer for an explanation of the technique.
These work for me ($args is reserved). -split on the left side splits on whitespace. Or you can get-content from a file where each argument is on a seperate line. You might run into a limit with how long a commandline can be. Piping that list in or loading it from a file might be a better approach.
echo hi > file.txt
$args2 = 'hi','file.txt'
findstr $args2
# hi
$args2 = 'hi','file.txt'
& findstr $args2
# hi
$args2 = 'hi file.txt'
findstr (-split $args2)
# hi
findstr ($args2 -split ' ')
# hi
For debugging my scripts, I would like to add the internal variables $FUNCNAME and $LINENO at the beginning of each of my outputs, so I know what function and line number the output occurs on.
foo(){
local bar="something"
echo "$FUNCNAME $LINENO: I just set bar to $bar"
}
But since there will be many debugging outputs, it would be cleaner if I could do something like the following:
foo(){
local trace='$FUNCNAME $LINENO'
local bar="something"
echo "$trace: I just set bar to $bar"
}
But the above literally outputs:
"$FUNCNAME $LINENO: I just set bar to something"
I think it does this because double quotes only expands variables inside once.
Is there a syntactically clean way to expand variables twice in the same line?
You cannot safely evaluate expansions twice when handling runtime data.
There are means to do re-evaluation, but they require trusting your data -- in the NSA system design sense of the word: "A trusted component is one that can break your system when it fails".
See BashFAQ #48 for a detailed discussion. Keep in mind that if you could be logging filenames, that any character except NUL can be present in a UNIX filename. $(rm -rf ~)'$(rm -rf ~)'.txt is a legal name. * is a legal name.
Consider a different approach:
#!/usr/bin/env bash
trace() { echo "${FUNCNAME[1]}:${BASH_LINENO[0]}: $*" >&2; }
foo() {
bar=baz
trace "I just set bar to $bar"
}
foo
...which, when run with bash 4.4.19(1)-release, emits:
foo:7: I just set bar to baz
Note the use of ${BASH_LINENO[0]} and ${FUNCNAME[1]}; this is because BASH_LINENO is defined as follows:
An array variable whose members are the line numbers in source files where each corresponding member of FUNCNAME was invoked.
Thus, FUNCNAME[0] is trace, whereas FUNCNAME[1] is foo; whereas BASH_LINENO[0] is the line from which trace was called -- a line which is inside the function foo.
Although eval has its dangers, getting a second expansion is what it does:
foo(){
local trace='$FUNCNAME $LINENO'
local bar="something"
eval echo "$trace: I just set bar to $bar"
}
foo
Gives:
foo 6: I just set bar to something
Just be careful not to eval anything that has come from external sources, since you could get a command injected into the string.
Yes to double expansion; but no, it won't do what you are hoping for.
Yes, bash offers a way to do "double expansion" of a variable, aka, a way to first interpret a variable, then take that as the name of some other variable, where the other variable is what's to actually be expanded. This is called "indirection". With "indirection", bash allows a shell variable to reference another shell variable, with the final value coming from the referenced variable. So, a bash variable can be passed by reference.
The syntax is just the normal braces style expansion, but with an exclamation mark prepended to the name.
${!VARNAME}
It is used like this:
BAR="my final value";
FOO=BAR
echo ${!FOO};
...which produces this output...
my final value
No, you can't use this mechanism to do the same as $( eval "echo $VAR1 $VAR2" ). The result of the first interpretation must be exactly the name of a shell variable. It does not accept a string, and does not understand the dollar sign. So this won't work:
BAR="my final value";
FOO='$BAR'; # The dollar sign confuses things
echo ${!FOO}; # Fails because there is no variable named '$BAR'
So, it does not solve your ultimate quest. None-the-less, indirection can be a powerful tool.
I am fairly new to unix bash scripting and need to know if this is possible. I want to ask user for their input multiple times and then store that input in to one variable.
userinputs= #nothing at the start
read string
<code to add $string to $userinputs>
read string
<code to add $string to $userinputs> #this should add this input along with the other input
so if the user enters "abc" when asked first time, it add's "abc" in $userinputs
then when asked again for the input and the user enters "123" the script should store it in the same $userinputs
this would make the $userinput=abc123
The usual way to concat two strings in Bash is:
new_string="$string1$string2"
{} are needed around the variable name only if we have a literal string that can obstruct the variable expansion:
new_string="${string1}literal$string2"
rather than
new_string="$string1literal$string2"
You can also use the += operator:
userinputs=
read string
userinputs+="$string"
read string
userinputs+="$string"
Double quoting $string is optional in this case.
See also:
How to concatenate string variables in Bash?
You can concatentate variables and store multiple strings in the same one like so:
foo=abc
echo $foo # prints 'abc'
bar=123
foo="${foo}${bar}"
echo $foo # prints 'abc123'
You can use the other variables, or the same variable, when assigning to a variable, e.g. a="${a}123${b}". See this question for more info.
You don't have to quote the strings you're assigning to, or do the ${var} syntax, but learning when to quote and not to quote is a surprisingly nuanced art, so it's often better to be safe than sorry, and the "${var}" syntax in double quotes is usually the safest approach (see any of these links for more than you ever wanted to know: 1 2 3).
Anyway, you should read into a temporary variable (read, by default, reads into $REPLY) and concatentate that onto your main variable, like so:
allinput=
read # captures user input into $REPLY
allinput="${REPLY}"
read
allinput="${allinput}${REPLY}"
Beware that the read command behaves very differently depending on supplied switches and the value of the IFS global variable, especially in the face of unusual input with special characters. A common "just do what I mean" choice is to empty out IFS via IFS= and use read -r to capture input. See the read builtin documentation for more info.
In a zsh script,
echo ${X:-4711}
outputs the value of the variable X, or 4711 if there is none.
echo ${X:u}
outputs the value of the variable X, converted to upper case.
I wonder, whether there is a way to combine the two, i.e. to have the effect of
tmp=${X:-4711}
echo $X:u
without introducing an auxiliary variable.
$ echo ${${X:-4711}:u}
4711
$ X=hello
$ echo ${${X:-4711}:u}
HELLO
From man zshexpn:
If a `${...}` type parameter expression or a `$(...)` type command
substitution is used in place of name above, it is expanded first and
the result is used as if it were the value of name. Thus it is possible
to perform nested operations: `${${foo#head}%tail}` substitutes the value
of `$foo` with both 'head' and 'tail' deleted.
I'm not sure how to do this but I figured I would ask here.. I'm trying to create a string of specific environment variables such that:
$A = "foo"
$B = "bar"
$C = "baz"
would give "foo, bar, baz"
Unfortunately, it doesn't seem that the Bourne shell supports arrays, which would have made these easily solvable. The other way I'm trying to solve this is by directly inserting my own variable called $COMMA after each environment variable, however I am getting syntax errors so I'm not sure how to do this correctly. Would appreciate any advice here, thanks!
Your variables shouldn't start with $ unless you want the value of them (this isn't perl or php...)
A=foo
B=bar
C=baz
echo $A,$B,$C
or even:
A=foo B=bar C=baz echo $A,$B,$C
will give you a comma seperated list of the variables you defined.