cant create third parameter in bash script (after $*) - linux

I want to create a bash script, that gets 3 parameters. But the second needs to be $*, because i need later these lines. The other two parameters (first and third) doesn't need this.
for x in $* do
The first and second parameter aren't the problem, this one works:
parameter1="$1"
shift
parameter2="$*"
But i need the third parameter at the end and something like this
parameter1="$1"
parameter3="$3"
shift
parameter2="$*"
won't work. My command at the end should look like this:
bash myscript parameter1 parameter2 parameter3

For specifically three parameters, you can use substring parameter expansion in a simple way:
parameter1=$1
parameter2="${#:2:1}" # One parameter, starting with #2
parameter3=$3
Or course, that's unnecessary, since you can just use $2 instead of ${#:2:1}, but I point it out as a simple introduction to the syntax (and not at all because I overlooked the fact you would use $2, really....)
(You can also use it as a substitute for indirect parameter expansion; "${#:n:1}" and "${!n}" are basically equivalent when n is a variable with an integer value.)
For the more general case, where you want an arbitrary number of arguments between the first and last, it gets a little more complicated, although the principle is the same:
parameter1=$1
middleParameters=( "${#:2:$#-2}" ) # n - 2 parameters, starting with #2, i.e., all but $1 and ${!n} for n=$#
lastParameter="${#:$#}"

shift removes an argument from the left. If you want to remove an argument from the right, you can do that with:
set -- "${#:1:$# - 1}"
Thus:
parameter1=$1 # capture leftmost argument
shift # remove leftmost argument
parameter3=${*:$#:1} # capture rightmost argument
set -- "${#:1:$# - 1}" # remove rightmost argument
parameter2=$* # concatenate remaining arguments and store in a string
Note that $* is almost certainly the Wrong Thing. If you want to keep your arguments separate, respecting their quoting, instead use an array:
parameter2=( "$#" )
for item in "${parameter2[#]}"; do
echo "Processing item: $item"
done
If your script is run as yourscript arg1 "item A" "item B" arg3, then the above will ensure that item A and item B are treated as individual arguments, rather than treating item as an argument, A as another, etc.

Related

Powershell, use (space-separated) string as arguments to a program

I've read this and it doesn't solve my problem.
I have a space-separated string, let's say $MyString = "arg1 arg2". Suppose I have a command line program called MyProgram, which accepts an arbitrary number of positional arguments, so it can be run like MyProgram arg1 arg2. However doing MyProgram $MyString doesn't work, and neither does MyProgram ($MyString -split ' ') nor MyProgram $($MyString -split ' '). I get the same error which basically says that it doesn't recognise the argument "arg1 arg2", which I guess is because it still thinks it's one argument containing a space rather than two arguments. In practice, $MyString may be quite huge and is read from a file. How do I make this work?
Oh I just found out how LOL. I should have thought of this sooner; basically, just use splatting The following worked for me:
$MyArray = $($MyString -split " ")
MyProgram #MyArray
Explanation: The first line converts the string into an array of strings split by space (" "); The $(...) notation around a command captures the output of the command, which I then assign to $MyArray. Then, instead of using $MyArray with a dollar sign $, I use it with # to splat the array of strings into arguments for MyProgram.
tl;dr
For calling PowerShell commands you indeed need splatting in order to pass the elements of an array as individual, positional arguments; this requires defining the array in an auxiliary variable that can then be passed with sigil # in lieu of $ to request splatting:
$myArray = -split $myString # See below for limitations, bottom section for fix
MyPowerShellCommand #myArray # Array elements are passed as indiv. arguments.
While this technique also works with external programs, it isn't strictly necessary there, and you can pass an array directly to achieve the same effect:
MyExternalProgram (-split $myString) # Array elements are passed as indiv. args.
Note that (...) rather than $(...) is used to pass the expression as an argument. (...) is usually sufficient and generally preferable, because $(...) can have side effects - see this answer for details.
Just to bring the post you link to in your question and your answer here together:
First, to be clear: neither answer, due to splitting by spaces only will deal properly with arguments inside the argument-list string that have embedded spaces (and therefore, of necessity use embedded quoting), e.g., $myString = "arg1 `"arg2 with spaces`" arg3" would not work as expected - see the bottom section for a solution.
Leaving that aside, the difference is:
When calling an external program, as in the linked post, passing an array causes each element to become its own argument.
That is, myExternalProgram (-split $MyString) would work.
Note that I'm using the unary form of the -split operator for more flexible tokenization, which splits by any non-empty run of whitespace while ignoring leading and trailing whitespace (same as awk's default behavior).
When calling a PowerShell command, as in your case, an array is by default passed as-is, as a whole, as a single argument.
To achieve the same effect as with external programs, i.e. to pass the array's elements as individual, positional arguments, you indeed have to use splatting, i.e. you have to:
save the array in a variable first: $myArray = -split $myString,
which you can then pass as as a splatted argument by using # instead of $ as the sigil: MyPowerShellCommand #myArray
Do note that when calling PowerShell commands it is more common - and more robust - to use hashtable- rather than an array-based splatting, as it allows you to explicitly bind to parameters by name rather than by position - and PowerShell commands often have parameters that can only be bound by name.
E.g., if MyPowerShellCommand accepts parameters -Foo and -Bar, you could use:
$myArgs = #{ Foo='foo value'; Bar='bar value '}; MyPowerShellCommand #myArgs
If you do want to handle argument-list strings that have arguments with embedded quoting:
$myString = 'arg1 "arg2 with spaces" arg3'
$myArray = (Invoke-Expression ('Write-Output -- ' + $myString -replace '\$', "`0")) -replace "`0", '$$'
Note: Invoke-Expression (iex) should generally be avoided, but the extra precautions taken in this particular command make its use safe.
$myArray is then a 3-element array with verbatim elements arg1, arg2 with spaces and arg3, which can again be used as shown above.
See this answer for an explanation of the technique.
These work for me ($args is reserved). -split on the left side splits on whitespace. Or you can get-content from a file where each argument is on a seperate line. You might run into a limit with how long a commandline can be. Piping that list in or loading it from a file might be a better approach.
echo hi > file.txt
$args2 = 'hi','file.txt'
findstr $args2
# hi
$args2 = 'hi','file.txt'
& findstr $args2
# hi
$args2 = 'hi file.txt'
findstr (-split $args2)
# hi
findstr ($args2 -split ' ')
# hi

How to modify command line arguments inside bash script using set

I'm executing a shell script and passing few command line arguments to it.
I want to modify the arguments inside the script using set. Not all at once depending upon some conditions.
How can I do that?
Copy unmodified arguments at their respective location within set --
Say you want to modify value of argument 2:
set -- "${#::2}" 'new arg2 value' "${#:3}"
Explanation:
"${#::2}": Expands 2 arguments from index 0 (arguments 0 and 1)
new arg2 value: Becomes the value for argument 2.
"${#:3}": Expands all argument values starting at index 3.
Opinion:
Anyway, having mutable arguments is considered code-smell in modern programming. So I'd recommend you reconsider your approach to the problem you are trying to solve.

How can I know if a string contains only one or several words in Bash? [duplicate]

This question already has answers here:
A confusion about ${array[*]} versus ${array[#]} in the context of a bash completion
(2 answers)
Closed 6 years ago.
When I get the content of an array in a string, I have the 2 solutions bellow :
$ a=('one' 'two')
$ str1="${a[*]}" && str2="${a[#]}"
After, of course, I can reuse my string on the code
but how can I know if my variable has only one or several words?
In both cases, the contents of the array are concatenated to a single string and assigned to the variable. The only difference is what is used to join the elements. With ${a[*]}, the first character of IFS is used. With ${a[#]}, a single space is always used.
$ a=(one two)
$ IFS="-"
$ str1="${a[*]}"
$ str2="${a[#]}"
$ echo "$str1"
one-two
$ echo "$str2"
one two
When expanding $str1 or $str2 without quoting, the number of resulting words is entirely dependent on the current value of IFS, regardless of how the variables were originally defined. "$str1" and "$str2" each expand, of course, to a single word.
To add to #chepner's great answer: the difference between ${arr[*]} and ${arr[#]} is very similar to the difference between $* and $#. You may want to refer to this post which talks about $* and $#:
What's the difference between $# and $* in UNIX?
As a rule of thumb, it is always better to use "$#" and "${arr[#]}" than their unquoted or * counterparts.
"${a[*]}" expands to one string for all entries together and "${a[#]}" expands to one string per entry.
Assume we had a program printParameters, which prints for each parameter ($1, $2, and so on) the string my ... parameter is ....
>_ a=('one' 'two')
>_ printParameters "${a[*]}"
my 1. parameter is one two
>_ printParameters "${a[#]}"
my 1. parameter is one
my 2. parameter is two
If you would expand the array manually, you would write
${a[*]} as "one two" and
${a[#]} as "one" "two".
There also differences regarding IFS and so on (see other answers). Usually # is the better option, but * is way faster – use the latter one in cases where you have to deal with large arrays and don't need separate arguments.
By the way: The script printParameters can be written as
#! /bin/bash
argIndex=0
for argValue in "$#"; do
echo "my $((++i)). argument is $argValue"
done
It's very useful for learning more about expansion by try and error.

zsh parameter expansion syntax: combining default value and conversion to upper case

In a zsh script,
echo ${X:-4711}
outputs the value of the variable X, or 4711 if there is none.
echo ${X:u}
outputs the value of the variable X, converted to upper case.
I wonder, whether there is a way to combine the two, i.e. to have the effect of
tmp=${X:-4711}
echo $X:u
without introducing an auxiliary variable.
$ echo ${${X:-4711}:u}
4711
$ X=hello
$ echo ${${X:-4711}:u}
HELLO
From man zshexpn:
If a `${...}` type parameter expression or a `$(...)` type command
substitution is used in place of name above, it is expanded first and
the result is used as if it were the value of name. Thus it is possible
to perform nested operations: `${${foo#head}%tail}` substitutes the value
of `$foo` with both 'head' and 'tail' deleted.

bash getopts - difference between ${OPTARG} and $OPTARG [duplicate]

In shell scripts, when do we use {} when expanding variables?
For example, I have seen the following:
var=10 # Declare variable
echo "${var}" # One use of the variable
echo "$var" # Another use of the variable
Is there a significant difference, or is it just style? Is one preferred over the other?
In this particular example, it makes no difference. However, the {} in ${} are useful if you want to expand the variable foo in the string
"${foo}bar"
since "$foobar" would instead expand the variable identified by foobar.
Curly braces are also unconditionally required when:
expanding array elements, as in ${array[42]}
using parameter expansion operations, as in ${filename%.*} (remove extension)
expanding positional parameters beyond 9: "$8 $9 ${10} ${11}"
Doing this everywhere, instead of just in potentially ambiguous cases, can be considered good programming practice. This is both for consistency and to avoid surprises like $foo_$bar.jpg, where it's not visually obvious that the underscore becomes part of the variable name.
Variables are declared and assigned without $ and without {}. You have to use
var=10
to assign. In order to read from the variable (in other words, 'expand' the variable), you must use $.
$var # use the variable
${var} # same as above
${var}bar # expand var, and append "bar" too
$varbar # same as ${varbar}, i.e expand a variable called varbar, if it exists.
This has confused me sometimes - in other languages we refer to the variable in the same way, regardless of whether it's on the left or right of an assignment. But shell-scripting is different, $var=10 doesn't do what you might think it does!
You use {} for grouping. The braces are required to dereference array elements. Example:
dir=(*) # store the contents of the directory into an array
echo "${dir[0]}" # get the first entry.
echo "$dir[0]" # incorrect
You are also able to do some text manipulation inside the braces:
STRING="./folder/subfolder/file.txt"
echo ${STRING} ${STRING%/*/*}
Result:
./folder/subfolder/file.txt ./folder
or
STRING="This is a string"
echo ${STRING// /_}
Result:
This_is_a_string
You are right in "regular variables" are not needed... But it is more helpful for the debugging and to read a script.
Curly braces are always needed for accessing array elements and carrying out brace expansion.
It's good to be not over-cautious and use {} for shell variable expansion even when there is no scope for ambiguity.
For example:
dir=log
prog=foo
path=/var/${dir}/${prog} # excessive use of {}, not needed since / can't be a part of a shell variable name
logfile=${path}/${prog}.log # same as above, . can't be a part of a shell variable name
path_copy=${path} # {} is totally unnecessary
archive=${logfile}_arch # {} is needed since _ can be a part of shell variable name
So, it is better to write the three lines as:
path=/var/$dir/$prog
logfile=$path/$prog.log
path_copy=$path
which is definitely more readable.
Since a variable name can't start with a digit, shell doesn't need {} around numbered variables (like $1, $2 etc.) unless such expansion is followed by a digit. That's too subtle and it does make to explicitly use {} in such contexts:
set app # set $1 to app
fruit=$1le # sets fruit to apple, but confusing
fruit=${1}le # sets fruit to apple, makes the intention clear
See:
Allowed characters in Linux environment variable names
The end of the variable name is usually signified by a space or newline. But what if we don't want a space or newline after printing the variable value? The curly braces tell the shell interpreter where the end of the variable name is.
Classic Example 1) - shell variable without trailing whitespace
TIME=10
# WRONG: no such variable called 'TIMEsecs'
echo "Time taken = $TIMEsecs"
# What we want is $TIME followed by "secs" with no whitespace between the two.
echo "Time taken = ${TIME}secs"
Example 2) Java classpath with versioned jars
# WRONG - no such variable LATESTVERSION_src
CLASSPATH=hibernate-$LATESTVERSION_src.zip:hibernate_$LATEST_VERSION.jar
# RIGHT
CLASSPATH=hibernate-${LATESTVERSION}_src.zip:hibernate_$LATEST_VERSION.jar
(Fred's answer already states this but his example is a bit too abstract)
Following SierraX and Peter's suggestion about text manipulation, curly brackets {} are used to pass a variable to a command, for instance:
Let's say you have a sposi.txt file containing the first line of a well-known Italian novel:
> sposi="somewhere/myfolder/sposi.txt"
> cat $sposi
Ouput: quel ramo del lago di como che volge a mezzogiorno
Now create two variables:
# Search the 2nd word found in the file that "sposi" variable points to
> word=$(cat $sposi | cut -d " " -f 2)
# This variable will replace the word
> new_word="filone"
Now substitute the word variable content with the one of new_word, inside sposi.txt file
> sed -i "s/${word}/${new_word}/g" $sposi
> cat $sposi
Ouput: quel filone del lago di como che volge a mezzogiorno
The word "ramo" has been replaced.

Resources