Passing quoted as arguments to the function - linux

I would like to find out answer on probably quite simple question: I would like to pass quoted strings with whitespaces inside as a standalone arguments for function.
There is the following file with data (for example):
one
two three
four five six
seven
And there is script with 2 simple functions:
params_checker()
{
local first_row="$1"
local second_row="$2"
local third_row="$3"
echo "Expected args are:${first_row} ; ${second_row} ; ${third_row}"
echo "All args are:"
for arg in "$#"; do
echo "${arg}"
done
}
read_from_file()
{
local args_string
while read line; do
args_string="${args_string} \"${line}\""
echo "Read row: ${line}"
done < ./test_input
params_checker ${args_string}
}
read_from_file
In other words I would like to get rows from text file as arguments to function params_checker (each row from file as different parameter, I need to keep whitespaces in the rows). Attempt to make combined string with quoted "substrings" was failed, and output was:
~/test_sh$ sh test_process.sh
Read row: one
Read row: two three
Read row: four five six
Read row: seven
Expected args are:"one" ; "two ; three"
All args are:
"one"
"two
three"
"four
five
six"
"seven"
Expectation is $1="one", $2="two three", $3="four five six" ...
Quoting of ${args_string} during passing to params_checker gave another result, string is passed as a single argument.
Could you please help to find out correct way how to pass such strings with whitespaces from file as a different standalone function argumets?
Thanks a lot for help!

In bash/ksh/zsh you'd use an array. In sh, you can use the parameters "$1", "$2" etc:
read_from_file()
{
set -- # Clear parameters
while read line; do
set -- "$#" "$line" # Append to the parameters
echo "Read row: ${line}"
done < ./test_input
params_checker "$#" # Pass all parameters
}

There you go, this should give you what you are looking for:
#!/bin/bash
params_checker()
{
local first_row="$1"
local second_row="$2"
local third_row="$3"
local forth_row="$4"
echo "Expected args are: ${first_row} ; ${second_row} ; ${third_row} ; ${forth_row}"
echo "All args are:"
for i in "$#"
do
echo "$i"
done
}
read_from_file()
{
ARRAY=()
while read line; do
echo "Read row: ${line}"
ARRAY+=("$line")
done < ./test_input
params_checker "${ARRAY[#]}"
}
read_from_file;
That should work fine in BASH. If your file is named test.sh, you can run it like this ./test.sh

Related

How To Parameter Array To The Function In Bash [duplicate]

As we know, in bash programming the way to pass arguments is $1, ..., $N. However, I found it not easy to pass an array as an argument to a function which receives more than one argument. Here is one example:
f(){
x=($1)
y=$2
for i in "${x[#]}"
do
echo $i
done
....
}
a=("jfaldsj jflajds" "LAST")
b=NOEFLDJF
f "${a[#]}" $b
f "${a[*]}" $b
As described, function freceives two arguments: the first is assigned to x which is an array, the second to y.
f can be called in two ways. The first way use the "${a[#]}" as the first argument, and the result is:
jfaldsj
jflajds
The second way use the "${a[*]}" as the first argument, and the result is:
jfaldsj
jflajds
LAST
Neither result is as I wished. So, is there anyone having any idea about how to pass array between functions correctly?
You cannot pass an array, you can only pass its elements (i.e. the expanded array).
#!/bin/bash
function f() {
a=("$#")
((last_idx=${#a[#]} - 1))
b=${a[last_idx]}
unset a[last_idx]
for i in "${a[#]}" ; do
echo "$i"
done
echo "b: $b"
}
x=("one two" "LAST")
b='even more'
f "${x[#]}" "$b"
echo ===============
f "${x[*]}" "$b"
The other possibility would be to pass the array by name:
#!/bin/bash
function f() {
name=$1[#]
b=$2
a=("${!name}")
for i in "${a[#]}" ; do
echo "$i"
done
echo "b: $b"
}
x=("one two" "LAST")
b='even more'
f x "$b"
You can pass an array by name reference to a function in bash (since version 4.3+), by setting the -n attribute:
show_value () # array index
{
local -n myarray=$1
local idx=$2
echo "${myarray[$idx]}"
}
This works for indexed arrays:
$ shadock=(ga bu zo meu)
$ show_value shadock 2
zo
It also works for associative arrays:
$ declare -A days=([monday]=eggs [tuesday]=bread [sunday]=jam)
$ show_value days sunday
jam
See also nameref or declare -n in the man page.
You could pass the "scalar" value first. That would simplify things:
f(){
b=$1
shift
a=("$#")
for i in "${a[#]}"
do
echo $i
done
....
}
a=("jfaldsj jflajds" "LAST")
b=NOEFLDJF
f "$b" "${a[#]}"
At this point, you might as well use the array-ish positional params directly
f(){
b=$1
shift
for i in "$#" # or simply "for i; do"
do
echo $i
done
....
}
f "$b" "${a[#]}"
This will solve the issue of passing array to function:
#!/bin/bash
foo() {
string=$1
array=($#)
echo "array is ${array[#]}"
echo "array is ${array[1]}"
return
}
array=( one two three )
foo ${array[#]}
colors=( red green blue )
foo ${colors[#]}
Try like this
function parseArray {
array=("$#")
for data in "${array[#]}"
do
echo ${data}
done
}
array=("value" "value1")
parseArray "${array[#]}"
Pass the array as a function
array() {
echo "apple pear"
}
printArray() {
local argArray="${1}"
local array=($($argArray)) # where the magic happens. careful of the surrounding brackets.
for arrElement in "${array[#]}"; do
echo "${arrElement}"
done
}
printArray array
Here is an example where I receive 2 bash arrays into a function, as well as additional arguments after them. This pattern can be continued indefinitely for any number of bash arrays and any number of additional arguments, accommodating any input argument order, so long as the length of each bash array comes just before the elements of that array.
Function definition for print_two_arrays_plus_extra_args:
# Print all elements of a bash array.
# General form:
# print_one_array array1
# Example usage:
# print_one_array "${array1[#]}"
print_one_array() {
for element in "$#"; do
printf " %s\n" "$element"
done
}
# Print all elements of two bash arrays, plus two extra args at the end.
# General form (notice length MUST come before the array in order
# to be able to parse the args!):
# print_two_arrays_plus_extra_args array1_len array1 array2_len array2 \
# extra_arg1 extra_arg2
# Example usage:
# print_two_arrays_plus_extra_args "${#array1[#]}" "${array1[#]}" \
# "${#array2[#]}" "${array2[#]}" "hello" "world"
print_two_arrays_plus_extra_args() {
i=1
# Read array1_len into a variable
array1_len="${#:$i:1}"
((i++))
# Read array1 into a new array
array1=("${#:$i:$array1_len}")
((i += $array1_len))
# Read array2_len into a variable
array2_len="${#:$i:1}"
((i++))
# Read array2 into a new array
array2=("${#:$i:$array2_len}")
((i += $array2_len))
# You can now read the extra arguments all at once and gather them into a
# new array like this:
extra_args_array=("${#:$i}")
# OR you can read the extra arguments individually into their own variables
# one-by-one like this
extra_arg1="${#:$i:1}"
((i++))
extra_arg2="${#:$i:1}"
((i++))
# Print the output
echo "array1:"
print_one_array "${array1[#]}"
echo "array2:"
print_one_array "${array2[#]}"
echo "extra_arg1 = $extra_arg1"
echo "extra_arg2 = $extra_arg2"
echo "extra_args_array:"
print_one_array "${extra_args_array[#]}"
}
Example usage:
array1=()
array1+=("one")
array1+=("two")
array1+=("three")
array2=("four" "five" "six" "seven" "eight")
echo "Printing array1 and array2 plus some extra args"
# Note that `"${#array1[#]}"` is the array length (number of elements
# in the array), and `"${array1[#]}"` is the array (all of the elements
# in the array)
print_two_arrays_plus_extra_args "${#array1[#]}" "${array1[#]}" \
"${#array2[#]}" "${array2[#]}" "hello" "world"
Example Output:
Printing array1 and array2 plus some extra args
array1:
one
two
three
array2:
four
five
six
seven
eight
extra_arg1 = hello
extra_arg2 = world
extra_args_array:
hello
world
For further examples and detailed explanations of how this works, see my longer answer on this topic here: Passing arrays as parameters in bash
You can also create a json file with an array, and then parse that json file with jq
For example:
my-array.json:
{
"array": ["item1","item2"]
}
script.sh:
ARRAY=$(jq -r '."array"' $1 | tr -d '[],"')
And then call the script like:
script.sh ./path-to-json/my-array.json

Is there a way to pass multiple argument as a single string in bash?

I'm running a program that takes variable numbers of arguments with the same flag. For example
myprogram -args 'var1' 'var2' 'var3' 'var4'
myprogram -args 'var5' 'var6'
I have to launch this program several times with different sets of arguments provided in a test.txt file.
arg1 arg2 arg3
arg5 arg6
arg7
arg8 arg9 arg9 arg10
The program must be inside its own script to request resources in our HPCC.
while read p; do
launchmyprogram.sh "$p"
done < test.txt
I know I can use var1=$1 syntax inside launchmyprogram.sh to collect and allocate the variables, but this cannot handle variable number of arguments, and I'd have to create a script for each line. Is there a way to create a script in bash that takes variable numbers of arguments?
Use arrays to store dynamically sized sequences of strings. Bash has two ways of reading input into an array:
readarray -t somearray turns lines of an entire input file into array elements.
read -a somearray turns tokens of a single line of input into array elements.
In this case you can use the latter. Here’s a runnable MWE:
myprogram() {
local -i i
echo "Got ${#} arguments."
for ((i = 1; i <= $#; ++i)); do
echo "Argument No. ${i} is '${!i}'."
done
}
while read -ra args; do
myprogram -args "${args[#]}"
done <<-INPUT
arg1 arg2 arg3
arg5 arg6
arg7
arg8 arg9 arg9 arg10
INPUT
That way the arguments from each line are kept separate, as the output suggests:
Got 4 arguments.
Argument No. 1 is '-args'.
Argument No. 2 is 'arg1'.
Argument No. 3 is 'arg2'.
Argument No. 4 is 'arg3'.
Got 3 arguments.
Argument No. 1 is '-args'.
Argument No. 2 is 'arg5'.
Argument No. 3 is 'arg6'.
Got 2 arguments.
Argument No. 1 is '-args'.
Argument No. 2 is 'arg7'.
Got 5 arguments.
Argument No. 1 is '-args'.
Argument No. 2 is 'arg8'.
Argument No. 3 is 'arg9'.
Argument No. 4 is 'arg9'.
Argument No. 5 is 'arg10'.
You can use $# to query the number of arguments passed to launchmyprogram.sh. Something like
if [ $# -eq 1 ]; then
echo "one argument"
elif [ $# -eq 2 ]; then
echo "two arguments"
elif [ $# -eq 3 ]; then
echo "three arguments"
else
echo "too many arguments"
exit 1
fi
All bash scripts take a variable number of arguments, your question is about how to access them. The simplest method is:
for arg; do
my-cmd "$arg"
done
This repeats my-cmd with each argument, individually. You can use this loop in launchmyprogram.sh. You can also put the relevant code in a function, and use just the function inside the loop.
Parsing arguments from a file is more complicated. If the arguments aren't quoted for the shell, and don't contain spaces or wildcard characters ([]?*), you could just unquote $p in your example. It will be split on white space in to multiple arguments.
In this case you could also just parse the whole file in launchmyprogram.txt:
for arg in $(<test.txt); do
my-cmd "$arg"
done
This basically is Andrej's answer, but to simplify and make it just a little more directly related to the structure of the question -
while read -ra p; do # read and parse line into an array
myprogram -args "${p[#]}" # pass elements as separate values
done < test.txt # after reading them in as one line

Expanding a string with a variable reference later, after the variable is assigned

I'm trying to combine two lists containing names (if available) and emails with a standard email text in bash (shell)
(I had to delete the irrelevant code as it contains some private info, so some of the code might look unusal.)
The first half of the code checks if there is a name list along with the email list.
The second half combines only the email address and text if no name is available, if the name list is available it also 'tries' to combine the name, email and text.
f1 = email list and f2 = name list.
As you can see in the first half of the code below, $f2 should show the names if the list is available but it does not show anything in the log file.
I been trying to sort this problem out for two days but nothing has worked. When names are available it always outputs as "Hello ..." when it should be "Hello John D..."
#FIRST HALF
if [ "$names" = "no" ]
then
text="Hello..."
elif [ "$names" = "yes" ]
then
text="Hello $f2..."
fi
#SECOND HALF
if [ "$names" = "no" ]
then
for i in $(cat $emaillist); do
echo "$text" >> /root/log
echo "$i" >> /root/log
done
elif [ "$names" = "yes" ]
then
paste $emaillist $namelist | while IFS="$(printf '\t')" read -r f1 f2
do
echo "$text" >> /root/log
echo "$f1" >> /root/log
done
fi
When you run text="Hello $f2", $f2 is looked up at the time of the assignment; an exact string is assigned to text, and only that exact string is used later, on echo "$text".
This is very desirable behavior: If shell variables' values could run arbitrary code, it would be impossible to write shell scripts that handled untrusted data safely... but it does mean that implementing your program requires some changes.
If you want to defer evaluation (looking up the value of $f2 at expansion time rather than assignment), don't use a shell variable at all: Use a function instead.
case $names in
yes) write_greeting() { echo "Hello $name..."; };;
*) write_greeting() { echo "Hello..."; };;
esac
while read -r name <&3 && read -r email <&4; do
write_greeting
echo "$email"
done 3<"$namelist" 4<"$emaillist" >>/root/log
Some enhancements in the code above:
You don't need paste to read from two streams in lockstep; you can simply open them on different file descriptors (above, FDs 3 and 4 are chosen; only 0, 1 and 2 are reserved, so larger numbers could have been selected as well) with a separate read command for each.
Opening your output sink only once for the entire loop (by putting the redirection after the done) is far more efficient than re-opening it every time you want to write a single line.
Expansions, such as "$namelist" and "$emaillist", are always quoted; this makes code more reliable if dealing with filenames with unusual characters (including spaces and glob expressions), or if IFS is at a non-default value.

Shell Script that performs different functions based on input from file

I am trying to merge two very different scripts together for consolidation and ease of use purposes. I have an idea of how I want these scripts to look and operate, but I could use some help getting started. Here is the flow and look of the script:
The input file would be a standard text file with this syntax:
#Vegetables
Broccoli|Green|14
Carrot|Orange|9
Tomato|Red|7
#Fruits
Apple|Red|15
Banana|Yellow|5
Grape|Purple|10
The script would take the input of this file. It would ignore the commented portions, but use them to dictate the output. So based on the fact that it is a Vegetable, it would perform a specific function with the values listed between the delimiter (|). Then it would go to the Fruits and do something different with the values, based on that delimiter. Perhaps, I would add Vegetable/Fruit to one of the values and dependent on that value it would perform the function while in this loop to read the file. Thank you for your help in getting this started.
UPDATE:
So I am trying to implement the IFS setup and thought of a more logical arrangement. The input file will have the "categories" displayed within the parameters. So the setup will be like this:
Vegetable|Carrot|Yellow
Fruit|Apple|Red
Vegetable|Tomato|Red
From there, the script will read in the lines and perform the function. So basically this type of setup in shell:
while read -r category item color
do
if [[ $category == "Vegetable" ]] ; then
echo "The $item is $color"
elif [[ $category == "Fruit" ]] ; then
echo "The $item is $color"
else
echo "Bad input"
done < "$input_file"
Something along those lines...I am just having trouble putting it all together.
Use read to input the lines. Do a case statement on their prefix:
{
while read DATA; do
case "$DATA" in
\#*) ... switch function ...;;
*) eval "$FUNCTION";;
esac
done
} <inputfile
Dependent on your problem you might want to experiment with setting $IFS before reading and read multiple variables in 1 go.
You can redefine the processing function each time you meet a # directive:
#! /bin/bash
while read line ; do
if [[ $line == '#Vegetables' ]] ; then
process () {
echo Vegetables: "$#"
}
elif [[ $line == '#Fruits' ]] ; then
process () {
echo Fruits: "$#"
}
else
process $line
fi
done < "$1"
Note that the script does not skip empty lines.

Shell Programming: Access Element of List

It is my understanding that when writing a Unix shell program you can iterate through a string like a list with a for loop. Does this mean you can access elements of the string by their index as well?
For example:
foo="fruit vegetable bread"
How could I access the first word of this sentence? I've tried using brackets like the C-based languages to no avail, and solutions I've read online require regular expressions, which I would like to avoid for now.
Pass $foo as argument to a function. Than you can use $1, $2 and so on to access the corresponding word in the function.
function try {
echo $1
}
a="one two three"
try $a
EDIT: another better version is:
a="one two three"
b=( $a )
echo ${b[0]}
EDIT(2): have a look at this thread.
Using arrays is the best solution.
Here's a tricky way using indirect variables
get() { local idx=${!#}; echo "${!idx}"; }
foo="one two three"
get $foo 1 # one
get $foo 2 # two
get $foo 3 # three
Notes:
$# is the number of parameters given to the function (4 in all these cases)
${!#} is the value of the last parameter
${!idx} is the value of the idx'th parameter
You must not quote $foo so the shell can split the string into words.
With a bit of error checking:
get() {
local idx=${!#}
if (( $idx < 1 || $idx >= $# )); then
echo "index out of bounds" >&2
return 1
fi
echo "${!idx}"
}
Please don't actually use this function. Use an array.

Resources