Delete final positional argument from command in bash - linux

I have a script called dosmt where I input a couple args and then print something:
if [ "${#: -1}" == "--ut" ]; then
echo "Hi"
fi
What I'm trying to do is delete the last positional argument which is --ut if the statement is true. So if my input were to be $ dosmt hello there --ut, it would echo Hi, but if I were to print the args after, I just want to have hello there. So basically I'm trying to delete the last argument for good and I tried using shift but that's only temporary so that doesn't work...

First, let's set the parameters that you want:
$ set -- hello there --ut
We can verify that the parameters are correct:
$ echo "$#"
hello there --ut
Now, let's remove the last value:
$ set -- "${#: 1: $#-1}"
We can verify that the last value was successfully removed:
$ echo "$#"
hello there
Demonstration in a script
To demonstrate this as part of a script:
$ cat script
#!/bin/bash
echo Initial values="$#"
set -- "${#: 1: $#-1}"
echo Final values="$#"
We can run with your arguments:
$ script hello there --ut
Initial values=hello there --ut
Final values=hello there

Related

Using a for loop input to rename an object within the loop in bash

I am currently trying to rename an input argument by the variable "i" in the following for loop:
cd $1
num=$(echo $#)
echo $num
echo $#
echo "This is the next part where I print stuff"
for i in $(seq 2 $num)
do
echo $i
echo ${!i}
Args_array+=$(printf '${arg_%s[#]}' ${i})
echo $Args_array
arg_${i}=$(ls ${!i})
done
The output is as follows:
4
output_folder /path/to/my.tsv /path/to/my2.tsv /path/to/my3.tsv
2
/path/to/my.tsv
${arg_2[#]}
/var/spool/slurm/d/job6985121/slurm_script: line 23: arg_2=/path/to/my.tsv: No such file or directory
But it will not allow me to rename the $2, $3 arguments with "i" like this. Any help would be appreciated.
I want to pass these arguments into R and have to put them in arg_1, arg_2, etc. format.
Not sure I understand what's being attempted with Args_array so focusing solely on OP's comment: 'have to put them in arg_1, arg_2' and skipping arg_1 since OP's code doesn't appear to care about storing $1 anywhere; then again, is R not capable of processing input parameters from the command line?
One bash idea:
$ cat testme
#!/usr/bin/bash
num=$#
for ((i=2;i<=$num;i++))
do
declare args_$i=${!i}
done
for ((i=2;i<=$num;i++))
do
typeset -p args_$i
done
Taking for a test drive:
$ testme output_folder /path/to/my.tsv /path/to/my2.tsv /path/to/my3.tsv
declare -- args_2="/path/to/my.tsv"
declare -- args_3="/path/to/my2.tsv"
declare -- args_4="/path/to/my3.tsv"

How to create a string=$* without arguments $1 and $2

I have a script that takes in several arguments.
I need everything but $1 and $2 in a string.
I have tried this:
message="$*"
words= $(grep -v "$2"|"$3" $message)
but it doesn't work, it gives me the error:
./backup: line 26: First: command not found
Use shift 2 to shift the arguments along (it drops the first n arguments).
If you need "$1" and "$2" for later, save them in variables first.
Note that in shell, assignments to variables cannot have whitespace either side of the =.
First=$1
Second=$2
shift 2
Message=$#
Maybe something like this?
[root#tsekmanrhel771 ~]# cat ./skip1st2.sh
#!/bin/bash
COUNT=0
for ARG in "$#"
do
COUNT=$[COUNT + 1]
if [ ${COUNT} -gt 2 ]; then
RESULT="${RESULT} ${ARG}"
fi
done
echo ${RESULT}
[root#tsekmanrhel771 ~]# ./skip1st2.sh first second third 4 5 6 7
third 4 5 6 7
You can use a subarray:
$ set -- arg1 arg2 arg3 arg4
$ str=${*:3}
$ echo "$str"
arg3 arg4
More often than not, it's good practice to preserve the arguments as separate elements, though, which you can do by using $# and assigning to a new array:
$ arr=("${#:3}")
$ declare -p arr
declare -a arr=([0]="arg3" [1]="arg4")
Notice that in str=${*:3}, quoting isn't necessary, but in arr=("${#:3}"), it is (or the arguments would be split on whitespace).
As for your error message: your command
words= $(grep -v "$2"|"$3" $message)
does the following:
It sets a variable words to the empty string for the environment of the command (because there is a blank after =).
It tries to set up a pipeline consisting of two commands, grep -v "$2" and "$3" $message. The first of these commands would just hang and wait for input; the second one tries to run the contents of $3 as a command; presumably, based on your error message, $3 contains First.
If the pipeline would actually run, its output would be run as a command (again because of the blank to the right of =).

Define bash variable to be evaluated every time it is used

I want to define bash a variable which will be evaluated every time it is used.
My goal is to define two variables:
A=/home/userA
B=$A/my_file
So whenever I update A, B will be updated with the new value of A
I know how to do it in prompt variables, but, is there a way to do it for regular variables?
If you have Bash 4.4 or newer, you could (ab)use the ${parameter#P} parameter expansion, which expands parameter as if it were a prompt string:
$ A='/home/userA'
$ B='$A/my_file' # Single quotes to suppress expansion
$ echo "${B#P}"
/home/userA/my_file
$ A='/other/path'
$ echo "${B#P}"
/other/path/my_file
However, as pointed out in the comments, it's much simpler and more portable to use a function instead:
$ appendfile() { printf '%s/%s\n' "$1" 'my_file'; }
$ A='/home/user'
$ B=$(appendfile "$A")
$ echo "$B"
/home/user/my_file
$ A='/other/path'
$ B=$(appendfile "$A")
$ echo "$B"
/other/path/my_file
No. Use a simple and robust function instead:
b() {
echo "$a/my_file"
}
a="/home/userA"
echo "b outputs $(b)"
a="/foo/bar"
echo "b outputs $(b)"
Result:
b outputs /home/userA/my_file
b outputs /foo/bar/my_file
That said, here's one ugly way of fighting the system accomplish your goal verbatim:
# Trigger a re-assignment after every single command
trap 'b="$a/my_file"' DEBUG
a="/home/userA"
echo "b is $b"
a="/foo/bar"
echo "b is $b"
Result:
b is /home/userA/my_file
b is /foo/bar/my_file

"printf -v" inside function not working with redirected output

With bash 4.1.2 and 4.3.48, the following script gives the expected output:
#!/bin/bash
returnSimple() {
local __resultvar=$1
printf -v "$__resultvar" '%s' "ERROR"
echo "Hello World"
}
returnSimple theResult
echo ${theResult}
echo Done.
Output as expected:
$ ./returnSimple
Hello World
ERROR
Done.
However, when stdout from the function is piped to another process, the assignment of the __resultvar variable does not work anymore:
#!/bin/bash
returnSimple() {
local __resultvar=$1
printf -v "$__resultvar" '%s' "ERROR"
echo "Hello World"
}
returnSimple theResult | cat
echo ${theResult}
echo Done.
Unexpected Output:
$ ./returnSimple
Hello World
Done.
Why does printf -v not work in the second case? Should printf -v not write the value into the result variable independent of whether the output of the function is piped to another process?
See man bash, section on Pipelines:
Each command in a pipeline is executed as a separate process (i.e., in a subshell).
That's why when you write cmd | cat, cmd receives a copy of variable that it can't modify.
A simple demo:
$ test() ((a++))
$ echo $a
$ test
$ echo $a
1
$ test | cat
$ echo $a
1
Interestingly enough, the same also happens when using eval $__resultvar="'ERROR'" instead of the printf -v statement. Thus, this is not a printf related issue.
Instead, adding a echo $BASH_SUBSHELL to both the main script and the function shows that the shell spawns a sub shell in the second case - since it needs to pipe the output from the function to another process. Hence the function runs in a sub shell:
#!/bin/bash
returnSimple() {
local __resultvar=$1
echo "Sub shell level: $BASH_SUBSHELL"
printf -v "$__resultvar" '%s' "ERROR"
}
echo "Sub shell level: $BASH_SUBSHELL"
returnSimple theResult | cat
echo ${theResult}
echo Done.
Output:
% ./returnSimple.sh
Sub shell level: 0
Sub shell level: 1
Done.
This is the reason why any variable assignments from within the function are not passed back to the calling script.

Unset a argument variable

I have created a script in sh shell.
#script.sh
echo $1
if [ x$1 = 'x' ]
then
echo CODE1
else
echo CODE2
fi
1) if I am running it using . ./script.sh
OUTPUT: CODE1
2) If I run it like . ./script.sh arg1
OUTPUT: arg1
CODE2
3)if I run it again after using . ./script.sh
then it gives me
OUTPUT: arg1
CODE2
I think 3rd has same output as 2nd because I am running 3rd in the same shell as 2nd so $1 is not deallocated and 3rd is actully using the value of $1 set by 2nd.
But if I deallocate it using unset 1 then shell is giving error as unknown identifire.
How can I deallocate this environment variable $1 ?
OR
How can I set it to null.
By sourcing your shell with ., you're running it in the context of your current shell. If you just don't do that, none of these problems will happen:
$ ./script.sh
CODE1
$ ./script.sh arg1
arg1
CODE2
$ ./script.sh
CODE1

Resources