I have a setting.sh file with this line:
HOME="/var/lib/${USER}"
Then I have a shell.sh with these lines (in order):
. ./settings.sh
USER="test"
Can I set a $home without first setting $user?
If I use a default user in setting.sh and then set it as a new value in shell.sh, will it update the home variable?
No to both. Statements are executed in order. If you change $USER that will not retroactively update $HOME. Makefiles have lazy variable expansions, but not shell scripts.
You could reorder the statements:
USER="test"
. ./settings.sh
Answers to your questions:
1. No
2. No
You can ( so $user = "")
It will not update your $HOME variable
typical computer programming concepts
int a = 5; // a = 5
int c = a; // c = 5
a = 7; // c is still 5
Not sure if this was worth a question. You can easily try this and answer your own question:
$ A="B IS ${B}"
$ echo $A
B IS
$ B=FOO
$ echo $B
FOO
$ echo $A
B IS
In other words, the right-hand side expression is evaluated on assignment. Which means that answers to both of your questions are no.
What you can do, however, is to perform lazy evaluation. For example:
$ A="B IS \${B}"
$ echo $A
B IS ${B}
$ eval echo $A
B IS
$ B=FOO
$ echo $A
B IS ${B}
$ eval echo $A
B IS FOO
$
Good Luck!
Related
In Racket Scheme, there is a data structure called "string port" and you can read data from it. Anything similar in perl6? For examples, I want to achieve these outcomes:
my $a = "(1,2,3,4,5)"; # if you read from $a, you get a list that you can use;
my $aList=readStringPort($a);
say $aList.WHAT; # (List)
say $aList.elems; # 5 elements
my $b = "[1,2,3]"; # you get an array to use if you read from it;
my $c = "sub ($one) {say $one;}";
$c("Big Bang"); # says Big Bang
The function EVAL is not quite doing the full spectrum of tasks:
> EVAL "1,2,3"
(1 2 3)
> my $a = EVAL "1,2,3"
(1 2 3)
> $a.WHAT
(List)
> my $b = EVAL "sub ($one) {say $one;}";
===SORRY!=== Error while compiling:
Variable '$one' is not declared. Did you mean '&one'?
------> my $b = EVAL "sub (⏏$one) {say $one;}";
Thanks a lot!
lisprog
EVAL does this.
The problem in your last example, is that double-quoted strings interpolate $ variables and { blocks and similar. In order to represent such things in a string literal, either escape them with backslashes...
my $b = EVAL "sub (\$one) \{say \$one;}";
...or use a non-interpolating string literal:
my $b = EVAL 'sub ($one) {say $one;}';
my $b = EVAL Q[sub ($one) {say $one;}];
I am new to shell script. I am working with Hex values and writing a simple script for substraction. Here is my script:
#!/bin/bash
var1=“0x0001”
var2=“0x0005”
var3=“$(( 16#$var2 - 16#$var1 ))”
echo “Diference $var3”
I am getting this error :
line 6: 16#?: value too great for base (error token is "16#?")
Could you please let me know where my mistake is?
$ var1=0x0001
$ var2=0x0005
$ var3=$(( $var2 - $var1 ))
$ echo "Diference $var3"
Diference 4
Assign the hex values without double quotes(i.e not as strings).
Since you have already put a 0x there is no need for 16#
To conver the answer back to hex you can use:
printf '%x' $num
Here is an example:
$ var1=0x19
$ var2=0xA
$ var3=$(( $var1 - $var2 ))
$ echo $var3
15
$ printf '%x\n' $var3
f
$ var3=$(printf '%x' $var3)
$ echo $var3
f
16# and 0x are redundant, and mutually exclusive. The problem is that, due to the 16#, Bash thinks the x is trying to be a digit in a base-16 number (whereas it's only valid in base 34 or higher). Just drop either the 16# or the 0x, and it'll work.
I have a .txt input file that is the product of a printf defining each line as POV(n)="sequenceX,yearY"
cat output.PA
POV01="SEQ010,FY15"
POV02="SEQ010,FY16"
POV03="SEQ020,FY15"
POV04="SEQ020,FY16"
How can I source this file so that I can export each POV as the variable value of sequence and fy, respectively for the given line?
export POV(n)="$seq,$fy"
the printf I have used to get tho this point is as follows:
cat step1
while read -r seq fy; do
printf 'POV%02d="%s,%s"\n' ${counter} ${seq} ${fy}
(( counter = counter + 1 ))
done <test_scenario_02.txt > output.PA
If I source output.PA I get the following:
./step2
POV00=YEAR,
POV01=SEQ010,FY15
POV02=SEQ010,FY16
POV03=SEQ020,FY15
POV04=SEQ020,FY16
POV05=SEQ030,FY15
POV06=SEQ030,FY16
POV07=SEQ030,FY15
POV08=SEQ030,FY16
POV09=SEQ040,FY15
POV10=SEQ040,FY16
POV11=SEQ050,FY15
POV12=SEQ050,FY16
$ cat step2
. ./output.PA
set | grep "^POV"
It is not at all clear what you want, but it seems like you are trying to create an array variable that holds all the value in output.PA. You probably don't need to do that, but this should work:
$ pov=($(sed -e 's/[^"]*"//' -e 's/"$//' output.PA))
$ echo ${pov[0]}
SEQ010,FY15
$ echo ${pov[1]}
SEQ010,FY16
$ echo ${pov[2]}
SEQ020,FY15
When I run commands in my shell as below, it returns an expr: non-integer argument error. Can someone please explain this to me?
$ x=20
$ y=5
$ expr x / y
expr: non-integer argument
Those variables are shell variables. To expand them as parameters to another program (ie expr), you need to use the $ prefix:
expr $x / $y
The reason it complained is because it thought you were trying to operate on alphabetic characters (ie non-integer)
If you are using the Bash shell, you can achieve the same result using expression syntax:
echo $((x / y))
Or:
z=$((x / y))
echo $z
I believe it was already mentioned in other threads:
calc(){ awk "BEGIN { print "$*" }"; }
then you can simply type :
calc 7.5/3.2
2.34375
In your case it will be:
x=20; y=3;
calc $x/$y
or if you prefer, add this as a separate script and make it available in $PATH so you will always have it in your local shell:
#!/bin/bash
calc(){ awk "BEGIN { print $* }"; }
Why not use let; I find it much easier.
Here's an example you may find useful:
start=`date +%s`
# ... do something that takes a while ...
sleep 71
end=`date +%s`
let deltatime=end-start
let hours=deltatime/3600
let minutes=(deltatime/60)%60
let seconds=deltatime%60
printf "Time spent: %d:%02d:%02d\n" $hours $minutes $seconds
Another simple example - calculate number of days since 1970:
let days=$(date +%s)/86400
Referencing Bash Variables Requires Parameter Expansion
The default shell on most Linux distributions is Bash. In Bash, variables must use a dollar sign prefix for parameter expansion. For example:
x=20
y=5
expr $x / $y
Of course, Bash also has arithmetic operators and a special arithmetic expansion syntax, so there's no need to invoke the expr binary as a separate process. You can let the shell do all the work like this:
x=20; y=5
echo $((x / y))
To get the numbers after decimal point, you can do this:-
read num1 num2
div=`echo $num1 / $num2 | bc -l`
echo $div
let's suppose
x=50
y=5
then
z=$((x/y))
this will work properly .
But if you want to use / operator in case statements than it can't resolve it.
In that case use simple strings like div or devide or something else.
See the code
I am having some issues with word-splitting in bash variable expansion. I want to be able to store an argument list in a variable and run it, but any quoted multiword arguments aren't evaluating how I expected them to.
I'll explain my problem with an example. Lets say I had a function decho that printed each positional parameter on it's own line:
#!/bin/bash -u
while [ $# -gt 0 ]; do
echo $1
shift
done
Ok, if I go decho a b "c d" I get:
[~]$ decho a b "c d"
a
b
c d
Which is what I expect and want. But on the other hand if I get the arguments list from a variable I get this:
[~]$ args='a b "c d"'
[~]$ decho $args
a
b
"c
d"
Which is not what I want. I can go:
[~]$ echo decho $args | bash
a
b
c d
But that seems a little clunky. Is there a better way to make the expansion of $args in decho $args be word-split the way I expected?
You can use:
eval decho $args
You can move the eval inside the script:
#!/bin/bash -u
eval set -- $*
for i;
do
echo $i;
done
Now you can do:
$ args='a b "c d"'
$ decho $args
a
b
c d
but you'll have to quote the arguments if you pass them on the CL:
$ decho 'a b "c d"'
a
b
c d
Have you tried:
for arg in "$#"
do
echo "arg $i:$arg:"
let "i+=1"
done
Should yield something like:
arg 1: a
arg 2: c d
in your case.
Straight from memory, no guarantee :-)
hmmm.. eval decho $args works too:
[~]$ eval decho $args
a
b
c d
And I may be able to do something with bash arrays using "${array[#]}" (which works like "$#"), but then I would have to write code to load the array, which would be a pain.
It is fundamentally flawed to attempt to pass an argument list stored in a variable, to a command.
Presumably, if you have code somewhere to create a variable containing the intended args. for a command, then you can change it to instead store the args into an array variable:
decho_argv=(a b 'c d') # <-- easy!
Then, rather than changing the command "decho" to accommodate the args taken from a plain variable (which will break its ability to handle normal args) you can do:
decho "${decho_argv[#]}" # USE DOUBLE QUOTES!!!
However, if you are the situation where you are trying to take arbitrary input which is expected to be string fields corresponding to intended command positional arguments, and you want to pass those arguments to a command, then you should instead of using a variable, read the data into an array.
Note that suggestions which offer the use of eval to set positional parameters with the contents of an ordinary variable are extremely dangerous.
Because, exposing the contents of a variable to the quote-removal and word-splitting on the command-line affords no way to protect against shell metachars in the string in the variable from causing havoc.
E.g., imagine in the following example if the word "man" was replaced with the two words "rm" and "-rf" and the final arg word was "*":
Do Not Do This:
> args='arg1 ; man arg4'
> eval set -- $args
No manual entry for arg4
> eval set -- "$args" # This is no better
No manual entry for arg4
> eval "set -- $args" # Still hopeless
No manual entry for arg4
> eval "set -- '$args'" # making it safe also makes it not work at all!
> echo "$1"
arg1 ; man arg4