Why does this simple script behave differently inline and in script? - linux

I have a simple bash script for file listing:
$ cat process.sh
for i in *; do echo $i; done
$
and then I run:
$ ./process.sh
a
b
c
d
process.sh
$
and
$ . ./process.sh
$
and
$ for i in *; do echo $i; done
$
I've read Why does Bash behave differently when called as sh? which explains that inline commands use sh instead of bash - is wildcard non-POSIX in this case?
Why do I get different behaviours when executing the same code?
How to make this example work?
Are there any other cases to look out for?

Solved. The reason was an incorrect for alias. Thanks for your replies.

Related

Is there a built in linux command that return code matches some integer input parameter?

Is there a built in linux command that return code matches some integer input parameter? Of course I could write script, but wanted to know if something was built in.
It should work like this:
$ ~ cmd 42
$ ~ echo $?
42
and the only purpose of cmd should be to exit 42.
You could spawn a shell and use exit:
$ ~ value=42
$ ~ bash -c "exit ${value}"
$ ~ echo $?
42
Note: awk -v "val=${value}" 'BEGIN{exit val}' might be a little more lightweight compared to starting a shell.
You can use a perl one-liner
perl -e 'exit($ARGV[0])' 42
echo $?
42
There are many options for this. They essentially all boil down to the same. Call a subprocess that exits with the proper exit code. Any scripting language can be used for this. The most hidden subprocess is
$ exitcode=42
$ ( exit $exitcode )
otherwise any of the next will do
$ exitcode=42
$ awk -v e=$exitcode 'BEGIN{exit e}'
$ perl -e "exit $exitcode"
$ bash -c "exit $exitcode"
$ python -c "exit($exitcode)"
and there are many more

What's the point of eval/bash -c as opposed to just evaluating a variable?

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

"exec not found" when using variable in exec

I met a strange problem.
mkfifo "spch2008"
exec 100<>"spch2008"
It's OK. But, when i use variable to replace "100", error occurs.
exec: 100: not found
PIPE_ID=100
mkfifo "spch2008"
exec ${PIPE_ID}<>"spch2008"
I don't know the reason. please hlep me,thanks.
It is caused by shell not performing variable expansion on the left side of the redirection operator. You can use a workaround:
eval exec "${PIPE_ID}"'<>"spch2008"'
It will force the shell to do variable expansion, producing
eval exec 100'<>"spch2008"'
Then the eval built-in will feed the command to the shell, which will effectively execute
exec 100<>"spch2008"
I/O redirection doesn't allow variables to specify the file descriptors generally, not just in the context of the <> redirection.
Consider:
$ cat > errmsg # Create script
echo "$#" >&2 # Echo arguments to standard error
$ chmod +x errmsg # Make it executable
$ x=2
$ ./errmsg Hi # Writing on standard error
Hi
$ ./errmsg Hi ${x}>&1 # Writing on standard error
Hi 2
$ ./errmsg Hi 2>&1 # Redirect standard error to standard output
Hi
$ ./errmsg Hi 2>/dev/null # Standard error to /dev/null
$ ./errmsg Hi ${x}>/dev/null # Standard output to /dev/null
Hi 2
$

Linux script: Reinterpret environment variable

I am creating a Bash script which reads some other environment variables:
echo "Exporting configuration variables..."
while IFS="=" read -r k v; do
key=$k
value=$v
if [[ ${#key} > 0 && ${#value} > 0 ]]; then
export $key=$value
fi
done < $HOME/myfile
and have the variable:
$a=$b/c/d/e
and want to call $a as in:
cp myOtherFile $a
The result for the destination folder for the copy is "$b/c/d/e", and an error is shown:
"$b/c/d/e" : No such file or directory
because it is interpreted literally as a folder path.
Can this path be reinterpreted before being used in the cp command?
You need eval to do this :
$ var=foo
$ x=var
$ eval $x=another_value
$ echo $var
another_value
I recommend you this doc before using eval : http://mywiki.wooledge.org/BashFAQ/048
And a safer approach is to use declare instead of eval:
declare "$x=another_value"
Thanks to chepner 2 for the latest.
It sounds like you want $HOME/myfile to support Bash notations, such as parameter-expansion. I think the best way to do that is to modify $HOME/myfile to be, in essence, a Bash script:
export a=$b/c/d/e
and use the source builtin to run it as part of the current Bash script:
source $HOME/myfile
... commands ...
cp myOtherFile "$a"
... commands ...
try this
cp myOtherFile `echo $a`

Concatenate strings inside bash script (different behaviour from shell)

I'm trying some staff that is working perfectly when I write it in the regular shell, but when I include it in a bash script file, it doesn't.
First example:
m=`date +%m`
m_1=$((m-1))
echo $m_1
This gives me the value of the last month (actual minus one), but doesn't work if its executed from a script.
Second example:
m=6
m=$m"t"
echo m
This returns "6t" in the shell (concatenates $m with "t"), but just gives me "t" when executing from a script.
I assume all these may be answered easily by an experienced Linux user, but I'm just learning as I go.
Thanks in advance.
Re-check your syntax.
Your first code snippet works either from command line, from bash and from sh since your syntax is valid sh. In my opinion you probably have typos in your script file:
~$ m=`date +%m`; m_1=$((m-1)); echo $m_1
4
~$ cat > foo.sh
m=`date +%m`; m_1=$((m-1)); echo $m_1
^C
~$ bash foo.sh
4
~$ sh foo.sh
4
The same can apply to the other snippet with corrections:
~$ m=6; m=$m"t"; echo $m
6t
~$ cat > foo.sh
m=6; m=$m"t"; echo $m
^C
~$ bash foo.sh
6t
~$ sh foo.sh
6t
Make sure the first line of your script is
#!/bin/bash
rather than
#!/bin/sh
Bash will only enable its extended features if explicitly run as bash. If run as sh, it will operate in POSIX compatibility mode.
First of all, it works fine for me in a script, and on the terminal.
Second of all, your last line, echo m will just output "m". I think you meant "$m"..

Resources