My system is CentOS 6.5
When I want to use backtick to run the commands in filename, i got the result below:
the file's content is below:
[liu-uil#~ 15:54:16]$cat test
echo 1;
echo 2;
echo 3;
[liu-uil#~ 15:54:18]$`cat test`
1; echo 2; echo 3;
[liu-uil#~ 15:54:24]$
the commands after the first echo are all treated as text plain, I don't know why? Could somebody kindly explain this to me? Thank you very much!
Command substitution is one of the expansions. Expansions happen when the command line was already split into commands, it's too late to create new commands.
You can use
eval `cat 1`
to call the shell parser again to split the string into commands and run them.
Only the first word in the result of a backtick command is interpreted as a command. Remaining text is the argument list.
If you want to run commands in a file, you don't need backticks, you need the dot (.) command:
[liu-uil#~ 15:54:16]$. test
Related
I am having a problem where cmd1 works, but not cmd2 in my Bash script ending in .sh. I have made the Bash script executable.
Additionally, I can execute cmd2 just fine from my Bash terminal. I have tried to make a minimally reproducible example, but my larger goal is to run a complicated executable with command line arguments and pass output to a file that may or may not exist (rather than displaying the output in the terminal).
Replacing > with >> also gives the same error in the script, but not the terminal.
My Bash script:
#!/bin/bash
cmd1="cat test.txt"
cmd2="cat test.txt > a"
echo $cmd1
$cmd1
echo $cmd2
$cmd2
test.txt has the words "dog" and "cat" on two separate lines without quotes.
Short answer: see BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!.
Long answer: the shell expands variable references (like $cmd1) toward the end of the process of parsing a command line, after it's done parsing redirects (like > a is supposed to be) and quotes and escapes and... In fact, the only thing it does with the expanded value is word splitting (e.g. treating cat test.txt > a as "cat" followed by "test.txt", ">", and finally "a", rather than a single string) and wildcard expansion (e.g. if $cmd expanded to cat *.txt, it'd replace the *.txt part with a list of matching files). (And it skips word splitting and wildcard expansion if the variable is in double-quotes.)
Partly as a result of this, the best way to store commands in variables is: don't. That's not what they're for; variables are for data, not commands. What you should do instead, though, depends on why you were storing the command in a variable.
If there's no real reason to store the command in a variable, then just use the command directly. For conditional redirects, just use a standard if statement:
if [ -f a ]; then
cat test.txt > a
else
cat test.txt
fi
If you need to define the command at one point, and use it later; or want to use the same command over and over without having to write it out in full each time, use a function:
cmd2() {
cat test.txt > a
}
cmd2
It sounds like you may need to be able to define the command differently depending on some condition, you can actually do that with a function as well:
if [ -f a ]; then
cmd() {
cat test.txt > a
}
else
cmd() {
cat test.txt
}
fi
cmd
Alternately, you can wrap the command (without redirect) in a function, then use a conditional to control whether it redirects:
cmd() {
cat test.txt
}
if [ -f a ]; then
cmd > a
else
cmd
fi
It's also possible to wrap a conditional redirect into a function itself, then pipe output to it:
maybe_redirect_to() {
if [ -f "$1" ]; then
cat > "$1"
else
cat
fi
}
cat test.txt | maybe_redirect_to a
(This creates an extra cat process that isn't really doing anything useful, but if it makes the script cleaner, I'd consider that worth it. In this particular case, you could minimize the stray cats by using maybe_redirect_to a < test.txt.)
As a last resort, you can store the command string in a variable, and use eval to parse it. eval basically re-runs the shell parsing process from the beginning, meaning that it'll recognize things like redirects in the string. But eval has a well-deserved reputation as a bug magnet, because it's easy for it to treat parts of the string you thought were just data as command syntax, which can cause some really weird (& dangerous) bugs.
If you must use eval, at least double-quote the variable reference, so it runs through the parsing process just once, rather than sort-of-once-and-a-half as it would unquoted. Here's an example of what I mean:
cmd3="echo '5 * 3 = 15'"
eval "$cmd3"
# prints: 5 * 3 = 15
eval $cmd3
# prints: 5 [list of files in the current directory] 3 = 15
# ...unless there are any files with shell metacharacters in their names, in
# which case something more complicated might happen.
BashFAQ #50 discusses some other possible reasons and solutions. Note that the array approach will not work here, since arrays also get expanded after redirects are parsed.
If you pop an 'eval' in front of $cmd2 it should work as expected:
#!/bin/bash
cmd2="cat test.txt > a"
eval $cmd2
If you're not sure about the operation of a script you could always use the debug mode to see if you can determine the error.
bash -x scriptname
This will run the command and display the output of variable evaluations. Hopefully this will reveal any issues with syntax.
I have a bit of an issue and i've tried several ways to fix this but i can't seem to.
So i have two shell scripts.
background.sh: This runs a given command in the background and redirect's output.
#!/bin/bash
if test -t 1; then
exec 1>/dev/null
fi
if test -t 2; then
exec 2>/dev/null
fi
"$#" &
main.sh: This file simply starts the emulator (genymotion) as a background process.
#!/bin/bash
GENY_DIR="/home/user/Documents/MyScript/watchdog/genymotion"
BK="$GENY_DIR/background.sh"
DEVICE="164e959b-0e15-443f-b1fd-26d101edb4a5"
CMD="$BK player --vm-name $DEVICE"
$CMD
This works fine when i have NO spaces in my directory. However, when i try to do: GENY_DIR="home/user/Documents/My Script/watchdog/genymotion"
which i have no choice at the moment. I get an error saying that the file or directory cannot be found. I tried to put "$CMD" in quote but it didn't work.
You can test this by trying to run anything as a background process, doesn't have to be an emulator.
Any advice or feedback would be appreciated. I also tried to do.
BK="'$BK'"
or
BK="\"$BK\""
or
BK=$( echo "$BK" | sed 's/ /\\ /g' )
Don't try to store commands in strings. Use arrays instead:
#!/bin/bash
GENY_DIR="$HOME/Documents/My Script/watchdog/genymotion"
BK="$GENY_DIR/background.sh"
DEVICE="164e959b-0e15-443f-b1fd-26d101edb4a5"
CMD=( "$BK" "player" --vm-name "$DEVICE" )
"${CMD[#]}"
Arrays properly preserve your word boundaries, so that one argument with spaces remains one argument with spaces.
Due to the way word splitting works, adding a literal backslash in front of or quotes around the space will not have a useful effect.
John1024 suggests a good source for additional reading: I'm trying to put a command in a variable, but the complex cases always fail!
try this:
GENY_DIR="home/user/Documents/My\ Script/watchdog/genymotion"
You can escape the space with a backslash.
Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.
There is a command I don't understand:
custom_command << EOF!!
I want to ask what EOF!! is in the bash script. I did find EOF with google, but google will ignore the "!!" automatically, so I cannot find EOF!!.
I know the end of the file token, but I don't exactly know what it means with the "!!" in the script. Is this a mark to force something to do something like in vim's wq! ?
Plus, why and when should we use EOF!! instead of EOF?
On the command line, !! would be expanded to the last command executed. Bash will print the line for you:
$ ls
a.txt b.txt
$ cat <<EOF!!
cat <<EOFls
>
In a script, though, history expansion is disabled by default, so the exclamation marks are part of the word.
#! /bin/bash
ls
cat <<EOF!!
echo 1
EOFls
echo 2
Produces:
a.txt b.txt
script.sh: line 7: warning: here-document at line 3 delimited by end-of-file (wanted `EOF!!')
echo 1
EOFls
echo 2
To enable history and history expansion in a script, add the following lines:
set -o history
set -H
You can use whatever string as here document terminator.
EOF!! is just what the person writing the script decided to use.
It's probably just a weird heredoc.
Example:
cat << EOF!!
blabla
EOF!!
Note: this only works in script files. The command line parser interprets !!.
As others already wrote, this is a here-document.
The token used for that should be chosen carefully; as the probability that the here-document contains EOF!! is lower than for EOF itself, they chose that.
I suppose they checked it does not harm before using it; !! in a script does NOT refer to the history, but it stays as it is.
The bash manual lists this under "Event designators", saying:
!!
Refer to the previous command. This is a synonym for !-1`.
I simply searched for "bash manual double exclamation".
x="a=b"
`echo $x`
echo $a
I expect the second line to generate "a=b", and execute it in the context of the main shell, resulting in a new variable a with value b.
However, what I really get (if I enter the commands manually) is the error message after the second line, bash: a=b: command not found
Why is that so?
Try
eval $x
(And we need 30 characters for this answer to be posted)
What your first echo line does is running in a subshell and returns its value to the callee.. The same result is achieved using $() and is - by the way - easier to use than backticks.
So, what you are doing is first running echo $x (which returns a=b). And, because of the backticks, a=b is returned to the shell that tries to run that line as a command which - obviously - won't work.
Try this in a shell:
$(echo ls)
And you will clearly see what is happening.
It's because of the order in which bash parses the command line. It looks for variable definitions (e.g. a=b) before performing variable and command substitution (e.g. commands in backticks). Because of this, by the time echo $x is replaced by a=b, it's too late for bash to see this as a variable definition and it's parsed as a command instead. The same thing would've happened if you'd just used $x as the command (instead of echo in backticks). As in #mvds's answer, the eval command can be used to force the command to be reparsed from the beginning, meaning that it'll be recognized as a variable definition:
$ x="a=b"
$ `echo $x`
-bash: a=b: command not found
$ $(echo $x) # Exact same thing, but with cleaner syntax
-bash: a=b: command not found
$ $x # This also does the same thing, but without some extra steps
-bash: a=b: command not found
$ eval "$x" # This will actually work
$ echo $a
b
$ a= # Start over
$ eval "$(echo "$x")" # Another way of doing the same thing, with extra steps
$ echo $a
b
Note that when using eval I've put all of the references to $x in double-quotes -- this is to prevent the later phases of bash parsing (e.g. word splitting) from happening twice, since bash will finish its regular parsing process, then recognize the eval command, and then redo the entire parsing process again. It's really easy to get unexpected results from using eval, and this removes at least some of the potential for trouble.
Did you try $x in that funny apostrophes? Without echo, echo seems to be only for displaying string, not execute commands.