I was playing around in my terminal earlier and discovered that I can execute single word commands ('ls','cat','python2.7','exit') by making a file or directory named the same thing as the command. However, I can't execute 'multi-word' commands ('rm\ -rf *','ls -a', 'python2.7\ test.py') (which also led me to discover that you can't remove directories named '-rf' with 'rm -rf *') UNLESS the arguments are in alphabetic order. Seemingly, when you just pass in * to bash it reads the names of the files/directories in alphabetical order, passing each successive name in as an argument to the previous command. Example:
$ mkdir cat
$ touch dog
$ vim dog # at this point I put 'Hello!' into dog
$ *
Hello!
Why does the wildcard character not allow me to execute commands of files with spaces in them but does allow me to execute commands of multiple file names? Thanks!
EDIT: Also aliases don't work for some reason. Example:
$ alias lsa='ls -a'
$ mkdir lsa
$ *
No command 'lsa' found, did you mean:
...
lsa: command not found
Anyone know why this is?
Every command bash processes is subject to several varieties of expansions before bash interprets and acts on the result. The last of those is pathname expansion, wherein bash examines the words of the command (as produced by previous expansions and word splitting) for any that contain any of the unquoted special characters *, ?, and [, and interprets those words as patterns representing file names. After all expansions are performed, any redirections and variable assignments in the expanded command are performed and removed from the command. If any words remain, the first is taken as the name of a command to run. The provenance of that word does not matter.
In pathname expansion, the * matches any string, including the empty string. When bash performs pathname expansion on the word consisting of only *, it expands to the names of all the files in the working directory, except those beginning with a dot (.).
Thus, if your working directory contains just a single file named ls, and you execute the command *, bash expands that command to ls and executes it, producing the output ls. Similarly, if the contents of the directory are echo, Hello,, and World!, then when you execute the command * bash expands it to the equivalent of echo Hello, 'World!', and the command outputs Hello, World!.
That doesn't work as you thought it would for filenames with spaces in them because each file name matching a pattern is expanded to a single word. These are not subsequently split. (Bash does perform "word splitting" as part of the expansion process, but that happens before pathname expansion.) This is normally what you want, for otherwise commands such as rm * would not reliably do what you expect.
That doesn't work with aliases because bash expands aliases when it reads commands, not when it executes them. That has several implications, but one of them is that aliases are interpreted before any expansions are performed. Another is that the replacement text of an alias is subject to all the normal expansions.
Related
I want to delete content of the file main_copy.json.
I have a confusion if I used below command, will it remove files in all these directories?
[user#yyy ~]$ > /AAA/BBB/CCC/DDD/main_copy.json
The '>' character is the shell (often Bash) redirection character. You are at a command prompt of the shell, probably Bash but you don't specify. The command as you have written basically says "redirect <nothing> to the file /AAA/BBB/CCC/DDD/main_copy.json". The net result is to truncate the file to zero length, effectively deleting its contents.
Since there are no spaces in the argument to '>', bash treats it as a single argument, thus there is no possibility that your command will delete contents of any files in any of the directories in your path. In any case, the '>' redirect operator does not work with multiple arguments. So if you were to say:
[user#yyy ~]$ > /AAA /BBB/CCC/DDD/main_copy.json
The net effect would issue an error because you can't redirect to a directory, only to a file. If you were to say:
[user#yyy ~]$ > /AAA/myfile.txt BBB/CCC/DDD/main_copy.json
the shell would truncate the file myfile.txt to zero length, (or create a zero-length file if it did not exist) and then treat the second argument as an executable command, which of course would fail with something like "Permission Denied" because it's not an executable.
Hope that helps. Bash (and other shells) is a complicated beast and takes years to really learn well.
I am trying to delete any folder or files in my account inside the cluster using the rm command. But somehow I am not able to do this and getting an error
bash: warning: shell level (1000) too high, resetting to 1
bash: warning: shell level (1000) too high, resetting to 1
/home/.../bin/rm: fork: Cannot allocate memory
The problem is simple yet it might look complex as it shows itself in a rather unexpected way and is caused by not so simple process of command name resolution.
Let me remind you what happens when the shell executes a command like
command_name arg1 ... argn
The shell splits the command line into words separated by spaces and other metacharacters.
It takes the first word and treats at as a command name.
It checks if it contains slashes and if true treats it as a path to a command file and goes to #10
If the check in 3 fails It checks if command name is a name of a built-in command and executes the command if true
If the check in 4 fails it tries to find the command on the file system
It reads the $PATH variable and splits its value into segments separated by columns :.
For each segment it treats it as a path to a directory in the filesystem.
If the directory exists it scans the directory for an executable file name matching the command name if found it appends the command name to the directory name and treats this as a path to the command file and goes to #10
If the file is not found in any of the segments of the $PATH the shell reports an error.
The shell executes the command
command_file arg1 ... argn
Note the difference is that instead of command_name we have got command_file now.
Now what this has to do with your problem.
First your $PATH has /home/.../bin before /bin. So rm myfile becomes /home/.../bin/rm myfile instead of /bin/rm myfile.
Second your /home/.../bin/rm file is a shell script that calls rm $* or something like this. Executing this file results in a non terminating sequence of calls to /home/.../bin/rm myfile.
To check what command is being executed use the command which rm
Option 1
So one option would be to edit /home/.../bin/rm and replace all calls to rm with calls to /bin/rm in the file. Then it would not cause a non terminating sequence of calls.
Option 2
In general it is not a good idea to override standard commands unless you know very well what you are doing, so you could rename /home/.../bin/rm to /home/.../bin/myrm
I'm using WSL (Ubuntu 18.04) on Windows 10 and bash.
I have a file filename.gpg with the content:
export SOME_ENV_VAR='123'
Now I run the following commands:
$ $(gpg -d filename.gpg)
$ echo $SOME_ENV_VAR
'123' <-- with quotes
However, if I run it directly in the shell:
$ export SOME_ENV_VAR='123'
$ echo $SOME_ENV_VAR
123 < -- without quotes
Why does it behave like this? Why is there a difference between running a command using $() and running it directly?
Aside: I got it working using eval $(gpg -d filename), I have no idea why this works.
Quotes in shell scripts do not behave differently from quotes in shell commands.
With the $(gpg -d filename.gpg) syntax, you are not executing a shell script, but a regular single command.
What your command does
It executes gpg -d filename.gpg
From the result, it takes the first (IFS-separated) word as the command to execute
It takes every other (IFS-separated) words, including words from additional lines, as its parameters
It executs the command
From the following practical examples, you can see how it differs from executing a shell script:
Remove the word export from filename.gpg: the command is then SOME_ENV_VAR='123' which is not understood as a variable assignment (you will get SOME_ENV_VAR='123': command not found).
If you add several lines, they won't be understood as separated command lines, but as parameters to the very first command (export).
If you change export SOME_ENV_VAR='123' to export SOME_ENV_VAR=$PWD, SOME_ENV_VAR will not contain the content of variable PWD, but the string $var
Why is it so?
See how bash performs expansion when analyzing a command.
There are many steps. $(...) is called "command substitution" and is the fourth step. When it is done, none of the previous steps will be performed again. This explains why your command does not work when you remove the export word, and why variables are not substituted in the result.
Moreover "quote Removal" is the last step and the manual reads:
all unquoted occurrences of the characters ‘\’, ‘'’, and ‘"’ that did
not result from one of the above expansions are removed
Since the single quotes resulted from the "command substitution" expansion, they were not removed. That's why the content of SOME_ENV_VAR is '123' and not 123.
Why does eval work?
Because eval triggers another complete parsing of its parameters. The whole set of expansions is run again.
From the manual:
The arguments are concatenated together into a single command, which is then read and executed
Note that this means that you are still running one single command, and not a shell script. If your filename.gpg script has several lines, subsequent lines will be added to the argument list of the first (and only) command.
What should I do then?
Just use source along with process substitution.
source <(gpg -d filename.gpg)
Contrary to eval, source is used to execute a shell script in the current context. Process substitution provides a pseudo-filename that contains the result of the substitution (i.e. the output of gpg).
I have two files.
~/directory1/7120599_S1.txt
hello world!
~/directory2/7120599_S7.txt
bye world!
I'm looking for Perl code that will append the contents of 7120599_S1.txt to the end of 7120599_S7.txt. The following command works in Linux:
$ cat ~/directory1/7120599_S1.txt >> ~/directory2/7120599_S*.txt
New ~/directory2/7120599_S7.txt
hello world!
bye world!
But for some reason this doesn't work in Perl
system "cat ~/directory1/7120599_S1.txt >> ~/directory2/7120599_S*.txt"
Instead it creates a new file in directory2 called 7120599_S*.txt. How do I get Perl to recognize the Linux wildcard character?
If we look at the POSIX shell:
For the other redirection operators, the word that follows the redirection operator shall be subjected to tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal. Pathname expansion shall not be performed on the word by a non-interactive shell; an interactive shell may perform it, but shall do so only when the expansion would result in one word.
Pathname expansion refers to globs like *.txt.
The Bash shell does not follow POSIX precisely and tries to be more convenient and sensible, including applying pathname expansion to redirection targets even in non-interactive mode. Usually, your interactive shell will be Bash.
However, when you run system commands this usually runs a simpler shell that is POSIX-compliant (though that depends entirely on your operating system). Often this shell is installed as /bin/sh.
If you want Bash, you must run Bash explicitly, e.g.
system "bash", "-c", "cat ~/directory1/7120599_S1.txt >> ~/directory2/7120599_S*.txt"
By the way, you can force Bash to follow POSIX if you set the POSIXLY_CORRECT environment variable.
I've run into a really silly problem with a Linux shell script. I want to delete all files with the extension ".bz2" in a directory. In the script I call
rm "$archivedir/*.bz2"
where $archivedir is a directory path. Should be pretty simple, shouldn't it? Somehow, it manages to fail with this error:
rm: cannot remove `/var/archives/monthly/April/*.bz2': No such file or directory
But there is a file in that directory called test.bz2 and if I change my script to
echo rm "$archivedir/*.bz2"
and copy/paste the output of that line into a terminal window the file is removed successfully. What am I doing wrong?
TL;DR
Quote only the variable, not the whole expected path with the wildcard
rm "$archivedir"/*.bz2
Explanation
In Unix, programs generally do not interpret wildcards themselves. The shell interprets unquoted wildcards, and replaces each wildcard argument with a list of matching file names.
if $archivedir might contain spaces, then rm $archivedir/*.bz2 might not do what you
You can disable this process by quoting the wildcard character, using double or single quotes, or a backslash before it. However, that's not what you want here - you do want the wildcard expanded to the list of files that it matches.
Be careful about writing rm $archivedir/*.bz2 (without quotes). The word splitting (i.e., breaking the command line up into arguments) happens after $archivedir is substituted. So if $archivedir contains spaces, then you'll get extra arguments that you weren't intending. Say archivedir is /var/archives/monthly/April to June. Then you'll get the equivalent of writing rm /var/archives/monthly/April to June/*.bz2, which tries to delete the files "/var/archives/monthly/April", "to", and all files matching "June/*.bz2", which isn't what you want.
The correct solution is to write:
rm "$archivedir"/*.bz2
Your original line
rm "$archivedir/*.bz2"
Can be re-written as
rm "$archivedir"/*.bz2
to achieve the same effect. The wildcard expansion is not taking place properly in your existing setup. By shifting the double-quote to the "front" of the file path (which is legitimate) you avoid this.
Just to expand on this a bit, bash has fairly complicated rules for dealing with metacharacters in quotes. In general
almost nothing is interpreted in single-quotes:
echo '$foo/*.c' => $foo/*.c
echo '\\*' => \\*
shell substitution is done inside double quotes, but file metacharacters aren't expanded:
FOO=hello; echo "$foo/*.c" => hello/*.c
everything inside backquotes is passed to the subshell which interprets them. A shell variable that is not exported doesn't get defined in the subshell. So, the first command echoes blank, but the second and third echo "bye":
BAR=bye echo `echo $BAR`
BAR=bye; echo `echo $BAR`
export BAR=bye; echo `echo $BAR`
(And getting this to print the way you want it in SO takes several tries is apparently impossible...)
The quotes are causing the string to be interpreted as a string literal, try removing them.
I've seen similar errors when calling a shell script like
./shell_script.sh
from another shell script. This can be fixed by invoking it as
sh shell_script.sh
Why not just rm -rf */*.bz2? Works for me on OSX.