I want to make a shortcut for long commands with default arguments so I can bring the command into the command line and then add or change arguments by myself.
For example with the command wget:
print "wget -O downloaded.file"
will result in:
user#hostname$ wget -O downloaded.file
and then I add the "url" I want to download from:
user#hostname$ wget -O downloaded.file http://example.com/
With
alias my_personal_wget_shortcut="wget -O downloaded.file"
additional arguments are added afterwards:
my_personal_wget_shortcut http://example.com/
With a function or a shell script, you can place the arguments anywhere you like:
Function:
my_personal_wget_shortcut()
{
wget -O downloaded.file "$#"
}
Script:
#!/bin/bash
wget -O downloaded.file "$#"
Don't forget to set the x-bit on the script!
My answer is based on solutions from https://unix.stackexchange.com/a/82716/330217 and https://unix.stackexchange.com/a/325220/330217.
You might also get some ideas from https://unix.stackexchange.com/a/391698/330217
Clean way for bash, but requires pressing cursor-up:
You can put "fake" commands onto bash's command history using
history -s 'some command'
(see https://www.gnu.org/savannah-checkouts/gnu/bash/manual/bash.html#Bash-History-Builtins)
After putting the command to the history you can press the cursor-up key to get this command into the command buffer and edit it as you like.
Of course you can use an alias as a shortcut.
alias myalias="history -s 'wget -O downloaded.file '"
myalias
After pressing cursor-up you should get
wget -O downloaded.file _
Note: The underscore (_) is meant to show the cursor position after the trailing space.
More hacky solution which directly fills the command buffer:
writecmd () {
perl -e 'ioctl STDOUT, 0x5412, $_ for split //, do{ chomp($_ = <>); $_ }' ;
}
# Example usage
echo 'my test cmd' | writecmd
Combined with an alias
writecmd () {
perl -e 'ioctl STDOUT, 0x5412, $_ for split //, do{ chomp($_ = <>); $_ }' ;
}
alias myalias="echo 'wget -O downloaded.file '|writecmd"
myalias
should directly result in
wget -O downloaded.file _
Note: This solution has the drawback that it also prints the command to stdout. That means the command also appears before the prompt. (I don't know if it is possible to suppress this. When I redirect stdout to /dev/null, the command will no longer be written to the command buffer.)
In zsh you could use print -z to put something into the command buffer.
alias myalias="print -z 'wget -O downloaded.file '"
Related
I need to write a script that checks some >20k files for some >2k search text and it needs to be flexible, so I came up with this script:
#!/bin/bash
# This script checks all files in a given directory against a list of criteria
shopt -s expand_aliases
source ~/.bashrc
TIMESTAMP=$(date "+%Y-%m-%d-%T")
ROOT_DIR=/data
PROJECT_NAME=$1
FILE_DIR=$ROOT_DIR/projects/$1/$2
RESULT_DIR=$ROOT_DIR/projects/$1/check_result
SEARCHTEXT_FILE=$ROOT_DIR/scripts/$3
OIFS="$IFS"
IFS=$'\n'
files=$(find $FILE_DIR -type f -name '*.json')
for file in $files; do
while read line; do
grep -H -o $line "$file" >> $RESULT_DIR/check_result_$TIMESTAMP.log
done < $SEARCHTEXT_FILE
done
IFS="$OIFS"
This script only produces the empty $RESULT_DIR/check_result_$TIMESTAMP.log log file with correct name.
Because the file names sometimes contain spaces I added the IFS... statements and I enclosed $file in " quotes (copied from another post).
The content of the $SEARCHTEXT_FILE is for example:
'Tel alt........'
'City ..........'
If I place an echo before the grep like this
echo grep -H -o $line "$file"
then output I get is
grep -H -o 'Tel alt........' /data/projects/DNAR/input/report-157538.json
and I can execute this line as is and get the correct result.
I tried to put various combinations of " or ' or ` or () or {} around any part of this grep command but nothing changed.
Somewhere I did read about alias and the alias set for grep is
alias grep='grep --color=auto'
After many hours of searching on the internet I couldn't find any post that helped me as most of them are covering issues around wrong quotes or inline bash issues.
What are I missing here?
The simple and obvious workaround is to remove all that complexity and simply use the features of the commands you are running anyway.
find "$FILE_DIR" -type f -name '*.json' \
-exec grep -H -o -f "$SEARCHTEXT_FILE" {} + > "$RESULT_DIR/check_result_$TIMESTAMP.log"
Notice also the quoting fixes; see When to wrap quotes around a shell variable; to avoid mishaps, you should switch to lower case for your private variables (see Correct Bash and shell script variable capitalization).
shopt -s expand_aliases
and source ~/.bashrc merely look superfluous, but could contribute to whatever problem you are trying to troubleshoot; they should basically never be part of a script you plan to use in production.
I have a big script (call it test) that, after stripping out the unrelated parts, comes down to just this using which I can explain my question:
#!/bin/bash
bash -c "$#"
This doesn't work as expected. E.g. ./test echo hi executes the only the echo and the argument disappears!
Testing with various inputs I can see only $1 is passed to bash -c ... and rest are discarded.
But if I use a variable like:
#!/bin/bash
cmd="$#"
bash -c "$cmd"
it works as expected for all inputs.
Questions:
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
For example:
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
(If possible, please refer to the bash grammar where this behaviour is documented).
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
From info bash #:
#
($#) Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands
to a separate word. That is, "$#" is equivalent to "$1" "$2" ....
Thus, bash -c "$#" is equivalent to bash -c "$1" "$2" .... In the case of ./test echo hi invocation, the expression is expanded to
bash -c "echo" "hi"
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
Bash actually doesn't discard anything. From man bash:
If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, they are assigned to the positional parameters, starting with $0.
Thus, for the command bash -c "echo" "hi", Bash passes "hi" as $0 for the "echo" script.
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
According to the rules mentioned above, Bash executes "ls" script and passes the following positional parameters to this script:
$0: "-l"
$1: "-a"
$2: "hi"
$3: "hello"
$4: "blah"
Thus, the command actually executes ls, and the positional parameters are unused in the script. You can use them by referencing to the positional parameters, e.g.:
$ set -x
$ bash -c "ls \$0 \$1 \$3" -l -a hi hello blah
+ bash -c 'ls $0 $1 $3' -l -a hi hello blah
ls: cannot access hello: No such file or directory
You should be using $* instead of $# to pass command line as string. "$#" expands to multiple quoted arguments and "$*" combines multiple arguments into a single argument.
#!/bin/bash
bash -c "$*"
Problem is with your $# it executes:
bash -c echo hi
But with $* it executes:
bash -c 'echo hi'
When you use:
cmd="$#"
and use: bash -c "$cmd" it does the same thing for you.
Read: What is the difference between “$#” and “$*” in Bash?
This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 6 years ago.
I'm trying to write a database call from within a bash script and I'm having problems with a sub-shell stripping my quotes away.
This is the bones of what I am doing.
#---------------------------------------------
#! /bin/bash
export COMMAND='psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o ${EXPORT_FILE} 2>&1'
PSQL_RETURN=`${COMMAND}`
#---------------------------------------------
If I use an 'echo' to print out the ${COMMAND} variable the output looks fine:
echo ${COMMAND}
screen output:-
#---------------
psql drupal7 -F , -t --no-align -c "SELECT DISTINCT hostname FROM accesslog;" -o /DRUPAL/INTERFACES/EXPORTS/ip_list.dat 2>&1
#---------------
Also if I cut and paste this screen output it executes just fine.
However, when I try to execute the command as a variable within a sub-shell call, it gives an error message.
The error is from the psql client to the effect that the quotes have been removed from around the ${SQL} string.
The error suggests psql is trying to interpret the terms in the sql string as parameters.
So it seems the string and quotes are composed correctly but the quotes around the ${SQL} variable/string are being interpreted by the sub-shell during the execution call from the main script.
I've tried to escape them using various methods: \", \\", \\\", "", \"" '"', \'"\', ... ...
As you can see from my 'try it all' approach I am no expert and it's driving me mad.
Any help would be greatly appreciated.
Charlie101
Instead of storing command in a string var better to use BASH array here:
cmd=(psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o "${EXPORT_FILE}")
PSQL_RETURN=$( "${cmd[#]}" 2>&1 )
Rather than evaluating the contents of a string, why not use a function?
call_psql() {
# optional, if variables are already defined in global scope
DB_NAME="$1"
SQL="$2"
EXPORT_FILE="$3"
psql "$DB_NAME" -F , -t --no-align -c "$SQL" -o "$EXPORT_FILE" 2>&1
}
then you can just call your function like:
PSQL_RETURN=$(call_psql "$DB_NAME" "$SQL" "$EXPORT_FILE")
It's entirely up to you how elaborate you make the function. You might like to check for the correct number of arguments (using something like (( $# == 3 ))) before calling the psql command.
Alternatively, perhaps you'd prefer just to make it as short as possible:
call_psql() { psql "$1" -F , -t --no-align -c "$2" -o "$3" 2>&1; }
In order to capture the command that is being executed for debugging purposes, you can use set -x in your script. This will the contents of the function including the expanded variables when the function (or any other command) is called. You can switch this behaviour off using set +x, or if you want it on for the whole duration of the script you can change the shebang to #!/bin/bash -x. This saves you explicitly echoing throughout your script to find out what commands are being run; you can just turn on set -x for a section.
A very simple example script using the shebang method:
#!/bin/bash -x
ec() {
echo "$1"
}
var=$(ec 2)
Running this script, either directly after making it executable or calling it with bash -x, gives:
++ ec 2
++ echo 2
+ var=2
Removing the -x from the shebang or the invocation results in the script running silently.
I'm trying to execute a wget command with a variable inside it but it just ignores it, any idea what am I doing wrong?
#!/bin/bash
URL=http:://www.myurl.com
echo $(date) 'Running wget...'
wget -O - -q "$URL/something/something2"
Four things:
Add quotes around your URL: http:://www.myurl.com ==> "http:://www.myurl.com"
Remove the double colon: "http:://www.myurl.com" ==> "http://www.myurl.com"
Get rid of the extra flags and hyphen on the wget command: "wget -O - -q "$URL/something/something2"" ==> wget "$URL/something/something2"
Add curly braces around your variable: "wget "$URL/something/something2"" ==> "wget "${URL}/something/something2""
This works:
#!/bin/bash
URL="http://www.google.com"
echo $(date) 'Running wget...'
wget "${URL}"
I use that code in IPython (colab):
URL = 'http:://www.myurl.com'
!wget {URL}
I wrote this answer because was searching it!)
Another handy option in Bash (or other shells) is to create a simple helper function that calls wget with the common options required by nearly all sites and any specific options you generally use. This reduces the typing involved and can also be useful in your scripts. I place the following in my ~/.bashrc to make it available to all shells/subshells. It validates input, checks that wget is available, and then passes all command line arguments to wget with the default options set in the script:
wgnc () {
if [ -z $1 ]; then
printf " usage: wg <filename>\t\t(runs wget --no-check-certificate --progress=bar)\n"
elif ! type wget &>/dev/null; then
printf " error: 'wget' not found on system\n"
else
printf " wget --no-check-certificate --progress=bar %s\n" "$#"
wget --no-check-certificate --progress=bar "$#"
fi
}
You can cut down typing even more by aliasing the function further. I use:
alias wg='wgnc'
Which reduces the normal wget --no-check-certificate --progress=bar URL to simply wg URL. Obviously, you can set the options to suit your needs, but this is a further way to utilize wget in your scripts.
I'm trying to write a shell script that calls another script that then executes a rsync command.
The second script should run in its own terminal, so I use a gnome-terminal -e "..." command. One of the parameters of this script is a string containing the parameters that should be given to rsync. I put those into single quotes.
Up until here, everything worked fine until one of the rsync parameters was a directory path that contained a space. I tried numerous combinations of ',",\",\' but the script either doesn't run at all or only the first part of the path is taken.
Here's a slightly modified version of the code I'm using
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l '\''/media/MyAndroid/Internal storage'\''' "
Within Backup.sh this command is run
rsync $5 "$path"
where the destination $path is calculated from text in Stamp.
How can I achieve these three levels of nested quotations?
These are some question I looked at just now (I've tried other sources earlier as well)
https://unix.stackexchange.com/questions/23347/wrapping-a-command-that-includes-single-and-double-quotes-for-another-command
how to make nested double quotes survive the bash interpreter?
Using multiple layers of quotes in bash
Nested quotes bash
I was unsuccessful in applying the solutions to my problem.
Here is an example. caller.sh uses gnome-terminal to execute foo.sh, which in turn prints all the arguments and then calls rsync with the first argument.
caller.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh 'long path' arg2 arg3"
foo.sh:
#!/bin/bash
echo $# arguments
for i; do # same as: for i in "$#"; do
echo "$i"
done
rsync "$1" "some other path"
Edit: If $1 contains several parameters to rsync, some of which are long paths, the above won't work, since bash either passes "$1" as one parameter, or $1 as multiple parameters, splitting it without regard to contained quotes.
There is (at least) one workaround, you can trick bash as follows:
caller2.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh '--option1 --option2 \"long path\"' arg2 arg3"
foo2.sh:
#!/bin/bash
rsync_command="rsync $1"
eval "$rsync_command"
This will do the equivalent of typing rsync --option1 --option2 "long path" on the command line.
WARNING: This hack introduces a security vulnerability, $1 can be crafted to execute multiple commands if the user has any influence whatsoever over the string content (e.g. '--option1 --option2 \"long path\"; echo YOU HAVE BEEN OWNED' will run rsync and then execute the echo command).
Did you try escaping the space in the path with "\ " (no quotes)?
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l ''/media/MyAndroid/Internal\ storage''' "