define a function-like macro in bash - linux

Is it possible to define a macro-function in bash so when I write:
F(sth);
bash runs this:
echo "sth" > a.txt;

Arbitrary syntax can't be made to do anything. Parentheses are metacharacters which have special meaning to the parser, so there's no way you can use them as valid names. The best way to extend the shell is to define functions.
This would be a basic echo wrapper that always writes to the same file:
f() {
echo "$#"
} >a.txt
This does about the same but additionally handles stdin - sacrificing echo's -e and -n options:
f() {
[[ ${1+_} || ! -t 0 ]] && printf '%s\n' "${*-$(</dev/fd/0)}"
} >a.txt
Which can be called as
f arg1 arg2...
or
f <file
Functions are passed arguments in the same way as any other commands.
The second echo-like wrapper first tests for either a set first argument, or stdin coming from a non-tty, and conditionally calls printf using either the positional parameters if set, or stdin. The test expression avoids the case of both zero arguments and no redirection from a file, in which case Bash would try expanding the output of the terminal, hanging the shell.

F () {
echo "$1" > a.txt
}
You don't use parentheses when you call it. This is how you call it:
F "text to save"

Yes, only you should call it with F sth:
F()
{
echo "$1" > a.txt
}
Read more here.

This was answered long ago, but to provide an answer that satisfies the original request (even though that is likely not what is actually desired):
This is based on Magic Aliases: A Layering Loophole in the Bourne Shell by Simon Tatham.
F() { str="$(history 1)"; str=${str# *F(}; echo "${str%)*}"; } >a.txt
alias F='\F #'
$ F(sth)
$ cat a.txt
sth
See also ormaaj's better magic alias.

Related

Bash call variables in for loop variables

I am curious to know that whether it is possible in bash that we can run for loop on a bunch of variables and call those values within for loop. Example:
a="hello"
b="world"
c="this is bash"
for f in a b c; do {
echo $( $f )
OR
echo $ ( "$f" )
} done
I know this is not working but can we call the values saved in a, b and c variables in for loop with printing f. I tried multiple way but unable to resolve.
You need the ! like this:
for f in a b c; do
echo "${!f}"
done
You can also use a nameref:
#!/usr/bin/env bash
a="hello"
b="world"
c="this is bash"
declare -n f
for f in a b c; do
printf "%s\n" "$f"
done
From the documentation:
If the control variable in a for loop has the nameref attribute, the list of words can be a list of shell variables, and a name reference will be established for each word in the list, in turn, when the loop is executed.
Notes on the OP's code, (scroll to bottom for corrected version):
for f in a b c; do {
echo $( $f )
} done
Problems:
The purpose of { & } is usually to put the separate outputs of
separate unpiped commands into one stream. Example of separate
commands:
echo foo; echo bar | tac
Output:
foo
bar
The tac command puts lines of input in reverse order, but in the
code above it only gets one line, so there's nothing to reverse.
But with curly braces:
{ echo foo; echo bar; } | tac
Output:
bar
foo
A do ... done already acts just like curly braces.
So "do {" instead of a "do" is unnecessary and redundant; but it
won't harm anything, or have any effect.
If f=hello and we write:
echo $f
The output will be:
hello
But the code $( $f ) runs a subshell on $f which only works if $f is
a command. So:
echo $( $f )
...tries to run the command hello, but there probably is no such
command, so the subshell will output to standard error:
hello: command not found
...but no data is sent to standard output, so echo will
print nothing.
To fix:
a="hello"
b="world"
c="this is bash"
for f in "$a" "$b" "$c"; do
echo "$f"
done

Bash: find function in files with the same content

I'm trying to solve some problem that would behave as follow
Let's quote situation
In the directory, I have few scripts with some content (it doesn't matter what it's doing)
example1.sh
example2.sh
example3.sh
...etc
Altogether there are 50 scripts
Some of these scripts contain the same function, for example
function foo1
{
echo "Hello"
}
and in some scripts function can be named the same but has other content or modified, for example
function foo1
{
echo "$PWD"
}
or
function foo1
{
echo "Hello"
ls -la
}
I have to find the same function with the same name and the same content in these scripts
For example,
foo1 the same or modified content in example1.sh and example2.sh -> what I want
foo1 other content in example1.sh and example3.sh -> not interested
My question is what is the best idea to solve this problem? What do you think?
My idea was to sort content from all scripts and grep names of repeated functions. I managed to do that but still, it's not what I want because I have to check every file with this function and check its content... and it's a pain in the neck because for some functions there are 10 scripts...
I was wondering about extracting content from repeated functions but I don't know how to do it, what do you think? Or maybe you have some other suggestions?
Thank you in advance for your answer!
what is the best idea to solve this problem?
Write a shell language tokenizer and implement syntax parsing enough to extract function definitions from a file. Sources of shell implementations will be an inspiration. Then build a database of file->function+body and list all files with same function+body.
For simple enough functions, an awk or perl or python script would be enough to cover most cases. But the best would be full shell language tokenizer.
Do not use function name {. Instead use name() {. See bash obsolete and deprecated syntax.
With the following files:
# file1.sh
function foo1
{
echo "Hello"
}
# file2.sh
function foo1
{
echo "Hello"
}
# file3.sh
function foo1
{
echo "$PWD"
}
# file4.sh
function foo1
{
echo "$PWD"
}
The following script:
printf "%s\n" *.sh |
while IFS= read -r file; do
sed -zE '
s/(function[[:space:]]+([[:print:]]+)[[:space:]]*\{|(function[[:space:]]+)?([[:print:]]+)[[:space:]]*\([[:space:]]*\)[[:space:]]*\{)([^}]*)}/\x01\2\4\n\5\x02/g;
/\x01/!d;
s/[^\x01\x02]*\x01([^\x01\x02]*)\x02[^\x01\x02]*/\1\n\x00/g
' "$file" |
sed -z 's~^~'"$file"'\x01~';
done |
awk -v RS='\0' -v FS='\1' '
{cnt[$2]++; a[$2]=a[$2]" "$1}
END{ for (i in cnt) if (cnt[i] > 1) print a[i], i }
'
outputs:
file1.sh file2.sh foo1
echo "Hello"
file3.sh file4.sh foo1
echo "$PWD"
Indicating there is the same function foo1 in file1.sh and file2.sh and the same function foo1 in file3.sh and file4.sh.
Also note that a script can and do:
if condition; then
func() { echo something; }
else
func() { echo something else; }
fi
A real tokenizer will have to also take that into account.
Create a message digest of the content of each function and use it as a key in an associative array. Add files that contain the same function digest to find duplicates.
You may want to normalize space in the function content and tweak the regex address range.
#!/usr/bin/env bash
# the 1st argument is the function name
func_name="$1"
func_pattern="^function $func_name[[:blank:]]*$"
shift
declare -A dupe_groups
while read -r func_dgst file; do # collect results in an associative array
dupe_groups[$func_dgst]+="$file "
done < <( # the remaining arguments are scripts
for f in "${#}"; do
if grep --quiet "$func_pattern" "$f"; then
dgst=$( # use an address range in sed to print function contents
sed -n "/$func_pattern/,/^}/p" "$f" | \
# pipe to openssl to create a message digest
openssl dgst -sha1 )
echo "$dgst $f"
fi
done )
# print the results
for key in "${!dupe_groups[#]}"; do
echo "$key ${dupe_groups[$key]}"
done
I tested with your example{1..3}.sh files added the following example4.sh for a duplicate function.
example4.sh
function foo1
{
echo "Hello"
ls -la
}
function another
{
echo "there"
}
To run
./group-func.sh foo1 example1.sh example2.sh example3.sh example4.sh
Results
155853f813e944a7fcc5ae73ee2d959e300d217a example1.sh
7848af9b8b9d48c5cb643f34b3e5ca26cb5bfbdd example2.sh
4771de27523a765bb0dbf070691ea1cbae841375 example3.sh example4.sh

Define bash variable to be evaluated every time it is used

I want to define bash a variable which will be evaluated every time it is used.
My goal is to define two variables:
A=/home/userA
B=$A/my_file
So whenever I update A, B will be updated with the new value of A
I know how to do it in prompt variables, but, is there a way to do it for regular variables?
If you have Bash 4.4 or newer, you could (ab)use the ${parameter#P} parameter expansion, which expands parameter as if it were a prompt string:
$ A='/home/userA'
$ B='$A/my_file' # Single quotes to suppress expansion
$ echo "${B#P}"
/home/userA/my_file
$ A='/other/path'
$ echo "${B#P}"
/other/path/my_file
However, as pointed out in the comments, it's much simpler and more portable to use a function instead:
$ appendfile() { printf '%s/%s\n' "$1" 'my_file'; }
$ A='/home/user'
$ B=$(appendfile "$A")
$ echo "$B"
/home/user/my_file
$ A='/other/path'
$ B=$(appendfile "$A")
$ echo "$B"
/other/path/my_file
No. Use a simple and robust function instead:
b() {
echo "$a/my_file"
}
a="/home/userA"
echo "b outputs $(b)"
a="/foo/bar"
echo "b outputs $(b)"
Result:
b outputs /home/userA/my_file
b outputs /foo/bar/my_file
That said, here's one ugly way of fighting the system accomplish your goal verbatim:
# Trigger a re-assignment after every single command
trap 'b="$a/my_file"' DEBUG
a="/home/userA"
echo "b is $b"
a="/foo/bar"
echo "b is $b"
Result:
b is /home/userA/my_file
b is /foo/bar/my_file

How to parse a string which contains spaces as an argument in Bash Script [duplicate]

In bash one can escape arguments that contain whitespace.
foo "a string"
This also works for arguments to a command or function:
bar() {
foo "$#"
}
bar "a string"
So far so good, but what if I want to manipulate the arguments before calling foo?
This does not work:
bar() {
for arg in "$#"
do
args="$args \"prefix $arg\""
done
# Everything looks good ...
echo $args
# ... but it isn't.
foo $args
# foo "$args" would just be silly
}
bar a b c
So how do you build argument lists when the arguments contain whitespace?
There are (at least) two ways to do this:
Use an array and expand it using "${array[#]}":
bar() {
local i=0 args=()
for arg in "$#"
do
args[$i]="prefix $arg"
((++i))
done
foo "${args[#]}"
}
So, what have we learned? "${array[#]}" is to ${array[*]} what "$#" is to $*.
Or if you do not want to use arrays you need to use eval:
bar() {
local args=()
for arg in "$#"
do
args="$args \"prefix $arg\""
done
eval foo $args
}
Here is a shorter version which does not require the use of a numeric index:
(example: building arguments to a find command)
dir=$1
shift
for f in "$#" ; do
args+=(-iname "*$f*")
done
find "$dir" "${args[#]}"
Use arrays (one of the hidden features in Bash).
You can use the arrays just as you suggest, with a small detail changed. The line calling foo should read
foo "${args[#]}"
I had a problem with this too as well. I was writing a bash script to backup the important files on my windows computer (cygwin). I tried the array approach too, and still had some issues. Not sure exactly how I fixed it, but here's the parts of my code that are important in case it will help you.
WORK="d:\Work Documents\*"
# prompt and 7zip each file
for x in $SVN $WEB1 $WEB2 "$WORK" $GRAPHICS $W_SQL
do
echo "Add $x to archive? (y/n)"
read DO
if [ "$DO" == "y" ]; then
echo "compressing $x"
7zip a $W_OUTPUT "$x"
fi
echo ""
done

How to pass command line parameters with quotes stored in single variable?

I want to call external application from shell script, but this shell script gets parameters (from other script) in a single variable. All was OK until I did not have to use double quotes for single parameter, but words separated by space.
Here is simplified example of my problem (sh_param just prints all passed parameters):
#!/bin/sh
pass() {
echo "Result with \$#"
./sh_param $#
echo "Result with \"\$#\""
./sh_param "$#"
echo "Result with \$*"
./sh_param $*
echo "Result with \"\$*\""
./sh_param "$*"
}
pass '"single param" separate params'
and results (sh_param just prints all passed parameters):
Result with $#
Param: "single
Param: param"
Param: separate
Param: params
Result with "$#"
Param: "single param" separate params
Result with $*
Param: "single
Param: param"
Param: separate
Param: params
Result with "$*"
Param: "single param" separate params
And I want:
Param: single param
Param: separate
Param: params
script
pass() {
echo 'Result with "$#"'
sh_param "$#"
}
sh_param() {
for i in "$#"
do
echo Param: $i
done
}
pass "single param" separate param
result
Result with "$#"
Param: single param
Param: separate
Param: param
Answering my own question. BIG thanks goes to pzanoni.
xargs seems to parse correctly anything you are throwing to it :-)
"$#", "$", $# and $ works good with it. So my code now looks like:
#!/bin/sh
pass() {
echo $* | xargs ./sh_param
}
pass '"single param" separate params'
And result is what I wanted:
Param: single param
Param: separate
Param: params
If you're stuck with a single variable, you'll have to use eval:
$ show() { i=0; for param; do ((i++)); echo "$i>$param"; done; }
$ show '"single param" separate params'
1>"single param" separate params
$ eval show '"single param" separate params'
1>single param
2>separate
3>params
Note that the double quotes are eaten by the shell.
Avoid passing the whole thing as a single argument (as you do with the squotes). It's hard enough to make shell scripts that carefully preserve the number of arguments passed, without needing to make it harder by flattening them into strings and expanding them again.
If you do need to though, there are some best practices to follow. I had to write a script a while back that serves as a su/sudo wrapper: su takes a single argument that it passes to sh to evaluate; sudo takes any number of arguments that it passes on unmodified to an execv(e).
You have to be pretty careful about getting it portable and not running into problems in obscure shells. The whole script isn't very useful to post, but the general gist is to write some escaping functions and very perform the quoting to build up a bullet-proof, unreadably-cautiously escaped string that's safe to pass to eval.
bash_escape() {
# backtick indirection strictly necessary here: we use it to strip the
# trailing newline from sed's output, which Solaris/BSD sed *always* output
# (unlike GNU sed, which outputs "test": printf %s test | sed -e s/dummy//)
out=`echo "$1" | sed -e s/\\'/\\''\\\\'\\'\\'/g`
printf \'%s\' "$out"
}
append_bash_escape() {
printf "%s " "$1"
bash_escape "$2"
}
sed_escape() {
out=`echo "$1" | sed -e 's/[\\/&]/\\\\&/g'`
printf %s "$out"
}
These useful functions let you do something like this to portably build command strings:
COMMAND=
while [ $# -gt 0 ] ; do
COMMAND=`append_bash_escape "$COMMAND" "$1"` ; shift
done
You can then manipulate the command string, for example by running bash_escape on it and using sed_escape to substitute into the string "su - root -c SUB_HERE", or just substitute it directly into a string like "sudo -- SUB_HERE". You then have something you can safely eval without worrying about metacharacters, argument splitting, unescaped globs, and so on.
Be paranoid, and unittest your script with every nasty input you can think of to make sure your argument splitting and preserving really is correct!
This might work, although it's hard to tell without knowing how sh_param handles its arguments.
#!/bin/sh
pass() {
echo "Result with \"\$#\""
./sh_param "$#"
}
pass "single param" separate params

Resources