Understanding !# and $# in bash - linux

I've just recently started programming scala, and in the book "Programming in Scala"(www.artima.com/pins1ed) the following method of executing scala scripts in linux is presented:
#!/bin/sh
exec scala "$0" "$#"
!#
// Say hello to the first argument
println("Hello, "+ args(0) +"!")
Now I've been using linux for a long time, but bash scripting is not my speciality. Now I can guess how this type of scrpt works(and it works beautifully), but I was wondering what do the !# and $# do exactly.
Thanks in advance for all the help!

Beautiful indeed. $0 and "$#" are positional paramters ($0 = command itself just like argv[0] in C, and argv[1]+ for "$#"), whereas #!* tells the shell, and sometimes the kernel if it recognizes it which program to execute for the file.
The thing that happens here actually is that the shell opens the script for input reading but on the point of exec, it transfers the input to scala, but scala wouldn't have to read it again from the beginning since the file descriptor is still open and so scala continues reading on the next line.
Rarely do I see scripts that do that with apparent and simple presentation of how it functions.
Note that exec replaces the process of the shell running the script and so it's like the shell becomes scala but scala would have the environment variable and opened handlers as the same.
UPDATE
Looks like I was wrong. Scala itself reads the whole but skips what it could see as header lines to it. So this is the real purpose of !#:
Script files may have an optional header that is ignored if present. There are two ways to format the header: either beginning with #! and ending with !#, or beginning with ::#! and ending with ::!#.

!# doesn't have anything to do with Bash. It's part of the Scala Language. It separates a non-Scala header from Scala Code in Script Mode.

"$#" represents all the script arguments.

You asked about what "$#" does exactly. It passes the arguments to the script in a non-word-splitting manner. Let's see some examples:
$ cat echowrap
#!/bin/sh
set -x
echo $*
echo $#
echo "$#"
$ ./echowrap oneword 'two words'
+ echo oneword two words
oneword two words
+ echo oneword two words
oneword two words
+ echo oneword 'two words'
oneword two words
In the first example, $* has split the input args so that echo sees three words.The second example $# behaves identically. The third example "$#" does not undergo word-splitting, therefore echo sees the same 2 args as were originally passed.
Consider a more useful example; if you called your script as
$ ./scalascript 'Joe Bloggs'
then try changing "$#" into $# or $*, the shell will pass two arguments to scala, scala will see args(0) and args(1), and the output of the test program will be different.

Related

How to put array in command-string so that I can eval it?

I've been struggling with this problem for a while. Let's assume I have two scripts.
test1.sh
test2.sh
The code in test1.sh is the following:
array1="/dir/file1.txt /dir/file2.txt /dir/file3.txt"
array2="/dir/file4.txt /dir/file5.txt /dir/file6.txt"
./test2.sh "$array1" "$array2"
The code in test2.sh is the following:
echo $1
echo $2
This works fine, and prints the two arrays correctly:
/dir/file1.txt /dir/file2.txt /dir/file3.txt
/dir/file4.txt /dir/file5.txt /dir/file6.txt
For the project I am working on I have to put the execution code in a variable so that I can run it with the eval-command. I've tried it as follows:
array1="/dir/file1.txt /dir/file2.txt /dir/file3.txt"
array2="/dir/file4.txt /dir/file5.txt /dir/file6.txt"
com="./test2.sh "$array1" "$array2" "
eval $com
However, this returns:
/dir/file1.txt
/dir/file2.txt
How do I get it to give the same input? I've been struggling with this for a while now and Im honestly pretty stuck. I believe it is caused by the many quatation marks in com-variable, but I am not sure.
Many thanks,
Patrick
Make com an array, and you don't need eval.
#!/usr/bin/env bash
# ^^^^- NOT /bin/sh. Run with "bash yourscript", not "sh yourscript"
# none of these are actually arrays; they're just misleadingly-named strings
array1="/dir/file1.txt /dir/file2.txt /dir/file3.txt"
array2="/dir/file4.txt /dir/file5.txt /dir/file6.txt"
# This is an actual array.
com=( ./test2.sh "$array1" "$array2" )
# Expand each element of the array into a separate word of a simple command
"${com[#]}"

Dynamically generate command in bash

I want to dynamically generate pretty long bash command depending on the command line options. Here is what I tried:
CONFIG_PATH=""
#Reading CONFIG_PATH from getopts if supplied
SOME_OPT=""
if [ ! -z "$CONFIG_PATH" ]; then
SOME_OPT="-v -s -cp $CONFIG_PATH"
fi
some_bash_command $SOME_OPT
The point here is that I want to pass 0 arguments to the some_bash_command if no arguments were passed to the script. In case there were some arguments I want to pass them.
It works fine, but the problem is that this approach looks rather unnatural to me.
What would be a better yet practical way to do this?
Your approach is more-or-less the standard one; the only significant improvement that I'd recommend is to use an array, so that you can properly quote the arguments. (Otherwise your command can horribly misbehave if any of the arguments happen to include special characters such as spaces or asterisks.)
So:
SOME_OPT=()
if [ ! -z "$CONFIG_PATH" ]; then
SOME_OPT=(-v -s -cp "$CONFIG_PATH")
fi
some_bash_command "${SOME_OPT[#]}"

How to store command arguments which contain double quotes in an array?

I have a Bash script which generates, stores and modifies values in an array. These values are later used as arguments for a command.
For a MCVE I thought of an arbitrary command bash -c 'echo 0="$0" ; echo 1="$1"' which explains my problem. I will call my command with two arguments -option1=withoutspace and -option2="with space". So it would look like this
> bash -c 'echo 0="$0" ; echo 1="$1"' -option1=withoutspace -option2="with space"
if the call to the command would be typed directly into the shell. It prints
0=-option1=withoutspace
1=-option2=with space
In my Bash script, the arguments are part of an array. However
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2="with space"')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
prints
0=-option1=withoutspace
1=-option2="with space"
which still shows the double quotes (because they are interpreted literally?). What works is
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
which prints again
0=-option1=withoutspace
1=-option2=with space
What do I have to change to make ARGUMENTS+=('-option2="with space"') work as well as ARGUMENTS+=('-option2=with space')?
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
Get rid of the single quotes. Write the options exactly as you would on the command line.
ARGUMENTS+=(-option1=withoutspace)
ARGUMENTS+=(-option2="with space")
Note that this is exactly equivalent to your second option:
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
-option2="with space" and '-option2=with space' both evaluate to the same string. They're two ways of writing the same thing.
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
It's the exact right thing to do. Arrays are perfect for this. Using a flat string would be a mistake.

"read" command not executing in "while read line" loop [duplicate]

This question already has answers here:
Read user input inside a loop
(6 answers)
Closed 5 years ago.
First post here! I really need help on this one, I looked the issue on google, but can't manage to find an useful answer for me. So here's the problem.
I'm having fun coding some like of a framework in bash. Everyone can create their own module and add it to the framework. BUT. To know what arguments the script require, I created an "args.conf" file that must be in every module, that kinda looks like this:
LHOST;true;The IP the remote payload will connect to.
LPORT;true;The port the remote payload will connect to.
The first column is the argument name, the second defines if it's required or not, the third is the description. Anyway, long story short, the framework is supposed to read the args.conf file line by line to ask the user a value for every argument. Here's the piece of code:
info "Reading module $name argument list..."
while read line; do
echo $line > line.tmp
arg=`cut -d ";" -f 1 line.tmp`
requ=`cut -d ";" -f 2 line.tmp`
if [ $requ = "true" ]; then
echo "[This argument is required]"
else
echo "[This argument isn't required, leave a blank space if you don't wan't to use it]"
fi
read -p " $arg=" answer
echo $answer >> arglist.tmp
done < modules/$name/args.conf
tr '\n' ' ' < arglist.tmp > argline.tmp
argline=`cat argline.tmp`
info "Launching module $name..."
cd modules/$name
$interpreter $file $argline
cd ../..
rm arglist.tmp
rm argline.tmp
rm line.tmp
succes "Module $name execution completed."
As you can see, it's supposed to ask the user a value for every argument... But:
1) The read command seems to not be executing. It just skips it, and the argument has no value
2) Despite the fact that the args.conf file contains 3 lines, the loops seems to be executing just a single time. All I see on the screen is "[This argument is required]" just one time, and the module justs launch (and crashes because it has not the required arguments...).
Really don't know what to do, here... I hope someone here have an answer ^^'.
Thanks in advance!
(and sorry for eventual mistakes, I'm french)
Alpha.
As #that other guy pointed out in a comment, the problem is that all of the read commands in the loop are reading from the args.conf file, not the user. The way I'd handle this is by redirecting the conf file over a different file descriptor than stdin (fd #0); I like to use fd #3 for this:
while read -u3 line; do
...
done 3< modules/$name/args.conf
(Note: if your shell's read command doesn't understand the -u option, use read line <&3 instead.)
There are a number of other things in this script I'd recommend against:
Variable references without double-quotes around them, e.g. echo $line instead of echo "$line", and < modules/$name/args.conf instead of < "modules/$name/args.conf". Unquoted variable references get split into words (if they contain whitespace) and any wildcards that happen to match filenames will get replaced by a list of matching files. This can cause really weird and intermittent bugs. Unfortunately, your use of $argline depends on word splitting to separate multiple arguments; if you're using bash (not a generic POSIX shell) you can use arrays instead; I'll get to that.
You're using relative file paths everywhere, and cding in the script. This tends to be fragile and confusing, since file paths are different at different places in the script, and any relative paths passed in by the user will become invalid the first time the script cds somewhere else. Worse, you aren't checking for errors when you cd, so if any cd fails for any reason, then entire rest of the script will run in the wrong place and fail bizarrely. You'd be far better off figuring out where your system's root directory is (as an absolute path), then referencing everything from it (e.g. < "$module_root/modules/$name/args.conf").
Actually, you're not checking for errors anywhere. It's generally a good idea, when writing any sort of program, to try to think of what can go wrong and how your program should respond (and also to expect that things you didn't think of will also go wrong). Some people like to use set -e to make their scripts exit if any simple command fails, but this doesn't always do what you'd expect. I prefer to explicitly test the exit status of the commands in my script, with something like:
command1 || {
echo 'command1 failed!' >&2
exit 1
}
if command2; then
echo 'command2 succeeded!' >&2
else
echo 'command2 failed!' >&2
exit 1
fi
You're creating temp files in the current directory, which risks random conflicts (with other runs of the script at the same time, any files that happen to have names you're using, etc). It's better to create a temp directory at the beginning, then store everything in it (again, by absolute path):
module_tmp="$(mktemp -dt module-system)" || {
echo "Error creating temp directory" >&2
exit 1
}
...
echo "$answer" >> "$module_tmp/arglist.tmp"
(BTW, note that I'm using $() instead of backticks. They're easier to read, and don't have some subtle syntactic oddities that backticks have. I recommend switching.)
Speaking of which, you're overusing temp files; a lot of what you're doing with can be done just fine with shell variables and built-in shell features. For example, rather than reading line from the config file, then storing them in a temp file and using cut to split them into fields, you can simply echo to cut:
arg="$(echo "$line" | cut -d ";" -f 1)"
...or better yet, use read's built-in ability to split fields based on whatever IFS is set to:
while IFS=";" read -u3 arg requ description; do
(Note that since the assignment to IFS is a prefix to the read command, it only affects that one command; changing IFS globally can have weird effects, and should be avoided whenever possible.)
Similarly, storing the argument list in a file, converting newlines to spaces into another file, then reading that file... you can skip any or all of these steps. If you're using bash, store the arg list in an array:
arglist=()
while ...
arglist+=("$answer") # or ("#arg=$answer")? Not sure of your syntax.
done ...
"$module_root/modules/$name/$interpreter" "$file" "${arglist[#]}"
(That messy syntax, with the double-quotes, curly braces, square brackets, and at-sign, is the generally correct way to expand an array in bash).
If you can't count on bash extensions like arrays, you can at least do it the old messy way with a plain variable:
arglist=""
while ...
arglist="$arglist $answer" # or "$arglist $arg=$answer"? Not sure of your syntax.
done ...
"$module_root/modules/$name/$interpreter" "$file" $arglist
... but this runs the risk of arguments being word-split and/or expanded to lists of files.

What language runs after I start a Konsole window, and what can it do?

How can I store the result of an an expression into a variable?
echo "hello" > var1
Can I also do something like this?
var1.substring(10,15);
var1.replace('hello', '2');
var1.indexof('hello')
PS. I had tried Googling, but was not sucessful.
As #larsmans comments, Konsole is the terminal emulator, which in turn runs a shell.
On linux, this is typically bash, but it could be something else.
Find out what shell you're using, and print the man page.
echo $SHELL # shows the full path to the shell
man ${SHELL##*/} # use the rightmost part (typically bash, in linux)
For a general introduction, use the wikipedia entry on the unix shell or the GNU Bash refererence
Some specific answers:
var1="hello"
echo ${var1:0:4} # prints "hell"
echo ${var1/hello/2} # prints "2" -- replace "hello" with "2"
And at the risk of showing off:
index_of() { (t=${1%%$2*} && echo ${#t}); } # define function index_of
index_of "I say hello" hello
6
But this goes beyond simple shell programming.
Konsole is bash basically. So its technically bash that you are looking for.
Suppose:
s="hello"
For var1.substring(1,3);
you would do:
$ echo ${s:1:2}
el
For var1.replace('e', 'u');
you can:
$ echo ${s/l/u} #replace only the first instance.
hullo
$ echo ${s//e/u} #this will replace all instances of e with u
For var1.indexof('l')
You can (I dont know of any bash-ish method but, anyway):
$ echo $(expr index hello l)
4
In bash (the standard shell on linux) the syntax for storing the result of an expression in a variable is
VAR=$( EXPRESSION )
so, for example:
$ var=$(echo "hello")
$ echo $var
hello
For your second question: yes, these kind of things are possible using only the shell - but you're probably better of using a scripting language like python.
For what its worth: Here is a document describing how to do string manipulations in bash.
As you can see, it's not exactly beautiful.

Resources