"For" loop in bash script only run once - linux

The script goal is simple.
I have many directory which contains some captured traffic files.
I want to run a command for each directory. So I came up with a script. But I don't know why the script is run only with the first match.
#!/bin/bash
# Collect throughput from a group of directory containing capture files
# Group of directory can be specify by pattern
# Usage: ./collectThroughputList [regex]
# [regex] is the name pattern of the group of directory
for DIR in $( ls -d $1 ); do
if test -d "$DIR"; then
echo Collecting throughputs from directory: "$DIR"
( sh collectThroughput.sh $DIR > $DIR.txt )
fi
done
echo Done\!
I try it with:
for DIR in $1; do
or
for DIR in `ls -d $1`; do
or
for DIR in $( ls -d "$1" ); do
or
for DIR in $( ls -d $1 ); do
But the result is the same. The for loop runs only one time.
Finally I found this one and did some tricks for it to work. However, I would like to know why my first script doesn't work.
find *Delay50ms* -type d -exec bash -c "cd '{}' && echo enter '{}' && ../collectThroughput.sh ../'{}' > ../'{}'.txt" \;
"*Delay*" is the directory pattern name that I want to run the command with.
Thanks for pointing out the issues.

Since you want to find all sub-directories under $1, use it like this:
for DIR in $(find $1 -type d)

Problem
Most probably the problem you are encountering is due to the fact that you are trying to use some kind of pattern like * as argument to your script.
Running it with something like:
my_script *
What's happening here is, that the shell will expand * prior to calling your script.
Thus after word splitting has been performed $1 in your script will just reference the first entry returned by ls.
Example
Given the following directory layout:
directory_a
directory_b
directory_c
Calling my_script * will result in:
my_script directory_a directory_b directory_c
being called thus your loop just iterating over $(ls -d directory_a) which in fact is nothing else but directory_a alone.
Solution
To have the program run with $1=* you would have to escape the * prior to calling your script.
Try running:
my_script \*
To see it effectively does what it is intended to do then. This way $1 in your script will contain * instead of directory_a which most probably is the way you wanted your script to work.

as mikyra has pointed out, the shell expands your argument * to all entries in your directory prior to passing it to your script.
if you want shell-expansion of your wildcards (e.g. * matches all but hidden files), you could simply leave the expansion to the shell and use the result, by iterating over all arguments, rather than just the first one:
for DIR in $#; do
# ...
done
if you want to do the expansion yourself (e.g. because the pattern should be applied only to a pre-filtered list or to files in a different directory, or because you want regex-expansion rather than shell globbing), you have to protect the argument from being expanded by the shell, either using backslash notation (like mikyra's \*) or by using quotes (which is often easier to use):
my_script "*"

Related

Shell Script Issue with Multiple Filetypes

Some of my files are separated into different directories such as /apps, /games, /docs etc....
Within each directory, is a subdirectory called _CHECKSUM. Inside this directory, is a file called openssl.sh.
For example:
openssl sha1 /path/to/apps/*.iso | sed 's/\/.*.\///' > /path/to/apps/_CHECKSUM/sum.sha1
This outputs to a file called sum.sha1 within the _CHECKSUM directory, of which the contents could look like this:
SHA1(anApp.iso)= b398c8b175411e6174942d7b4acbc5c90473a852
SHA1(anotherApp.iso)= cc150483feed3d4b607749f31eddccefd0ba5478
SHA1(yetAnotherApp.iso)= d9682a2eca25b70dddf7a906374c27ee35614c7d
However, some directories contain multiple filetypes, so the script would have to look like this:
openssl sha1 /path/to/games/*.{7z,iso} | sed 's/\/.*.\///' > /path/to/games/_CHECKSUM/sum.sha1
producing something like this:
SHA1(myFaveGame.7z)= b398c8b175411e6174942d7b4acbc5c90473a852
SHA1(anotherGoodGame.iso)= cc150483feed3d4b607749f31eddccefd0ba5478
I don't want to always run these scripts manually, so I created the following script, /path/to/scripts/openssl_recursive.sh:
#!/bin/bash
# finds every openssl.sh recursively and executes it.
IFS=$'\n'
for file in $(find /path/to -name "openssl.sh" | sort -n)
do
echo "executing $file ..."
sh $file
echo "done.";
done
This seems to work fine for all directories where just one file type exists. However, for the openssl.sh scripts that contain multiple extensions, an empty sum.sha1 file is created.
Why is it that if I run the openssl.sh directly, it will create the correct result in sum.sha1 for directories with multiple filetypes, yet if I run the openssl_recursive.sh, this results in an empty sum.sha1?
as stated here, modern Debian and Ubuntu systems symlink sh to dash by default, which is a lighter version and lacks some advanced features.
So this may not be the same shell, and probably doesn't like "rich" wildcard constructs like *.{7z,iso}. You must have fallen into that category.
On the other hand, bash accepts those wildcards happily.
So a working solution is forcing the use of /bin/bash env variable:
#!/bin/bash
# finds every openssl.sh recursively and executes it.
IFS=$'\n'
for file in $(find /path/to -name "openssl.sh" | sort -n)
do
echo "executing $file ..."
/bin/bash $file
echo "done.";
done

how to pass asterisk into ls command inside bash script

Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.

Bash: execute a multi-command line string in a script

There is, in a file, some multi-command line like this:
cd /home/user; ls
In a bash script, I would like to execute these commands, adding some arguments to the last one. For example:
cd /home/user; ls -l *.png
I thought it would be enough to do something like this:
#!/bin/bash
commandLine="$(cat theFileWithCommandInside) -l *.png"
$commandLine
exit 0
But it says:
/home/user;: No such file or directory
In other words, the ";" character doesn't mean anymore "end of the command": The shell is trying to find a directory called "user;" in the home folder...
I tried to replace ";" with "&&", but the result is the same.
the point of your question is to execute command stored in string. there are thousands of ways to execute that indirectly. but eventually, bash has to involve.
so why not explicitly invoke bash to do the job?
bash -c "$commandLine"
from doc:
-c string
If the -c option is present, then commands are read from string. If there are arguments after the string, they are assigned to the positional parameters, starting with $0.
http://linux.die.net/man/1/bash
Why dont you execute the commands themselves in the script, instead of "importing" them?
#!/bin/bash
cd /home/user; ls -l *.png
exit 0
Wrap the command into a function:
function doLS() {
cd user; ls $#
}
$# expands to all arguments passed to the function. If you (or the snippet authors) add functions expecting a predefined number of arguments, you may find the positional parameters $1, $2, ... useful instead.
As the maintainer of the main script, you will have to make sure that everyone providing such a snippet provides that "interface" your code uses (i.e. their code defines the functions your program calls and their functions process the arguments your program passes).
Use source or . to import the function into your running shell:
#!/bin/bash
source theFileWithCommandInside
doLS -l *.png
exit 0
I'd like to add a few thoughts on the ; topic:
In other words, the ";" character doesn't mean anymore "end of the
command": The shell is trying to find a directory called "user;" in
the home folder...
; is not used to terminate a statement as in C-style languages. Instead it is used to separate commands that should be executed sequentially inside a list. Example executing two commands in a subshell:
( command1 ; command2 )
If the list is part of a group, it must be succeeded by a ;:
{ command1 ; command2 ; }
In your example, tokenization and globbing (replacing the *) will not be executed (as you may have expected), so your code will not be run successfully.
The key is: eval
Here, the fixed script (look at the third line):
#!/bin/bash
commandLine="$(cat theFileWithCommandInside) -l *.png"
eval $commandLine
exit 0
Using the <(...) form
sh <(sed 's/$/ *.png/' theFileWithCommandInside)

Bash shell script function gives "find: missing argument to `-exec'" error

I wrote a function in a Bash shell script to search a Linux tree for filenames matching a pattern containing a regular expression, with colour highlighting:
function ggrep {
LS_="ls --color {}|sed s~./~~"
[ -n "$1" -a "$1" != "*" ] && NAME_="-iname $1" || NAME_=
[ -n "$2" ] && EXEC_="egrep -q \"$2\" \"{}\" && $LS_ && egrep -n \"$2\" --color=always \"{}\"|sed s~^B~\ B~" || EXEC_=$LS_
FIND_="find . -type f $NAME_ -exec sh -c \"$EXEC_\" \\;"
echo -e \\e[7m $FIND_ \\e[0m
$FIND_
}
e.g. ggrep a* lists all files starting with a under the current directory tree,
and ggrep a* x lists of files starting with a and containing x
When I run it, I get:
find: missing argument to `-exec'
even though I get the correct output when I copy and paste the line output by "echo" into the terminal. Can anyone please tell me what I've done wrong?
Secondly, it would be great if ggrep * x listed all files containing x, but * expands to a list of filenames and I need to use \* or '*' instead. Is there a way around this? Thanks!
Terminate the find command with \; instead of \\; .
find . -type f $NAME_ -exec sh -c \"$EXEC_\" \;
eval $FIND_
in the last line of the function body works fine for me.
Expansions in BASH are generally not recursive, so if you load a command from a variable, you should always use "eval" to enforce reprocessing the expanded variable as it was a fresh input. Normally quotes are not handled properly within a string that has already been expanded.
To your second problem, I think there is no satisfactory solution. The shell will always expand * before passing it to anything controlled by you. You can disable this expansion, but that is a global setting. Anyway, I think that this expansion could actually act in favor of your function. Consider rewriting it in a way that takes advantage of it. (I did not analyze whether the current version was close to that or not.)

Linux bash shell scripts - spaces in file names

It has been a long time since I did much bash script writing.
This is a bash script to copy and rename files by deleting all before the first period delimiter:
#!/bin/bash
mkdir fullname
mv *.audio fullname
cd fullname
for x in * ;
do
cp $x ../`echo $x | cut -d "." -f 2-`
done
cd ..
ls
It works well for file names with no embedded spaces but not for those with spaces.
How can I change the code to fix this simple Linux bash script? Any suggestions for improving the code for other reasons would also be welcome.
Example filenames, some with embedded spaces and some not (from link)
http://www.homenetvideo.com/demo/index.php?/Radio%20%28VLC%29
Ambient.A6.SOMA Space Station.audio
Blues.B9.Blues Radio U.K.audio
Classical.K3.Radio Stephansdom - Vienna.audio
College.CI.KDVS U of California, Davis.audio
Country.Q1.K-FROG.audio
Easy.G4.WNYU.audio
Eclectic.M2.XPN.audio
Electronica.E2.Rinse.audio
Folk.F1.Radionomy.audio
Hiphop.H1.NPR.audio
Indie.I4.WAUG.audio
Jazz.J6.KCSM.audio
Latin.L3.Mega.audio
Misc.X7.Gaydio.audio
News.N9.KQED.audio
Oldies.O1.Lonestar.audio
OldTime.Y1.Roswell.audio
Progressive.P1.Aural Moon.audio
Rock.R8.WXRT.audio
Scanner.Z3.Montreal.audio
Soul.S1.181.FM.audio
Talk.T2.TWiT.audio
World.W3.Persian.audio
http://lh5.googleusercontent.com/-QjLEiAtT4cw/U98_UFcWvvI/AAAAAAAABv8/gyPhbg8s7Bw/w681-h373-no/homenet-radio.png
Whenever you deal with file names that might have spaces in them, you must reference them as "$x" rather than just $x. That's what's causing your cp command to fail.
Your echo command is also problematic. Although echo does the right thing for simple spaces - it echoes a file named A B C as A B C - it will still fail if you have more than one consecutive space in the name, or whitespace that isn't a simple space character.
Instead of passing the file names to external programs for processing, which always requires getting them through the whitespace-hostile command line, you should use bash built-in functions for string manipulations wherever possible, e.g. ${x%%foo}, ${x#bar} and similar functions. The man page describes them under "Parameter expansion".
Here's my suggestion:
#!/bin/bash
shopt -s nullglob
mkdir fullname
mv *.audio fullname
(
cd fullname || exit
for x in *; do
cp "$x" "../${x#*.}"
done
)
ls
nullglob prevents * from presenting itself if no file matches it. Just optional.
() summons a subshell and saves you from changing back to another directory.
|| exit terminates the subshell if cd fails to change directory.
${x#*.} removes the <first>. from $x and expands it.

Resources