How to loop through subdirectories in a bash script? - linux

Suppose I'm in a directory, dir1. Within that directory I have myscript.sh and a directory subdir1. subdir1 has several subsub directories, subsub1, subsub2, subsub3. Within each of those subsub directories is a bash script all named script2.sh, and I want to run each one of them.
First, I just want to make sure I can print all the subsub directories.
I have:
for dir in /subdir1/*/ ; do
echo $dir
done
Can anyone tell me what I'm doing wrong?

I guess you could run them this way:
find subdir1 -type f -name "NAME_OF_YOUR_SCRIPT.sh" -exec {} \;

You are going to run into issues if any of the directories have spaces or other special characters in the names.
Based on what you provided
for scriptfile in ./subdir/*/*script2.sh
do
/bin/sh $scriptfile
done
may work best for you.
To deal with spaces or other special characters in your directory names, you'll want to use find and pass the output to xargs.

Related

Recursively find a directory and rename it in Shell Script

Im putting together a simple Shell script to run on a Linux Machine where I would:
1) Look for specific sub-directories within a main directory. These sub-dirs have a very specific naming convention (see below) and they are always 2 -max depth below the main directory.
2) Rename those sub-dirs to PART of its original name.
For example,
The sub directories are named:
andrew-11111
andrew-11112
andrew-11113
andrew-11114
The path to get to these sub dirs would look something like this:
myphotos/sailing/photos/andrew-1111
myphotos/sailing/photos/andrew-1112
myphotos/biking/photos/andrew-1113
myphotos/hiking/photos/andrew-1114
Id like take out the 'andrew-' from each of these sub dirs:
myphotos/sailing/photos/1111
myphotos/sailing/photos/1112
myphotos/biking/photos/1113
myphotos/hiking/photos/1114
Ive gotten as far as "finding" the sub dirs and listing them. I also understand how to copy and rename in command line. But putting it together at my level of shell scripting knowledge has been taking much more time than I can afford. Just a disclaimer, I am more than willing to learn, and have written a handful of shell scripts, but still new to this. Any help or examples are much appreciated!
Use wildcards to match the files in the nested directories
You can use bash parameter expansion operators to manipulate the filenames.
for file in myphotos/*/photos/*; do
name=${file##*/} # remove everything up to last /
dir=${file%/*} # remove everything from last /
newname=${name##*-} # remove everything up to last -
mv "$file" "$dir/$newname"
done
If you have the perl-based rename command, you can do:
rename 's#[^/]*-##' myphotos/*/photos/*
You can do it with this one-liner:
find -type d -name andrew\* -exec sh -c 'mv {} $(dirname {})/$(basename {} | cut -d"-" -f2)' \;
Explanation:
-type d find only directories
-name andrew\* self-explaining, you have to escape the * though
-exec sh -c '...' execute it in a subshell, so you can do the command substitution ($(...)) without problems
mv {} the {} holds whatever find finds
dirname gives you the path to a directory (try it out with a random path, my english is too bad now to explain better)
basename gives you the last directory of a given path
cut -d"-" -f2 use cut to cut off "andrew-". For this set the delimiter to - and select the field number 2

Loop through a directory with any level of depth

I want to execute a command on all files present on all levels in the directory. It may have any number of files and sub directories. Even these sub directories may contain any number of files and subdirectories. I want to do this using shell script. As I am new to this field can any one suggest me a way out.
You can use the command "find" with "xargs" after "|"(pipe).
Example: Suppose that I want to remove all files that have ".txt" extension on "Documents" directory:
find Documents -iname *.txt |xargs rm -f
Helps?
You can use a recursive command that uses wildcard characters (*) like so:
for dir in ~/dev/myproject/*; do (cd "$dir" && git status); done
If you want to apply commands on the individual files you should use the find command and execute commands on it like so:
find yourdirectory -type f -exec echo "File found: '{}'" \;
What this does:
finds all the items in the directory yourdirectory
that have the type f - so are a file
runs an exec on each file
Use find:
find -type f -exec COMMAND {} \;
-f applies the command only to files, not to directories. The command is recursive by default.

Shell Script to Recursively Loop Through Directory and print location of important files

So I am trying to write a command line shell script or a shell script that will be able to recursively loop through a directory, all its files, and sub-directories for certain files and then print the location of these files to a text file.
I know that this is possible using BASH commands such as find, locate, exec, and >.
This is what I have so far. find <top-directory> -name '*.class' -exec locate {} > location.txt \;
This does not work though. Can any BASH, Shell scripting experts help me out please?
Thank-you for reading this.
The default behavior of find (if you don't specify any other action) is to print the filename. So you can simply do:
find <top-directory> -name '*.class' > location.txt
Or if you want to be explicit about it:
find <top-directory> -name '*.class' -print > location.txt
You can save the redirection by using find's -fprint option:
find <top-directory> -name '*.class' -fprint location.txt
From the man page:
-fprint file
[...] print the full file name into file file. If file does not exist when find is run, it is created; if it does exist, it is truncated.
A less preferred way to do it is to use ls:
ls -d $PWD**/* | grep class
let's break it down:
ls -d # lists the directory (returns `.`)
ls -d $PWD # lists the directory - but this time $PWD will provide full path
ls -d $PWD/** # list the directory with full-path and every file under this directory (not recursively) - an effect which is due to `/**` part
ls -d $PWD/**/* # same like previous one, only that now do it recursively to the folders below (achieved by adding the `/*` at the end)
A better way of doing it:
After reading this due to recommendation from Charles Duffy, it appears as a bad idea to use both ls as well as find (article also says: "find is just as bad as ls in this context".) The reason it's a bad idea is because you can't control the output of ls: for example, you can't configure ls to terminate filenames with NUL. The reason it's problematic is that unix allows all kind of weird characters in a file-name (newline, pipe etc) and will "break" ls in a way you can't anticipate.
Better use a shell script for the task, and it's pretty simple task too:
Create a file my_script.sh, edit the file to contain:
for i in **/*; do
echo $PWD/$i
done
Give it execute permissions (by running: chmod +x my_script.sh).
Run it from the same directory with:
./my_script.sh
and you're good to go!

unix bash - save environment variable and loop

Let's say you have a first.sh file in a directory: "/home/userbob/scripts/foo/". Basically I would like to know how to loop through specific directories, each time going back up to a higher level directory and repeating.
The .sh file has something like this pseudocode:
#!/bin/bash
curdi={$PATH} #where the first.sh file sits on the server
FOLDERS="$curdi/waffles/inner/
$curdi/pancakes/inner/
$curdi/bagels/inner/"
for f in $FOLDERS
do
cd $f
cp innerofinner/* .
cd $curdi
done
The idea is to somehow copy all the contents of /home/userbob/scripts/foo/waffles/inner/innerofinner to /home/userbob/scripts/foo/waffles/inner/
(and basically repeating just with the path having pancakes, bagels.etc.)
Can't do it for all directories (*) under /home/userbob/scripts/foo/ because there are some that I don't want to copy.
This should do it:
for name in waffles pancakes bagels
do
cp "$curdi/$name/inner/innferofinner/"* "$curdi/waffles/inner"
done
Walking file trees? Sounds like a job for find!
#!/usr/local/bin/env bash
# only environment variables should be all-caps
dirs=({bagels,pancakes}/inner)
find "${dirs[#]}" -type d -maxdepth 1 -mindepth 1 -name innerofinner -execdir bash -c 'cp "$1"/* .' -- {} \;
I did a partial path and assumed a working directory of /home/userbob/scripts/foo. An absolute path would work, too, and would look like
dirs=(/home/userbob/scripts/foo/{bagels,pancakes}/inner)
This finds all directories exactly one level below the listed directory that are named "innerofinner" and, in their parent directories, executs bash and a simple cp script.
If you're wondering how this works, read below.
The dirs=() syntax creates an empty array named dirs. dirs+(a b) creates an array with a at index 0 and b at index 1. Any whitespace-delimited string will work, here. In a shell script {a,b,c} expands to a b c but A{a,b,c}B expands to AaB AbB AcB. So specifying {bagels,pancakes}/inner is just a way to say both bagels/inner and pancakes/inner without having to type as much.
A variable in bash can be expanded with $foo or with ${foo}; these are the same. An array in shell can be expanded to all of its elements with ${foo[#]} delimited by spaces (if you know perl or php this will make some sense) and quoting the expansion (always a good idea in shell!) prevents spaces innside the variable from being processed again by the shell. Thus, "${dir[#]}" becomes bagels/inner pancakes/inner.
Knowing this we see that the find command has become find bagels/inner pancakes/inner -maxdepth 1 -mindepth 1 -type d -name innerofinner and if you execute this it will return exactly two lines: both full paths to each innerofinner directory. All we want now is to do something for each one, which -execdir does nicely.
Use a recursive function or invoke the script recursively.
I am not sure if I understand your problem statement correctly. Your psuedo code seems good. But, I see a problem with the following line.
curdi={$PWD}
It does not give you the directory where the script resides but gives the directory you are in. If your script directory is in the path and you are running the script from your home directory then $curdi would point to your home directory and not the directory where your script resides. This will lead to undesired results.
Incidentally, if you really wanted to do it in the way that your pseudo-script attempts it, you'd do it like this
#!/usr/bin/env bash
for f in "$PWD"/{waffles,pancakes,bagels}/inner ; do
cd "$f"
cp innerofinner/* .
# if you know for sure that it's one level up
cd ..
done
Presuming that $PWD is a good enough indicator of "current" directory for you. Me, I'd pass it in to the script.
#!/usr/bin/env bash
base="${1-$PWD}"
for f in "$base"/{waffles,pancakes,bagels}/inner ; do
cd "$f"
cp innerofinner/* .
cd ..
done
at call it like
breakfast.sh /home/userbob/scripts/foo/
find . \( -iname '*waffles*innferofinner*' -o \
-iname '*pancakes*innferofinner*' -o \
-iname '*baggels*innferofinner*' \) \
-type f \
-exec cp {} "`echo {} | sed 's:\(.*\)/[^/]\+/[^/]\+:\1:'` \;
Should do. Finds every file in the desired subdirs, then copies it based on its name.
HTH

Need to touch a file in several directories using one liner in linux

I have lots of directories. I need to cd to all the directories and create 2 files. I tried doing this using xargs, but I couldn't do it. Can you please tell me how to achieve this?
If you don't want or need to run find but have a list of directories, something like this:
xargs -i touch {}/a {}/b <directories.txt
If the directory paths are completely regular (e.g. all subdirectories two levels down), it might be as easy as
touch */*/a */*/b
find <path> -type d -exec touch {}/a {}/b \;
path may be . if you are already in the top directory you are interested to work on.

Resources