launch several scripts located in subdirectories - linux

Here is my problem. I have a directory that contains multiple sub-directories. In each sub-directory, there is at least one script sh.
I want to do a script that execute sequentially all this scripts.
I am pretty new to linux.
Thanks for your help,

find . -name "*.sh" -exec {} \;
This is a shell command which, beginning in the directory it's being run in (specified by .), finds file names that end in .sh and then executes those files (the found file is substituted in the {}). The backslash prevents the semicolon from being expanded by the shell (here, bash).

Try doing it using find and for:
for file in `find . -type f -name "*.sh"`; do sh $file; done
Use can also store it in array and do it:
array=($(find . -type f -name "*.sh"))
for file in ${array[#]};do sh $file; done

From the top directory, run the following command:
for f in `find . -type f -name \*.sh`; do $f; done
The find command will locate all .sh files. The output of the find command (a whitespace separated list of pathnames to the scripts) becomes the input to the for command. The for command processes each input, assigning each entry to the variable f. The "$f" executes each script.

Related

Loop through a directory with any level of depth

I want to execute a command on all files present on all levels in the directory. It may have any number of files and sub directories. Even these sub directories may contain any number of files and subdirectories. I want to do this using shell script. As I am new to this field can any one suggest me a way out.
You can use the command "find" with "xargs" after "|"(pipe).
Example: Suppose that I want to remove all files that have ".txt" extension on "Documents" directory:
find Documents -iname *.txt |xargs rm -f
Helps?
You can use a recursive command that uses wildcard characters (*) like so:
for dir in ~/dev/myproject/*; do (cd "$dir" && git status); done
If you want to apply commands on the individual files you should use the find command and execute commands on it like so:
find yourdirectory -type f -exec echo "File found: '{}'" \;
What this does:
finds all the items in the directory yourdirectory
that have the type f - so are a file
runs an exec on each file
Use find:
find -type f -exec COMMAND {} \;
-f applies the command only to files, not to directories. The command is recursive by default.

Copy specific files recursively

This problem has been discussed extensively but I couldn't find a solution that would help me.
I'm trying to selectively copy files from a directory tree into a specific folder. After reading some Q&A, here's what I tried:
cp `find . -name "*.pdf" -type f` ../collect/
I am in the right parent directory and there indeed is a collect directory a level above. Now I'm getting the error: cp: invalid option -- 'o'
What is going wrong?
To handle difficult file names:
find . -name "*.pdf" -type f -exec cp {} ../collect/ \;
By default, find will print the file names that it finds. If one uses the -exec option, it will instead pass the file names on to a command of your choosing, in this case a cp command which is written as:
cp {} ../collect/ \;
The {} tells find where to insert the file name. The end of the command given to -exec is marked by a semicolon. Normally, the shell would eat the semicolon. So, we escape the semicolon with a backslash so that it is passed as an argument to the find command.
Because find gives the file name to cp directly without interference from the shell, this approach works for even the most difficult file names.
More efficiency
The above runs cp on every file found. If there are many files, that would be a lot of processes started. If one has GNU tools, that can be avoided as follows:
find . -name '*.pdf' -type f -exec cp -t ../collect {} +
In this variant of the command, find will supply many file names for each single invocation of cp, potentially greatly reducing the number of processes that need to be started.

Find -exec and Bash scripts

I'm writing a script in bash.
I invoke it with
find *.zip -type f -exec ./myscript.sh {} \;
At the top of my script I invoke another script like this:
#!/bin/bash
. ticktick.sh
I get the following error
.: ticktick.sh: file not found
If I invoke the script like this
./myscript.sh somefile.zip
it works
If I put the ticktick.sh script in my path in another directory it breaks, so that isn't an option. Is there some special kind of context that scripts called with a find have? I'm obviously new to BASH scripting. Any help would be appreciated
I think there are 2 problems.
1.: if you want to search for all zip files in the current directory, you have to write the following command
find . -type f -name *.zip -exec ...
2.: you execute myscript.sh with ./ before it. So myscript.sh has to be in the current working directory. if your script is in /home/jd/ and you execute it from /home/ your myscript.sh will be not found.
first you have to determine the directory of your files:
install_path=$(dirname $(readlink -f $0))
So your complete find command is:
find . -type f -name *.zip -exec $install_path/myscript.sh {} \;
The myscript.sh file have to be in the same directory as ticktick.sh

How to find -exec cd in linux / unix

I'm searching for a config folder, and trying to change to that directory:
find . -name "config" -exec cd {} \;
There is one match, ./my-applications/config, but after I try this it says:
find: `cd': No such file or directory
What am I doing wrong?
The command cd is a shell built-in, not found in /bin or /usr/bin.
Of course, you can't change directory to a file and your search doesn't limit itself to directories. And the cd command would only affect the executed command, not the parent shell that executes the find command.
Use:
cd $(find . -name config -type d | sed 1q)
Note that if your directory is not found, you'll be back in your home directory when the command completes. (The sed 1q ensures you only pass one directory name to cd; the Korn shell cd takes two values on the command and does something fairly sensible, but Bash ignores the extras.)
In case you have more than one config directory:
select config in $(find . -name config -type d)
do
cd $config
break
done
find runs -exec programs as subprocesses and subprocesses cannot affect their parent process. So, it cannot be done. You may want to try
cd `find . -name "config"`

Shell command to find files in a directory pattern

With a shell command i need to list all files on my server in the following directory pattern:
/home/*/public_html/images/*.php
Theres a few an its taking a long time to do this manually. I really have no idea when it comes to these commands.
find /path/to/directory/. -path "*/match/this/path/*" -type f -name "*.php"
Shell Script:
find /home/*/public_html/images -iname "*php" -exec echo {} \;
You can then change the -exec command to do whatever actions you want to the returned files. In this case, we echo them, but you could easily perform other actions as well.
Let bash expand the files for you and use ls to list them:
ls /home/*/public_html/images/*.php
Example output:
/home/grant/public_html/images/bar.php
/home/grant/public_html/images/foo.php
/home/marcog/public_html/images/helloworld.php
Use the PHP glob function
glob('/home/*/public_html/images/*.php')
It will return an array of the matching path strings. You can also just use:
ls /home/*/public_html/images/*.php
or:
for i in /tmp/*/public_html/images/*.php;
do
some_command "$i"
done
from the shell.

Resources