Find -exec and Bash scripts - linux

I'm writing a script in bash.
I invoke it with
find *.zip -type f -exec ./myscript.sh {} \;
At the top of my script I invoke another script like this:
#!/bin/bash
. ticktick.sh
I get the following error
.: ticktick.sh: file not found
If I invoke the script like this
./myscript.sh somefile.zip
it works
If I put the ticktick.sh script in my path in another directory it breaks, so that isn't an option. Is there some special kind of context that scripts called with a find have? I'm obviously new to BASH scripting. Any help would be appreciated

I think there are 2 problems.
1.: if you want to search for all zip files in the current directory, you have to write the following command
find . -type f -name *.zip -exec ...
2.: you execute myscript.sh with ./ before it. So myscript.sh has to be in the current working directory. if your script is in /home/jd/ and you execute it from /home/ your myscript.sh will be not found.
first you have to determine the directory of your files:
install_path=$(dirname $(readlink -f $0))
So your complete find command is:
find . -type f -name *.zip -exec $install_path/myscript.sh {} \;
The myscript.sh file have to be in the same directory as ticktick.sh

Related

bash - find all .bashrc files and append to them

I need to find all .bashrc files and append "MYSQL_HISTFILE=/dev/null" to it, to remediate an issue. There are alot of .bashrc files, so can I do something like:
find / -type f -name ".bashrc" -exec echo "export MYSQL_HISTFILE=/dev/null" >> {} \;
>> is executed by the original shell process, it can't use substitution from find. And find doesn't run its command through a shell, so it can't do output redirection itself.
You need to execute bash explicitly so you can use redirection in the command.
find / -type f -name '.bashrc' -exec bash -c 'echo export MYSQL_HISTFILE=/dev/null >> "{}"' \;

How to find all .sh files and make them executable using bash in linux?

also when I launch I want to pass path to folder when located these .sh files
I started with this
#!/bin/bash
find /home/user_name -name "*.sh"
And after script has to write in logo list with executable files
The safest and way both in terms of security and in terms of weird file names (spaces, weird characters, and so forth) is to use find directly:
find /home/user -name "*.sh" -execdir chmod u+x {} +
You can check the comments and the manual of find why this is safe, but in short, it makes sure your file is properly quoted in the chmod command. execdir (rather then -exec) is an extra security feature making sure the command is executed in the directory the file was found in avoiding race conditions (elaborated in the manual).
another way :
find . -name "*.sh" -exec chmod ux+y {} \;
you can first check your command by using
find . -name "*.sh" -print
If you want to make all files executable for the current user, you can use the command as follows (assuming that you have permission for all files in target home folder) :
find /home/user_name -name "*.sh" -print0 | xargs -0 chmod u+x
By #kabanus command
#!/bin/bash
# chmod u+x $(find $1 -name "*.sh")
# ls -1 $1/*.sh
find $1 -name "*.sh" -print -exec chmod u+x {} +
And use as
$ ./script.sh /your_directory
/your_directory - first argument ($1) in the script.

Run an Executable Program File in Multiple Subdirectories Using Shell

I have a main directory with 361 subdirectories. Within the each subdirectory, there is a parameter file and one executable program file. The executable file is coded to look for the parameter file in the directory where the executable is located. (The same executable file is in all subdirectories. The parameter files all have the same file name in all subdirectories)
Instead of executing the program file individually, is there a cshell command for terminal to run them all at once?
UPDATED
If your Linux is so old it doesn't have -execdir, you could try this:
find $(pwd) -name YourProgram -exec dirname {} \; | while read d; do cd "$d" && pwd; done
If that correctly prints the names of the directories where your program needs to be run, just remove the pwd and replace with whatever you want done in tha directory - presumably something like this:
find $(pwd) -name YourProgram -exec dirname {} \; | while read d; do cd "$d" && ./YourPrgram; done
ORIGINAL ANSWER
Like this maybe:
find . -type f -name YourProgramName -execdir ./YourProgramName YourParameterFile \;
But backup first and check it looks right before using.
The -execdir causes find to change to the directory it has found before running the commands there.
If your command is more complicated, you can do this:
find . -type f -name YourProgramName -execdir sh -c "command1; command2; command3" \;
Check it does what you want like this:
find . -type f -name YourProgramName -execdir pwd \;
Maybe this will help. Suppose you have in each folder a file named params_file and an executable named exec_file, then:
for dir in `find . -maxdepth 1 -mindepth 1 -type d` ; do
cd $dir
cat params_file | xargs ./exec_file
cd ..
done

Find files in a dir, executing a command with execdir and redirecting

It seems like I am unable to find a direct answer to this question.
I appreciate your help.
I'm trying to find all files with a specific name in a directory, read the last 1000 lines of the file and copy it in to a new file in the same directory. As an example:
Find all files names xyz.log in the current directory, copy the last 1000 lines to file abc.log (which doesn't exist).
I tried to use the following command with no luck:
find . -name "xyz.log" -execdir tail -1000 {} > abc.log \;
The problem I'm having is that for all the files in the current directory, they all write to abc.log in the CURRENT directory and not in the directory where xyz.log resides. Clearly the find with execdir is first executed and then the output is redirected to abc.log.
Can you guys suggest a way to fix this? I appreciate any information/help.
EDIT- I tried find . -name "xyz.log" -execdir sh -c "tail -1000 {} > abc.log" \; as suggested by some of the friends, but it gives me this error: sh: ./tail: No such file or directory error message. Do you guys have any idea what the problem is?
Luckily the solution to use -printf is working fine.
The simplest way is this:
find . -name "xyz.log" -execdir sh -c 'tail -1000 "{}" >abc.log' \;
A more flexible alternative is to first print out the commands and then execute them all with sh:
find . -name "xyz.log" -printf 'tail -1000 "%p" >"%h/abc.log"\n' | sh
You can remove the | sh from the end when you're trying it out/debugging.
There is a bug in some versions of findutils (4.2 and 4.3, though it was fixed in some 4.2.x and 4.3.x versions) that cause execdir arguments that contain {} to be prefixed with ./ (instead of the prefix being applied only to {} it is applied to the whole quoted string). To work around this you can use:
find . -name "xyz.log" -execdir sh -c 'tail -1000 "$1" >abc.log' sh {} \;
sh -c 'script' arg0 arg1 runs the sh script with arg0, arg1, etc. passed to it. By convention, arg0 is the name of the executable (here, "sh"). From the script you can access the arguments using $0 (corresponding to "sh"), $1 (corresponding to find's expansion of {}), etc.
The redirect isn't passed into execdir, so abc.log shows up in the directory you run the command in. -execdir also doesn't like embedded redirects. but you can workaround the problem by passing -execdir a shell command with a redirect embedded, like this:
find . -name "xyz.log" -execdir sh -c '/usr/bin/tail -1000 {} > abc.log' \;
Much credit to this blog post (not mine):
http://www.microhowto.info/howto/act_on_all_files_in_a_directory_tree_using_find.html
Edit
I put the full path to tail in the command (assuming it's in /usr/bin on your system), since sh may load a .profile with a PATH that differs from your current shell.
Here's another non-find (well, sorta - it still uses find but doesn't try to shoehorn find into doing the whole thing):
while read f
do
d=$(dirname "${f}")
tail -n 1000 "${f}" > "${d}/abc.log"
done < <(find . -type f -name xyz.log -print)

How to find -exec cd in linux / unix

I'm searching for a config folder, and trying to change to that directory:
find . -name "config" -exec cd {} \;
There is one match, ./my-applications/config, but after I try this it says:
find: `cd': No such file or directory
What am I doing wrong?
The command cd is a shell built-in, not found in /bin or /usr/bin.
Of course, you can't change directory to a file and your search doesn't limit itself to directories. And the cd command would only affect the executed command, not the parent shell that executes the find command.
Use:
cd $(find . -name config -type d | sed 1q)
Note that if your directory is not found, you'll be back in your home directory when the command completes. (The sed 1q ensures you only pass one directory name to cd; the Korn shell cd takes two values on the command and does something fairly sensible, but Bash ignores the extras.)
In case you have more than one config directory:
select config in $(find . -name config -type d)
do
cd $config
break
done
find runs -exec programs as subprocesses and subprocesses cannot affect their parent process. So, it cannot be done. You may want to try
cd `find . -name "config"`

Resources