I am trying to find and rename a directory on a linux system.
the folder name is something like : thefoldername-23423-431321
thefoldername is consistent but the numbers change every time.
I tried this:
find . -type d -name 'thefoldername*' -exec mv {} newfoldername \;
The command actually works and rename that directory. But I got an error on terminal saying that there is no such file or directory.
How can I fix it?
It's a harmless error which you can get rid of with the -depth option.
find . -depth -type d -name 'thefoldername*' -exec mv {} newfoldername \;
Find's normal behavior is to process directories and then recurse into them. Since you've renamed it find complains when it tries to recurse. The -depth option tells find to recurse first, then process the directory after.
It's missing the -execdir option! As stated in man pages of find:
-execdir command {};
Like -exec, but the specified command is run from the subdirectory containing the matched file, which is not normally the directory in which you started find.
find . -depth -type d -name 'thefoldername*' -execdir mv {} newfoldername \;
With the previous answer my folders contents are disappeared.
This is my solution. It works well:
for i in find -type d -name 'oldFolderName';
do
dirname=$(dirname "$i")
mv $dirname/oldFolderName $dirname/newFolderName
done
.../ABC -> .../BCD
find . -depth -type d -name 'ABC' -execdir mv {} $(dirname $i)/BCD \;
Replace 1100 with old_value and 2200 with new_value that you want to replace.
example
for i in $(find . -type d -iname '1100');do echo "mv "$i" "$i"__" >> test.txt; sed 's/1100__/2200/g' test.txt > test_1.txt; bash test_1.txt ; rm test*.txt ; done
Proof
[user#server test]$ ls -la check/
drwxr-xr-x. 1 user user 0 Jun 7 12:16 1100
[user#server test]$ for i in $(find . -type d -iname '1100');do echo "mv "$i" "$i"__" >> test.txt; sed 's/1100__/2200/g' test.txt > test_1.txt; bash test_1.txt ; rm test*.txt ; done
[user#server test]$ ls -la check/
drwxr-xr-x. 1 user user 0 Jun 7 12:16 2200
here __ in sed is used only to change the name it have no other significance
Related
I got a script which deletes files which are older than 2 days, usually it works properly, but for some reason it doesn't work fine today.
I want to find an option to get an output from script with error why the files are not deleted.
Could you tell me is there such option?
script:
#!/bin/bash
#script for cleaning logs from files older than two days
dir_name=/home/albert/scripts/files
file_log=/home/albert/scripts/info.log
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log)
You probably want to add a redirection for standard error too, i.e. in the case you want to send it to the same file:
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log) 2>&1
You could use the find option -delete:
-delete - If the removal failed, an error message is issued.
find "${dir_name}" -type f -name '*.log' -mtime +2 -delete >> ${file_log} 2>&1
Example:
$ find /etc -name passwd -delete
find: cannot delete ‘/etc/pam.d/passwd’: Permission denied
find: cannot delete ‘/etc/passwd’: Permission denied
Suppse if I have directories
dir_1/ dir_2/ dir_3/
How can I create a directory of same name under all these directories using a single command?
Here is one command for you:
If your want the sub dir have the same name as the parent dir:
for i in ./dir_*; do mkdir -p "${i}/${i}"; done
If you want the sub dir share the same new name.
for i in ./dir_*; do mkdir -p "${i}/new_dir_name"; done
You should use Brace Expansion
mkdir dir_{1..3}/newDir
Only work if you know the names of the dirs in advance, of course.
Doesn’t work with ’sh’ though.
You can use this find:
find . -maxdepth 1 -type d -name 'dir_*' -exec mkdir {}/{} \;
Test:
$ ls
dir_1 dir_2 dir_3 file1 file2 file3
$ find . -maxdepth 1 -type d -name 'dir_*' -exec mkdir {}/{} \;
$ ls dir_1/
dir_1
Using find you could do:
find . -type d -maxdepth 1 -execdir mkdir -p "{}/{}" \;
This will create directory/directory if it doesn't already exist.
Trying to use "find" to copy a bunch of shared objects. Almost there, but would like to remove all version numbers except the major version.
example would be somesharedobject.so.30.0.4 copied to somesharedobject.so.30
find . -maxdepth 1 -type f -name '*.so.*' -exec cp '{}' test/'{}' \;
I'm guessing I'm going to have to pipe to xargs and sed but just hitting a mental block.
find . -maxdepth 1 -type f -name '*.so.*'|xargs -I '{}' cp '{}' test/'{}'
Think I'm just going to go with something like this
find . -maxdepth 1 -type f -name '*.so.*' -exec cp '{}' test/'{}' \;
for f in test/*.so.* ; do mv "$f" "${f%.*.*}" ; done
seems to work ok from my tests
I would write a function + script to make the job easy
#!/bin/bash
specialised_copy(){
version="${1##*so.}"
# extract the version part alone in the above step
cp "$1" "test/${1%%.so*}.so.${version%%.*}"
#cut the major version part from the version and use it for copy
#note folder test should be relative to where the script is saved
}
export -f specialised_copy
find . -maxdepth 1 -type f -name '*.so.*' -exec bash -c 'specialised_copy "$1"' _ {} \;
Im trying to find the executables files and their total in a folder,its showing but the total is not this is my code below,can someone help me out were i am making mistakes,i am just a newbie trying to learn some bash scripting hope this is the right way of doing it thanks
#!/bin/bash
To="home/magie/d2"
cd "$To"
find . -type f -perm 755
if
find . -type f -perm 755
then
echo | echo wc -l
fi
If you want to find all the executable files then use this command:
find home/magie/d2 -type f -perm -u+rx | wc -l
OR
find home/magie/d2 -type f -perm +111 | wc -l
All the answers here are finding files with permission 755 only however keep in mind even 744 or 700 are also executable files by the user.
Just remove the if structure and the echo's
#!/bin/bash
To="home/magie/d2"
cd "$To"
find . -type f -perm 755
find . -type f -perm 755 | wc -l
Use /111 to find any file that has any of the execute bits set.
find . -type f -perm /111 | wc -l
I think I'd do something like this:
#!/bin/bash
dir=$1
files="$(find $dir -perm 755)"
total=$(wc -l <<< "$files")
echo "$files"
echo "Total: $total"
where the desired directory has to be passed as an argument in the command line and the quotes are used to preserve line breaks needed later by wc to correctly count the number of lines.
From the command line a simple one-liner should do the trick -
wc -l < <(find /home/magie/d2 -type f -perm 755)
<(..) is process substitution.
How do I find all the files that were create only today and not in 24 hour period in unix/linux
On my Fedora 10 system, with findutils-4.4.0-1.fc10.i386:
find <path> -daystart -ctime 0 -print
The -daystart flag tells it to calculate from the start of today instead of from 24 hours ago.
Note however that this will actually list files created or modified in the last day. find has no options that look at the true creation date of the file.
find . -mtime -1 -type f -print
To find all files that are modified today only (since start of day only, i.e. 12 am), in current directory and its sub-directories:
touch -t `date +%m%d0000` /tmp/$$
find . -type f -newer /tmp/$$
rm /tmp/$$
Source
I use this with some frequency:
$ ls -altrh --time-style=+%D | grep $(date +%D)
After going through many posts I found the best one that really works
find $file_path -type f -name "*.txt" -mtime -1 -printf "%f\n"
This prints only the file name like
abc.txt not the /path/tofolder/abc.txt
Also also play around or customize with -mtime -1
This worked for me. Lists the files created on May 30 in the current directory.
ls -lt | grep 'May 30'
Use ls or find to have all the files that were created today.
Using ls : ls -ltr | grep "$(date '+%b %e')"
Using find : cd $YOUR_DIRECTORY; find . -ls 2>/dev/null| grep "$(date '+%b %e')"
find ./ -maxdepth 1 -type f -execdir basename '{}' ';' | grep `date +'%Y%m%d'`
You can use find and ls to accomplish with this:
find . -type f -exec ls -l {} \; | egrep "Aug 26";
It will find all files in this directory, display useful informations (-l) and filter the lines with some date you want... It may be a little bit slow, but still useful in some cases.
Just keep in mind there are 2 spaces between Aug and 26. Other wise your find command will not work.
find . -type f -exec ls -l {} \; | egrep "Aug 26";
If you're did something like accidentally rsync'd to the wrong directory, the above suggestions work to find new files, but for me, the easiest was connecting with an SFTP client like Transmit then ordering by date and deleting.
To get file before 24 hours execute below command:
find . -type f -mtime 1 -exec ls -l {} \;
To get files created today execute below command:
find . -type f -mtime -1 -exec ls -l {} \;
To Get files created before n days before, where +2 is before 2 days files in below command:
find . -type f -mtime +2 -exec ls -l {} \;