Find files in multiple directories taken from list in a file? - linux

FreeBSD 9.2 RELEASE p2
I have a file fromdirs.txt. In this file is a new line separated directory list like so:
/etc
/home
/home/goods/
I need to first find in all directory's files which have names like "good" or contain string "(NODES_'TASK')" and then copy all these files into the directory /tmp.
2.sh file chmod +x and is 755
fromDirs.txt file chmod +x and is 755
This code give me error
IFS=$'\n' read -d '' -r -a dirs < fromDirs.txt
find "${dirs[#]}" -type f \( -name '*good*' -o -exec grep -F "(NODES_'TASK')" {} \; \) -exec cp {} /tmp/ \;
2.sh: cannot open fromDirs.txt : No such file or directory
2.sh: ${dirs[...}: Bad substitution
But File fromDirs.txt exist and 2.sh running from it allocation directory, also i trying to provide full path instead of fromDirs.txt and error the same
This code give me error
FILE=fromDirs.txt
IFS='\n'
while read -r dirs
do
find "$dirs" -type f \( -name '*good*' -o -exec grep -F "(NODES_'TASK')" {} \; \) -exec cp {} /tmp/ \;
done < "$FILE"
2.sh: 6: Syntax error: "done" unexpected (expecting "do")
This code give me error too
FILENAME=fromDirs.txt
awk '{kount++;print kount, $0}
END{print "\nTotal " kount " lines read"}' $FILENAME
2.sh: : not found awk: can't open file fromDirs.txt source line number 2
So how to read file line by line and do what i need?

This works for me
for line in "`cat fromDirs.txt`"; do find "$line" -type f \( -name '*good*' -o -exec grep -F "(NODES_'TASK')" {} \; \) -exec cp {} /tmp/ \;; done

Related

Shell cp: cannot stat no such file or directory

I was trying to use cp to copy files from one directory to another by globing
for files in index/*
do
file=$(echo $files|cut -d'/' -f2)
cp -r "$files" ".target/file"
done
However, cp will give this warning if the directory is empty. I tried 2>/dev/null to mute this message but it did not work. I wonder how I could fix it.
What about this: (not tested)
find /index -maxdepth 1 -type f -exec cp {} .target/ \;
-maxdepth 1 : only look in this directory
-type f : only take the files
-exec cp {} .target/ \; : execute a "file copy" action

find directory and make symbolic link into hard link in directory

I want to find a folder in directory and change the symbolic link in these folders into hard link.
I can find all symlink with following command:
find ${DIRECTORY_0} -type d -name "${DIRECTORY_1}" -exec bash -c 'find "$0" -type l -exec echo {\} \;' {} \;
The result is list all symlink are found.
If change echo into readlink, it shows the hard link file.
find ${DIRECTORY_0} -type d -name "${DIRECTORY_1}" -exec bash -c 'find "$0" -type l -exec readlink {\} \;' {} \;
Once I try the command :
find ${DIRECTORY_1} -type l -execdir bash -c ' cp --remove-destination -fR "$(readlink {} && rm {})" {} ' \;
that can make all symlink into hard link.
but i want to merge them together, both find ${DIRECTORY_1} in ${DIRECTORY_0} and change symlink in ${DIRECTORY_1} into hard link.
My try:
find directory -type d -name "special_folder" -exec bash -c '\
for i do
find "$i" -type l -execdir bash cp --remove-destination -fvR "$(readlink {\} && rm {\})" {\} +
done' bash {} +
but show error message
/bin/cp: /bin/cp: cannot execute binary file

How to rename multiple files at once

I have lots of files, directories and sub-directories at my file system.
For example:
/path/to/file/test-poster.jpg
/anotherpath/my-poster.jpg
/tuxisthebest/ohyes/path/exm/bold-poster.jpg
I want to switch all file names from *-poster.jpg to folder.jpg
I have tried with sed and awk with no success.
little help?
You can do it with find:
find -name "*poster.jpg" -exec sh -c 'mv "$0" "${0%/*}/folder.jpg"' '{}' \;
Explanation
Here, for each filename matched, executes:
sh -c 'mv "$0" "${0%/*}/folder.jpg"' '{}'
Where '{}' is the filename passed as an argument to the command_string:
mv "$0" "${0%/*}/folder.jpg"
So, at the end, $0 will have the filename.
Finally, ${0%/*}/folder.jpg expands to the path of the old filename and adds /folder.jpg.
Example
Notice I'm replacing mv with echo
$ find -name "*poster.jpg" -exec sh -c 'echo "$0" "${0%/*}/folder.jpg"' '{}' \;
./anotherpath/my-poster.jpg ./anotherpath/folder.jpg
./path/to/file/test-poster.jpg ./path/to/file/folder.jpg
./tuxisthebest/ohyes/path/exm/bold-poster.jpg ./tuxisthebest/ohyes/path/exm/folder.jpg
Try this script, it should rename all the files as required.
for i in $(find . -name "*-poster.jpg") ; do folder=`echo $i | awk -F"-poster.jpg" {'print $1'}`; mv -iv $i $folder.folder.jpg; done
You can replace . to the directory where these files are placed in the command find . -name "*-poster.jpg" in the script. Let me know if it is working fine for you.
you can try it like
find -name '*poster*' -type f -exec sh -c 'mv "{}" "$(dirname "{}")"/folder.jpg' \;
find all files containing poster == find -name '*poster*' -type f
copy the directory path of the file and store it in a temporary variable and afterwards affix "folder.jpg" to directory path == -exec sh -c 'mv "{}" "$(dirname "{}")"/folder.jpg' \;

linux find no such file or directory but exists

Then i trying to use this script
for line in `cat dirs.txt`;
do
find "$line" -type f \( -name '*good*' -o -exec grep -F "badbad" {} \; \) -exec echo {} \;;
done
I get error on each existing dirs and match the find criteria
find: /home/goods/ : No such file or directory
find: /home/bads/ : No such file or directory
find: /home/fill/ : No such file or directory
But then i look manualy this dirs exist and i can read them all
Why this happens?
You must check in file for ^M$
You can do that with command cat dirs.txt -vET
Then you must trim them all with command cat dirs.txt|tr -d "\r" >1.txt
Issue is that you have dos (^M) line endings, in the file. Running dos2unix dirs.txt dirs.txt should solve the problem. Ideally, you also shouldn't use for line in $(cat ..., but something like
while IFS= read -r line; do
find "$line" -type f \( -name '*good*' -o -exec grep -F "badbad" {} \; \) -exec echo {} \;
done < dirs.txt

dos2unix command

I have this script
#!/bin/sh
for i in `ls -R`
do
echo "Changing $i"
fromdos $i
done
I want to remove "^M" charcaters from many files which are in more subdirectories. I got this:
fromdos: Unable to access file
Is there somethig i'm missing?
Thanks in advance.
ls -R lists everything, including directories. So you're telling fromdos to act on actual directories is some cases.
Try something like this:
find . -type f -exec fromdos {} \;
I guess you don't need a for loop.
Here is a quick panorama of solutions for files with extension ".ext" (such commands shall be somehow restrictive)
note : ^M is obtained with CTRL-V" + "CTRL-M"
# PORTABLE SOLUTION
find /home -type f -name "*.ext" -exec sed -i -e 's/^M$//' {} \;
# GNU-sed
find /home -type f -name "*.ext" -exec sed -i -e "s/\x0D$//g" {} \;
# SED with more recent nux
find /home -type f -name "*.ext" -exec sed -i -e "s/\r$//g" {} \;
# DOS2UNIX
find /home -type f -name "*.ext" -print0 | while read -r -d "$(printf "\000")" -r path; do dos2unix $path $path"_new"; done
# AWK
find /home -type f -name "*.ext" -print0 | while read -r -d "$(printf "\000")" -r path; do awk '{ sub("\r$", ""); print }' $path > $path"_new"; done
# TR
find /home -type f -name "*.ext" -print0 | while read -r -d "$(printf "\000")" -r path; do cat $path | tr -d '\r' > $path"_new"; done
# PERL
find /home -type f -name "*.ext" -exec perl -pi -e 's/\r//g' {} \;

Resources