Find and delete file but not specific path - linux

I am writing a script to cleanup user dir on "/srv". At present every user keeps some temp files on "/srv/$USER".
Following is my script :
for x in $(cut -d: -f1 /etc/passwd); do
if [ -d "/srv/${x}" ]; then
echo "/srv/${x}"
find /srv/${x} -mindepth 1 -type f -not -amin -10080 -exec rm {} \;
fi
done
So I tried this script replacing rm with ls
/srv/abc
/srv/abc/2015-04-20-11-multi-interval.json
/srv/abc/2015-04-20-10-mimic.json
/srv/xyz
/srv/xyz/magnetic_hadoop/fabfile.py
here i want to exclude /srv/abc which is parent dir and delete only files, So I added -mindepth 1, but still I didn't get what I want.
Then I added -not -path /srv/${x} but no difference.
Anyone know what am I missing here ?
Thanks

the '-type f' means that you will get only files. and your output shows that: after the folder name which comes from the echo command, only files are shown.
unless you want to leave user folders intact, you don't want the '-mindepth 1' option; it does not change the fact that '-type f'

Related

Moving files with a pattern in their name to a folder with the same pattern as its name

My directory contains mix of hundreds of files and directories similar to this:
508471/
ae_lstm__ts_ 508471_detected_anomalies.pdf
ae_lstm__508471_prediction_result.pdf
mlp_508471_prediction_result.pdf
mlp__ts_508471_detected_anomalies.pdf
vanilla_lstm_508471_prediction_result.pdf
vanilla_lstm_ts_508471_detected_anomalies.pdf
598690/
ae_lstm__ts_598690_detected_anomalies.pdf
ae_lstm__598690_prediction_result.pdf
mlp_598690_prediction_result.pdf
mlp__ts_598690_detected_anomalies.pdf
vanilla_lstm_598690_prediction_result.pdf
vanilla_lstm_ts_598690_detected_anomalies.pdf
There are folders with an ID number as their names, like 508471 and 598690.
In the same path as these folders, there are pdf files that have this ID number as part of their name. I need to move all the pdf files with the same ID in their name, to their related directories.
I tried the following shell script but it doesn't do anything. What am I doing wrong?
I'm trying to loop over all the directories, find the files that have id in their name, and move them to the same dir:
for f in ls -d */; do
id=${f%?} # f value is '598690/', I'm removing the last character, `\`, to get only the id part
find . -maxdepth 1 -type f -iname *.pdf -exec grep $id {} \; -exec mv -i {} $f \;
done
#!/bin/sh
find . -mindepth 1 -maxdepth 1 -type d -exec sh -c '
for d in "$#"; do
id=${d#./}
for file in *"$id"*.pdf; do
[ -f "$file" ] && mv -- "$file" "$d"
done
done
' findshell {} +
This finds every directory inside the current one (finding, for example, ./598690). Then, it removes ./ from the relative path and selects each file that contains the resulting id (598690), moving it to the corresponding directory.
If you are unsure of what this will do, put an echo between && and mv, it will list the mv actions the script would make.
And remember, do not parse ls.
The below code should do the required job.
for dir in */; do find . -mindepth 1 -maxdepth 1 -type f -name "*${dir%*/}*.pdf" -exec mv {} ${dir}/ \;; done
where */ will consider only the directories present in the given directory, find will search only files in the given directory which matches *${dir%*/}*.pdf i.e file name containing the directory name as its sub-string and finally mv will copy the matching files to the directory.
in Unix please use below command
find . -name '*508471*' -exec bash -c 'echo mv $0 ${0/508471/598690}' {} \;
You may use this for loop from the parent directory of these pdf files and directories:
for d in */; do
compgen -G "*${d%/}*.pdf" >/dev/null && mv *"${d%/}"*.pdf "$d"
done
compgen -G is used to check if there is a match for given glob or not.

delete specific files and retain some

so I have this directory that includes these .js and .yml files and one folder named as config
pogi#gwapo-pah:~$ ls
index.cat.js
index.bird.js
index.dog.js
index.monkey.js
function.yml
config
I would like to execute a one-liner bash command that would perform these
find if "index.dog.js" exists, and if none then exit
find if "index.dog.js" exists, and if present then remove only the
other *.js files and retain index.dog.js, function.yml and the folder config
if command is success then the files from folder shold look like this:
index.dog.js
function.yml
config
these is so far I tried however I'm not able to continue the missing logic
if [ -f index.dog.js ] ; then echo 'exists' ; fi
shopt -s extglob
[[ -f index.dog.js ]] && rm !(index.dog).js
Another way using find command:
[ -f "index.dog.js" ] && find . -maxdepth 1 -name \*.js -not -name index.dog.js -delete
find command search in current directory any file with js extension but index.dog.js and remove it.
replace . with folder name if you are not inside directory where are file.
Test if "index.dog.js" exists, if it does, use find to yield all *.js files (but not index.dog.js), and delete them.
EDIT As John Kugelman correctly advises, best to avoid ls due to possible inconsistencies with it.
[ -f "index.dog.js" ] && \
find . -type f -not -name "index.dog.js" -name \*.js -exec rm {} +
test -f index.dog.js && find . -name \*.js -not -name index.dog.js -exec rm {} +
Explanation:
test is a way to do if without all the extra syntax, if you don't need the else.
&& is the "short circuit" (exit) you want if there is no dog file.
find looks for files using multiple criteria. In this case files whose name match *.js but are not the dog file.
find can then execute a command against the found files. The {} is a stand-in for the found files. The + means put all the filenames on one rm command, rather than running one command per file.

Copy multiple file from multiple directories with new filename

I want to make a specific copy.
I explain
So here my main folder :
Sub-Directory-name-01\filename-01.jpg
Sub-Directory-name-01\filename-02.jpg
Sub-Directory-name-01\filename-03.jpg
Sub-Directory-name-01\special-filename-01.jpg
Sub-Directory-name-02\filename2-01.jpg
Sub-Directory-name-02\filename2-02.jpg
Sub-Directory-name-02\filename2-03.jpg
Sub-Directory-name-02\special-filename2-01.jpg
Sub-Directory-name-02\filename2-01.jpg
Sub-Directory-name-02\filename2-02.jpg
Sub-Directory-name-02\filename2-03.jpg
Sub-Directory-name-02\special-filename2-01.jpg
I want to copy all file from all dir and :
- keep original file
- copy 2 times the original file
- add a prefix to the new name
- prefix-01 for first copy
- prefix-02 for second copy
- keep the new files in the same dir as original file
I allready succes with a command to copy 1 time with 1 prefix.
It works in the sub-directory
for file in *.jpg; do cp "$file" "prefix-$file"; done
I try to do for all sub-dirs but i got an error
find . -type f \( -iname "*.jpg" ! -iname "special-*.jpg" \) | xargs cp -v "$file" "prefix-$file"
( yes i exclude a special name )
But i got error :
cp: target `./Sub-Directory-name-01/filename-01.jpg' is not a directory
i dont know how to solve my problem and how to add the 2nd copy in the cmd.
Thanks
Edit : I havent found any similar question so any answser to solve this problem.
Note that above $file is set only by the for file in ... ; do ... ;done loop, i.e. in your xargs cmdline you were just using the last leftover value from the loop.
Some things to consider:
need to process each file separately => use xargs -l1 (process each 1 line).
need to separate DIR/FILENAME as the needed command is something like 'cp $DIR/$FILENAME $DIR/prefix-01-$FILENAME' (and prefix-02 also), use find ... -printf "%h %f\n" for this
for each line, need to do couple things (prefix-01,02) => use a scriptlet via sh -c '<scriptlet>'
better skip prefix-0?-*.jpg files from find, to be able to re-run it without "accumulating" copies
A possible implementation would be:
find . -type f \( -iname "*.jpg" ! -iname "special-*.jpg" ! -name "prefix-0?-*.jpg" \) -printf "%h %f\n" | \
xargs -l1 sh -c 'cp -v "$1/$2" "$1/prefix-01-$2"; cp -v "$1/$2" "$1/prefix-02-$2"' --
As xargs runs sh -c '<scriptlet>' -- DIR FILE for each line, the scriptlet will properly evaluate $1 and $2 respectively.
--jjo
PS: directory separator in Unix-like systems is / :)
[Update: fixed to use %f instead of %P, as per comments below]

linux command line recursively check directories for at least 1 file with the same name as the directory

I have a directory containing a large number of directories. Each directory contains some files and in some cases another directory.
parent_directory
sub_dir_1
sub_dir_1.txt
sub_dir_1_1.txt
sub_dir_2
sub_dir_2.txt
sub_dir_2_1.txt
sub_dir_3
sub_dir_3.txt
sub_dir_3_1.txt
sub_dir_4
sub_dir_4.txt
sub_dir_4_1.txt
sub_dir_5
sub_dir_5.txt
sub_dir_5_1.txt
I need to check that each sub_dir contains at least one file with the exact same name. I don' need to check any further down if there are sub directories within the sub_dirs.
I was thinking of using for d in ./*/ ; do (command here); done but I dont know how to get access to the sub_dir name inside the for loop
for d in ./*/ ;
do
(if directory does not contain 1 file that is the same name as the directory then echo directory name );
done
What is the best way to do this or is there a simpler way?
from the parent directory
find -maxdepth 1 -type d -printf "%f\n" |
xargs -I {} find {} -maxdepth 1 -type f -name {}.txt
will give you the name/name.txt pair. Compare with the all dir names to find the missing ones.
UPDATE
this might be simpler, instead of scanning you can check whether file exists or not
for f in $(find -maxdepth 1 -type d -printf "%f\n");
do if [ ! -e "$f/$f.txt" ];
then echo "$f not found";
fi; done
Maybe not understand fully, but
find . -print | grep -P '/(.*?)/\1\.txt'
this will print any file which is inside of the same-named directory, e.g:
./a/b/b.txt
./a/c/d/d.txt
etc...
Similarly
find . -print | sed -n '/\(.*\)\/\1\.txt/p'
this
find . -print | grep -P '/(.*?)/\1\.'
will list all files regardless of the extension in same-named dirs.
You can craft other regexes following the backreference logic.

LINUX - shell script finding and listing all files with rights to write in directory tree

Here is the code that i have soo far :
echo $(pwd > adress)
var=$(head -1 adress)
rm adress
found=0 #Flag
fileshow()
{
cd $1
for i in *
do
if [ -d $i ]
then
continue
elif [ -w $i ]
then
echo $i
found=1
fi
done
cd ..
}
fileshow $1
if [ $found -eq 0 ]
then
clear
echo "$(tput setaf 1)There arent any executable files !!!$(tput sgr0)"
fi
Its working but it find files only in current directory.
I was told that i need to use some kind of recursive method to loop through all sub-directories but i dont know how to do it.
So if any one can help me i will be very grateful.
Thanks!
The effect of your script is to find the files below the current working directory that are not directories and are writeable to the current user. This can be achieved with the command:
find ./ -type f -writable
The advantage of using -type f is that it also excludes symbolic links and other special kinds of file, if that's what you want. If you want all files that are not directories (as suggested by your script), then you can use:
find ./ ! -type d -writable
If you want to sort these files (added question, assuming lexicographic ascending order), you can use sort:
find ./ -type f -writable | sort
If you want to use these sorted filenames for something else, the canonical pattern would be (to handle filenames with embedded newlines and other seldom-used characters):
while read -r -d $'\0'; do
echo "File '$REPLY' is an ordinary file and is writable"
done < <(find ./ -type f -writable -print0 | sort -z)
If you're using a very old version of find that does not support the handy -writable predicate (added to v.4.3 in 2005), then you only have file permissions to go on. You then have to be clear about what you mean by “writable” in the specific context (writable to whom?), and you can replace the -writable predicate with the -perm predicates described in #gregb's answer. If you decide that you mean “writable by anyone” you could use -perm /u=w,g=w,o=w or -perm /222, but there's actually no way of getting all the benefits of -writable just using permissions. Also note that the + form of permission tests to -perm is deprecated and should no longer be used; the / form should be used instead.
You could use find:
find /path/to/directory/ -type f -perm -o=w
Where the -o=w implies that each file has the "other write-permission" set.
or,
find /path/to/directory/ -type f -perm /u+w,g+w,o+w
Where /u+w,g+w,o+w implies that each file either has user, group, or other write-permissions set.

Resources