Unix shell loop to check if a directory exists inside multiple directories - linux

I have folder structure like this:
/home/
/folder1/
/backup/
/folder2/
/backup/
/folder3/
/folder4/
/backup/
/folder5/
(As you can see, no all directories "folder" have a directory "backup")
I need to check if the directory "backup" exists in the "folder"s and delete it.
I am using this command:
for d in /home/* ;
do [ -d "$d/backup" ]
&& echo "/backup exists in $d"
&& rm -rf "$d/backup"
&& echo "/backup deleted in $d" ;
done
But it is not working. Please help.

find . -type d -name "backup" -delete -print
Obviously, all content under backup directories will be lost.
This will recurse down into your directories. If you need to limit it to only the first level, you can do:
find . -maxdepth 1 -type d -name "backup" -delete -print
Both commands will print the deleted directories. No output == no directory found, nothing done.
Lastly, you want to avoid looping on files or directory names like you attempted, since you might have files or directories with spaces in their names. A complete discussion and solutions are available here: https://mywiki.wooledge.org/BashFAQ/001

Related

Moving files with a pattern in their name to a folder with the same pattern as its name

My directory contains mix of hundreds of files and directories similar to this:
508471/
ae_lstm__ts_ 508471_detected_anomalies.pdf
ae_lstm__508471_prediction_result.pdf
mlp_508471_prediction_result.pdf
mlp__ts_508471_detected_anomalies.pdf
vanilla_lstm_508471_prediction_result.pdf
vanilla_lstm_ts_508471_detected_anomalies.pdf
598690/
ae_lstm__ts_598690_detected_anomalies.pdf
ae_lstm__598690_prediction_result.pdf
mlp_598690_prediction_result.pdf
mlp__ts_598690_detected_anomalies.pdf
vanilla_lstm_598690_prediction_result.pdf
vanilla_lstm_ts_598690_detected_anomalies.pdf
There are folders with an ID number as their names, like 508471 and 598690.
In the same path as these folders, there are pdf files that have this ID number as part of their name. I need to move all the pdf files with the same ID in their name, to their related directories.
I tried the following shell script but it doesn't do anything. What am I doing wrong?
I'm trying to loop over all the directories, find the files that have id in their name, and move them to the same dir:
for f in ls -d */; do
id=${f%?} # f value is '598690/', I'm removing the last character, `\`, to get only the id part
find . -maxdepth 1 -type f -iname *.pdf -exec grep $id {} \; -exec mv -i {} $f \;
done
#!/bin/sh
find . -mindepth 1 -maxdepth 1 -type d -exec sh -c '
for d in "$#"; do
id=${d#./}
for file in *"$id"*.pdf; do
[ -f "$file" ] && mv -- "$file" "$d"
done
done
' findshell {} +
This finds every directory inside the current one (finding, for example, ./598690). Then, it removes ./ from the relative path and selects each file that contains the resulting id (598690), moving it to the corresponding directory.
If you are unsure of what this will do, put an echo between && and mv, it will list the mv actions the script would make.
And remember, do not parse ls.
The below code should do the required job.
for dir in */; do find . -mindepth 1 -maxdepth 1 -type f -name "*${dir%*/}*.pdf" -exec mv {} ${dir}/ \;; done
where */ will consider only the directories present in the given directory, find will search only files in the given directory which matches *${dir%*/}*.pdf i.e file name containing the directory name as its sub-string and finally mv will copy the matching files to the directory.
in Unix please use below command
find . -name '*508471*' -exec bash -c 'echo mv $0 ${0/508471/598690}' {} \;
You may use this for loop from the parent directory of these pdf files and directories:
for d in */; do
compgen -G "*${d%/}*.pdf" >/dev/null && mv *"${d%/}"*.pdf "$d"
done
compgen -G is used to check if there is a match for given glob or not.

delete specific files and retain some

so I have this directory that includes these .js and .yml files and one folder named as config
pogi#gwapo-pah:~$ ls
index.cat.js
index.bird.js
index.dog.js
index.monkey.js
function.yml
config
I would like to execute a one-liner bash command that would perform these
find if "index.dog.js" exists, and if none then exit
find if "index.dog.js" exists, and if present then remove only the
other *.js files and retain index.dog.js, function.yml and the folder config
if command is success then the files from folder shold look like this:
index.dog.js
function.yml
config
these is so far I tried however I'm not able to continue the missing logic
if [ -f index.dog.js ] ; then echo 'exists' ; fi
shopt -s extglob
[[ -f index.dog.js ]] && rm !(index.dog).js
Another way using find command:
[ -f "index.dog.js" ] && find . -maxdepth 1 -name \*.js -not -name index.dog.js -delete
find command search in current directory any file with js extension but index.dog.js and remove it.
replace . with folder name if you are not inside directory where are file.
Test if "index.dog.js" exists, if it does, use find to yield all *.js files (but not index.dog.js), and delete them.
EDIT As John Kugelman correctly advises, best to avoid ls due to possible inconsistencies with it.
[ -f "index.dog.js" ] && \
find . -type f -not -name "index.dog.js" -name \*.js -exec rm {} +
test -f index.dog.js && find . -name \*.js -not -name index.dog.js -exec rm {} +
Explanation:
test is a way to do if without all the extra syntax, if you don't need the else.
&& is the "short circuit" (exit) you want if there is no dog file.
find looks for files using multiple criteria. In this case files whose name match *.js but are not the dog file.
find can then execute a command against the found files. The {} is a stand-in for the found files. The + means put all the filenames on one rm command, rather than running one command per file.

Asterisk in for loop not working as expected?

I don't understand why this is not working. I am trying to loop through files and folders and delete some of them depending on the name. In the example below delete all folders except the ones in the if statement.
Here's the code:
#!/bin/bash
workdir=/var/www/
for dir in $workdir/custom/*; do
if ! [ "$dir" == "$workdir/custom/somefolder" ]; then
rm -rf $dir
echo "remove $dir $?"
fi
echo "$dir"
done
The problem is that there are several files and folders in /custom/ but echo "$dir" outputs /var/www/custom/* once
instead of running through every file and folder in that directory. I know this means that * didn't match anything, but this is impossible.
The folder exists, has several files and folders in it and the path is correct, also the user has all needed permissions to rm files, I checked that twice.
What am I missing?
find /var/www/custom/ -name "*" -type d -mindepth 1 -exec rm -rf {} \;
this will remove all the directories inside your custom directory. If this is what is required.

Archive old directories and delete them

How can we archive old directories and delete them after? If we suppose an old directory is one who has modified last time at least three days ago, I am able to get my directories list with
find . -mindepth 2 -type d -mtime 3
which return a list like
./dir1/1394547493
./dir2/1394525075
./dir2/1394531732
./dir3/1394546562
Now we need for any directory from this list to create a .ZIP archive in coresponding dirX containing ONLY the files from it (there are no more other directories in those directories), and delete the subdirectory right after. In the end our structure should look like this
./dir1/1394547493.zip
./dir2/1394525075.zip
./dir2/1394531732.zip
./dir3/1394546562.zip
Resulting archives must not contains any paths.
Try this one-liner:
for dir in $(find . -mindepth 2 -type d -mtime 3); do cd "$dir" && zip ../$(basename "$dir") * && cd - && rm -rf "$dir" || cd -; done
This one-liner enters each directory, zips its contents without parent directories and removes the directory on success, but leaving the directory in place in case of failure.
Good luck :)
Edit: Your directory names are required to not include spaces or tabs

Merging Sub-Folders together, Linux

I have a main folder "Abc" which has about 800 sub-folders. Each of these sub-folders contains numerous files (all of the same format, say ".doc"). How do I create one master folder with all these files (and not being distributed into subfolders). I am doing this on a Windows 7 machine, using cygwin terminal.
The cp -r command copies it but leaves the files in the sub-folders, so it doesn't really help much. I'd appreciate assistance with this. Thank you!
Assuming there could be name collisions and multiple extensions, this will create unique names, changing directory paths to dashes (e.g. a/b/c.doc would become a-b-c.doc). Run this from within the folder you want to collapse:
# if globstar is not enabled, you'll need it.
shopt -s globstar
for file in */**; do [ -f "$file" ] && mv -i "$file" "${file//\//-}"; done
# get rid of the now-empty subdirectories.
find . -type d -empty -delete
If you can guarantee unique names, this will move the files and remove the subdirectories. You can change the two .s to the name of a folder and run it from outside said folder:
find . -depth \( -type f -exec mv -i {} . \; \) -o \( -type d -empty -delete \)
This may not be the most elegant or efficient way to do it, but I believe it'd accomplish what you want:
for file in `find abc`
do
if [ -f $file ]
then
mv $file `basename $file`
fi
done
Iterate through everything in abc, check if it's a file (not a directory) and if it is then move it from its current location (eg abc/d/example.txt) to abc/
Edit: This would leave all the subfolders in place (but they'd be empty now)

Resources