Linux - Match Filename passed to a script against a wildcard list in a text file and move based off that - linux

I am trying to write a script that will perform the following. Here is the detail.
1) Files will land into a staging folder
/tmp/staging
2) We need to move files to two folders
/tmp/folder1 (files we dont want but can archive)
/tmp/folder2 (files we want to process)
In order to do this, we need to match the landed file against a list of exlusions that are wildcards such as TST_, PMP_, if it matches them, we move the file to folder1, if it does not match not we move it to folder2.
I have tried to setup a find command that will look through the list and move based off what it finds below, what what I really want is to only match the file that I pass as a parameter to the list.
FILTER=/tmp/filter.txt
STG=$1 #this is passed as a paremter which is the filename
FOL1=/tmp/folder1
FOL2=/tmp/folder2
FILE=$2
while read line
do
find "$STG" -type f -name "$line" -exec mv -t "{$STG%/*}$FOL2" {} \;"
done < "$FILTER"

Related

Search and replace string in file names without affecting folder names which contains same string

I have a folder which contains multiple subfolders and files with different names.
I want to change file names which contains specific string with a new name. It works when the string i want to change is not same with subfolder names. When it does my code is failing.
Example directory and files:
./cat/cat_1.py
./cat/small_cat_2.py
./small_cat_3.py
./example/cat_4_small.py
My working command:
find -name '*small*' -type f -exec bash -c 'mv "$0" "${0/small/${1}}"' {} ${NEW_NAME} \;
It works without problem.
My failing command:
find -name '*cat*' -type f -exec bash -c 'mv "$0" "${0/cat/${1}}"' {} ${NEW_NAME} \;
For NEW_NAME=dog
It doesn't work because this command tries to replace folder names from ./cat/cat_1.py as ./dog/dog_1.py which is a situation I don't want.
I want to rename ./cat/cat_1.py as ./cat/dog_1.py.
I set NEW_NAME parameter with user input.
Folders and files are not static, it is changing every day so i can't change my working directory to "cat" folder and make replacement in cat subfolders statically.
Every day new directories and files may be created. So my code have to handle new situations.
Can anyone help about it?
You need a different shell script to process the renames.
Since you want to rename only the base name of the file and not the directory path basedir of it, you will have to split the basename from the basedir,
then perform the rename:
#!/usr/bin/env bash
# Capture the replacement name as the first argument
newname="$1"
# Shift the first argument out of the argument array
shift
# Iterate the remaining arguments files paths provided by the find command
for filepath; do
# Capture the base file name of the file path
# by stripping out all leading characters up to and including the last /
basename=${filepath##*/}
# Capture the base directory of the file path
# by stripping out all the trailing characters starting at
# and including the last /
basedir=${filepath%/*}
# Perform the replacement of the cat string with the newname
# within the old base name of the file to build its new base name
newbasename=${basename/cat/"$newname"}
# Proceed to the renames, of the old filename path to the
# reassembled base directory with the new base name
echo mv -- "$filepath" "$basedir/$newbasename"
done
With the inline bash inside find:
find . -name '*cat*' -type f -exec bash -c 'r=$1;shift;for p;do o=${p##*/};d=${p%/*};echo mv -- "$p" "$d/${o/cat/"$r"}";done' _ dog {} \+;
If satisfied by the output, remove the echo so it executes the command, or pipe the output to a shell.

Recursive find that will append directory name to any file

Any help would be greatly appreciated!
I need a way to append the parent directory name to any file in any path.
An example current directory tree
/Hawaii/Surfing/800x600/picture1.jpg
/Hawaii/Surfing/800x600/picture2.jpg
/Hawaii/Surfing/800x600/picture3.jpg
/RockClimbing/SouthAfrica/TableMountain/4096x2160/Picture1.jpg
/RockClimbing/SouthAfrica/TableMountain/4096x2160/Picture2.jpg
/RockClimbing/SouthAfrica/TableMountain/4096x2160/Picture3.jpg
The goal
/Hawaii/Surfing/800x600/picture1.800x600.jpg
/Hawaii/Surfing/800x600/picture2.800x600.jpg
/Hawaii/Surfing/800x600/picture3.800x600.jpg
/RockClimbing/SouthAfrica/TableMountain/4096x2160/Picture1.4096x2160.jpg
/RockClimbing/SouthAfrica/TableMountain/4096x2160/Picture2.4096x2160.jpg
/RockClimbing/SouthAfrica/TableMountain/4096x2160/Picture3.4096x2160.jpg
I have found some examples of this but the users all have set directory depths unfortunately I have files at many different levels.
find dir -name *.jpg -exec rename -nv -- 's|/(.*)/(.*)$|/$1/$1.jpg|' {}
Your first capture group is matching everything before the last /, not just the last directory name. Use /([^/]*)/ instead of /(.*)/ so it won't match across / delimiters. You're also not splitting up the filename into the name and extension, so you're not inserting the directory name between them.
find dir -name *.jpg -exec rename -nv -- 's|([^/]*)/([^/]*)\.jpg$|$1/$2.$1.jpg|' {} +

Moving a file and renaming it after the directory which contains it on Bash

I'm trying to learn bash on Linux, just for fun. I thought it would be pretty useful to have a .sh that would group together similar files. For example, let's say we have the directory
/home/docs/
Inside the directory we have /mathdocs/, /codingdocs/, etc.
Inside those sub-directories we have doc.txt, in all of them. Same name for all the files on the subdirectories.
Let's say I want to group them together, and I want to move all the files to /home/allthedocs/ and rename them after the directories they were in. (mathdocs.txt, codingdocs.txt, etc.)
How could I do that?
I've tried to create a script based on the ls and cp commmands, but I don't know how I can take the name of the directories to rename the files in it after I moved them. I guess it has to be some sort of iterative sentence (for X on Y directories) but I don't know how to do it.
You can move and rename your file in one shot with mv, with a loop that grabs all your files through a glob:
#!/bin/bash
dest_dir=/home/allthedocs
cd /home/docs
for file in */doc.txt; do
[[ -f "$file" ]] || continue # skip if not a regular file
dir="${file%/*}" # get the dir name from path
mv "$file" "$dest_dir/$dir.txt"
done
See this post for more info:
Copying files from multiple directories into a single destination directory
Here is a one liner solution that treats whitespaces in filenames, just as #codeforester 's solution does with the glob.
Note that white spaces are treated with the "-print0" option passed to "find", the internal field separator (IFS) in while loop and the wrapping of file3 variable with quotes.
The parameter substitution from file2 into file3 gets rid of the leading "./".
The parameter substition inside the move command turns the path into a filename (run under /home/docs/):
find . -maxdepth 2 -mindepth 2 -print0 | while IFS= read -r -d '' file; \
do file2=$(printf '%s\n' "$file"); file3=${file2#*\/*}; \
mv "$file2" ../allsil/"${file3//\//}"; done

I want to cat a file for a list of file names, then search a directory for each of the results and move any files it finds

I'm having a really hard time finding an answer for this because most people want to find a list of files, then do the xargs.
I have a file with a list of file names. I would like to go through the list of file names in that file and one by one search a single directory (not recursive) and any time it finds the file, move it to a sub folder.
cat test.txt | xargs find . -type f -iname
At this point, I get this error:
find: paths must precede expression: file1
Why don't you use something like:
for i in `cat test.txt`
do
test -e $i && mv <src> <dst>
done

List file names from a directory and place output in a text file

I am trying to find all text files in my directory and copy a list to text file. I am using a for loop to get the names. Then the output is being placed in the text file. However I am getting an indication of a syntax error near unexpected token$'do\r''`. What would be the best way to do this? Also for the text file name how could I save it with a date/timestamp name?
FILE_PATH="/my_folder/"
for f in $(find $FILE_PATH -type f -name '*.txt'); do
echo "file '$f'" >> "$FILE_PATH"mylist.txt;
done
Don't use for loop. In order for that for loop to execute, it first has to find all of the files that meet your criteria, then substitutes that list into your for loop. It means waiting for that ``findto process before yourfor` loop can even start to process the names. Plus, if there are lots of files, you can overload the command line and lose files you want to see.
Why not:
FILE_PATH="/my_folder"
find $FILE_PATH -type f -name '*.txt' > "$FILE_PATH/mylist.txt"
No need for a loop. No need to echo. Simple and clean.
Plus, you don't have issues with file names that may contain spaces.
You can avoid find using pathname expansion:
FILE_PATH="/my_folder/"
for f in "$FILE_PATH"/* ; do
test -f "$f" || continue # not a regular file
echo "$f"
done >| "$FILE_PATH/mylist.txt" # >| force overwrite

Resources