Looping through find results in bash [duplicate] - string

This question already has answers here:
How to loop through file names returned by find?
(17 answers)
Closed 4 years ago.
I wanna loop through jpg files, using find to locate them.
The result is the full path name including ./ in front. I want to remove the replace ./ with ./newsubdir/ so I can use that as the output file name in a process, creating a modified copy of the original in newsubdir using the same folder structure.
This is what I tried.
#!/bin/bash
find . -type f -name '*jpg'
for file do
echo ${file:1}
done
However the substring extraction didn't seem to work at all. Is there a reason for that or a different way to do this. Very new to Bash.
I was going for something like this as a end result. Trying to square a bunch of pictures but keep the folder structure.
#!/bin/bash
find . -type f -name '*jpg'
for file do
convert '$file[2048x2048]' -gravity center -extent 2048x2048 "./newsubdir${file:1}"
done

You were close! Sticking a little closer to the original code (and thus avoiding starting more shells than necessary):
#!/bin/bash
find . -type f -name '*.jpg' -exec bash -c '
for file do
convert "$file[2048x2048]" -gravity center -extent 2048x2048 "./newsubdir${file:1}"
done
' _ {} +
...or, using your original shell and avoiding -exec entirely:
#!/bin/bash
while IFS= read -r -d '' file; do
convert "$file[2048x2048]" -gravity center -extent 2048x2048 "./newsubdir${file:1}"
done < <(find . -type f -name '*.jpg' -print0)
These patterns and more are part of UsingFind.

find . -type f -name '*jpg' -exec bash -c '
file=$1; convert "${file}[2048x2048]" -gravity \
center -extent 2048x2048 "./newsubdir/${file:1}"' _ {} \;
Frankly, I think you're much better off writing a script to do the conversion and just calling it with find . -type f -name '*.jpg' -exec script {} \;. Doing that will help to avoid inevitable quoting problems.

Related

Linux FIND searching files with names within single quotes [duplicate]

This question already has answers here:
How can I store the "find" command results as an array in Bash
(8 answers)
Closed 1 year ago.
I am trying to save results of FIND command into array so then I can do some AWK commands with them.
My actual code is: files_arr=( "$(find "$1" -type f \( -name "\'*[[:space:]]*\'" -o -name "*" \) -perm -a=r -print )
this code should find all files with spaces and without spaces and return them to my array (and are readable also)
The PROBLEM is, when I have directory named: 'not easy' and inside this is directory are files: 'file one' and 'file two' so what I will get is: not easy/file one
what I want to get is: 'not easy'/'file one'I was thinking about using SED to add quotes but it would add quotes even if I had just simple one word file which doesnt have quotes in it.
Thank you for our advices.
Try this out :
mapfile -d '' files_arr < <(find . -type f -name "'*[[:space:]]*'" -perm -a=r -print0)
declare -p files_arr # To see what's in the array

Bash script to find .jpgs created within a certain time frame and then rename them

I have the following find command working pretty well which walks through a directory tree looking for any .jpg it finds with a file modification date of 600 minutes or less:
find /some/directory/ -depth -mmin -600 -name *.jpg
What I need to do now is rename all the .jpg it finds to the actual creation date that the .jpg was created on and create some random numbers at the end of the file before appending .jpg back to it. I've used this in the past: (date -r "$f" +%Y-%m-%d_%H-%M-%S-%N).jpg but I can't seem to figure out how to tie the find to the mv.
Am I missing a simple way to do this with -exec?
This should achieve what you wanted :
find /some/directory/ -depth -mmin -600 -name "*.jpg" \
-exec bash -c 'echo mv "$1" "$(dirname "$1")/$(date -r "$1" +%Y-%m-%d_%H-%M-%S-)$(date +%N).jpg"' bash {} \;
Remove echo to do renaming once you are satisfied with result.

How to make ImageMagic identify accept input from a pipe?

Trying to pipe list of images from find to identify and I get no output.
Using this command, I get no results.
find . -iname "*.jpg" -type f | identify -format '%w:%h:%i'
However, if I use this command, which doesn't use a pipe but instead uses find's -exec option it works normally.
find . -iname "*.jpg" -type f -exec identify -format '%w:%h:%i\n' '{}' \;
Does anyone know why this is happening and how to use pipe properly instead of find -exec?
Figured it out, I needed to use xargs
find . -iname "*.jpg" -type f | xargs -I '{}' identify -format '%w:%h:%i\n' {}
the brackets '{}' are used to represent the file array.
This works for me:
identify -format '%w:%h:%i\n' $(find . -iname "*.jpg")
Note: I have added \n so that each image will list on a new line.
Your first command, namely this:
find . -iname "*.jpg" -type f | identify -format '%w:%h:%i'
doesn't work because identify expects the filenames as parameters, not on its stdin.
If you want to make identify read filenames from a file, you would use:
identify -format '%w:%h:%i\n' #filenames.txt
If you want to make identify read filenames from stdin, (this is your use case) use:
find . -iname "*.jpg" -type f | identify -format '%w:%h:%i\n' #-
If you want to get lots of files done fast and in parallel, use GNU Parallel:
find . -iname "*.jpg" -print0 | parallel -0 magick identify -format '%w:%h:%i\n' {}

What the best way to delete files by extension?

I am looking the best way to delete files from directory by extension.
I am planning to do it by date. But now, i am testing how it works.
This:
dir=/tmp/backup/
mask="jpeg jpg png gif bmp pdf"
for i in $mask; do
find $dir -name "*.$i" -type f -delete
done
Or this ?
find $dir \( -name "*.jpeg" -o -name "*.jpg" -o -name "*.png" \
-o -name "*.gif" -o -name "*.bmp" -o -name "*.pdf" \) -type f -delete
I wan to do it with min resources of machine and operation system. Maybe you know other ways to do it. Because i will delete one year old files. And it can call lags. Thanks.
You can just use:
# to ensure it doesn't return *.jpg if there is no .jpg file
shopt -s nullglob
# list all matching extension filea
echo *.{jpeg,jpg,png,gif,bmp,pdf}
When you are satisfied with the output, just replace echo by rm
However if you want to make use of a variable then store all extensions in a variable then use it like this with find command:
mask="jpeg jpg png gif bmp pdf"
find . -type f -regextype posix-extended -regex ".*\.("${mask// /|}")"

Want to find any reference in any file to a certain string in linux [duplicate]

This question already has answers here:
how to find files containing a string using egrep
(7 answers)
Closed 8 years ago.
I am trying to search All .PHP files or ALL .SH files for any reference that contains:
'into tbl_free_minutes_mar'
I have command line access to the server but the files may be scattered in different directories.
For all directories everywhere,
find / -type f \( -name '*.php' -o -name '*.sh' \) \
-exec fgrep 'into tbl_free_minutes_mar' {} \+
For fewer directories elsewhere, just give a list of paths instead of /. To just list the matching files, try fgrep -l. If your file names might not always match the wildcards in the -name conditions, maybe scan all files.
find / -type f \( -name \*.php -o -name \*.sh \) -exec grep 'into tbl_free_minutes_mar' {} /dev/null \;
Change find / ... to to something less all-encompassing if you know the general area that you want to look in, e.g. find /home ...
Provided /base/path is the path where you want to start looking this will get you a list of files:
find /base/path -type f -iregex '.*\.\(php\|sh\)$' -exec grep -l 'into tbl_free_minutes_mar' '{}' \;

Resources