I am trying to concat jpg and png images with the use of ffmpeg. Before doing so, I am trying to append the file name and paths to a txt file. However, I am getting an error. The files have to follow the format file '/media/test.jpg' for appending to the txt file. What would be the best way to achieve this?
Print files:
find . \( -name '*.jpg' -o -name '*.png' \) -print
Error when trying this:
find . \( -name '*.jpg' -o -name '*.png' \) printf "file '%s'\n" > mylist.txt
find . \( -name '*.jpg' -o -name '*.png' \) -printf "file '%p'\n" >> mylist.txt
Related
I'm looking through many sub directories and finding all the files ending in .JPG .jpg and .png and copying them to a separate directory, however just now its only finding .JPG
Could someone explain what i'm doing wrong?
find /root/TEST/Images -name '*.png' -o -name '*.jpg' -o -name '*.JPG' -exec cp -t /root/TEST/CopiedImages {} +
You have to group the -o conditions because -a, the implied AND between the last -name '*.JPG' and -exec has higher precedence:
find /root/TEST/Images \( -name '*.png' -o -name '*.jpg' -o -name '*.JPG' \) -exec cp -t /root/TEST/CopiedImages {} +
Grouping is done with parentheses, but they have to be escaped (or quoted) due to their special meaning is shell.
Unrelated to this, you can shorten the overall expression by combining filters for jpg and JPG with the case-insensitive -iname (as noted in comments):
find /root/TEST/Images \( -name '*.png' -o -iname '*.jpg' \) -exec cp -t /root/TEST/CopiedImages {} +
I am looking the best way to delete files from directory by extension.
I am planning to do it by date. But now, i am testing how it works.
This:
dir=/tmp/backup/
mask="jpeg jpg png gif bmp pdf"
for i in $mask; do
find $dir -name "*.$i" -type f -delete
done
Or this ?
find $dir \( -name "*.jpeg" -o -name "*.jpg" -o -name "*.png" \
-o -name "*.gif" -o -name "*.bmp" -o -name "*.pdf" \) -type f -delete
I wan to do it with min resources of machine and operation system. Maybe you know other ways to do it. Because i will delete one year old files. And it can call lags. Thanks.
You can just use:
# to ensure it doesn't return *.jpg if there is no .jpg file
shopt -s nullglob
# list all matching extension filea
echo *.{jpeg,jpg,png,gif,bmp,pdf}
When you are satisfied with the output, just replace echo by rm
However if you want to make use of a variable then store all extensions in a variable then use it like this with find command:
mask="jpeg jpg png gif bmp pdf"
find . -type f -regextype posix-extended -regex ".*\.("${mask// /|}")"
How to remove all files without the .txt and .exe extensions recursively in the current working directory? I need a one-liner.
I tried:
find . ! -name "*.txt" "*.exe" -exec rm -r {} \
find -type f -regextype posix-extended -iregex '.*\.(txt|exe)$'
Try this.
find . -type f ! -name "*.exe" ! -name "*.txt" -exec rm {} \;
The above command will remove all the files other than the .exe and .txt extension files in the current directory and sub directory recursively.
If you have GNU find with the -delete action:
find . -type f ! \( -name '*.txt' -o -name '*.exe' \) -delete
And if not:
find . -type f ! \( -name '*.txt' -o -name '*.exe' \) -exec rm -f {} +
using -exec ... {} + to execute rm as few times as possible, with the arguments chained.
Try the following:
rm -f $(find . -type f ! \( -name "*.txt" -o -name "*.exe" \))
This will first recursively find all files that do not end with .txt or .exe extensions, and then delete all of these files.
I have a source directory with several files. Some of them are symlinks to other files.
I created a cscope.files file. But when I execute cscope. It complains for the files that are symlinks:
cscope: cannot find file /home/bla/source/file.cc
I think it's not very good, but maybe the correct way to go is to change the "find" script, to just write the destination of the symlink instead?
Currently I'm using:
# Write only the files which are NOT symlinks
find `pwd` \( \( -iname "*.c" -o -iname "*.cc" -o -iname "*.h" \) -and \( -not -type l \) \) -print > cscope.files
# Add the target of the symlink for all files matching the right extension, and are symlinks
find `pwd` \( \( -iname "*.c" -o -iname "*.cc" -o -iname "*.h" \) -and -type l \) -printf "%l\n" >> cscope.files
But this seems like a terrible solution. Still looking for a better one
I think you can use the command to find all real paths in a folder that you searched
find -L [your searched folder] -name [your searched pattern] -exec realpath {} \; >> cscope.files
For example, if I would like to add developed folder and linux kernel header to cscope.files, I will the these commands:
find -L `pwd` -iname "*.c" -o -iname "*.h" > cscope.files
find -L /usr/src/linux-headers-3.19.0-15-generic/ -iname '*.h' -exec realpath {} \; >> cscope.files
I hope the answer can help you.
For example if you want to give / as your path for cscope, and want cscope to search files with extensions .c/.h/.x/.s/.S you can give the find command as:
find / -type f -name "*.[chxsS]" -print -exec readlink -f {} \;> cscope.files
This will include regular files, including targets of symbolic links.
I just do the following to avoid symbolic links, as well get the absolute path in the cscope.files. With absolute path you can search from any directory in your sandbox when cscope is integrated with the vim editor
find /"path-to-your-sandbox" -path .git -prune -o -name "*.[ch]" -exec readlink -f {} \; > cscope.files
Note: if you omit -print from the find it does not put the symbolic link path in your cscope.files only the resolved path.
Better in a bash script:
#!/bin/bash
#
# find_cscope_files.sh
extension_list=(c cpp cxx cc h hpp hxx hh)
for x in "${extension_list[#]}"; do
find . -name "*.$x" -print -exec readlink -f {} \;
done
For reference for others I'm currently using.
find "$(pwd)" \( -name "*.[chCS]" -o -name "*.[ch][ci]" -o -name "*.[ch]pp" -o -name "*.[ch]++" -o -name "*.[ch]xx" ) -not \( -ipath "*unittest*" -or -ipath "*regress*" \) \( \( -type l -xtype f -exec readlink -f {} \; \) -o \( -type f -print \) \) >cscope.files
cscope -q -R -b -i cscope.files
I would like to find all php and js files inside a directory and exclude one of sub directory.
I may have to exclude more than one sub directory in the future.
I tried :
find /home/jul/here -type f -iname "*.php" -o -iname "*.js" ! -path "/home/jul/here/exclude/*"
Problem is that it is excluding only js file from /home/jul/here/exclude.
Is there a way to put some kind of parentheses?
find (something OR something else) AND exclude THIS
find /home/jul/here -type f \( -iname "*.php" -o -iname "*.js" \) ! -path "/home/jul/here/exclude/*"
You need to add the exclusion pattern after each group of files. So something like this should work:
find /home/jul/here -type f -iname "*.php" ! -path "/home/jul/here/exclude/*" -o -iname "*.js" ! -path "/home/jul/here/exclude/*"
Or maybe better with a variable:
EXCLUDE=/home/jul/here/exclude
find /home/jul/here -type f -iname "*.php" ! -path "$EXCLUDE/*" -o -iname "*.js" ! -path "$EXCLUDE/*"