Usage of "find" linux command - linux

My question is about linux commands. I need to find all ".sh" files and I need to delete all the files which end with ".sh" extension automatically. Could anyone please help me out by giving suitable linux command for it?

find . -type f -name '*.sh' -exec rm {} \;

Related

Find by -name and -mtime, returns nothing

I am trying to use find command to delete some old file from backup folder but the find command return nothing, and so nothing is being removed! this is the code (find part), my system is ubuntu 18.04 LTS
find -name "*.sql" -type f -mtime +30
the result of find command
and the output of ls -l command is :
the result of ls -l command
I googled and searched the web but did find nothing to solve the problem. any help appreciated.
You are missing the starting point for your find command, in this case . because you already execute the command in the target directory:
find . -name "*.sql" -type f -mtime +30
the rest can stay the same.
First make sure it gives you the correct result and afterwards you can tack on the -exec to execute a command for each line of the result.
find . -name "*.sql" -type f -mtime +30 -exec rm '{}' ';'
You can usually find such answers on the UNIX stackexchange: How to execute ln on find results
Please see the comment from David in this particular case it might be a misunderstanding of the mtime parameter.
I have tested exactly the commands that were listed here, below you see
my preparation and some multiple usage, you can see how the files show up as expected, every time the mtime value decreases:
VIRTUAL BOX UBUNTU LTS 18.04
which is no surprise, given the man page of the find command:
FIND MAN PAGE / -mtime PARAMETER
please check for any typos... this should work.

Stuck trying to copy files to a new directory under specific guidelines

I'm trying to copy all files from a folder that start with a capital letter into another folder.
So far i've used the find command to actually find the files
find /examplefolder -type f -name "[[:upper:]]*"
and i'm able to find them no problem
I tried to just replace the find command with cp and that does not work then i tried to pipe into the cp command and I failed yet again
Use the -exec option for find
find /examplefolder -type f -name "[[:upper:]]*" -exec cp {} /my/new/shiny/folder/ \;
I was able to do it by piping in to cp with xargs and the -I argument
find /examplefolder -type f -name "[[:upper:]]*" | xargs -I % cp % /copied_directory

bash find -exec sometimes works and sometimes doesn't

I'm probably missing something, but this oneliner in a bash-script to cycle through some scripts that dump data from different sources:
find . -name 'dump-*.sh' -exec {} "$DUMP_LOG" &>>"$DUMP_LOG" \;
will work when I execute the bash-script containing this oneliner directly, but it doesn't work when I invoke it as the cmd_preexec in rsnapshot. It doesn't spawn any errors, it just doesn't do anything.
I tried adding '(/bin/)bash -c', like this:
find . -name 'dump-*.sh' -exec bash -c '{} "$DUMP_LOG" &>>"$DUMP_LOG"' \;
but then I get an error about '(/bin/)bash not existing, even if Irun the script directly.
OK, silly me. Of course the first parameter of the find-cmd needs the working directory.
find /usr/local/sbin -name 'dump-*.sh' -exec {} "$DUMP_LOG" &>>"$DUMP_LOG" \;
has solved the problem.

How to find files recursively by file type and copy them to a directory?

I would like to find all the pdf files in a folder. It contains pdf files inside and more directories that contain more as well. The folder is located on a remote server I have ssh access to. I am using the mac terminal but I believe the server I am connecting to is Centos.
I need to find all the pdfs and copy them all to one directory on the remote server. I've tried about 10 variations with no luck. Both mine and the remote systems do not seem to recognise -exec as a command though exec is fine so thats a problem.
Im not sure what the problem is here but the command does not fail it just sits there and stalls forever so I do not have any useful errors to post.
cp $(find -name "*.pdf" -type f; exec ./pdfsfolder {} \; | sed 1q)
find: ./tcs/u25: Permission denied
find: ./tcs/u68: Permission denied
-bash: /var/www/html/tcs_dev/sites/default/files/pdfsfolder: is a directory
-bash: exec: /var/www/html/tcs_dev/sites/default/files/pdfsfolder: cannot execute: Success
cp: target `./runaways_parents_guide_2013_final.pdf' is not a directory
This is the last one I tried, I think I can ignore the permission denied errors for now but im not sure about the rest.
Try this:
find . -name "*.pdf" -type f -exec cp {} ./pdfsfolder \;
Paul Dardeau answer is perfect, the only thing is, what if all the files inside those folders are not PDF files and you want to grab it all no matter the extension. Well just change it to
find . -name "*.*" -type f -exec cp {} ./pdfsfolder \;
Just to sum up!
Something like this should work.
ssh user#ip.addr 'find -type f -name "*.pdf" -exec cp {} ./pdfsfolder \;'

Can the find command's "exec" feature start a program in the background?

I would like to do something like:
find . -iname "*Advanced*Linux*Program*" -exec kpdf {} & \;
Possible? Some other comparable method available?
Firstly, it won't work as you've typed, because the shell will interpret it as
find . -iname "*Advanced*Linux*Program*" -exec kpdf {} &
\;
which is an invalid find run in the background, followed by a command that doesn't exist.
Even escaping it doesn't work, since find -exec actually execs the argument list given, instead of giving it to a shell (which is what actually handles & for backgrounding).
Once you know that that's the problem, all you have to do is start a shell to give these commands to:
find . -iname "*Advanced*Linux*Program*" -exec sh -c '"$0" "$#" &' kpdf {} \;
On the other hand, given what you're trying to do, I would suggest one of
find ... -exec kfmclient exec {} \; # KDE
find ... -exec gnome-open {} \; # Gnome
find ... -exec xdg-open {} \; # any modern desktop
which will open the file in the default program as associated by your desktop environment.
If your goal is just not having to close one pdf in order to see the next one as opposed to display each pdf in its own separate instance, you might try
find . -iname "*Advanced*Linux*Program*" -exec kpdf {} \+ &
With the plussed variant, -exec builds the command line like xargs would so all the files found would be handed to the same instance of kpdf. The & in the end then affects the whole find. With very large numbers of files found it might still open them in batches because command lines grow too long, but with respect to ressource consumption on your system this may even be a good thing. ;)
kpdf has to be able to take a list of files on the command line for this to work, as I don't use it myself I don't know this.

Resources