Linux find command explanation - linux

Can someone explain me what does this command do and if I want to try the same thing using git, how should I modify this command?
find . -name CVS -print -exec rm -fr {} \;

This command looks in your current working directory for any directories or files named "CVS" and prints the full path. Then executes a forced recursive removal for each result returned by the find command.
Since there is no filetype present in the name, this command will remove any directory, within your current working directory, named CVS, including all subdirectories and files housed within.

Related

copying all the .tar and .tar.bz2 file systems from all the sub-directories into another directory

Imagine that i have lots of sub-directory in a sub-directory in a directory I wanted to copy all the .tar and .tar.bz2 extension files from all the sub-directories into another directory.
I used
$find /home/apple/mango -name *.tar -exec cp {} ./kk \;
but it copies only once from a sub directory and stops , it doesn't find files which are in other sub directories or go inside a sub directories and find them.
I want to do it recursively
You may use:
find /home/apple/mango -name '*.tar*' -execdir cp {} /full/path/to/kk \;
Note how name pattern is quoted to avoid shell expansion even before find command executes.
In the absence of quoting *.tar is expanded to some file.tar which is present in current directory and find stop right there because file.tar is not found in sub directories. By quoting glob pattern we make sure that find command gets literal pattern to search the sub directories.

Linux backup all files with known extensions with timestamps

I want to backup all files with a given extension in a directory but I want them to be with timestamps.
Given a directory:
Sample/ with multiple subdirectory and a subfolder name BACKUPS.
cd Sample
find . -name '*.xml' -exec cp {} BACKUPS \;
Say I have multiple xml files in this Sample folder and I want them to be copied to the BACKUPS folder but I want them to be timestamp
say..
text.xml.20171107
conf.xml.20171107
I am able to backup the files but I could not figure out how to append a timestamp to the files using the find command.
You could try this:
find . -name '*.xml' -execdir cp {} "$PWD/BACKUPS/{}.$(date +%Y%m%d)" \;
As before, we use find . -name '*.xml' to locate all the files. However, in order to get rid of the names of subdirectories, we use -execdir instead of exec. This causes the specified command to be run from inside the subdirectory the current file is in and replaces {} by its base name.
This means we have to modify cp's second argument (the target filename). We now pass "$PWD/BACKUPS" to create an absolute path ($PWD is the current working directory). This way cp always targets the right directory, even when invoked from a subdirectory of Sample.
Finally, the filename we use is constructed from {}.$(date +%Y%m%d). $( ) runs the specified command and substitutes its output (the current date, in this case). This is done by the shell before find is invoked, so find just sees .../{}.20171107. The {} part is replaced by find itself just before it runs each cp.

Linux command to create the empty file called 'test1'

Enter a Linux command to create the empty file called 'test1' in the directory 'systems' (you are still in your home directory).
Assuming 'systems' is a subdirectory of the current directory:
touch systems/test1
Assuming that you only know that the directory 'systems' is some subdirectory in the directory tree of the current directory then: find . -name systems -type d -exec touch "{}/test1" \; Will create such a file. Alternately, so will find . -name systems -type d -execdir touch systems/test1 \; However, both will do so in every subdirectory named 'systems' in the current directory tree. We could limit that action to only the first, the last, or some other criteria, but the list of possible permutations is just too long.
You really have not provided enough information for us to provide a complete answer.

Search recursively for files in a parent directory in Linux

I am trying to list all the files in a parent directory and its subdirectories. However, I am running this command from another location. So, at first, I need to traverse to the directory (from where I want to run this command).
Please note that I am using the find command instead of ls because I also want to list down the absolute path for each file in front of it. This is not possible with the ls command.
here is what I am doing:
cd ../../../;cd level1_dir1;find $(pwd) . -name *.* -printf "%TY-%Tm-%Td\t%p\n"
This command does not show any output.
Here is the directory structure:
level1_dir1
this has multiple subdirectories:
level2_dir1
level2_dir2
....
level2_dir10
each of these subdirectories again have subdirectories and files inside them.
however, now if I do:
cd ../../../;cd level1_dir1/level2_dir1;find $(pwd) . -name *.* -printf "%TY-%Tm-%Td\t%p\n"
it will do the recursion properly for all the subdirectories in level2_dir1 and show output like:
date level1_dir1/level2_dir1/path/to/file/filename
so, I wanted to print out for all the level2 directories, this way (by using the wild character):
cd ../../../;cd level1_dir1/*;find $(pwd) . -name *.* -printf "%TY-%Tm-%Td\t%p\n"
but it prints out the results only for the first directory in level2 (that is level2_dir1)
how can I make it list down the files for all the subdirectories?
thanks.
How about this?
find ../../../level1_dir1 -printf "%TY-%Tm-%Td\t%p\n"
If you want all the files, you don't even need -name in the find command. If you don't want to see the directories and only the files, just add "-type f" before -printf.
Hope this helps...

Bash script to recursively step through folders and delete files

Can anyone give me a bash script or one line command i can run on linux to recursively go through each folder from the current folder and delete all files or directories starting with '._'?
Change directory to the root directory you want (or change . to the directory) and execute:
find . -name "._*" -print0 | xargs -0 rm -rf
xargs allows you to pass several parameters to a single command, so it will be faster than using the find -exec syntax. Also, you can run this once without the | to view the files it will delete, make sure it is safe.
find . -name '._*' -exec rm -Rf {} \;
I've had a similar problem a while ago (I assume you are trying to clean up a drive that was connected to a Mac which saves a lot of these files), so I wrote a simple python script which deletes these and other useless files; maybe it will be useful to you:
http://github.com/houbysoft/short/blob/master/tidy
find /path -name "._*" -exec rm -fr "{}" +;
Instead of deleting the AppleDouble files, you could merge them with the corresponding files. You can use dot_clean.
dot_clean -- Merge ._* files with corresponding native files.
For each dir, dot_clean recursively merges all ._* files with their corresponding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used.
If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.
Because dot_clean works recursively by default, use:
dot_clean <directory>
If you want to turn off the recursively merge, use -f for flat merge.
dot_clean -f <directory>
find . -name '.*' -delete
A bit shorter and perform better in case of extremely long list of files.

Resources