Copy files matching a name in different folders - linux

I am using
find ../../ -type f -name <filename>*.PDF -print0 | xargs -0 cp --target-directory=Directory name with path>;
but it is copy only one file. It doesn't copy all files which is having same name. I need number of files to be searched and copied which is having same name but it it created on different date and different folder. how to solve this issue. I have already created lot's more I am facing the problem in this regard.

This will give you the duplicate files. Once you have the name, you can find them and delete them using your script:
for i in `find .|grep pom.xml`; do
basename $i;
done |sort|uniq -c|sort -n|cut -b9-
PS: everyone's in a hurry. Adding urgency to your posts is usually frowned upon in StackOverflow, and you might prompt the opposite reaction

Related

Customized deleting files from a folder

I have a folder where different files can be located. I would like to check if it contains other files than .gitkeep and delete them, keeping .gitkeep at once. How can I do this ? (I'm a newbie when it comes to bash)
As always, there are multiple ways to do this, I am just sharing what little I know of linux :
1)find <path-to-the-folder> -maxdepth 1 -type f ! -iname '\.gitkeep' -delete
maxdepth of 1 specifies to search only the current directory. If you remove maxdepth, it will recursively find all files other than '.gitkeep' in all directories under your path. You can increase maxdepth to however deep you want find to go into directories from your path.
'-type f' specifies that we are just looking for files . If you want to find directories as well (or links, other types ) then you can omit this option.
-iname '.gitkeep' specifies a case insensitive math for '.gitkeep', the '\' is used for escaping the '.', since in bash, '.' is a regular expression.
You can leave it to be -name instead of -iname for case sensitive match.
The '!' before -iname, is to do an inverse match, i.e to find all files that don't have the name '.gitkeep', if you remove the '!', then you will get all files that match '.gitkeep'.
finally, '-delete' will delete the files that match this specification.
If you want to see what all files will be deleted before executing -delete, you can remove that flag and it will show you all the files :
find <path-to-the-folder> -maxdepth 1 -type f ! -iname '\.gitkeep'
(you can also use -print at the end, which is just redundant)
2) for i in `ls -a | grep -v '\.gitkeep'` ; do rm -rf $i ; done
Not really recommended to do it this way, since rm -rf is always a bad idea (IMO). You can change that to rm -f (to ensure it just works on file and not directories).
To be on the safe side, it is recommended to do an echo of the file list first to see if you are ready to delete all the files shown :
for i in `ls -a | grep -v '\.gitkeep'` ; do echo $i ; done
This will iterate thru all the files that don't match '.gitkeep' and delete them one by one ... not the best way I suppose to delete files
3)rm -rf $(ls -a | grep -v '\.gitkeep')
Again, careful with rm -rf, instead of rm -rf above, you can again do an echo to find out the files that will get deleted
I am sure there are more ways, but just a glimpse of the array of possibilities :)
Good Luck,
Ash
================================================================
EDIT :
=> manpages are your friend when you are trying to learn something new, if you don't understand how a command works or what options it can take and do, always lookup man for details.
ex : man find
=> I understand that you are trying to learn something out of your comfort zone, which is always commendable, but stack overflow doesn't like people asking questions without researching.
If you did research, you are expected to mention it in your question, letting people know what you have done to find answers on your own.
A simple google search or a deep dive into stack overflow questions would have provided you with a similar or even a better answer to your question. So be careful :)
Forewarned is forearmed :)
You can use find:
find /path/to/folder -maxdepth 1 ! -name .gitkeep -delete

find tekst in files in subfolders

So this question might have been asked before, but after some hours of searching (or searching wrongfully) I decided to ask this question.
If it's already been answered before, please link me the question and close this one.
here's my issue.
I have a folder on my filesystem, ie "files". this folder has got a lot of subfolders, with their subfolders. some levels deep, they all have a file which is called the same in all folders. In that file, a lot of text is in it, but it's not ALL the same. I need to have a list of files that contains a certain string.
I KNOW I can do this with
find ./ -type f -exec grep -H 'text-to-find-here' {} \;
but the main problem is: it will get over every single file on that filesystem. as the filesystem contains MILLIONS of files, this would take up a LONG time, specially when I know the exact file this piece of text should be in.
visually it looks like this:
foobar/foo/bar/file.txt
foobar/foobar/bar/file.txt
foobar/barfoo/bar/file.txt
foobar/raboof/bar/file.txt
foobar/oof/bar/file.txt
I need a specific string out of file.txt (if that string exists..)
(and yes: the file in /bar/ is ALLWAYS called file.txt...)
Can anyone help me on how to do so? i'm breaking my head on an "easy" solution :o
Thnx,
Daniel
Use the -name option to filter by name:
find . -type f -name file.txt -exec grep -H 'text-to-find-here' {} +
And if it's always in a directory named bar, you can use -path with a wildcard:
find . -type f -path '*/bar/file.txt' -exec grep -H 'text-to-find-here' {} +
With single GNU grep command:
grep -rl 'pattern' --include=*file.txt
--include=glob
Search only files whose name matches glob, using wildcard

BASH: Checking if files are duplicates within a directory?

I am writing a house-keeping script and have files within a directory that I want to clean up.
I want to move files from a source directory to another, there are many sub-directories so there could be files that are the same. What I want to do, is either use CMP command or MD5sum each file, if they are no duplicates then move them, if they are the same only move 1.
So the I have the move part working correctly as follows:
find /path/to/source -name "IMAGE_*.JPG" -exec mv '{}' /path/to/destination \;
I am assuming that I will have to loop through my directory, so I am thinking.
for files in /path/to/source
do
if -name "IMAGE_*.JPG"
then
md5sum (or cmp) $files
...stuck here (I am worried about how this method will be able to compare all the files against eachother and how I would filter them out)...
then just do the mv to finish.
Thanks in advance.
find . -type f -exec md5sum {} \; | sort | uniq -d
That'll spit out all the md5 hashes that have duplicates. then it's just a matter of figuring out which file(s) produced those duplicate hashes.
There's a tool designed for this purpose, it's fdupes :
fdupes -r dir/
dupmerge is another such tool...

bash on Linux, delete files with certain file extension

I want to delete all files with a specific extension - ".fal", in a folder and its subfolders, except the one named "*Original.fal". The problem is that I want to delete other files that have the same extension:
*Original.fal.ds
*Original.fal.ds.erg
*Original.fal.ds.erg.neu
There are other ".fal"s that I want to delete as well, that don't have "Original" in them.
Names vary all the time, so I can't delete specific names. The *Original.fal doesn't vary.
I can only get up to here:
$find /disk_2/people/z183464/DOE-Wellen -name "*.fal" \! -name "*Original.fal" -type f -exec echo rm {} \;
It would be great if the command can delete only in the folder (and it's subfolders) where it has been called (executed)
When I run the code it gives me an error:
/disk_2/people/z183464/DOE-Wellen: is a directory
If you do not want find to dive too deep, you can restrict it with -maxdepth.
You can use a simple for loop for that. This command shows all the files you might want to delete. Change echo with rm to delete them.
cd /disk_2/people/z183464/DOE-Wellen && for I in `find . -name "*.fal" ! -name "*Original.fal"`; do echo $I; done
With "find ... | grep ..." you can use regex too, if you need more flexibility.

extracting nested different types of archives from different folders

I have got an archive of many fonts but i have troubble extracting them all into one folder. i tried to write a long script for 3 hours now, it somehow breaks on a path issue. i tried piping like find . -name *.zip|unzip -d ~/fonts but it doesnt work. i changed so much in the script i wrote, that it is not really presentable :(.
each fontfile is supposedly (i didnt check all, there are really many) inside a rar archive which together with a readme is in a zip archive which together with another readme is in each its own folder. can this be done in one line?
Try changing the one line like this
find . -name "*.zip" | xargs unzip -d ~/fonts
Try this
find . -name "*.zip" -exec unzip -d ~/fonts {} \;

Resources