Copy or move from nth level subfolder to another directory or folder - linux

We have a software that adds a wav file and put it on a folder by its date which is buried under several subfolders.
For example:
home/user/music/group1/person1/todays date/wav file
home/user/music/group1/person1/yesterdays date/wav file
home/user/music/group1/person2/todays date/wav file
home/user/music/group1/person2/yesterdays date/wav file
Also, the person(n) folder is dynamic which means its created automatically if the software founds someone using that device and creates that folder. So for example, if a new user is using the software it will create home/user/music/group1/person3/.
How do I move or copy starting from the person(n) folder and move them to a new folder like home/user/new/person1.. home/user/new/person2..
Since the person(n) folder is dynamic I could not just do command like cp person1 newdirectory
What i did is find all wav files under group1 folder and cp to new folder but it copies the full path.
find /home/user/music/group1 -name "*.wav" -type f -exec cp --parents \{\} /home/user/new \;
If i remove --parents it will only copy the files to new folder. how do I copy starting from the person(n) folder to new folder?

thanks #RamanSailopal, i got it to work with find /home/user/music/group1/ -name "person*" -type d -exec cp -pR {}/ /home/user/new \;

Related

Copy or move all files in a directory regardles of folder depth or number

Lets say i have a folder named Pictures and I want to move or copy all files out of this folder.
However I also want to move and harvest all of the files who are in sub folders so:
Pictures/1.png
Pictures/yolo/2.png
Pictures/yolo/swag/sand/3.png
Pictures/extra/fire/4.png
I want to move or copy all these files to another folder like results so I get:
results/1.png
results/2.png
results/3.png
results/4.png
Only I have no idea in advance what sub folders will be in the Pictures folder.
How can I accomplish this in bash/shell scripts ?
I also appreciate making it file type neutral so any files are harvested from their directories (not only .png like in my example) and I have no idea what the file name will be (I only used 1...4 because i did not have any idea how to name them).
You can do it like this:
find /absolute/path/to/Pictures -type f -name '*.png' -exec mv -i {} /absolute/path/to/results \;
Another option is to use xargs
find /absolute/path/to/Pictures -name '*.png' | xargs -I files mv files /absolute/path/to/results
You can simply copy all files and subdirectories along with their contents using cp's recursive option:
cp -pr <source_path>/* <destination_path>/
But, moving them recursively is a bit tricky, you will need to create tar files of the subdirectories and move them and then untar the tar files in destination path. As this is a complex process, as a workaround, you can copy the files/directories recursively and then delete the files from original path.
cp -pr <source_path>/* <destination_path>/ && rm -rf <source_path>/*

Deleting particular named directories from parent and all child folders

My folder hierarchy is like A/B/C/D.
So now each folder contains directory named CVS.
My purpose is I want to delete all CVS named directories from all folders.
I frequently tried to execute from parent folder(A Folder) rm -rf "CVS" , but it deletes CVS folder only from A folder and it's not fulfilling my needs.
I want to delete total 1200 folders named CVS.
If you can let me know appropriate command to delete CVS named directory recursively from parent to all sub folder it would be great help.
You can use the find command.
find pathname -type d -iname "CVS" -delete
In path name , You can give the path from which directory you have to delete.
Or else try this.
find pathname -type d -iname "CVS" -exec rm -rf \{\} \;

Rsync make flat copy

I'm trying to write a script that copy all the files of one dir (with subdirs) to the root of another dir.
So Imagine I have this file structure:
/
pic.JPG
PIC5.JPG
FOLDER
pic2.JPG
pic3.JPG
FOLDER2
pic4.JPG
I want all the .JPG files from that directory and copy them over to another destination. But I don't want the directory structure, just the files.
This is what I've got:
"sudo rsync -aq --include '*/' --include '*.JPG' --exclude '*\' /source/picturesRoot/ /destination/flatView/
But it also copies the directories :(
I found this link on stackoverflow:
rsync : Recursively sync all files while ignoring the directory structure
I looked at the solution and didn't see much difference with my command, apart from the * and . in the path. I tried it but it didn't work.
I hope somebody can help me, thanks.
This answer cannot work for you because your pictures are not at the same level in directories. There is no option in rsync to skip the creation of directory structure. In the link you gave, it's working because the user explicitly select source files with *.
You can try something with find and rsync. Find will find files and rsync copy them.
Here is a solution :
find /source/picturesRoot -type f -name "*.JPG" -exec rsync -a {} /destination/flatView/ \;
Be careful, if two files have the same name just one will be in destination directory.

Copy modified files with directory structure in linux

How can I copy a list of files modified today with the directory structure into a new directory. As shown in the following command I want to copy all the files modified today from /dev1/Java/src into /dev2/java/src. The src folder has many sub directories.
find /dev1/Java/src -newermt 2014-06-10 > 1.txt
for f in $(cat 1.txt) ; do cp $f /dev2/Java/src; done
You can take advantage of find and cpio utility.
cd /dev1/Java/src; find . -mindepth 1 -mtime -1 | cpio -pdmuv /dev2/Java/src
The above command goes to the source directory and finds the list of new files relative to the source directory.
The output is read by cpio and copies the files into the target directory in the same structure as the source, hence the need for relative pathnames.
Extracts the files modified within a day and copies them to the desired path.
find . -type f -mtime -1 -exec cp {} /path \;

Extract all the files with name containing a keyword

I have thousands of html file in a directory. I wanna extract files that contain Chennai in the file name and put it into another folder. I am sure it is possible. I am not close enough to copy the files to another folder.
Use globbing:
mv *Chennai* target/
If the file names might start with a dot, use
mv .*Chennai* *Chennai* target/
Try:
find directory_with_htmls -type f -name "*Chennai*.html" -exec cp {} some_other_folder \;
This would copy html files in the directory_with_htmls directory containing Chennai in the name to the directory some_other_folder.

Resources