copy/move same name multiple files in different folder - linux

I have multiple files in different folders with the same name and extension. For example: There are 460 folders and each folder has one file with the name of snps.vcf. I want to copy/move these files to one folder and later on, I will do some analysis that I need to do.
I have tried:
find -type f -name "*.vcf" -exec cp {} /home/AWAN/try';'
but this code overwrites the files and only one file remains there in the end.
I have tried rename but I don't know how to select multiple files by find command then rename. Even with the mmv I couldn't find the possible solution.

You need to write an external script and pass it to -exec.
Your script may use mktemp to generate a random file name. Example:
mktemp /your/directory/try-XXX
The XXX part will be replaced by mktemp with a different value for each call.

Related

Copy files within multiple directories to one directory

We have an Ubuntu Server that is only accessed via terminal, and users transfer files to directories within 1 parent directory (i.e. /storage/DiskA/userA/doc1.doc /storage/DiskA/userB/doc1.doc). I need to copy all the specific files within the user folders to another dir, and I'm trying to specifically target the .doc extension.
I've tried running the following:
cp -R /storage/diskA/*.doc /storage/diskB/monthly_report/
However, it keeps telling me there is no such file/dir.
I want to be able to just pull the .doc files from all the user dirs and transfer to that dir, /storage/monthly_report/.
I know this is an easy task, but apparently, I'm just daft enough to not be able to figure this out. Any assistance would be wonderful.
EDIT: I updated the original to show that I have 2 Disks. Moving from Disk A to Disk B.
I would go for find -exec for such a task, something like:
find /storage/DiskA -name "*.doc" -exec cp {} /storage/DiskB/monthly_report/ \;
That should do the trick.
Use
rsync -zarv --include="*/" --include="*.doc" --exclude="*" /storage/diskA/*.doc /storage/diskB/monthly_report/

How do I copy differing content files from one directory to another?

There exists two directories: a/ and b/.
I'd like to copy all the files(recursively) from a/ into b/.
However, I only want to copy over an a file if its content is different than the already existing b file. If the corresponding b file does not exist, then you would still copy over the a file.
*by "corresponding file", I mean a files with the same name and relative path from their parent directories.
note:
The reason I don't want to overwrite a b file with the same exact contents, is because the b directory is being monitored by another program, and I don't want the file date to change causing the program to do more work than required.
I'm essentially looking for a way to perform a cp -rf a/ b/ while performing a diff check on each file. If the file's are different, perform the copy; otherwise skip the copy.
I see that cp has an update flag:
-u, --update
copy only when the SOURCE file is newer than the destination file or when the
destination file is missing
but this will not work because I'm not concerned about newer files; I'm concerned about different file contents.
Any shell language will do.
I've been attempting to get this to work by injecting my diff check into a find command:
find a/ ??? -exec cp {} b \;
This doesn't seem like an uncommon thing to do between two directories, so I'm hoping there is an elegant command line solution as aposed to me having to write a python script.
You can achieve this using rsync. Files or directories will be updated only if there is any new update in source folder.
$rsync -av --progress sourcefolder destinationfolder

I need to "cat" a bunch of files in different locations

I have around 120 folders with sub directories but they all contain one file, a "download.txt" file that is created in its own directory bu 120 scripts executing in parallel.
I want to make all these scripts share the same file, but I need to merge or "cat"(I believe cat can be used for this purpose)them first.
What would be the best way do to this in bash/shell?
After the files are closed by writers:
find root_dir -name download.txt -exec cat {} \; > merged_download.txt
where the root_dir is the path to the parent of those 120 directories.
If these txt files are in same folder structure, such as always in folder1/folder2/download.txt, then you can run a simple one:
cat */*/download.txt > merged_download.txt

How to send list of file in a folder to a txt file in Linux

I'm fairly new to Linux (CentOS in this case). I have a folder with about 2000 files in it. I'd like to ideally execute a command at the command prompt that would write out the name of all the files into a single txt file.
If I have to, I could write an actual program to do it too, I was just thinking there might be a way to simply do it from the command prompt.
you can just use
ls > filenames.txt
(usually, start a shell by using "Terminal", or "shell", or "Bash".) You may need to use cd to go to that folder first, or you can ls ~/docs > filenames.txt
If only names of regular files immediately contained within a directory (assume it's ~/dirs) are needed, you can do
find ~/docs -type f -maxdepth 1 > filenames.txt

Find folders with specific name and no symlink pointing to them

I'm trying to write a shell script under linux, which lists all folders (recursively) with a certain name and no symlink pointing to it.
For example, I have:
/home/htdocs/cust1/typo3_src-4.2.11
/home/htdocs/cust2/typo3_src-4.2.12
/home/htdocs/cust3/typo3_src-4.2.12
Now I want to go through all subdirectories of /home/htdocs and find those folders typo3_*, that are not pointed to from somewhere.
Should be possible with a shellscript or a command, but I have no idea how.
Thanks for you help
Stefan
I think none of the common file systems store if there are symlinks pointing to this file in the file node, so you would have to scan all other files to see if it is a symlink to this one. If you don't limit your depth of search to a certain level, this might take a very long time. If you want to perform that search in /home/htdocs, for example, it would work something like this:
# find specified folders:
find /home/htdocs -name 'typo3_*' -type d | while read folder; do
# list all symlinks pointing to $folder
find -L /home/htdocs -samefile "$folder"|grep -v "$folder\$"
done

Resources