Linux find and copy files with same name to destination folder do not overwrite - linux

I want to find and copy all files with *.jpg in one folder includes its sub folder to another folder
I use
find /tempL/6453/ -name "*.jpg" | xargs -I '{}' cp {} /tempL/;
but it overwrite files with same name
for example in /tempL/6453/, there are test (1).jpg test (2).jpg and folder 1, in /tempL/6453/1/, there are also have files with the same name test (1).jpg test (2).jpg
If I use the above command, there are only two files test (1).jpg test (2).jpg in /tempL/, it can not copy all files to /tempL/.
What I want is to copy all files to /tempL/, when there are same file name, just rename them, how to?

What I want is to copy all files to /tempL/, when there are same file name, just rename them, how to?
1) If you only do not what overwrite cp --backup will give you a backup for existing file, with --suffix option of cp, you can also specify the suffix to be used for backup.
2) --parents option of cpwill keep directory tree, i.e. files in folder 1 will be copy to new created 1 folder.
3) If you want to customize your rename processing, you can not use cp command only. write script for it and call it to process the result of find

Install "GNU parallel" and use:
find /tempL/6453/ -name "*.jpg" | parallel 'cp {} ./dest-dir/`stat -c%i {}`_{/}'
{/} ................. gets filename with no full path
I think the same approach should be possible with xargs, but learning about parallel was amazing for me, it gives us many beautiful solutions.
I recommend using echo before cp in order to test your command

Related

Copy or move all files in a directory regardles of folder depth or number

Lets say i have a folder named Pictures and I want to move or copy all files out of this folder.
However I also want to move and harvest all of the files who are in sub folders so:
Pictures/1.png
Pictures/yolo/2.png
Pictures/yolo/swag/sand/3.png
Pictures/extra/fire/4.png
I want to move or copy all these files to another folder like results so I get:
results/1.png
results/2.png
results/3.png
results/4.png
Only I have no idea in advance what sub folders will be in the Pictures folder.
How can I accomplish this in bash/shell scripts ?
I also appreciate making it file type neutral so any files are harvested from their directories (not only .png like in my example) and I have no idea what the file name will be (I only used 1...4 because i did not have any idea how to name them).
You can do it like this:
find /absolute/path/to/Pictures -type f -name '*.png' -exec mv -i {} /absolute/path/to/results \;
Another option is to use xargs
find /absolute/path/to/Pictures -name '*.png' | xargs -I files mv files /absolute/path/to/results
You can simply copy all files and subdirectories along with their contents using cp's recursive option:
cp -pr <source_path>/* <destination_path>/
But, moving them recursively is a bit tricky, you will need to create tar files of the subdirectories and move them and then untar the tar files in destination path. As this is a complex process, as a workaround, you can copy the files/directories recursively and then delete the files from original path.
cp -pr <source_path>/* <destination_path>/ && rm -rf <source_path>/*

Eliminating subfolders to move all files into one folder

I have a folder that contains 32 folders, each with several image files. I would like to move all of these image files into one main folder. I know how to do that manually, folder by folder. Is there an automated command-line way to do that? I have Crunchbang Waldorf, and usually use PCmanFM as a file manager.
/*/ stands for directories.
mv /path/from/*/*.jpg /path/main/
if all these images have one extension, for instance .jpg:
find /directory/You/Want/To/Search -name "*.jpg" -exec cp -t /destination/directory {} +
Note: just make sure that all these images have one unique name otherwise this command would break
UPDATE:
if you don't know what are the images extensions you could just do that one:
find /directory/You/Want/To/Search -regex ".*\.\(jpg\|gif\|png\|jpeg\)" -exec cp -t /destination/directory {} +

Wrtie a script to Delete files if it exists in different folder in Linux

I'm trying write a script in linux. Where I have some csv files in Two different folders(A and B) and then after some processing copy of rejected files are moving to Bad Folder.
SO I want bad files to be deleted from Table A and B which have copied to Bad Folder.
Can you help me to write this script for linux?
Best
lets say name of Bad Folder is 'badFolder' and considering 'A', 'B' and 'badFolder' are in same directory
Steps to delete files from folder A and B:
step 1: change current directory to your 'badFolder'
cd badFolder
step 2: delete identical files
find . -type f -exec rm -f ../A/{} \;
find . -type f -exec rm -f ../B/{} \;
The argument -type f tells to look for files, not directories.
The -exec ... \; argument tells that, once it finds a file in 'badFolder', it should run the command rm -f on its counterpart in the A subdirectory.
Because rm is given with the -f option, it will silently ignore files that don't exist.
Also, it will not prompt before deleting files. This is very handy when deleting a large number of files. However, be sure that you really want to delete the files before running this script.
#!/bin/bash
#Set the working folder in which you want to delete the file
Working_folder=/<Folder>/<path>
cd $Working_folder
#command to delete all files present in folders
rm <filenames seperated by space>
echo "files are deleted"
#if you want to delete all files you can use wild card character
# e.g. command rm *.*
# if you want to delete a particular file say for deleting .csv file you can use command rm *.csv command
Set variables containing the paths of your A, B and BAD directories.
Then you can do something along the lines of
for file in ls ${PATH_TO_BAD}
do
rm ${PATH_TO_A}/$file
rm ${PATH_TO_B}/$file
done
This is iterating over the BAD directory and any file it finds, it deletes from the A and B directories.

Rsync make flat copy

I'm trying to write a script that copy all the files of one dir (with subdirs) to the root of another dir.
So Imagine I have this file structure:
/
pic.JPG
PIC5.JPG
FOLDER
pic2.JPG
pic3.JPG
FOLDER2
pic4.JPG
I want all the .JPG files from that directory and copy them over to another destination. But I don't want the directory structure, just the files.
This is what I've got:
"sudo rsync -aq --include '*/' --include '*.JPG' --exclude '*\' /source/picturesRoot/ /destination/flatView/
But it also copies the directories :(
I found this link on stackoverflow:
rsync : Recursively sync all files while ignoring the directory structure
I looked at the solution and didn't see much difference with my command, apart from the * and . in the path. I tried it but it didn't work.
I hope somebody can help me, thanks.
This answer cannot work for you because your pictures are not at the same level in directories. There is no option in rsync to skip the creation of directory structure. In the link you gave, it's working because the user explicitly select source files with *.
You can try something with find and rsync. Find will find files and rsync copy them.
Here is a solution :
find /source/picturesRoot -type f -name "*.JPG" -exec rsync -a {} /destination/flatView/ \;
Be careful, if two files have the same name just one will be in destination directory.

Zipping and deleting files with certain age

i'm trying to elaborate a command that will find files that haven't been modified in over 6 months and zip them with one command. Afterwards i want to delete all those files and i just archived.
My current command to find the directories with the files is
find /var/www -type d -mtime -400 ! -mtime -180 | xargs ls -l > testd.txt
This gave me all the directories including the files that are older than 6 months
Now i was wondering if there was a way of zipping all the results and deleting them afterwards. Something amongst the line of
find /var/www -type f -mtime -400 ! -mtime -180 | gzip -c archive.gz
If anyone knows the proper syntax to achieve this i'd love to know. Thakns!
Edit, after a few tests this command results in a corrupted file
find /var/www -mtime -900 ! -mtime -180 | xargs tar -cf test4.tar
Any ideas?
Break this into several distinct steps that you can implement and thoroughly test separately:
Build a list of files to be archived and then deleted, saved to a temp file
Use the list from step 1 to add the files to .tar.gz archives. Give the archive file a name following a specific pattern that won't appear in the files to be archived, and put it in a directory outside the hierarchy of files being archived.
Read back the files from the .tar.gz and compare them (or their hashes) to the original files to ENSURE that you got them all without corruption
Use the list from step 1 to delete the files. Do not use a wildcard for deletion. Put in some guard code to prevent deletion of any file matching the name pattern of the archive .tar.gz file(s) created in step 2.
When testing a script that can do irreversible damage, always code the dangerous command with a leading echo and leave it that way until you are sure everything works. Only then remove the echo.
Consider zip, it should meet your requirements.
find ... | zip -m# archive.zip
-m (move) deletes the input directories/files after making the specified zip archive.
-# takes the list of input files from standard input.
You may find more options which are useful to you in the zip manual, e. g.
-r (recurse) travels the directory structure recursively.
-sf (show-files) shows the files that would be operated on, then exits.
-t or --from-date operates on files not modified prior to the specified date.
-tt or --before-date operates on files not modified after or at the specified date.
This could possibly make findexpendable.
zip -mr --from-date 2012-09-05 --before-date 2013-04-13 archive /var/www

Resources