Moving files without overwrite - linux

I'm using the following command to move all files in subfolders to a destination folder, without overwriting files with same name:
find folder-target -type f -exec cp --backup=numbered \{\} folder-final \;
And this is causing the files append ~1~ if the file already exists. The problem is: this is causing the file usuless. I need catch all my pdfs, and i can't open this pdfs if they have this numbers.
Is this fixable? Can't i use a pre-fix?
Thanks.

try cp -n
see man pages here : http://man7.org/linux/man-pages/man1/cp.1.html

Related

Copy or move all files in a directory regardles of folder depth or number

Lets say i have a folder named Pictures and I want to move or copy all files out of this folder.
However I also want to move and harvest all of the files who are in sub folders so:
Pictures/1.png
Pictures/yolo/2.png
Pictures/yolo/swag/sand/3.png
Pictures/extra/fire/4.png
I want to move or copy all these files to another folder like results so I get:
results/1.png
results/2.png
results/3.png
results/4.png
Only I have no idea in advance what sub folders will be in the Pictures folder.
How can I accomplish this in bash/shell scripts ?
I also appreciate making it file type neutral so any files are harvested from their directories (not only .png like in my example) and I have no idea what the file name will be (I only used 1...4 because i did not have any idea how to name them).
You can do it like this:
find /absolute/path/to/Pictures -type f -name '*.png' -exec mv -i {} /absolute/path/to/results \;
Another option is to use xargs
find /absolute/path/to/Pictures -name '*.png' | xargs -I files mv files /absolute/path/to/results
You can simply copy all files and subdirectories along with their contents using cp's recursive option:
cp -pr <source_path>/* <destination_path>/
But, moving them recursively is a bit tricky, you will need to create tar files of the subdirectories and move them and then untar the tar files in destination path. As this is a complex process, as a workaround, you can copy the files/directories recursively and then delete the files from original path.
cp -pr <source_path>/* <destination_path>/ && rm -rf <source_path>/*

find -exec unzip multiple .zip files, each into their own directory where source and destination different

I have a directory that new .zip files get placed every day. I need to find new files within the last day, and unzip the files each into their own directory in a different location. What I have found with a lot of searching almost does this for me.
find /source1/source2/source3 -maxdepth 1 -type f -mtime -1 \
-exec sh -c 'unzip -d /dest1/dest2/"${1%.*}" "$1"' _ {}
The problem with the above line, is the destination directory it is trying to create is /dest1/dest2/source1/source2/source3/(dir that is the filename of the zip)/{unzipped files} I need it to just be /dest1/dest2/{filename}
Is there a way to strip the source directories out of the ${1%.*} variable? Or if there is a better way to get this done i'm open to any suggestion.
You can strip the source directories with basename. Just replace "${1%.*}" with $(basename "${1%.*}").

Eliminating subfolders to move all files into one folder

I have a folder that contains 32 folders, each with several image files. I would like to move all of these image files into one main folder. I know how to do that manually, folder by folder. Is there an automated command-line way to do that? I have Crunchbang Waldorf, and usually use PCmanFM as a file manager.
/*/ stands for directories.
mv /path/from/*/*.jpg /path/main/
if all these images have one extension, for instance .jpg:
find /directory/You/Want/To/Search -name "*.jpg" -exec cp -t /destination/directory {} +
Note: just make sure that all these images have one unique name otherwise this command would break
UPDATE:
if you don't know what are the images extensions you could just do that one:
find /directory/You/Want/To/Search -regex ".*\.\(jpg\|gif\|png\|jpeg\)" -exec cp -t /destination/directory {} +

Howto replace a file in several sub-folders

I've a series of directories containing a set of files. There is a new copy of this file which I would like to replace all instances with. How can do this with find command?
Latest file is in /var/www/html is called update_user.php
There are 125 directories with several other files including a copy of update_user.php. I want to replace these with the one in update_user.php excluding itself.
This should do the job:
find /path/to/old/files -type f -name update_user.php -exec cp /path/to/new/update_user.php {} \;
You should check if the new file is not inside /path/to/old and if so than first copy it outside and use that copy but.. it'll not harm if you don't - one cp will fail with are the same file error.
You can use
cp -v to see what it does
cp -u to update only when source file is newer
echo cp to perform dry run
I would suggest to check first if all dest. files are the same with:
find /path/to/old/files -type f -name update_user.php -exec md5sum {} \;|awk '{print $1}'|sort|uniq

Command Line to find and delete Similar files in Linux

I want to find and delete all the "similar files as shown below" recursively from the specified directory of my Linux Server by Using COMMAND LINE....
"file.php?p=10&file=load%2Fnew Animations 1%2FAwesome_flowers-(text1).gif&sort=0"
"file.php?p=10&file=load%2Fnew Games 1%2FAwesome_flowers_-(text1).gif&sort=1"
Note: All these files having different file sizes and The mathed text is in all files is "file.php?p***"
This should help you to delete files recursively..
find /path/to/dir -type f -name 'file.php\?p*' -exec rm -i {} \;

Resources