Copy files within multiple directories to one directory - linux

We have an Ubuntu Server that is only accessed via terminal, and users transfer files to directories within 1 parent directory (i.e. /storage/DiskA/userA/doc1.doc /storage/DiskA/userB/doc1.doc). I need to copy all the specific files within the user folders to another dir, and I'm trying to specifically target the .doc extension.
I've tried running the following:
cp -R /storage/diskA/*.doc /storage/diskB/monthly_report/
However, it keeps telling me there is no such file/dir.
I want to be able to just pull the .doc files from all the user dirs and transfer to that dir, /storage/monthly_report/.
I know this is an easy task, but apparently, I'm just daft enough to not be able to figure this out. Any assistance would be wonderful.
EDIT: I updated the original to show that I have 2 Disks. Moving from Disk A to Disk B.

I would go for find -exec for such a task, something like:
find /storage/DiskA -name "*.doc" -exec cp {} /storage/DiskB/monthly_report/ \;
That should do the trick.

Use
rsync -zarv --include="*/" --include="*.doc" --exclude="*" /storage/diskA/*.doc /storage/diskB/monthly_report/

Related

Copying files from multiple directories to another directory using linux command line

I have a bunch of files in separate folders, and all of the folders are in one directory.
/var/www/folder1/file1.txt
/var/www/folder1/file2.txt
/var/www/folder1/file3.txt
/var/www/folder2/file4.jpg
/var/www/folder2/file5.jpg
/var/www/folder2/file6.jpg
/var/www/folder3/file7.pdf
/var/www/folder3/file8.doc
/var/www/folder3/file9.gif
I need everything inside of the folders that are inside of /var/www/ to be copied to another directory (say, /var/my-directory/), but not the actual folders. Based on the example above, I need /var/my-directory/` to look as follows:
/var/my-directory/file1.txt
/var/my-directory/file2.txt
/var/my-directory/file3.txt
/var/my-directory/file4.jpg
/var/my-directory/file5.jpg
/var/my-directory/file6.jpg
/var/my-directory/file7.pdf
/var/my-directory/file8.doc
/var/my-directory/file9.gif
I can't seem to figure out the command to do this. I've tried the following:
sudo cp -R /var/www/./. /var/my-directory/
But, that still copies all of the folders.
Is there any way to do what I'm trying to do?
Use find.
find /var/www/ -type f -exec cp '{}' /var/my-directory/ \;
The trick is -type f that only selects file.

how to delete all files in directory except one folder and one file?

I have application which has one folder called vendor and one file called .env. When ever i automatically publish my source code files to folder, all old files should get deleted except these two.
How can i do this in linux by using shell?
PS : I am trying to implement rollback mechanism in Jenkins. I will copy artifacts from old build and transfer them to server using ssh. But this will be a copy operation. So I want to delete previous files before starting copy using SSH.
You can use find:
find ! \( -name 'name1' -o -name 'name2' \) -exec rm -r {} +
try with this command
rm !(<filename>)

How can I delete files that are not used in code files in linux?

I am running Fedora 18 linux and I have a PHP project that I have been working on for some time. I am trying to clean things up for a production deploy of a web application. I have a folder of icon images that over time has collected files that are not used in my code any more, either because I changed to a different icon in code, or the image file was used to create other icons. What I am looking to do is to make a backup copy of the entire code project, and HOPEFULLY using a combination of find, rm and grep on the command line, scan the entire folder of the images, and if the images are not used anywhere in my code files, delete them. I did some searching on the web and I am finding things that find a line of text in a file and delete it, but I have not found anything quite like what I am trying to do.
Any help is appreciated...
So here is what I came up with. I put together a shell script that does what I need. For the benefit of those who stumble upon this, and for those who want to critique my solution, here it is. I chose to skip files that were found in .xcf files because these are only used to create many of the icon files and some of the .png images would grep to these .xcf files.
#!/bin/bash
FILES=/var/www/html/support_desk/templates/default/images/icons/*
codedir=/var/www/html/support_desk_branch/
for f in $FILES
do
bn=$(basename $f)
ext="${bn##*.}"
echo "Processing $bn file..."
if ! fgrep --quiet -R $bn $codedir; then
if [ ext != 'xcf' ]; then
rm $f
fi
fi
done
Now I have ONLY the image files that are used in the PHP script files. Just so as not to miss any of the icon files used in the menu, which is defined in a table in a mysql database, I created an sql dump file of the data for that table, and put it in the path of the application files prior to running the shell script.
The simplest way to find unused icon files would be to do a build of your complete project and then look at the access-times of the icon-files. Those that were not read recently (including with grep, of course) would show up readily.
For instance, supposing that you did a backup an hour ago, and did a build ten minutes ago — the access times would be distinct. Then
find . -amin +15 -type f
should give a nice list of "unused" files. If you're sure of the list (you did do a backup, right?) then you could purge the unused files:
find . -amin +15 -type f -exec rm -i {} \;
If you are really certain, you can remove the -i option.

How to archive files and sub folders in a location to another place in linux

I am trying to create a shell script to copy folders and files within those folders from one Linux machine to another linux machine. After copying I would like to delete only the files that are copied. I want to retain the folder structure as is.
Eg.
Machine X has a main folder named F with subfolders A,B,C folders in which each of them has 10 files.
I would like to make a copy in such a way that machine Y will have a folder named F with subfolders A,B,C containing the same files. Once the copy of all folders and files are complete, it should delete all the files in source folder but retain the folders.
The code below is untested. Use with care and backup first.
Something like this should get you started:
#!/bin/bash
srcdir=...
set -ex
rsync \
--verbose \
--recursive \
"${srcdir}/" \
user#host:/dstdir/
find "${srcdir}" -type f -delete
Set the srcdir variable and the remote argument to rsync to taste.
The rsync options are just from memory, so they may need tweaking. Read the documentation, especially options regarding deletion, backup, permissions and links.
(I'd rather not answer questions requests that show no signs of effort, but my fingers were itching, so there you go.)
scp the files, check the exit code of the scp and then delete the files locally.
Something like scp files user#remotehost:/path/ && rm files
If scp has failed, the second part of the command won't execute

Symlinking files with specific extension into another folder in linux

I currently have a number of drives mounted under "/media/" I want to recursively scan all the drives mounted looking for files with a specific extension "*.foo". Once found I want to symlink these files into a directory elsewhere. One requirement is that I keep the basename of the file the same when creating the symlink. I wasn’t able to come up with a an easy solution using "find -exec" on my own. Is there an easy way to do this?
find /media/ -name *.foo | xargs ln -s -t DIRECTORYYOUWANTLINKSIN

Resources