I need to delete all non php files from a linux server or download them - linux

I have thousands of files, maybe hundreds of thousands of files on a Linux Server and they are in directories, and sub directories -
The files are all located in /home/sas/httpdocs -
I want to get a copy of the entire directory with just the php files, but preserving the same directory structure -
I have two options:
either remove ALL of the non php files, then tarball it up and download it -
Or simply extract only all of the php files in a new directory but keeping the same directory structure -
Any ideas on how to do this?
Sas

This will copy only php files into a separate dir
cd /home/sas/httpdocs
tar -cf - `find . -name "*.php" -print` | ( cd /destination_dir && tar xBf - )
Ther is another method of deleting non-php files. Here is it, detailed elegantly
https://superuser.com/questions/168130/unix-delete-files-and-folders-excluding-pattern

Using rsync could be an option:
rsync -av --include "*/" --include "*.php" --exclude "*" /home/sas/httpdocs/. /copy/dir/

To delete the file not ending by .php:
find /dir -type f ! -name "*.php" -print
When you are happy with the output, replace the -print by -delete.

Related

Use rsync to move DB2 Archlog files including path

I try to move DB2 Archive Log files to a /backup filesystem before I do additional actions on them. Important here is that we preserv the full path.
tar -czvf $OUTFILE db2/???/log_archive/db2???/???/NODE*/LOGSTREAM*/C*/* --remove-files
At the moment I use tar from the root,but it has some downsides; if prefer that the files are simply moved.
So therefore I am playing with rsync, like for instance:
rsync -nrv --include='/db2/???/log_archive/*' --include '*.LOG' --exclude '*' --prune-empty-dirs / /backup
But what ever I try.... or I have just (nearly) all files and folders from root, or nothing at all.
Does anyone have a good idea?
When I noticed that rsync can work with "files-from", I managed it in a totally other way.....
So to move the DB2 Log files to a different /backup filesystem and preserving full path, I can use find and then rsync
find /db2/???/log_archive/* -name *.LOG -type f -mmin +1 | rsync -rv --files-from=- --remove-source-files / /backup/
sending incremental file list
db2/
db2/BWP/
db2/BWP/log_archive/
db2/BWP/log_archive/db2bwp/
db2/BWP/log_archive/db2bwp/BWP/
db2/BWP/log_archive/db2bwp/BWP/NODE0000/
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/C0000000/
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/C0000000/S0006869.LOG
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/C0000000/S0006870.LOG
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/C0000000/S0006871.LOG
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/C0000000/S0006872.LOG
db2/BWP/log_archive/db2bwp/BWP/NODE0000/LOGSTREAM0000/C0000000/S0006873.LOG
etc etc etc
Here I use 'find' to select the files (older then 1 minute to ensure DB2 is finished with archiving this file) and then transport this list of files to 'rsync' who moves them to the new location.

Copying folders but not tar files in linux

I have a folder, which consists of many folders and many tar files. (this many is around 1000)
I want to write a script to copy all folders with their contents to another directory, but I do not want to copy tar files.
I already know by writing
cp -a /source/ /path/
I can copy a directory with its contents to another, but for this case, I do not know how to do it.
As the number of directories are alot, I am not able to each time copy one directory.
I appreciate if someone can help me on this.
I think this might be what you're looking for.
You want to use the rsync command and in the --exclude flag you want to put *.tar
So your answer will look something like this:
rsync -r --exclude='*.tar' [source] [destination]
This is also a helpful little tutorial on how to use rsync.
You can combine cp in find to exclude *.tar files:
dest='/path/'
mkdir "$dest" &&
find /source -mindepth 1 -not -name '*.tar' -exec cp -a {} "$dest" \;

How to compress files with specified suffix in subdirectory?

For example, current directory is /A/B/, some scripts whose suffixes are .py and .sh in /A/B/C/,/A/B/C/D/ and /A/B/E/.
How to generate such a compressed file which has the structure of directories and contains the python /shell scripts?
Use find with your compression, e.g.:
zip outfile -r `find . -name '*.py'` `find . -name '*.sh'`
find ./someDir -name ".php" -o -name ".html" | tar -cf my_archive -T -
as seen here in a question very similar to yours.
How to tar certain file types in all subdirectories?

Rsync make flat copy

I'm trying to write a script that copy all the files of one dir (with subdirs) to the root of another dir.
So Imagine I have this file structure:
/
pic.JPG
PIC5.JPG
FOLDER
pic2.JPG
pic3.JPG
FOLDER2
pic4.JPG
I want all the .JPG files from that directory and copy them over to another destination. But I don't want the directory structure, just the files.
This is what I've got:
"sudo rsync -aq --include '*/' --include '*.JPG' --exclude '*\' /source/picturesRoot/ /destination/flatView/
But it also copies the directories :(
I found this link on stackoverflow:
rsync : Recursively sync all files while ignoring the directory structure
I looked at the solution and didn't see much difference with my command, apart from the * and . in the path. I tried it but it didn't work.
I hope somebody can help me, thanks.
This answer cannot work for you because your pictures are not at the same level in directories. There is no option in rsync to skip the creation of directory structure. In the link you gave, it's working because the user explicitly select source files with *.
You can try something with find and rsync. Find will find files and rsync copy them.
Here is a solution :
find /source/picturesRoot -type f -name "*.JPG" -exec rsync -a {} /destination/flatView/ \;
Be careful, if two files have the same name just one will be in destination directory.

How can i copy only header files in an entire nested directory to another directory keeping the same hierarchy after copying to new folder

I have a directory which has a lot of header files(.h) and other .o and .c files and other files.
This directory has many nested directories inside. I want to copy only header files to a separate directory preserving the same structure in the new directory.
cp -rf oldDirectory newDirectory will copy all files.
I want to copy only header files.
(cd src && find . -name '*.h' -print | tar --create --files-from -) | (cd dst && tar xvfp -)
You can do something similar with cpio if you just want to hard link the files instead of copying them, but it's likely to require a little mv'ing afterward. If you have lots of data and don't mind (or need!) sharing, this can be much faster. It gets confused if dst needs to have a src in it - this is, if it isn't just a side effect:
find src -name '*.h' -print | cpio -pdlv dst
mv dst/src/* dst/.
rmdir dst/src
this one worked for me:
rsync -avm --include='*.h' -f 'hide,! */' . /destination_dir
from here
cp -rf oldDirectory/*.h newDirectory
or something like (depending on the actual paths)
find oldDirectory -type f -name "*.h" -print0 | xargs -file cp file newDirectory

Resources