How to have cron delete file sizes over certain size - cron

Can someone please tell me what command line to input into cron to delete all files in a certain directory over a certain size. Thank you
(I'm on an apache server... and I'm using the cpanel cron program)

Try using
find path/to/directory -type f -size +150k
for specifying file size in kb. In case you need limit in MB some other day, use 150M instead.
The current command will delete all files within that directory and its subdirectory, so you may want to use the maxdepth option for deleting files within a directory and not in its subdirectories
find path/to/directory -maxdepth 1 -type f -size +150k

Related

Copy files within multiple directories to one directory

We have an Ubuntu Server that is only accessed via terminal, and users transfer files to directories within 1 parent directory (i.e. /storage/DiskA/userA/doc1.doc /storage/DiskA/userB/doc1.doc). I need to copy all the specific files within the user folders to another dir, and I'm trying to specifically target the .doc extension.
I've tried running the following:
cp -R /storage/diskA/*.doc /storage/diskB/monthly_report/
However, it keeps telling me there is no such file/dir.
I want to be able to just pull the .doc files from all the user dirs and transfer to that dir, /storage/monthly_report/.
I know this is an easy task, but apparently, I'm just daft enough to not be able to figure this out. Any assistance would be wonderful.
EDIT: I updated the original to show that I have 2 Disks. Moving from Disk A to Disk B.
I would go for find -exec for such a task, something like:
find /storage/DiskA -name "*.doc" -exec cp {} /storage/DiskB/monthly_report/ \;
That should do the trick.
Use
rsync -zarv --include="*/" --include="*.doc" --exclude="*" /storage/diskA/*.doc /storage/diskB/monthly_report/

deletion of file after 7 days

I have to delete multiple files after 7 days regularly. And the deletion dates and location are different for each file.Yes, I can apply a cronjob for each folder separately but tat will involve many cronjobs (atleast 15).
In order to avoid this, I want to create a script which will go to each folder and delete the data.
For example:
-rw-r--r-- 1 csbackup other 20223605295 Jun 12 06:40 IO.tgz
As you can see IO.tgz was created on 12/06/2015 6:40... now I want to delete this file at 17/06/2015 00:00 hours... this is one reason I'm unable to use mtime as it will delete exactly after 7*24 hrs.
I was thinking to compare the timestamps of the file however, stat utility is not present on my machine. And its now even allowing me to install it.
Can anyone please guide me via a script which I can use to delete after n days
You can make a list of directories you want to search in a file.
# cat file
/data
/d01
/u01/files/
Now you can use for loop to remove the files which are there on those directories one by one.
for dir in $(cat file); do
find $dir -type f -mtime 7 |xargs rm -f
done

Linux Shell Script to change file mode

I am entirely new to shell scripts. I know how to change the mode for a single file.
But, I need to meet the following requirement and my operating system is red hat enterprise linux
To find all files in a directory which are having 640 mode and then change it to 644.
Like wise im having 10 directories where i need to recursively find all the files in all directories and change the mode to 644.
Later sending email with the file names whose status have been changed.
Expecting your kind assistance to complete this requirement.
Some research points to get you going.
The find command can be used to find files of varying properties under a given point.
You can use it with -type to select specific file types (regular files, directories, etc).
You can use it with -maxdepth to restrict how deep it will go down the tree.
You can use it with -perm to select only files with specific permissions.
You can use it with -print to output the filenames, including capturing them to a text file for later mailing (with a tool like mailx).
You can use it with -exec to carry out arbitrary commands on each file matching the conditions.
chmod is the command to change permissions.
For example, the following commands will find all regular files of the form *.dat, in the current directory (no subdirectories) and with permission 640, then change all those permissions to 644:
find . -type f -perm 640 -name '*.dat' -maxdepth 1 -exec chmod 644 {} ';'
All these options, and more, can be found in the manpage for find with the command:
man find
or by looking for some good explanations, such as the GNU find documentation.
However, find is not a tool for the faint of heart, it will bite you at every opportunity. Expect to ask at least another ten questions here before you get what you need :-)

Linux find command questions

I do not have a working Linux system to try these commands out with so I am asking on here if what I am planning on doing is the correct thing to do. (Doing this while I am downloading an ISO via a connection that I think dial-up is faster).
1, I am trying to find all files with the .log extension in the /var/log directory and sub-directories, writing the standard out to logdata.txt and standard out to logerrors.txt
I believe the command would be:
$ find /var/log/ -name *.log 1>logdata.txt 2>/home/username/logs/logerrors.txt.
2, Find all files with .conf in the /etc directory. standard out will be a file called etcdata and standard error to etcerrors.
$ find /etc -name *.conf 1>etcdata 2>etcerrors
3, find all files that have been modified in the last 30 minutes in the /var directory. standard out is to go into vardata and errors into varerrors.
Would that be:
$ find /var -mmin 30 1>vardata 2>varerrors.
Are these correct? If not what am I doing wrong?
1, I am trying to find all files with the .log extension in the /var/log directory and sub-directories, writing the standard out to logdata.txt and standard out to logerrors.txt
Here you go:
find /var/log/ -name '*.log' >logdata.txt 2>/home/username/logs/logerrors.txt
Notes:
You need to quote '*.log', otherwise the shell will expand them before passing to find.
No need to write 1>file, >file is enough
2, Find all files with .conf in the /etc directory. standard out will be a file called etcdata and standard error to etcerrors.
As earlier:
find /etc -name \*.conf >etcdata 2>etcerrors
Here I escaped the * another way, for the sake of an example. This is equivalent to '*.conf'.
3, find all files that have been modified in the last 30 minutes in the /var directory. standard out is to go into vardata and errors into varerrors.
find /var -mmin -30 >vardata 2>varerrors
I changed -mmin 30 to -mmin -30. This way it matches files modified within 30 minutes. Otherwise it matches files were modified exactly 30 minutes ago.
When using wildcards in the command, you need to make sure that they do not get interpreted by the shell. So, it is better to include the expression with wildcards in quotes. Thus, the first one will be:
find /var/log/ -name "*.log" 1>logdata.txt 2>/home/username/logs/logerrors.txt
Same comment on the second one where you should have "*.conf".

Find folders with specific name and no symlink pointing to them

I'm trying to write a shell script under linux, which lists all folders (recursively) with a certain name and no symlink pointing to it.
For example, I have:
/home/htdocs/cust1/typo3_src-4.2.11
/home/htdocs/cust2/typo3_src-4.2.12
/home/htdocs/cust3/typo3_src-4.2.12
Now I want to go through all subdirectories of /home/htdocs and find those folders typo3_*, that are not pointed to from somewhere.
Should be possible with a shellscript or a command, but I have no idea how.
Thanks for you help
Stefan
I think none of the common file systems store if there are symlinks pointing to this file in the file node, so you would have to scan all other files to see if it is a symlink to this one. If you don't limit your depth of search to a certain level, this might take a very long time. If you want to perform that search in /home/htdocs, for example, it would work something like this:
# find specified folders:
find /home/htdocs -name 'typo3_*' -type d | while read folder; do
# list all symlinks pointing to $folder
find -L /home/htdocs -samefile "$folder"|grep -v "$folder\$"
done

Resources