moving monthly linux files to another folder script - linux

sorry but I'm a beginner in Linux script commands.
I'm trying to automatically transfer files to another folder monthly ,on every 26th day of the existing month.(Linux system)
I would appreciate if somebody can help!
thank you in advance!
I have files with this format xxxxx_181025.txt. I want them to be moved in another folder every 26th day of month.(but only files generated for actual month, in this case October. I need some help on how to specify that only files of existing month should be moved?
cd /actual folder
_Y='%Y'
_y='%Y'
_m='%m'
_d='%d'
TIMESTAMP=`date "+$_Y$_m$_d"`
mv xxxxx_$TIMESTAMP /new folder/xxxxx_$TIMESTAMP
done

find /actual_folder -t f -mtime -26 -exec mv {} /new_folder/ \;

Related

Copy files within multiple directories to one directory

We have an Ubuntu Server that is only accessed via terminal, and users transfer files to directories within 1 parent directory (i.e. /storage/DiskA/userA/doc1.doc /storage/DiskA/userB/doc1.doc). I need to copy all the specific files within the user folders to another dir, and I'm trying to specifically target the .doc extension.
I've tried running the following:
cp -R /storage/diskA/*.doc /storage/diskB/monthly_report/
However, it keeps telling me there is no such file/dir.
I want to be able to just pull the .doc files from all the user dirs and transfer to that dir, /storage/monthly_report/.
I know this is an easy task, but apparently, I'm just daft enough to not be able to figure this out. Any assistance would be wonderful.
EDIT: I updated the original to show that I have 2 Disks. Moving from Disk A to Disk B.
I would go for find -exec for such a task, something like:
find /storage/DiskA -name "*.doc" -exec cp {} /storage/DiskB/monthly_report/ \;
That should do the trick.
Use
rsync -zarv --include="*/" --include="*.doc" --exclude="*" /storage/diskA/*.doc /storage/diskB/monthly_report/

using an exclude list for an rsync cronjob

I have a cron job on my computer set to run on an external volume each night. I would like to add an exclude list as part of this. However, I can't figure out where to place the list in order to have it read by the rsync command. I've tried placing it in my /usr/local/bin/ which is where the script that my cron job uses is, and putting just the name of the exclude file in the script. I've also tried putting the script on the external volume and providing the entire path. I know the list itself works because I tested it with files on my desktop. I would appreciate any help! The rsync script itself works as expected, I just need to figure out this one issue.
This is the code:
DATE=$(date +%Y%m%d-%H%M%S)
ICE="/Volumes/ice"
RTG="/Volumes/ice/rtg/"
LOGFILE="/Users/diunt-02/Desktop/${DATE}_ICEnightmoves.log"
find "${ICE}" -type d -iname 'X*X' -exec rsync -rthvP --exclude-from= 'exclude.txt' --exclude=".*" --exclude="jpegs" --exclude="*.jpg" --log-file="${LOGFILE}" --log-file-format="${DATE} '%f' %l " --remove-source-files "{}/" "${RTG}" \;

Cron Job to auto delete folder older than 7 days Linux

I am having issue storing my server backup on a storage VPS. My server is not deleting old backup folders and the storage is getting full and the backup fails in mid way. My runs once every week.
Can anyone help me create a cron job script on that deletes folder older than 7 days and runs one day before backup and delete old folders.
Any help appreciated.
For example, the description of crontab for deleting files older than 7 days under the /path/to/backup/ every day at 4:02 AM is as follows.
02 4 * * * find /path/to/backup/* -mtime +7 -exec rm {} \;
Please make sure before executing rm whether targets are intended files. You can check the targets by specifying -ls as the argument of find.
find /path/to/backup/* -mtime +7 -ls
mtime means the last modification timestamp and the results of find may not be the expected file depending on the backup method.

deletion of file after 7 days

I have to delete multiple files after 7 days regularly. And the deletion dates and location are different for each file.Yes, I can apply a cronjob for each folder separately but tat will involve many cronjobs (atleast 15).
In order to avoid this, I want to create a script which will go to each folder and delete the data.
For example:
-rw-r--r-- 1 csbackup other 20223605295 Jun 12 06:40 IO.tgz
As you can see IO.tgz was created on 12/06/2015 6:40... now I want to delete this file at 17/06/2015 00:00 hours... this is one reason I'm unable to use mtime as it will delete exactly after 7*24 hrs.
I was thinking to compare the timestamps of the file however, stat utility is not present on my machine. And its now even allowing me to install it.
Can anyone please guide me via a script which I can use to delete after n days
You can make a list of directories you want to search in a file.
# cat file
/data
/d01
/u01/files/
Now you can use for loop to remove the files which are there on those directories one by one.
for dir in $(cat file); do
find $dir -type f -mtime 7 |xargs rm -f
done

Batch rename log files from one time format to another

I'm looking for a way to batch rename almost 1,000 log files created by an Eggdrop bot. A few years ago, I had to setup my bot from scratch, and neglected to set the log format properly, so all of these files now have the format:
channelname.log.%d%b%Y (channelname.log.14Jan2014)
I want to rename all those files to match all my old log files, which are in the format of:
channelname.log.%Y%m%d (channelname.log.20140101)
I've already made the change in my eggdrop.conf file, but I would like to rename all the newer log files to match the format of the old ones.
This is on a Linux shell, so some sort of bash command would be ideal. Thanks!
find . -type f -name '*.log.*[^0-9-]*' -print0 | while read -d '' -r logfile; do
mv "${logfile}" "${logfile/.log.*/.log.`date -d ${logfile#*.log.} +%Y-%m-%d`}"
done
Assuming it's in a locale date knows how to handle.

Resources