using an exclude list for an rsync cronjob - cron

I have a cron job on my computer set to run on an external volume each night. I would like to add an exclude list as part of this. However, I can't figure out where to place the list in order to have it read by the rsync command. I've tried placing it in my /usr/local/bin/ which is where the script that my cron job uses is, and putting just the name of the exclude file in the script. I've also tried putting the script on the external volume and providing the entire path. I know the list itself works because I tested it with files on my desktop. I would appreciate any help! The rsync script itself works as expected, I just need to figure out this one issue.
This is the code:
DATE=$(date +%Y%m%d-%H%M%S)
ICE="/Volumes/ice"
RTG="/Volumes/ice/rtg/"
LOGFILE="/Users/diunt-02/Desktop/${DATE}_ICEnightmoves.log"
find "${ICE}" -type d -iname 'X*X' -exec rsync -rthvP --exclude-from= 'exclude.txt' --exclude=".*" --exclude="jpegs" --exclude="*.jpg" --log-file="${LOGFILE}" --log-file-format="${DATE} '%f' %l " --remove-source-files "{}/" "${RTG}" \;

Related

Shell script does not run completely when run by cron

The script in file modBackup.sh does not run completely when started by cron, the result is a corrupted tar.gz file that is half the size of this one if I run manually. In any case, its size is many times smaller than the one started manually, but still creates some content that can not be opened normally, archive is damaged
file modBackup.sh:
#!/bin/sh
find /home/share/ -mmin -720 -type f -exec tar -rvf /mnt/archives/`date +%F`-modified.tar.gz "{}" +
Тhe behavior of the automatic one seems to be interrupted and does not end.
When I run it manualy, the script creates a genuine archive as [current date]-modified.tar.gz
Here is the crontab -e:
00 18 * * 1-5 /home/myScripts/modBackup.sh
Edit:
There is no information in the logs except that crond has started
neither in the mail log, nor in the cron, nor in the messages
(I use very old CentOS :( but I don't think this is the reason for the error).
For testing only: I added %H%M of the file name in the script and did the following:
I ran it manually: sh /home/myScripts/modBackup.sh
and set with crontab -e to run a two minutes later the same command
After a few minutes, two files appeared that grew at the same time, but then the one created by cronjob
stopped growing
(two files).
I use the same GUI tool (Archive Manager) to open in both cases.
Тhe file, created by manually starting the script, opens (manually started), but the other one, from cronjob cannot, even after I changed the extension, the error is 'unexpected EOF in archive' (auto started)
Suggesting to include the users's environment context with $PATH and other critical environment variables for the application to work.:
modBackup.sh:
#!/bin/sh
source ~/.profile
find /home/share/ -mmin -720 -type f -exec tar -rvf /mnt/archives/`date +%F`-modified.tar.gz "{}" +
I found that in the cron environment the "find" command misinterprets filenames containing specific characters, even with the explicit change of the encoding with add at the beginning of the script line "export LANG = en_US.UTF-8; LC_CTYPE=...". With many other combinations and attempts I had no success.
That's why I left the "find" command and use the tar command with an option to archive modified files. This way works perfect now:
fromDate = $(date --date = '15 hours ago')
/bin/tar -N "$fromDate" -zcf /mnt/archives/`date +% F-% H% M`-share.modified.tar.gz /home/share/

Copy files within multiple directories to one directory

We have an Ubuntu Server that is only accessed via terminal, and users transfer files to directories within 1 parent directory (i.e. /storage/DiskA/userA/doc1.doc /storage/DiskA/userB/doc1.doc). I need to copy all the specific files within the user folders to another dir, and I'm trying to specifically target the .doc extension.
I've tried running the following:
cp -R /storage/diskA/*.doc /storage/diskB/monthly_report/
However, it keeps telling me there is no such file/dir.
I want to be able to just pull the .doc files from all the user dirs and transfer to that dir, /storage/monthly_report/.
I know this is an easy task, but apparently, I'm just daft enough to not be able to figure this out. Any assistance would be wonderful.
EDIT: I updated the original to show that I have 2 Disks. Moving from Disk A to Disk B.
I would go for find -exec for such a task, something like:
find /storage/DiskA -name "*.doc" -exec cp {} /storage/DiskB/monthly_report/ \;
That should do the trick.
Use
rsync -zarv --include="*/" --include="*.doc" --exclude="*" /storage/diskA/*.doc /storage/diskB/monthly_report/

how to delete all files in directory except one folder and one file?

I have application which has one folder called vendor and one file called .env. When ever i automatically publish my source code files to folder, all old files should get deleted except these two.
How can i do this in linux by using shell?
PS : I am trying to implement rollback mechanism in Jenkins. I will copy artifacts from old build and transfer them to server using ssh. But this will be a copy operation. So I want to delete previous files before starting copy using SSH.
You can use find:
find ! \( -name 'name1' -o -name 'name2' \) -exec rm -r {} +
try with this command
rm !(<filename>)

I have to write an automation script

I have to backup and delete old log files every month. I will delete files older than 6 months and backup files older than 2 months in as a zip file.
I am trying to write a script that will automate and do it every month instead of me doing it manually every time.
I have UNIX commands on how to do it, but I need to put it into script file which will run automatically on the day specified.
You can schedule a cronJob for daily , which runs command inside script such as
find foldername -mtime +120 -name "*.log" -exec gzip {} \;
Above will take care of archiving all files older than 120 days. Part inside quotes after name can be modified as per your requirement and so does the +120.
find foldername -mtime +180 -name "*" -exec rm {} \;
Above will remove all file inside foldername older than 180 days.
For automation part , you can look at wiki link provided in the answer below. Though i will include it in my answer too.
You can use crontab to schedule commands (https://en.wikipedia.org/wiki/Cron)
You can use crontab to schedule commands (https://en.wikipedia.org/wiki/Cron)
You can add entry typing crontab -e and use it to schedule jobs, after adding your unix commands to a script.
For example, if you have a /home/test/test.sh file, you can run it everyday by adding the below to your crontab :
0 0 * * * /home/test/test.sh

How can I delete files that are not used in code files in linux?

I am running Fedora 18 linux and I have a PHP project that I have been working on for some time. I am trying to clean things up for a production deploy of a web application. I have a folder of icon images that over time has collected files that are not used in my code any more, either because I changed to a different icon in code, or the image file was used to create other icons. What I am looking to do is to make a backup copy of the entire code project, and HOPEFULLY using a combination of find, rm and grep on the command line, scan the entire folder of the images, and if the images are not used anywhere in my code files, delete them. I did some searching on the web and I am finding things that find a line of text in a file and delete it, but I have not found anything quite like what I am trying to do.
Any help is appreciated...
So here is what I came up with. I put together a shell script that does what I need. For the benefit of those who stumble upon this, and for those who want to critique my solution, here it is. I chose to skip files that were found in .xcf files because these are only used to create many of the icon files and some of the .png images would grep to these .xcf files.
#!/bin/bash
FILES=/var/www/html/support_desk/templates/default/images/icons/*
codedir=/var/www/html/support_desk_branch/
for f in $FILES
do
bn=$(basename $f)
ext="${bn##*.}"
echo "Processing $bn file..."
if ! fgrep --quiet -R $bn $codedir; then
if [ ext != 'xcf' ]; then
rm $f
fi
fi
done
Now I have ONLY the image files that are used in the PHP script files. Just so as not to miss any of the icon files used in the menu, which is defined in a table in a mysql database, I created an sql dump file of the data for that table, and put it in the path of the application files prior to running the shell script.
The simplest way to find unused icon files would be to do a build of your complete project and then look at the access-times of the icon-files. Those that were not read recently (including with grep, of course) would show up readily.
For instance, supposing that you did a backup an hour ago, and did a build ten minutes ago — the access times would be distinct. Then
find . -amin +15 -type f
should give a nice list of "unused" files. If you're sure of the list (you did do a backup, right?) then you could purge the unused files:
find . -amin +15 -type f -exec rm -i {} \;
If you are really certain, you can remove the -i option.

Resources