Delete all files in a directory, except those listed matching specific criteria - linux

I need to automate a clean-up of a Linux based FTP server that only holds backup files.
In our "\var\DATA" directory is a collection of directories. Any directory here used for backup begins with "DEV". In each "DEVxxx*" directory are the actual backup files, plus any user files that may have been needed in the course of maintenance on these devices.
We only want to retain the following files - anything else found in these "DEVxxx*" directories is to be deleted:
The newest two backups: ls -t1 | grep -m2 ^[[:digit:]{6}_Config]
The newest backup done on the first of the month: ls -t1 | grep -m1 ^[[:digit:]{4}01_Config]
Any file that was modified less than 30 days ago: find -mtime -30
Our good configuration file: ls verification_cfg
Anything that doesn't match the above should be deleted.
How can we script this?
I'm guessing a BASH script can do this, and that we can create a cron job to run daily to perform the task.

Something like this perhaps?
{ ls -t1 | grep -m2 ^[[:digit:]{6}_Config] ;
ls -t1 | grep -m1 ^[[:digit:]{4}01_Config] ;
find -mtime -30 ;
ls -1 verification_cfg ;
} | rsync -a --exclude=* --include-from=- /var/DATA/ /var/DATA.bak/
rm -rf /var/DATA
mv /var/DATA.bak /var/DATA

For what it's worth, here is the bash script I created to accomplish my task. Comments are welcome.
#!/bin/bash
# This script follows these rules:
#
# - Only process directories beginning with "DEV"
# - Do not process directories within the device directory
# - Keep files that match the following criteria:
# - Keep the two newest automated backups
# - Keep the six newest automated backups generated on the first of the month
# - Keep any file that is less than 30 days old
# - Keep the file "verification_cfg"
#
# - An automated backup file is identified as six digits, followed by "_Config"
# e.g. 20120329_Config
# Remember the current directory
CurDir=`pwd`
# FTP home directory
DatDir='/var/DATA/'
cd $DatDir
# Only process directories beginning with "DEV"
for i in `find . -type d -maxdepth 1 | egrep '\.\/DEV' | sort` ; do
cd $DatDir
echo Doing "$i"
cd $i
# Set the GROUP EXECUTE bit on all files
find . -type f -exec chmod g+x {} \;
# Find the two newest automated config backups
for j in `ls -t1 | egrep -m2 ^[0-9]{8}_Config$` ; do
chmod g-x $j
done
# Find the six newest automated config backups generated on the first of the month
for j in `ls -t1 | egrep -m6 ^[0-9]{6}01_Config$` ; do
chmod g-x $j
done
# Find all files that are less than 30 days old
for j in `find -mtime -30 -type f` ; do
chmod g-x $j
done
# Find the "verification_cfg" file
for j in `find -name verification_cfg` ; do
chmod g-x $j
done
# Remove any files that still have the GROUP EXECUTE bit set
find . -type f -perm -g=x -exec rm -f {} \;
done
# Back to the users current directory
cd $CurDir

Related

How to compress multiple folders to separetly another folders and tar.gz

How I can compress files in this scenario:
I have folder structure like this:
User1:
/home/user1/websites/website1
/home/user1/websites/website2
User2:
/home/user2/websites/website1
/home/user2/websites/website2
/home/user2/websites/website3
And I try now (need) to do backup like this:
Folders for backups per user:
/backup/date/websites/user1/
/backup/date/websites/user1/
And I need backup tar in user directory separately per website like this:
/backup/date/websites/user1/website1.tar.gz
/backup/date/websites/user1/website2.tar.gz
/backup/date/websites/user2/website1.tar.gz
/backup/date/websites/user2/website2.tar.gz
/backup/date/websites/user2/website3.tar.gz
I have script like this one to do half of this work:
#VARIABLES
BKP_DATE=`date +"%F"`
BKP_DATETIME=`date +"%H-%M"`
#BACKUPS FOLDERS
BKP_DEST=/backup/users
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=`cd /home && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)"`
#Creating directories per user name
for i in $usersdirectories; do
mkdir -p $BACKUP_DIR/$i/websites
done
But if u see, i haven't how to do tar this for separately archives. In my half script I have done:
Create folder structure for backup by datetime (/backup/users/day/hour-minutes)
Create folder structure for backup by users names (/backup/users/day/hour-minutes/user1)
Thanks for all users who try to help me!
I will try to complete your script, but I can't debug it because your environment is hard to reproduce. It is better in the future that you provide a minimal reproducible example.
#VARIABLES
BKP_DATE=$(date +"%F")
BKP_DATETIME=$(date +"%H-%M")
#BACKUPS FOLDERS
BKP_DEST=/backup/users
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=$(cd /home && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)")
#Creating directories per user name
for i in $usersdirectories; do
for w in $(/home/$i/websites/*); do
ws=$(basename $w)
mkdir -p $BACKUP_DIR/$i/websites/$ws
tar -czvf $BACKUP_DIR/$i/websites/$ws.tar.gz /home/$i/websites/$ws
done
done
I suppose there are no blanks inside the directory names website1, etc...
I also replaced the deprecated backticks operators of your code by $(...).
I wish I could comment to ask for more clarification but I will attempt to help you.
#!/bin/bash
# example for user1
archive_path=/backup/data/websites/user1
file_path=/home/user1/websites/*
for i in $file_path
do
tar czf $archive_path/$(basename $i).tar.gz $file_path
done
Okay, after small fix. Version from Pierre working now. If u want, u can adjust it for yours need.
#VARIABLES
BKP_DATE=$(date +"%F")
BKP_DATETIME=$(date +"%H-%M")
#BACKUPS FOLDERS
BKP_DEST=/backup
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=$(cd /home/user && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)")
#Creating directories per user name
for i in $usersdirectories; do
for w in /home/user/$i/*; do
#echo ok
ws=$(basename $w)
echo mkdir -p $BACKUP_DIR/$i
echo tar -czvf $BACKUP_DIR/$i/$ws.tar.gz /home/user/$i/$ws
done
done

Linux: Search for old files.... copy the oldest ones to a location... (+- Verify copy)... then delete them

I need help with file handling in on Raspbian Stretch Lite running on Raspberry Pi Zero --- fresh install, updated.
The following script is run periodically as a cron job:
partition=/dev/root
imagedir=/etc/opt/kerberosio/capture/
if [[ $(df -h | grep $partition | head -1 | awk -F' ' '{ print $5/1 }' | tr ['%'] ["0"]) -gt 90 ]];
then
echo "Cleaning disk"
find $imagedir -type f | sort | head -n 100 | xargs -r rm -rf;
fi;
Essentially when the SD card is >90% full the oldest 100 files in a directory are deleted.
I want to add some functionality:
1) Copy the 100 oldest files to a NAS drive mounted on the file system and
2) Verify successful copy and
3) Delete the files that were copied.
I have found the following string which may be helpful in the modification of the script above:
find /data/machinery/capture/ -type f -name '*.*' -mtime +1 -exec mv {} /data/machinery/nas/ \;

Copy two most recent files to another directory using bash script

I'm trying to create a bash script to create daily backups of a MySQL db & a web directory. It should tar then then copy the two most recent .tar.gz files to a weekly directory on day 0 of each week, a monthly directory on day 1 of each month and to a year directory on day 1 of each year.
I'm having issues trying to get the 'copy the two most recent files' part to work.
What I've got so far (used the script from https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script as a base.):
#!/bin/sh
# https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Local Source
SOURCE=/path/to/source
# Create directories etc here
DIR=/path/to/backups
# Local Destination
DESTINATION=/path/to/network/share
# Direct all output to logfile found here
#LOG=$$.log
#exec > $LOG 2>&1
# Database Backup User
DATABASE='wordpress'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'
# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
DOW=$(date '+%u')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
#LATEST=$(ls -t | head -1)
#LATEST_DAILY=$(find $DIR/tmp/daily/ -name '*.tar.gz' | sort -n | tail -3)
#DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
#DAILY=$(ls -1tr $DIR/tmp/daily/ | tail -2 )
DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
# Direct all output to logfile found here
# LOG=$DIR/logs/$$.log
# exec > $LOG 2>&1
# Make Temporary Folder
if [ ! -d "$DIR/tmp" ]; then
mkdir "$DIR/tmp"
echo 'Created tmp directory...'
fi
# Make Daily Folder
if [ ! -d "$DIR/tmp/daily" ]; then
mkdir "$DIR/tmp/weekly"
echo 'Created daily directory...'
fi
# Make Weekly Folder
if [ ! -d "$DIR/tmp/weekly" ]; then
mkdir "$DIR/tmp/weekly"
echo 'Created weekly directory...'
fi
# Make Folder For Current Year
if [ ! -d "$DIR/tmp/${YEAR}" ]; then
mkdir "$DIR/tmp/${YEAR}"
echo 'Directory for current year created...'
fi
# Make Folder For Current Month
if [ ! -d "$DIR/tmp/${YEAR}/$MONTH" ]; then
mkdir "$DIR/tmp/${YEAR}/$MONTH"
echo '...'Directory for current month created
fi
# Make The Daily Backup
tar -zcvf $DIR/tmp/daily/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/tmp/database.sql
tar -zcvf $DIR/tmp/daily/${NOW}_database.tar.gz $DIR/tmp/database.sql
rm -rf $DIR/tmp/database.sql
echo 'Made daily backup...'
# Check whether it's Sunday (0), if so, then copy most recent daily backup to weekly dir.
if [ $DOW -eq 2 ] ; then
cp $DAILY $DIR/tmp/weekly/
fi
echo 'Made weekly backup...'
# Check whether it's the first day of the year then copy two most recent daily backups to $YEAR folder
if [ $DAY_OF_YEAR -eq 146 ] ; then
cp $DAILY $DIR/tmp/${YEAR}/
fi
echo 'Made annual backup...'
# Check if it's the first day of the month, if so, copy the latest daily backups to the monthly folder
if [ $DAY_OF_MONTH -eq 26 ] ; then
cp $DAILY $DIR/tmp/${YEAR}/${MONTH}/
fi
echo 'Made monthly backup...'
# Merge The Backup To The Local Destination's Backup Folder
# cp -rf $DIR/tmp/* $DESTINATION
# Delete The Temporary Folder
# rm -rf $DIR/tmp
# Delete daily backups older than 7 days
# find $DESTINATION -mtime +7 -exec rm {} \;
echo 'Backup complete. Log can be found under $DIR/logs/.'
I've commented out some parts for now whilst I'm trying to get this working and I've set the day/month/year to todays so I can see files being copied. I've also left in my commented-out previous attempts at $DAILY variables.
The issue I'm getting is that upon executing the script, it returns the following:
./backup-rotation-script.sh
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made weekly backup...
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made annual backup...
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made monthly backup...
Backup complete. Log can be found under /path/to/backups/logs/.
But when I check /path/to/backups/tmp/daily/ the files are there and it's clearly seeing them because it's returning the file names in the error.
From what I can gather, it's because $DAILY (find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2) is returning two results on one line? I'm assuming the easiest way to get this to work would probably be to create a for loop that copies the two results over to the weekly/monthly/yearly directories?
I tried adding variations on:
for file in `ls -1t /path/to/backups/tmp/daily/ | head -n2`
do
cp $file /path/to/backups/tmp/weekly/
done
But it didn't go so well. :S
Ideally, I'd also like it to report if it fails but I'm not that far yet. :)
Any help would be much appreciated!
Nevermind! Figured it out.
I removed the 'daily' variable entirely and used the following for the copy instead:
find $DIR/tmp/daily/ -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/tmp/weekly/
So script now looks like:
#!/bin/sh
# Original script: https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Edited/hacked/chopped/stuff by Khaito
# Redirect all script output to log file located in log directory with date in name.
exec 3>&1 4>&2
trap 'exec 2>&4 1>&3' 0 1 2 3 RETURN
exec 1>/path/to/logs/$(date +"%Y-%m-%d-%H%M")_intranet.log 2>&1
# Local Source
SOURCE=/path/to/source
# Create directories etc here
LOCAL=/path/to/backups
DIR=/path/to/backups/intranet
DIRD=/path/to/backups/intranet/daily
DIRW=/path/to/backups/intranet/weekly
DIRM=/path/to/backups/intranet/monthly
# Local Destination
DESTINATION=/path/to/network/share
# Database Backup User
DATABASE='dbname'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'
# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
DOW=$(date '+%u')
YEARMONTH=$(date +"%Y-%m-%B")
# Make Daily Folder
if [ ! -d "$LOCAL/intranet" ]; then
mkdir "$DIR/intranet"
echo 'Intranet directory created...'
fi
# Make Daily Folder
if [ ! -d "$DIR/daily" ]; then
mkdir "$DIR/daily"
echo 'Daily directory created...'
fi
# Make Weekly Folder
if [ ! -d "$DIR/weekly" ]; then
mkdir "$DIR/weekly"
echo 'Weekly directory created...'
fi
# Make Folder For Current Month
if [ ! -d "$DIR/monthly" ]; then
mkdir "$DIR/monthly"
echo 'Monthly directory created...'
fi
# Make Folder For Current Year
if [ ! -d "$DIR/${YEAR}" ]; then
mkdir "$DIR/${YEAR}"
echo 'Directory for current year created...'
fi
# Tar the intranet files then dump the db, tar it then remove the original dump file.
tar -cvzf $DIRD/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/database.sql
tar -cvzf $DIRD/${NOW}_database.tar.gz $DIR/database.sql
rm -rf $DIR/database.sql
echo 'Made daily backup...'
# Check if it's Sunday (0), if so, copy the two most recent daily files to the weekly folder.
if [ $DOW -eq 0 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRW
fi
echo 'Made weekly backup...'
# Check if it's the first day of the month, if so, copy the two most recent daily files to the monthly folder
if [ $DAY_OF_MONTH -eq 1 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRM
fi
echo 'Made monthly backup...'
# Check if it's the first day of the year, if so, copy the two most recent daily files to the current year folder
if [ $DAY_OF_YEAR -eq 1 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/${YEAR}/
fi
echo 'Made annual backup...'
# Rsync the new files to the network share for backup to tape
rsync -hvrPt $DIR/* $DESTINATION
# Delete local backups
# find $DIRD -mtime +8 -exec rm {} \;
# find $DIRW -mtime +15 -exec rm {} \;
# find $DIRM -mtime +2 -exec rm {} \;
# find $DIR/${YEAR} -mtime +2 -exec rm {} \;
# Delete daily backups older than 7 days on network share
# find $INTRANETDESTINATION/daily -mtime +8 -exec rm {} \;
# Delete weekly backups older than 31 days on network share
# find $INTRANETDESTINATION/weekly -mtime +32 -exec rm {} \;
# Delete monthly backups older than 365 days on network share
# find $INTRANETDESTINATION/monthly -mtime +366 -exec rm {} \;
echo 'Backup complete. Log can be found under /path/to/logs/.'

List all SVN repository URLs from a folder in recursive mode

We are looking for a script that will traverse in recursive mode all subfolders and list all SVN repository URLs and the path where it was found.
It will be used on /home folder of a user.
Recursively find directories, and for each of them try to get the SVN info. If it is successfull, then don't descend into the directory and print the directory name.
find -type d -exec bash -c "svn info {} > /dev/null 2> /dev/null" \; -prune -print
This will list the directories.
If you want the repository info, you can add it in the middle of the find exec command.
find -type d -exec bash -c "svn info {} 2> /dev/null | grep URL" \; -prune -print
Edit:
I found much better results by only testing for the presence of an .svn subdirectory. Then, svn info is called once at the end and grepped for Path and URL. (Plus using -0 to prevent from spaces in filenames.)
find -type d -exec test -d "{}/.svn" \; -prune -print0 | xargs -0 svn info | grep -e '\(Path\|URL\)'
Depending on what kind of limitations you have there could be different ways. The easiest way would be to do svn ls -R | grep -v "\." to grab all the sub-folders from the repository location you're at and feed that in to a for loop that adds the URI to the root of the ls to the front of each line. This will however not be adequate if you have files that do not contain a "." as they will be detected as folders. Unfortunately svn ls doesn't allow you to filter by file/folder, so if you need to deal with filenames without extensions then you'd have to do something different such as checking out the source and using find to get the folder names.
user_home=... # fill in the user's home dir
old_dir=/../
find $user_home -name .svn | rev | cut -f2- -d/ | rev | while read line ; do
echo -n "$line"$'\t'
wc -c <<<"$line"
done | sort -t$'\t' -k1,1 -k2,2n | while read dir size ; do
if [[ $dir != $old_dir* ]] ; then
old_dir=$dir
svn info $dir | grep URL
echo PATH: $dir
echo
fi
done
Just hope users do not store SVN under directories with spaces in names.

Script to distribute a large number of files in to smaller groups

I have folders containing large numbers of files (e.g. 1000+) of various sizes which I want to move in to smaller groups of, say, 100 files per folder.
I wrote an Apple Script which counted the files, created a numbered subfolder, and then moved 100 files in to the new folder (the number of files could be specified) which looped until there were less than specified number of files which it moved in to the last folder it created.
The problem was that it ran horrendously slowly. I'm looking for either an Apple Script or shell script I can run on my MacBook and/or Linux box which will efficiently move the files in to smaller groups.
How the files are grouped is not particularly significant, I just want fewer files in each folder.
This should get you started:
DIR=$1
BATCH_SIZE=$2
SUBFOLDER_NAME=$3
COUNTER=1
while [ `find $DIR -maxdepth 1 -type f| wc -l` -gt $BATCH_SIZE ] ; do
NEW_DIR=$DIR/${SUBFOLDER_NAME}${COUNTER}
mkdir $NEW_DIR
find $DIR -maxdepth 1 -type f | head -n $BATCH_SIZE | xargs -I {} mv {} $NEW_DIR
let COUNTER++
if [ `find $DIR -maxdepth 1 -type f| wc -l` -le $BATCH_SIZE ] ; then
mkdir $NEW_DIR
find $DIR -maxdepth 1 -type f | head -n $BATCH_SIZE | xargs -I {} mv {} $NEW_DIR
fi
done
The nested if statement gets the last remaining files. You can add some additional checks as you see needed after you modify for your use.
This is a tremendous kludge, but it shouldn't be too terribly slow:
rm /tmp/counter*
touch /tmp/counter1
find /source/dir -type f -print0 |
xargs -0 -n 100 \
sh -c 'n=$(echo /tmp/counter*); \
n=${n#/tmp/counter}; \
counter="/tmp/counter$n"; \
mv "$counter" "/tmp/counter$((n+1))"; \
mkdir "/dest/dir/$n"; \
mv "$#" "/dest/dir/$n"' _
It's completely indiscriminate as to which files go where.
The most common way to solve the problem of directories with too many files in them is to subdivide by the the first couple characters of the name. For example:
Before:
aardvark
apple
architect
...
zebra
zork
After:
a/aardvark
a/apple
a/architect
b/...
...
z/zebra
z/zork
If that isn't subdividing well enough, then go one step further:
a/aa/aardvark
a/ap/apple
a/ar/architect
...
z/ze/zebra
z/zo/zork
This should work quite quickly, because the move command that your script executes can use simple glob expansion to select all the files to move, ala mv aa* a/aa, as opposed to having to individually run a move command on each file (which would be my first guess as to why the original script was slow)

Resources