How to compress multiple folders to separetly another folders and tar.gz - linux

How I can compress files in this scenario:
I have folder structure like this:
User1:
/home/user1/websites/website1
/home/user1/websites/website2
User2:
/home/user2/websites/website1
/home/user2/websites/website2
/home/user2/websites/website3
And I try now (need) to do backup like this:
Folders for backups per user:
/backup/date/websites/user1/
/backup/date/websites/user1/
And I need backup tar in user directory separately per website like this:
/backup/date/websites/user1/website1.tar.gz
/backup/date/websites/user1/website2.tar.gz
/backup/date/websites/user2/website1.tar.gz
/backup/date/websites/user2/website2.tar.gz
/backup/date/websites/user2/website3.tar.gz
I have script like this one to do half of this work:
#VARIABLES
BKP_DATE=`date +"%F"`
BKP_DATETIME=`date +"%H-%M"`
#BACKUPS FOLDERS
BKP_DEST=/backup/users
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=`cd /home && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)"`
#Creating directories per user name
for i in $usersdirectories; do
mkdir -p $BACKUP_DIR/$i/websites
done
But if u see, i haven't how to do tar this for separately archives. In my half script I have done:
Create folder structure for backup by datetime (/backup/users/day/hour-minutes)
Create folder structure for backup by users names (/backup/users/day/hour-minutes/user1)
Thanks for all users who try to help me!

I will try to complete your script, but I can't debug it because your environment is hard to reproduce. It is better in the future that you provide a minimal reproducible example.
#VARIABLES
BKP_DATE=$(date +"%F")
BKP_DATETIME=$(date +"%H-%M")
#BACKUPS FOLDERS
BKP_DEST=/backup/users
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=$(cd /home && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)")
#Creating directories per user name
for i in $usersdirectories; do
for w in $(/home/$i/websites/*); do
ws=$(basename $w)
mkdir -p $BACKUP_DIR/$i/websites/$ws
tar -czvf $BACKUP_DIR/$i/websites/$ws.tar.gz /home/$i/websites/$ws
done
done
I suppose there are no blanks inside the directory names website1, etc...
I also replaced the deprecated backticks operators of your code by $(...).

I wish I could comment to ask for more clarification but I will attempt to help you.
#!/bin/bash
# example for user1
archive_path=/backup/data/websites/user1
file_path=/home/user1/websites/*
for i in $file_path
do
tar czf $archive_path/$(basename $i).tar.gz $file_path
done

Okay, after small fix. Version from Pierre working now. If u want, u can adjust it for yours need.
#VARIABLES
BKP_DATE=$(date +"%F")
BKP_DATETIME=$(date +"%H-%M")
#BACKUPS FOLDERS
BKP_DEST=/backup
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=$(cd /home/user && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)")
#Creating directories per user name
for i in $usersdirectories; do
for w in /home/user/$i/*; do
#echo ok
ws=$(basename $w)
echo mkdir -p $BACKUP_DIR/$i
echo tar -czvf $BACKUP_DIR/$i/$ws.tar.gz /home/user/$i/$ws
done
done

Related

rsync- create daily backups that are deleted after 30 days?

I am attempting to write a script in rsync to save daily backups in new directories named after the date they are created, before they are deleted 30 days after being created. The code below works, but it will quickly fill up my memory because the -u option will not see that several files in the directory structure already exist in a previous backup. Is there a better way to do this to preserve memory/bandwidth? I have had the --delete and --backup-dir options mentioned to me, but I have no idea how they would apply to this specific scenario.
#!/bin/bash
#User who's files are being backed up
BNAME=username
#directory to back up
BDIR=/home/username/BackThisUp
#directory to backup to
BackupDir=/var/home/username_local/BackupTo
#user
RUSER=$USER
#SSH Key
KEY=/var/home/username_local/.ssh
#Backupname
RBackup=`date +%F`
#Backup Server
BServ=backup.server
#Path
LPATH='Data for backup'
#date
DATE=`date +%F`
#make parent directory for backup
mkdir $BackupDir/$BNAME > /dev/null 2>&1
#Transfer new backups
rsync -avpHrz -e "ssh -i $KEY" $BNAME#$BServ:$BDIR $BackupDir/$BNAME/$DATE
find $BackupDir/$BNAME -type d -ctime +30 -exec rm -rf {} \;
I might do somethign simpler. Create a hash that only has the date's
day in it. For example, 8/11/2015 would hash to 11
Then do something like
# this number changes based on date.
hash=`date +%d`
rm -rf backup_folder/$hash
# then recreate backup_folder/$hash
You'll have around 30 days of backups. You may want to zip/compress these folders, assuming you have 30 times the size of the folder available on the disk.

copy entire directory excluding a file

As we know, cp -r source_dir intended_new_directory creates a copy of source directory with a new name. Now I want to do the same but want to exclude a particular file. I have found some related answers here, using tar and rsync, but in those solutions I need to create the destination directory first (using mkdir).
I honestly searched a lot, but didn't find exactly what I want.
So far the best I got is this:
tar -c --exclude=\*.dll --exclude=\*.exe sourceDir | tar -x -C destDir
(from http://www.linuxquestions.org/questions/programming-9/how-to-copy-an-entire-directory-structure-except-certain-files-385321/)
If you have binutils, you could use find to filter next cpio to copy (and create directories) :
find <sourceDir> \( ! -name *.dll \) -a \( ! -name *.exe \) | cpio -dumpv <destDir>
Try this by excluding the file using 'grep -v' ->
cp `ls | grep -v <exclude-file>` <dest-dir>
If the directory is not very large I used to write something like this:
src=path/to/source/directory
dst=path/to/destination/directory
find $src -type f | while read f ; do mkdir -p "$dst/`dirname $f`"; cp "$f" "$dst/$f" ; done
Here we list all regular files in $src, iterate over this list and for each file make a directory in $dst if it does not exist yet (-p option of mkdir), then copy the file to that directory.
The above command will copy all the files. Finally, just use
find $src -type f | grep -v whatever | while ...... # same as above
to filter out the files you don't need (e.g. \.bak$, \.orig$, or whatever files you don't want to copy).
Move all exclude file into home or other directory,copy the directory containing all remaining files to the destination folder then restore all exclude files.
#cd mydirectory
#mv exclude1 exclude2 /home/
#cp mydirectory destination_folder/
#cd /home/
#mv eclude1 exclude2 mydirectory/

Move files and directories older than specific time with the same folder structure

I want to move all files and directories are located on /etc/ that are older than 90 days to /old-etc directory but with the same structure in the source directory.
Thanks
Try doing this :
find /etc -mtime +90 -type f -exec bash -c 'install -D "$1" "/old-etc/$1" && rm -f "$1"' -- {} \;

Delete all files in a directory, except those listed matching specific criteria

I need to automate a clean-up of a Linux based FTP server that only holds backup files.
In our "\var\DATA" directory is a collection of directories. Any directory here used for backup begins with "DEV". In each "DEVxxx*" directory are the actual backup files, plus any user files that may have been needed in the course of maintenance on these devices.
We only want to retain the following files - anything else found in these "DEVxxx*" directories is to be deleted:
The newest two backups: ls -t1 | grep -m2 ^[[:digit:]{6}_Config]
The newest backup done on the first of the month: ls -t1 | grep -m1 ^[[:digit:]{4}01_Config]
Any file that was modified less than 30 days ago: find -mtime -30
Our good configuration file: ls verification_cfg
Anything that doesn't match the above should be deleted.
How can we script this?
I'm guessing a BASH script can do this, and that we can create a cron job to run daily to perform the task.
Something like this perhaps?
{ ls -t1 | grep -m2 ^[[:digit:]{6}_Config] ;
ls -t1 | grep -m1 ^[[:digit:]{4}01_Config] ;
find -mtime -30 ;
ls -1 verification_cfg ;
} | rsync -a --exclude=* --include-from=- /var/DATA/ /var/DATA.bak/
rm -rf /var/DATA
mv /var/DATA.bak /var/DATA
For what it's worth, here is the bash script I created to accomplish my task. Comments are welcome.
#!/bin/bash
# This script follows these rules:
#
# - Only process directories beginning with "DEV"
# - Do not process directories within the device directory
# - Keep files that match the following criteria:
# - Keep the two newest automated backups
# - Keep the six newest automated backups generated on the first of the month
# - Keep any file that is less than 30 days old
# - Keep the file "verification_cfg"
#
# - An automated backup file is identified as six digits, followed by "_Config"
# e.g. 20120329_Config
# Remember the current directory
CurDir=`pwd`
# FTP home directory
DatDir='/var/DATA/'
cd $DatDir
# Only process directories beginning with "DEV"
for i in `find . -type d -maxdepth 1 | egrep '\.\/DEV' | sort` ; do
cd $DatDir
echo Doing "$i"
cd $i
# Set the GROUP EXECUTE bit on all files
find . -type f -exec chmod g+x {} \;
# Find the two newest automated config backups
for j in `ls -t1 | egrep -m2 ^[0-9]{8}_Config$` ; do
chmod g-x $j
done
# Find the six newest automated config backups generated on the first of the month
for j in `ls -t1 | egrep -m6 ^[0-9]{6}01_Config$` ; do
chmod g-x $j
done
# Find all files that are less than 30 days old
for j in `find -mtime -30 -type f` ; do
chmod g-x $j
done
# Find the "verification_cfg" file
for j in `find -name verification_cfg` ; do
chmod g-x $j
done
# Remove any files that still have the GROUP EXECUTE bit set
find . -type f -perm -g=x -exec rm -f {} \;
done
# Back to the users current directory
cd $CurDir

Copy folder structure (without files) from one location to another

I want to create a clone of the structure of our multi-terabyte file server. I know that cp --parents can move a file and it's parent structure, but is there any way to copy the directory structure intact?
I want to copy to a linux system and our file server is CIFS mounted there.
You could do something like:
find . -type d > dirs.txt
to create the list of directories, then
xargs mkdir -p < dirs.txt
to create the directories on the destination.
cd /path/to/directories &&
find . -type d -exec mkdir -p -- /path/to/backup/{} \;
Here is a simple solution using rsync:
rsync -av -f"+ */" -f"- *" "$source" "$target"
one line
no problems with spaces
preserve permissions
I found this solution there
1 line solution:
find . -type d -exec mkdir -p /path/to/copy/directory/tree/{} \;
I dunno if you are looking for a solution on Linux. If so, you can try this:
$ mkdir destdir
$ cd sourcedir
$ find . -type d | cpio -pdvm destdir
This copy the directories and files attributes, but not the files data:
cp -R --attributes-only SOURCE DEST
Then you can delete the files attributes if you are not interested in them:
find DEST -type f -exec rm {} \;
This works:
find ./<SOURCE_DIR>/ -type d | sed 's/\.\/<SOURCE_DIR>//g' | xargs -I {} mkdir -p <DEST_DIR>"/{}"
Just replace SOURCE_DIR and DEST_DIR.
The following solution worked well for me in various environments:
sourceDir="some/directory"
targetDir="any/other/directory"
find "$sourceDir" -type d | sed -e "s?$sourceDir?$targetDir?" | xargs mkdir -p
This solves even the problem with whitespaces:
In the original/source dir:
find . -type d -exec echo "'{}'" \; > dirs2.txt
then recreate it in the newly created dir:
mkdir -p <../<SOURCEDIR>/dirs2.txt
Substitute target_dir and source_dir with the appropriate values:
cd target_dir && (cd source_dir; find . -type d ! -name .) | xargs -i mkdir -p "{}"
Tested on OSX+Ubuntu.
If you can get access from a Windows machine, you can use xcopy with /T and /E to copy just the folder structure (the /E includes empty folders)
http://ss64.com/nt/xcopy.html
[EDIT!]
This one uses rsync to recreate the directory structure but without the files.
http://psung.blogspot.com/2008/05/copying-directory-trees-with-rsync.html
Might actually be better :)
A python script from Sergiy Kolodyazhnyy
posted on Copy only folders not files?:
#!/usr/bin/env python
import os,sys
dirs=[ r for r,s,f in os.walk(".") if r != "."]
for i in dirs:
os.makedirs(os.path.join(sys.argv[1],i))
or from the shell:
python -c 'import os,sys;dirs=[ r for r,s,f in os.walk(".") if r != "."];[os.makedirs(os.path.join(sys.argv[1],i)) for i in dirs]' ~/new_destination
FYI:
Copy top level folder structure without copying files in linux
How do I copy a directory tree but not the files in Linux?
Another approach is use the tree which is pretty handy and navigating directory trees based on its strong options. There are options for directory only, exclude empty directories, exclude names with pattern, include only names with pattern, etc. Check out man tree
Advantage: you can edit or review the list, or if you do a lot of scripting and create a batch of empty directories frequently
Approach: create a list of directories using tree, use that list as an arguments input to mkdir
tree -dfi --noreport > some_dir_file.txt
-dfi lists only directories, prints full path for each name, makes tree not print the indentation lines,
--noreport Omits printing of the file and directory report at the end of the tree listing, just to make the output file not contain any fluff
Then go to the destination where you want the empty directories and execute
xargs mkdir < some_dir_file.txt
find source/ -type f | rsync -a --exclude-from - source/ target/
Copy dir only with associated permission and ownership
Simple way:
for i in `find . -type d`; do mkdir /home/exemplo/$i; done
cd oldlocation
find . -type d -print0 | xargs -0 -I{} mkdir -p newlocation/{}
You can also create top directories only:
cd oldlocation
find . -maxdepth 1 -type d -print0 | xargs -0 -I{} mkdir -p newlocation/{}
Here is a solution in php that:
copies the directories (not recursively, only one level)
preserves permissions
unlike the rsync solution, is fast even with directories containing thousands of files as it does not even go into the folders
has no problems with spaces
should be easy to read and adjust
Create a file like syncDirs.php with this content:
<?php
foreach (new DirectoryIterator($argv[1]) as $f) {
if($f->isDot() || !$f->isDir()) continue;
mkdir($argv[2].'/'.$f->getFilename(), $f->getPerms());
chown($argv[2].'/'.$f->getFilename(), $f->getOwner());
chgrp($argv[2].'/'.$f->getFilename(), $f->getGroup());
}
Run it as user that has enough rights:
sudo php syncDirs.php /var/source /var/destination

Resources