Removing archived directories after packing using tar and find - linux

I try to find directories, archive them and after that delete archives which were packaged.
I use following command:
find $DIRECTORY -maxdepth 1 -type d -name "`date --date="$i month ago" +%m`" -not -name \*.bz2 -print -exec tar --remove-files -cvjf {}.tar.bz2 {} \;
It works very good on tar (GNU tar) 1.23, input directory is being deleted. When I run it on tar (GNU tar) 1.15.1 there is strange behaviour. Directory which should be deleted still exist. Only its content(files) is removed.
I think that it could be caused by diffrent action in "remove-files" part.
I will be very gratefull for providing a solution to this problem.

Related

Pruning directories and current directory's files for use with tar

I've looked at numerous articles and I can't seem to figure it out, I suppose I'm a noob.
Anyways I have a directory that I would like to tar, however I want to exclude the shallow directory's files, as well as exclude the folders
"plugins", "backups", and "logs" that are located in the shallow directory.
->
\#!/bin/bash
mkdir -p /path/to/backup/directory/`date +%d%m%y`
cd /path/to/backup/directory/`date +%d%m%y`
cd .. | find . -not \\( -path plugins -prune -o -path backups -prune -o -path logs -prune \\) -mindepth 1 -print0 | xargs -0 tar cpj --directory=$(cd -) -f `date +%H`.tar.gz
The find section is what's wrong, it doesn't exclude anything. This is my 30th (not literally but probably higher than that actually xD ) attempt to prune and what not, with each attempt looking more ridiculous than the last.
If someone could just show me a solution for the find section, that'd be great - thanks
(the '`' characters are around the dates, it just breaks the code view when I try to put them in there)
Use --exclude=PATTERN and */, as it will only catch directories:
tar --exclude=plugins/ --exclude=logs/ --exclude=backups/ -cf /path/to/backup/directory/`date +%d%m%y`/whatever.tar [other options] */
About find excluding directories, try
find . -type d \( -path plugins -o -path backups -o -path logs \) -prune -o -print

How do you delete files older than specific date in Linux?

I used the below command to delete files older than a year.
find /path/* -mtime +365 -exec rm -rf {} \;
But now I want to delete all files whose modified time is older than 01 Jan 2014. How do I do this in Linux?
This works for me:
find /path ! -newermt "YYYY-MM-DD HH:MM:SS" | xargs rm -rf
You can touch your timestamp as a file and use that as a reference point:
e.g. for 01-Jan-2014:
touch -t 201401010000 /tmp/2014-Jan-01-0000
find /path -type f ! -newer /tmp/2014-Jan-01-0000 | xargs rm -rf
this works because find has a -newer switch that we're using.
From man find:
-newer file
File was modified more recently than file. If file is a symbolic
link and the -H option or the -L option is in effect, the modification time of the
file it points to is always used.
This other answer pollutes the file system and find itself offers a "delete" option. So, we don't have to pipe the results to xargs and then issue an rm.
This answer is more efficient:
find /path -type f -not -newermt "YYYY-MM-DD HH:MI:SS" -delete
find ~ -type f ! -atime 4|xargs ls -lrt
This will list files accessed older than 4 days, searching from home directory.

script to tar up multiple log files separately

I'm on a RedHat Linux 6 machine, running Elasticsearch and Logstash. I have a bunch of log files that were rotated daily from back in June til August. I am trying to figure out the best way to tar them up to save some diskspace, without manually taring up each one. I'm a bit of a newbie at scripting, so I was wondering if someone could help me out? The files have the name elasticsearch-cluster.log.datestamp. Ideally they would all be in their individual tar files, so that it'd be easier to go back and take a look at that particular day's logs if needed.
You could use a loop :
for file in elasticsearch-cluster.log.*
do
tar zcvf "$file".tar.gz "$file"
done
Or if you prefer a one-liner (this is recursive):
find . -name 'elasticsearch-cluster.log.*' -print0 | xargs -0 -I {} tar zcvf {}.tar.gz {}
or as #chepner mentions with the -exec option:
find . -name 'elasticsearch-cluster.log.*' -exec tar zcvf {}.tar.gz {} \;
or if want to exclude already zipped files:
find . -name 'elasticsearch-cluster.log.*' -not -name '*.tar.gz' -exec tar zcvf {}.tar.gz {} \;
If you don't mind all the files being in a single tar.gz file, you can do:
tar zcvf backups.tar.gz elasticsearch-cluster.log.*
All these commands leave the original files in place. After you validate the tar.gz files, you can delete them manually.

Recursively recode all project files excluding some directories and preserving permissions

How to recursively recode all project files excluding some directories and preserving permissions?
Based on this question, but its solution does not preserve permissions, so I had to modify it.
WARNING: since the recursive removal is a part of the solution, use it on your own risk
Task:
Recursively recode all project files (iso8859-8 -> utf-8) excluding '.git' and '.idea' dirs and preserving permissions.
Solution (worked well in my case):
Backup your project's dir, then cd there. Run:
find . -not -path "./.git/*" -not -path "./.idea/*" -type f -print -exec iconv -f iso8859-8 -t utf-8 -o {}.converted {} \; -exec sh -c 'cat {}.converted > {}' \; -exec rm {}.converted \;
Binary and image files will fail to recode since they aren't text, so files like 'image.jpeg.converted' will be left along with 'image.jpeg'. To clean up this mess:
find . -not -path "./.git/*" -not -path "./.idea/*" -type f -regex '.*\.converted' -exec rm {} \;
Before you do that, you may want just print (without rm) to see that there are only those files listed that you'd really like to remove.

Trying to FIND and then TAR.GZ found files

I found what I thought was a solution in this forum to being able to find my specific LOG files and then doing TAR.GZ on these files for a backup. However, when execute the command I'm getting an error. The command prior to the pipe works great and finds the files that I'm needing but when trying to create the backup file I blow up. Any suggestions/direction would be appreciated. Thanks.
Here is the command:
find /var/log/provenir -type f -name "*2014-09-08.log" | tar -cvzf backupProvLogFiles_20140908.tar.gz
Here is the error I'm getting:
find /var/log/provenir -type f -name "*2014-09-08.log" | tar -czvf backupProvLogFiles_20140908.tar.gz --null -T -
tar: Removing leading `/' from member names
tar: /var/log/provenir/BureauDE32014-09-08.log\n/var/log/provenir/DE_HTTP2014-09
-08.log\n/var/log/provenir/BureauDE22014-09-08.log\n/var/log/provenir/DE_HTTP220
14-09-08.log\n: Cannot stat: No such file or directory
tar: Exiting with failure status due to previous errors
You can also use gzip to do so
find /var/log/provenir -type f -name "*2014-09-08.log" | gzip > tar -cvzf backupProvLogFiles_20140908.tar EDIT
EDIT
A better solution would be to use command substituion
tar -cvzf backupProvLogFiles_20140908.tar $(find /var/log/provenir -type f -name "*2014-09-08.log")
I think you mean something like this:
find . -name "*XYZ*" -type f -print | tar -cvz -T - -f SomeFile.tgz
I was finally able to find a solution just in case someone else might be looking for another option to answer this question:
find /var/log/provenir -type f -name "*2014-09-08.log" -print0 | tar -czvf /var/log/provenir/barchive/backupProvLogFile_20140908.tar.gz --null -T -
This worked great. The answer came from this post: Find files and tar them (with spaces)
Thanks again for the help I received.
Regards.

Resources