unix script archive log files older than 15 days - linux

I have a list of log files in a directory, which are piled for more than a year now. I've written the below script to archive the log files which are older than 15 days.
Script:
#!/bin/bash
files=($(find /opt/Informatica/9.5.1/server/infa_shared/SessLogs -type f -mtime +15))
file=SessLog_bkup_`date +"%y-%m-%d"`.tar.gz
Backup=/opt/Informatica/9.5.1/server/infa_shared/SessLogs/Backup
tar -zcf $file --remove-files "${files[#]}"
mv $file $Backup
But, when I run the script it throws below error
Error:
./backuplogs.sh: line 5: /bin/tar: Argument list too long.
Please advise if I'm missing something in the script
thanks for the help
Kiran

Your error message is due to failure of execve(2) of /bin/tar by your shell with E2BIG.
Read the man page of tar(1). You could use
-T, --files-from=FILE
Get names to extract or create from FILE.
and combine that with some other parts of your script (e.g. redirecting the find command output to some temporary file, to be passed by -T to tar ....).
But as commented by hek2mgl you really want logrotate(8)
You could also use other archivers, e.g. afio(1)

Related

tar: Removing leading `/' from member names "it is not duplicate"

#!/bin/bash
source="/home/user/work/tar/deneme"
source2="/home/user/work/tar/deneme1"
for i in {1..5}
do
tar -czvf $source2/$i/$i.tar.gz $source/$i/
done
I get this error message.
tar: Removing leading/' from member names`
this is my script and error. there are a lot of questions here but my problem doesn't solve. I run script than script create .tar.gz file. But if I unzip with tar -xzvf 1.tar.gzthis command, my file created in full path like home/user/work/tar/deneme/1/1-1.txt.
Do you have any idea?
I try some of ways.
For examle
Find /SED to convert absolute path to relative path within a single line tar statement for crontab
https://unix.stackexchange.com/questions/59243/tar-removing-leading-from-member-names/59244
This is because GNU tar remove leading / (by default). To avoid it you can rewrite your script on this way:
#!/bin/bash
cd /home/user/work/tar
source="deneme"
source2="deneme1"
for i in {1..5}
do
mkdir -p ${source2}/${i}
tar -czvf ${source2}/${i}/${i}.tar.gz ${source}/${i}/
done
Thank you for all your comments and answer.
I find the solution. I change some of codes. which is inside for loop
mkdir $source2/$i
cd $source/
tar -czvf $source2/$i/$i.tar.gz $i/*

How to use OR operator with wildcards while extracting files with tar

While I can find all the .tgz files within a folder and then extract only PDF, EPUB and MOBI files from it if it is present in the archive.
find '/home/pi/Downloads/complete/' -type f -name "*.tgz"| while read i ; do tar -xvzf "$i" -C /home/pi/Downloads/complete/ebook/ --strip=1 --wildcards --no-anchored '*.pdf' '*.mobi' '*.epub'; done
This line of code works perfectly when either of pdf, mobi or epub is present in the archive. However with this code, whenever there is no pdf / epub / mobi within given archive it returns an error as shown below.
tar: *.pdf: Not found in archive
tar: *.mobi: Not found in archive
tar: Exiting with failure status due to previous errors
How to prevent this error. I believe there should be a way to provide the multiple wildcards with a 'OR' operator as available in other scripting languages.
tar isn't a scripting language.
To hide the error message, just redirect the stderr of tar to a bit bucket:
tar ... 2> /dev/null
Note that you might miss other errors, though.
The safe way would be to list the files first, select the ones to extract, and only do that if there were any.
tar --list -f ...tgz | grep '\.\(pdf\|mobi\|epub\)$'
Thanks to #choroba below code is perfect. No error reported. Posting the code as answer so that others have better visibility to the final working piece of code.
find '/home/pi/Downloads/complete/' -type f -name "*.tgz"| while read i ; do tar --list -f "$i" | grep '\.\(pdf\|mobi\|epub\)$' | while read -r line ; do tar -kxvzf "$i" -C "/home/pi/Downloads/complete/ebook/" "$line" --strip=1;done; done;

Using 'tar' command in for loop

I know it is the basic question but please do help me .I compressed and archived my server log file using tar command
for i in server.log.2016-07-05 server.log.2016-07-06 ; do tar -zcvf server2.tar.gz $i; done
The output of the above loop is:
server.log.2016-07-05
server.log.2016-07-06
But while listing the file using tar -tvf server2.tar.gz the output obtained is:
rw-r--r-- root/root 663643914 2016-07-06 23:59 server.log.2016-07-06
i.e., I archived two files but only one file was displayed which means archive doesnt have both files right?? Please help on this.
I just tested with these two files but my folder has multiple files. Since I didn't get expected output I was not proceeded with all the files in my folder. The exact loop I am going to use is:
Previousmonth=$(date "+%b" --date '1 month ago')
for i in $(ls -l | awk '/'$Previousmonth'/ && /server.log./ {print $NF}');do;tar -zcvf server2.tar.gz $i;done
I am trying to compress and archive multiple files but while listing the files using tar -tvf it doesn't shows all the files.
You don't need a loop here. Just list all the files you want to add as command line parameter:
tar -zcvf server2.tar.gz server.log.2016-07-05 server.log.2016-07-06
The same goes for your other example too:
tar -zcvf server2.tar.gz $(ls -l | awk '/'$Previousmonth'/ && /server.log./ {print $NF}')
Except that parsing the output of ls -l is awful and strongly not recommended.
But since the filenames to backup contain the month number,
a much simpler and better solution is to get the year + month number using the date command, and then use shell globbing:
prefix=$(date +%Y-%m -d 'last month')
tar -zcvf server2.tar.gz server.log.$prefix-??

Linux - Find command and tar command Failure

I am using a combination of find and copy command in my backup script.
it is used on a fairly huge amount of data,
first, out of 25 files it needs to find all the files older than 60 mins
then copy these files to a temp directory - each of these files are 1.52GB to 2GB
one of these 25 files will have data being appended continuously.
I have learnt from googling that Tarring operation will fail if there is an update going on to the file being attempted to tar, is it the same thing with find and copy also??
I am trying something like this,
/usr/bin/find $logPath -mmin +60 -type f -exec /bin/cp {} $logPath/$bkpDirectoryName \;
after this I have a step where I tar the files copied to the temp directory as mentioned above(&bkpDirectoryName), here I use as mentioned below,
/bin/tar -czf $bkpDir/$bkpDirectoryName.tgz $logPath/$bkpDirectoryName
and this also fails.
the same backup script was running from past many days and suddenly it has started failing and causing me headache! can someone please help me on this??
can you try these steps please
instead of copying files older than 60 min, move them.
run the tar on the moved files
If you do the above, the file which is continuously appended will not be moved.
In case any of your other 24 files might be updated after 60 min, you can do the following
Once you move a file, touch a file with the same name in case there are async updates which are not continuous.
When tarring the file, give a timestamp name to the tar file.This way you have a rolling tar of your logs
If nothing works due to some custom requirement on your side, try doing a rsync and then do the same operations on the rsynced files (i.e find and tar or just tar)
try this
output=`find $logPath -mmin 60 -type f`
if [ "temp$output" != "temp" ];then
cp -rf $output $other_than_logPath/$bkpDirectoryName/
else
echo sorry
fi
I think, you are using +60 instead of 60.
I also want to know, at what interval your script gets called.
#!/bin/bash
for find in `find / -name "*" -mmin 60`
do
cp $find / ## Choose directory
done
That's basically what you need, just change the directory I guess//

Zipping and deleting files with certain age

i'm trying to elaborate a command that will find files that haven't been modified in over 6 months and zip them with one command. Afterwards i want to delete all those files and i just archived.
My current command to find the directories with the files is
find /var/www -type d -mtime -400 ! -mtime -180 | xargs ls -l > testd.txt
This gave me all the directories including the files that are older than 6 months
Now i was wondering if there was a way of zipping all the results and deleting them afterwards. Something amongst the line of
find /var/www -type f -mtime -400 ! -mtime -180 | gzip -c archive.gz
If anyone knows the proper syntax to achieve this i'd love to know. Thakns!
Edit, after a few tests this command results in a corrupted file
find /var/www -mtime -900 ! -mtime -180 | xargs tar -cf test4.tar
Any ideas?
Break this into several distinct steps that you can implement and thoroughly test separately:
Build a list of files to be archived and then deleted, saved to a temp file
Use the list from step 1 to add the files to .tar.gz archives. Give the archive file a name following a specific pattern that won't appear in the files to be archived, and put it in a directory outside the hierarchy of files being archived.
Read back the files from the .tar.gz and compare them (or their hashes) to the original files to ENSURE that you got them all without corruption
Use the list from step 1 to delete the files. Do not use a wildcard for deletion. Put in some guard code to prevent deletion of any file matching the name pattern of the archive .tar.gz file(s) created in step 2.
When testing a script that can do irreversible damage, always code the dangerous command with a leading echo and leave it that way until you are sure everything works. Only then remove the echo.
Consider zip, it should meet your requirements.
find ... | zip -m# archive.zip
-m (move) deletes the input directories/files after making the specified zip archive.
-# takes the list of input files from standard input.
You may find more options which are useful to you in the zip manual, e. g.
-r (recurse) travels the directory structure recursively.
-sf (show-files) shows the files that would be operated on, then exits.
-t or --from-date operates on files not modified prior to the specified date.
-tt or --before-date operates on files not modified after or at the specified date.
This could possibly make findexpendable.
zip -mr --from-date 2012-09-05 --before-date 2013-04-13 archive /var/www

Resources