Crontab - How do I name my backup file after a weekday? - cron

I'm currently running 5 identical cron jobs, one for each weekday, to back up my public_html folder on the Cpanel-based webserver.
My cron looks like this:
tar -zcf /home/mywebsite/public_html/backups/monday_backup.tgz ./public_html
tar -zcf /home/mywebsite/public_html/backups/tuesday_backup.tgz ./public_html
.
.
tar -zcf /home/mywebsite/public_html/backups/friday_backup.tgz ./public_html
I'd like to know if there is a way to way to write the cronjob only once (instead of 5 times) so that the backup file automatically gets the weekday's name. Something like:
tar -zcf /home/mywebsite/public_html/backups/$weekday_backup.tgz ./public_html
Thanks!

date +%A will give you the day of the week. You can pipe the output through tr '[A-Z]' '[a-z]' to convert it to lowercase if you want to match the names you are already using:
tar -zcf /home/mywebsite/public_html/backups/`date +%A | tr '[A-Z]' '[a-z]'`_backup.tgz ./public_html

Use:
tar -zcf /home/mywebsite/public_html/backups/`date +%A`_backup.tgz ./public_html
date +%A gives you the current weekday (try it in your shell or man date to see this and other options).
I never tried something like this (i.e. embedding a command) within cron line itself, so you might need to move it to a separate script and call that script instead.

Related

How to get stdout of tar command

I am trying to tar a file and get it's output store in a variable.
I tried this but it is not working:
resulting_tar=$(tar -zcf "$(date '+%Y-%m-%d').tar.gz" folder)
Any idea how do I go about it?
By default, tar does not report the name of the file created. In fact, it doesn't say anything unless you tell it to, and the options given don't tell it to say anything.
Note that tar doesn't tell you what file it created. You tell tar what file to create.
You'll need to capture the name of the file in a variable and report it yourself:
file="$(date '+%Y-%m-%d').tar.gz"
tar -czf "$file" folder
echo "$file"
Try running tar -czf /dev/null folder; you won't see anything from (most implementations of) tar — and that's not because I specified /dev/null. Specify a name if you prefer: tar -czf junk.tar.gz folder and watch the (lack of) output — and remember to remove junk.tar.gz.
You might want to think about including the folder name in the tar file name, too.
folder="…whatever…"
file="$folder-$(date +'%Y-%m-%d').tar.gz"
tar -czf "$file" "$folder"
echo "$file"
EDIT: No longer applicable after further clarification. Leaving for posterity.
You're likely looking for both stdout and stderr. You can combine the two output streams by appending 2>&1 to your command:
resulting_tar=$(tar -zcf "$(date '+%Y-%m-%d').tar.gz" folder 2>&1)

Using 'tar' command in for loop

I know it is the basic question but please do help me .I compressed and archived my server log file using tar command
for i in server.log.2016-07-05 server.log.2016-07-06 ; do tar -zcvf server2.tar.gz $i; done
The output of the above loop is:
server.log.2016-07-05
server.log.2016-07-06
But while listing the file using tar -tvf server2.tar.gz the output obtained is:
rw-r--r-- root/root 663643914 2016-07-06 23:59 server.log.2016-07-06
i.e., I archived two files but only one file was displayed which means archive doesnt have both files right?? Please help on this.
I just tested with these two files but my folder has multiple files. Since I didn't get expected output I was not proceeded with all the files in my folder. The exact loop I am going to use is:
Previousmonth=$(date "+%b" --date '1 month ago')
for i in $(ls -l | awk '/'$Previousmonth'/ && /server.log./ {print $NF}');do;tar -zcvf server2.tar.gz $i;done
I am trying to compress and archive multiple files but while listing the files using tar -tvf it doesn't shows all the files.
You don't need a loop here. Just list all the files you want to add as command line parameter:
tar -zcvf server2.tar.gz server.log.2016-07-05 server.log.2016-07-06
The same goes for your other example too:
tar -zcvf server2.tar.gz $(ls -l | awk '/'$Previousmonth'/ && /server.log./ {print $NF}')
Except that parsing the output of ls -l is awful and strongly not recommended.
But since the filenames to backup contain the month number,
a much simpler and better solution is to get the year + month number using the date command, and then use shell globbing:
prefix=$(date +%Y-%m -d 'last month')
tar -zcvf server2.tar.gz server.log.$prefix-??

unix script archive log files older than 15 days

I have a list of log files in a directory, which are piled for more than a year now. I've written the below script to archive the log files which are older than 15 days.
Script:
#!/bin/bash
files=($(find /opt/Informatica/9.5.1/server/infa_shared/SessLogs -type f -mtime +15))
file=SessLog_bkup_`date +"%y-%m-%d"`.tar.gz
Backup=/opt/Informatica/9.5.1/server/infa_shared/SessLogs/Backup
tar -zcf $file --remove-files "${files[#]}"
mv $file $Backup
But, when I run the script it throws below error
Error:
./backuplogs.sh: line 5: /bin/tar: Argument list too long.
Please advise if I'm missing something in the script
thanks for the help
Kiran
Your error message is due to failure of execve(2) of /bin/tar by your shell with E2BIG.
Read the man page of tar(1). You could use
-T, --files-from=FILE
Get names to extract or create from FILE.
and combine that with some other parts of your script (e.g. redirecting the find command output to some temporary file, to be passed by -T to tar ....).
But as commented by hek2mgl you really want logrotate(8)
You could also use other archivers, e.g. afio(1)

How to zip files using last modified date and time using shell command in linux

I am trying to zip files using last modified date and time with following shell command in linux.
zip -rt $(date +"%Y-%m-%d:%H:%M:%S") destination.zip source_documents
E.g
zip -rt 2015-03-24:17:14:39 destination.zip source_documents
It does not works. it takes whole day files.
Following the man page you need to use the time format mmddyyyy. The command should look like this:
zip -rt $(date +"%m%d%Y") destination.zip source_documents
which gives (today):
zip -rt 03242015 destination.zip source_documents
Since zip -t doesn't take the time of day into consideration, let's find do the job, e. g.
find source_documents -newermt "2015-03-24 17:14:38" | zip -# destination

uncompressing a large number of files on the fly

I have a script that I need to run on a large number of files with the extension **.tar.gz*.
Instead of uncompressing them and then running the script, I want to be able to uncompress them as I run the command and then work on the uncompressed folder, all with a single command.
I think a pipe is a good solution for this but i haven't used it before. How would I do this?
The -v orders tar to print filenames as it extracts each file:
tar -xzvf file.tar.gz | xargs -I {} -d\\n myscript "{}"
This way the script will contain commands to deal with a single file, passed as a parameter (thanks to xargs) to your script ($1 in the script context).
Edit: the -I {} -d\\n part will make it work with spaces in filenames.
The following three lines of bash...
for archive in *.tar.gz; do
tar zxvf "${archive}" 2>&1 | sed -e 's!x \([^/]*\)/.*!\1!' | sort -u | xargs some_script.sh
done
...will iterate over each gzipped tarball in the current directory, decompress it, grab the top-most directories of the decompressed contents and pass those as arguments to somescript.sh. This probably uses more pipes than you were expecting but seems to do what you are asking for.
N.B: tar xf can only take one file per invocation.
You can use a for loop:
for file in *.tar.gz; do tar -xf "$file"; your commands here; done
Or expanded:
for file in *.tar.gz; do
tar -xf "$file"
# your commands here
done

Resources