tar exclude is not working inside the bash script - linux

I am trying to create a tar file of a folder, which has a lot of files to be excluded. So I wrote a script (mytar):
#!/usr/bin/env bash
# more files to be included
IGN=""
IGN="$IGN --exclude='notme.txt'"
tar --ignore-failed-read $IGN -cvf "$1" "$2"
# following command is working perfectly
# bash -c "tar --ignore-failed-read $IGN -cvf '$1' '$2'"
Test folder:
test/
notme.txt
test.txt
test2.txt
If I execute the script, it creates a tar file but doesn't exclude the files I have listed in IGN
Apparently, the command is:
tar --ignore-failed-read --exclude='notme.txt' -cvf test1.tar test
The command is working perfectly fine if it's directly executing in the shell. Also I have found a workaround for the script: using bash -c in script file
bash -c "tar --ignore-failed-read $IGN -cvf '$1' '$2'"
I am wondering and trying to figure out it,
Why this simple command is not working without bash -c?
Why it's working with bash -c?
Output:
First output shouldn't container notme.txt file like later
UPDATE 1 script updated

This has to do with the way bash expands variables in its shell.
When you set:
IGN="--exclude='notme.txt'"
it will be expanded as :
tar --ignore-failed-read '--exclude='\''notme.txt'\''' -cvf test1.tar test
And as such tar will look to exlcude a file named \''notme.txt'\'', which it won't find.
You may use:
IGN=--exclude='notme.txt'
which will be be interpreted correctly after shell expansion and tar will know it, but I would rather suggest you use your variable to only store the file name to be excluded:
IGN="notme.txt"
tar --exclude="$IGN" -cvf ./test1.tar ./*

in following command single quotes are syntactical (not literal, filename argument is not literaly surounded by quotes) to prevent shell for splitting argument in the case it contains a space or a tab
tar --ignore-failed-read --exclude='notme.txt' -cvf test1.tar test
the closest is to use array instead of string variable :
ign=( --exclude='notme.txt' )
tar --ignore-failed-read "${ign[#]}" -cvf test1.tar test

Related

How to get stdout of tar command

I am trying to tar a file and get it's output store in a variable.
I tried this but it is not working:
resulting_tar=$(tar -zcf "$(date '+%Y-%m-%d').tar.gz" folder)
Any idea how do I go about it?
By default, tar does not report the name of the file created. In fact, it doesn't say anything unless you tell it to, and the options given don't tell it to say anything.
Note that tar doesn't tell you what file it created. You tell tar what file to create.
You'll need to capture the name of the file in a variable and report it yourself:
file="$(date '+%Y-%m-%d').tar.gz"
tar -czf "$file" folder
echo "$file"
Try running tar -czf /dev/null folder; you won't see anything from (most implementations of) tar — and that's not because I specified /dev/null. Specify a name if you prefer: tar -czf junk.tar.gz folder and watch the (lack of) output — and remember to remove junk.tar.gz.
You might want to think about including the folder name in the tar file name, too.
folder="…whatever…"
file="$folder-$(date +'%Y-%m-%d').tar.gz"
tar -czf "$file" "$folder"
echo "$file"
EDIT: No longer applicable after further clarification. Leaving for posterity.
You're likely looking for both stdout and stderr. You can combine the two output streams by appending 2>&1 to your command:
resulting_tar=$(tar -zcf "$(date '+%Y-%m-%d').tar.gz" folder 2>&1)

tar: Removing leading `/' from member names "it is not duplicate"

#!/bin/bash
source="/home/user/work/tar/deneme"
source2="/home/user/work/tar/deneme1"
for i in {1..5}
do
tar -czvf $source2/$i/$i.tar.gz $source/$i/
done
I get this error message.
tar: Removing leading/' from member names`
this is my script and error. there are a lot of questions here but my problem doesn't solve. I run script than script create .tar.gz file. But if I unzip with tar -xzvf 1.tar.gzthis command, my file created in full path like home/user/work/tar/deneme/1/1-1.txt.
Do you have any idea?
I try some of ways.
For examle
Find /SED to convert absolute path to relative path within a single line tar statement for crontab
https://unix.stackexchange.com/questions/59243/tar-removing-leading-from-member-names/59244
This is because GNU tar remove leading / (by default). To avoid it you can rewrite your script on this way:
#!/bin/bash
cd /home/user/work/tar
source="deneme"
source2="deneme1"
for i in {1..5}
do
mkdir -p ${source2}/${i}
tar -czvf ${source2}/${i}/${i}.tar.gz ${source}/${i}/
done
Thank you for all your comments and answer.
I find the solution. I change some of codes. which is inside for loop
mkdir $source2/$i
cd $source/
tar -czvf $source2/$i/$i.tar.gz $i/*

Using 'tar' command in for loop

I know it is the basic question but please do help me .I compressed and archived my server log file using tar command
for i in server.log.2016-07-05 server.log.2016-07-06 ; do tar -zcvf server2.tar.gz $i; done
The output of the above loop is:
server.log.2016-07-05
server.log.2016-07-06
But while listing the file using tar -tvf server2.tar.gz the output obtained is:
rw-r--r-- root/root 663643914 2016-07-06 23:59 server.log.2016-07-06
i.e., I archived two files but only one file was displayed which means archive doesnt have both files right?? Please help on this.
I just tested with these two files but my folder has multiple files. Since I didn't get expected output I was not proceeded with all the files in my folder. The exact loop I am going to use is:
Previousmonth=$(date "+%b" --date '1 month ago')
for i in $(ls -l | awk '/'$Previousmonth'/ && /server.log./ {print $NF}');do;tar -zcvf server2.tar.gz $i;done
I am trying to compress and archive multiple files but while listing the files using tar -tvf it doesn't shows all the files.
You don't need a loop here. Just list all the files you want to add as command line parameter:
tar -zcvf server2.tar.gz server.log.2016-07-05 server.log.2016-07-06
The same goes for your other example too:
tar -zcvf server2.tar.gz $(ls -l | awk '/'$Previousmonth'/ && /server.log./ {print $NF}')
Except that parsing the output of ls -l is awful and strongly not recommended.
But since the filenames to backup contain the month number,
a much simpler and better solution is to get the year + month number using the date command, and then use shell globbing:
prefix=$(date +%Y-%m -d 'last month')
tar -zcvf server2.tar.gz server.log.$prefix-??

Execute multiple commands on target files from find command

Let's say I have a bunch of *.tar.gz files located in a hierarchy of folders. What would be a good way to find those files, and then execute multiple commands on it.
I know if I just need to execute one command on the target file, I can use something like this:
$ find . -name "*.tar.gz" -exec tar xvzf {} \;
But what if I need to execute multiple commands on the target file? Must I write a bash script here, or is there any simpler way?
Samples of commands that need to be executed a A.tar.gz file:
$ tar xvzf A.tar.gz # assume it untars to folder logs
$ mv logs logs_A
$ rm A.tar.gz
Here's what works for me (thanks to Etan Reisner suggestions)
#!/bin/bash # the target folder (to search for tar.gz files) is parsed from command line
find $1 -name "*.tar.gz" -print0 | while IFS= read -r -d '' file; do # this does the magic of getting each tar.gz file and assign to shell variable `file`
echo $file # then we can do everything with the `file` variable
tar xvzf $file
# mv untar_folder $file.suffix # untar_folder is the name of folder after untar
rm $file
done
As suggested, the array way is unsafe if file name contained space(s), and also doesn't seem to work properly in this case.
Writing a shell script is probably easiest. Take a look at sh for loops. You could use the output of a find command in an array, and then loop over that array to perform a set of commands on each element.
For example,
arr=( $(find . -name "*.tar.gz" -print0) )
for i in "${arr[#]}"; do
# $i now holds each of the filenames output by find
tar xvzf $i
mv $i $i.suffix
rm $i
# etc., etc.
done

uncompressing a large number of files on the fly

I have a script that I need to run on a large number of files with the extension **.tar.gz*.
Instead of uncompressing them and then running the script, I want to be able to uncompress them as I run the command and then work on the uncompressed folder, all with a single command.
I think a pipe is a good solution for this but i haven't used it before. How would I do this?
The -v orders tar to print filenames as it extracts each file:
tar -xzvf file.tar.gz | xargs -I {} -d\\n myscript "{}"
This way the script will contain commands to deal with a single file, passed as a parameter (thanks to xargs) to your script ($1 in the script context).
Edit: the -I {} -d\\n part will make it work with spaces in filenames.
The following three lines of bash...
for archive in *.tar.gz; do
tar zxvf "${archive}" 2>&1 | sed -e 's!x \([^/]*\)/.*!\1!' | sort -u | xargs some_script.sh
done
...will iterate over each gzipped tarball in the current directory, decompress it, grab the top-most directories of the decompressed contents and pass those as arguments to somescript.sh. This probably uses more pipes than you were expecting but seems to do what you are asking for.
N.B: tar xf can only take one file per invocation.
You can use a for loop:
for file in *.tar.gz; do tar -xf "$file"; your commands here; done
Or expanded:
for file in *.tar.gz; do
tar -xf "$file"
# your commands here
done

Resources