How to untar all .tar.gz with shell-script? - linux

I tried this:
DIR=/path/tar/*.gz
if [ "$(ls -A $DIR 2> /dev/null)" == "" ]; then
echo "not gz"
else
tar -zxvf /path/tar/*.gz -C /path/tar
fi
If the folder has one tar, it works. If the folder has many tar, I get an error.
How can I do this?
I have an idea to run a loop to untar, but I don't know how to solve this problem

for f in *.tar.gz
do
tar zxvf "$f" -C /path/tar
done

I find the find exec syntax very useful:
find . -name '*.tar.gz' -exec tar -xzvf {} \;
{} gets replaced with each file found and the line is executed.

for a in /path/tar/*.gz
do
tar -xzvf "$a" -C /path/tar
done
Notes
This presumes that files ending in .gz are gzipped tar files. Usually .tgz or .tar.gz is used to signify this, however tar will fail if something is not right.
You may find it easier to cd /path/tar first, then you can drop the -C /path/tar from the untar command, and the /path/tar/ in the loop.

The accepted answer worked for me with a slight modification
for f in *.tar.gz
do
tar zxvf "$f" -C \name_of_destination_folder_inside_current_path
done
I had to change the forward slash to a backslash and then it worked for me.

Related

Execute multiple commands on target files from find command

Let's say I have a bunch of *.tar.gz files located in a hierarchy of folders. What would be a good way to find those files, and then execute multiple commands on it.
I know if I just need to execute one command on the target file, I can use something like this:
$ find . -name "*.tar.gz" -exec tar xvzf {} \;
But what if I need to execute multiple commands on the target file? Must I write a bash script here, or is there any simpler way?
Samples of commands that need to be executed a A.tar.gz file:
$ tar xvzf A.tar.gz # assume it untars to folder logs
$ mv logs logs_A
$ rm A.tar.gz
Here's what works for me (thanks to Etan Reisner suggestions)
#!/bin/bash # the target folder (to search for tar.gz files) is parsed from command line
find $1 -name "*.tar.gz" -print0 | while IFS= read -r -d '' file; do # this does the magic of getting each tar.gz file and assign to shell variable `file`
echo $file # then we can do everything with the `file` variable
tar xvzf $file
# mv untar_folder $file.suffix # untar_folder is the name of folder after untar
rm $file
done
As suggested, the array way is unsafe if file name contained space(s), and also doesn't seem to work properly in this case.
Writing a shell script is probably easiest. Take a look at sh for loops. You could use the output of a find command in an array, and then loop over that array to perform a set of commands on each element.
For example,
arr=( $(find . -name "*.tar.gz" -print0) )
for i in "${arr[#]}"; do
# $i now holds each of the filenames output by find
tar xvzf $i
mv $i $i.suffix
rm $i
# etc., etc.
done

How to find all tar files in various sub-folders, then extract them in the same folder they were found?

I have lots of sub-folders, with only some containing a tar file. i.e.:
folder1/
folder2/this-is-a.tar
folder3/
folder4/this-is-another.tar
I can find which dirs have the tar by simply doing ls */*.tar.
What I want to achieve is somehow find all .tar files, then extract them in the same directory they are found, then delete the .tars.
I've tried ls */*.tar | xargs -n1 tar xvf but that extracts the tars in in the directory I'm in, not the directory the tars were found.
Any help would be greatly appreciated.
for i in */*.tar ; do pushd `dirname $i` ; tar xf `basename $i` && rm `basename $i` ; popd ; done
Edit: this is probably a better way:
find . -type f -iname "*.tar" -print0 -execdir tar xf {} \; -delete
for file in */*.tar; do
(cd `dirname $file`; tar xvf `basename $file`)
unlink $file
done

bash: /bin/tar: Argument list too long when compressing many files with tar

I am trying compress files from an archive with the command
tar -czvf compress_file.tar.gz $(cat file_list.txt)
And I have an error
-bash: /bin/tar: Argument list too long
The files numbers is too long, how can I resolve this?
Use the "-T" option to pass a file to tar that contains the filenames to tar up.
tar -czv -T file_list.txt -f tarball.tar.gz
and how to make list of files to tar up:
first create the list of files to tar up
ls > temp
then
tar cvzf dicionario_ultra.tgz -X FILE -T temp
and finally
rm temp
You can use find to avoid the issue, it will list the files under current folder and the -print will trigger the tar with newline
find . -type f -print | tar -cvf somefile.tar -T -

Extract and delete all .gz in a directory- Linux

I have a directory. It has about 500K .gz files.
How can I extract all .gz in that directory and delete the .gz files?
This should do it:
gunzip *.gz
#techedemic is correct but is missing '.' to mention the current directory, and this command go throught all subdirectories.
find . -name '*.gz' -exec gunzip '{}' \;
There's more than one way to do this obviously.
# This will find files recursively (you can limit it by using some 'find' parameters.
# see the man pages
# Final backslash required for exec example to work
find . -name '*.gz' -exec gunzip '{}' \;
# This will do it only in the current directory
for a in *.gz; do gunzip $a; done
I'm sure there's other ways as well, but this is probably the simplest.
And to remove it, just do a rm -rf *.gz in the applicable directory
Extract all gz files in current directory and its subdirectories:
find . -name "*.gz" | xargs gunzip
If you want to extract a single file use:
gunzip file.gz
It will extract the file and remove .gz file.
for foo in *.gz
do
tar xf "$foo"
rm "$foo"
done
Try:
ls -1 | grep -E "\.tar\.gz$" | xargs -n 1 tar xvfz
Then Try:
ls -1 | grep -E "\.tar\.gz$" | xargs -n 1 rm
This will untar all .tar.gz files in the current directory and then delete all the .tar.gz files. If you want an explanation, the "|" takes the stdout of the command before it, and uses that as the stdin of the command after it. Use "man command" w/o the quotes to figure out what those commands and arguments do. Or, you can research online.

for each dir create a tar file

I have a bunch of directories that need to be restored, but they have to first be packaged into a .tar. Is there a script that would allow me to package all 100+ directories into their own tar so dir becomes dir.tar.
So far attempt:
for i in *; do tar czf $i.tar $i; done
The script that you wrote will not work if you have some spaces in a directory name, because the name will be split, and also it will tar files if they exist on this level.
You can use this command to list directories not recursively:
find . -maxdepth 1 -mindepth 1 -type d
and this one to perform a tar on each one:
find . -maxdepth 1 -mindepth 1 -type d -exec tar cvf {}.tar {} \;
Do you have any directory names with spaces in them at that level? If not, your script will work just fine.
What I usually do is write a script with the command I want to execute echoed out:
$ for i in *
do
echo tar czf $i.tar $i
done
Then you can look at the output and see if it's doing what you want. After you've determined that the program will work, edit the command line and remove the echo command.
If there are spaces in the directory names, then just put the variables inside double quotes:
for i in *
do
tar czf "$i.tar" "$i"
done
Get them all done simply and in parallel with GNU Parallel:
parallel tar -cf {}.tar {} ::: *
If you want to check what it is going to do without actually doing anything, add --dry-run like this:
parallel --dry-run tar -cf {}.tar {} ::: *
Sample Output
tar -cf ab.tar ab
tar -cf cd.tar cd
if number of directories are very large and their names are too long
after execution of statement number one
for i in *
do
echo tar czf $i.tar $i
done
you will get error "string too long"

Resources