I tarred a folder and split it into tar.gz files of 200mb when zipping. How can I go about unzipping them? Is there a way I can do this in one command or do I have to do each one separately?
You even cannot do it separately.
Just undo what you did in reversed order:
first concatenate them
then unzip them
then untar
So you do
cat *.tar.gz.* | zcat | tar xvf -
or, even shorter,
cat *.tar.gz.* | tar xvfz -
You can use the bellow :
$ cat *.tar | tar -xvf - -i
cat command, listed .tar files, then listed files will extracted with tar -xvf - -i command.
Related
Requirement: Archive files using UNIX shell script into .gz format without directory structure
I am using below command
tar -C source_dir -zcvf target_dir/xyz.gz source_dir
example:
tar -C /home/log -zcvf /home/archive/xyz.gz /home/log
here xyz.gz contains /home/log
It's creating xyz.gz file maintaining the directory structure. I want only files to be archive without directory structure.
You can try the following command:
$ cd /home/log
$ tar zcvf /home/archive/xyz.gz *
You can use the --transform option to strip leading path components from the archived file names using a sed espression:
tar -C /home/log -zcvf /home/archive/xyz.gz --transform 's_.*/__' /home/log
This however will also write an entry for each encountered directory. If you don't want that, you can use find to find only regular files and pass them to tar on stdin like this:
cd /home/log
find -type f -print0 | tar -zcvf /home/archive/xyz.gz --transform 's_.*/__' --verbatim-files-from --null -T -
Note that this may create multiple entries with the same name in the tar archive, if files with the same name exist in different subdirectories. Also you should probably use the conventional .tar.gz or .tgz extension for the compressed tar archive.
Can any one tell me how to extract a tar file using wildcards, for example
$ tar -xvf file1_*.tar dir1/
Thanks in advance
You can execute the following in the same dir as the tars.
for filename in ./file1_*.tar; do tar -xvf $filename -C ./dir1/; done
To extract multiple tar files in a single directory, try the following (from the directory containing the files):
ls file1_*.tar | xargs -I{} tar -xvf {} dir1/
The command lists the tar files using your pattern in the current directory, piping them to xargs, which will execute the tar command on each file using the pattern tar -xvf {filename} dir1/.
To see exactly what will be performed, modify the above command to
ls file1_*.tar | xargs -I{} echo tar -xvf {} dir1/
xargs is an incredibly powerful tool to learn how to use from the commandline where a single command needs to be performed on multiple inputs, and will often save you a lot of time.
This post also has another alternative.
I have the following archived directory:
itunes20140618.tbz
I want to extract single file from it called:
itunes20140618/video
How would I do this?
So far, I am doing
$ bzip2 -d /tmp/itunes20140618.tbz
But it seems to create a tar directory of everything. How would I extract just the single video file?
There are a few different versions of tar around, but on my machine I can do this:
tar xjf archive.tbz filename
To extract filename from archive.
If that doesn't work you can use:
bzip2 -dc archive.tbz | tar xvf - filename
Which uses bzip2 to extract to stdout and then pipe to tar.
In both cases you can replace the x option with t to get a list of files. Eg:
tar tjf archive.tbz
You can use the tar command and pass the path of the desired file or folder as an argument to it:
tar xjf test.tbz /path/to/file/in/archive
I'm working on a backup script and want to tar up a file directory:
tar czf ~/backup.tgz /home/username/drupal/sites/default/files
This tars it up, but when I untar the resulting file, it includes the full file structure: the files are in home/username/drupal/sites/default/files.
Is there a way to exclude the parent directories, so that the resulting tar just knows about the last directory (files)?
Use the --directory option:
tar czf ~/backup.tgz --directory=/home/username/drupal/sites/default files
Hi I've a better solution when enter in the specified directory it's impossible (Makefiles,etc)
tar -cjvf files.tar.bz2 -C directory/contents/to/be/compressed .
Do not forget the dot (.) at the end !!
cd /home/username/drupal/sites/default/files
tar czf ~/backup.tgz *
Create a tar archive
tar czf $sourcedir/$backup_dir.tar --directory=$sourcedir WEB-INF en
Un-tar files on a local machine
tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
Upload to a server
scp -r -i $privatekey $sourcedir/$backup_dir.tar $server:$deploydir/med365/
echo "File uploaded.. deployment folders"
Un-tar on server
ssh -i $privatekey $server tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
To gunzip all txt (*.txt) files from /home/myuser/workspace/zip_from/
to /home/myuser/workspace/zip_to/ without directory structure of source files use following command:
tar -P -cvzf /home/myuser/workspace/zip_to/mydoc.tar.gz --directory="/home/myuser/workspace/zip_from/" *.txt
If you want to tar files while keeping the structure but ignore it partially or completely when extracting, use the --strip-components argument when extracting.
In this case, where the full path is /home/username/drupal/sites/default/files, the following command would extract the tar.gz content without the full parent directory structure, keeping only the last directory of the path (e.g. files/file1).
tar -xzv --strip-components=5 -f backup.tgz
I've found this tip on https://www.baeldung.com/linux/tar-archive-without-directory-structure#5-using-the---strip-components-option.
To build on nbt's and MaikoID's solutions:
tar -czf destination.tar.gz -C source/directory $(ls source/directory)
This solution:
Includes all files and folders in the directory
Does not include any of the directory structure (or .) in the final product
Does not require you to change directories.
However, it requires the directory to be given twice, so it may be most useful in another script. It may also be less efficient if there are a lot of files/folders in source/directory. Adjust the subcommand as necessary.
So for instance for the following structure:
|- source
| |- one
| `- two
`- working
the following command:
working$ tar -czf destination.tar.gz -C ../source $(ls ../source)
will produce destination.tar.gz where both one and two (and sub-files/-folders) are the first items.
This worked for me:
gzip -dc "<your_file>.tgz" | tar x -C <location>
For me -C or --directory did not work, I use this
cd source/directory/or/file
tar -cvzf destination/packaged-app.tgz *.jar
# this will put your current directory to what it previously was
cd -
Kindly use the below command to generate tar file without directory structure
tar -C <directoryPath> -cvzf <Path of the tar.gz file> filename1 filename2... filename N
eg:
tar -C /home/project/files -cvzf /home/project/files/test.tar.gz text1.txt text2.txt
tar -Cczf ~/backup.tgz /home/username/drupal/sites/default/files
-C does the cd for you
I have a script that I need to run on a large number of files with the extension **.tar.gz*.
Instead of uncompressing them and then running the script, I want to be able to uncompress them as I run the command and then work on the uncompressed folder, all with a single command.
I think a pipe is a good solution for this but i haven't used it before. How would I do this?
The -v orders tar to print filenames as it extracts each file:
tar -xzvf file.tar.gz | xargs -I {} -d\\n myscript "{}"
This way the script will contain commands to deal with a single file, passed as a parameter (thanks to xargs) to your script ($1 in the script context).
Edit: the -I {} -d\\n part will make it work with spaces in filenames.
The following three lines of bash...
for archive in *.tar.gz; do
tar zxvf "${archive}" 2>&1 | sed -e 's!x \([^/]*\)/.*!\1!' | sort -u | xargs some_script.sh
done
...will iterate over each gzipped tarball in the current directory, decompress it, grab the top-most directories of the decompressed contents and pass those as arguments to somescript.sh. This probably uses more pipes than you were expecting but seems to do what you are asking for.
N.B: tar xf can only take one file per invocation.
You can use a for loop:
for file in *.tar.gz; do tar -xf "$file"; your commands here; done
Or expanded:
for file in *.tar.gz; do
tar -xf "$file"
# your commands here
done