Can any one tell me how to extract a tar file using wildcards, for example
$ tar -xvf file1_*.tar dir1/
Thanks in advance
You can execute the following in the same dir as the tars.
for filename in ./file1_*.tar; do tar -xvf $filename -C ./dir1/; done
To extract multiple tar files in a single directory, try the following (from the directory containing the files):
ls file1_*.tar | xargs -I{} tar -xvf {} dir1/
The command lists the tar files using your pattern in the current directory, piping them to xargs, which will execute the tar command on each file using the pattern tar -xvf {filename} dir1/.
To see exactly what will be performed, modify the above command to
ls file1_*.tar | xargs -I{} echo tar -xvf {} dir1/
xargs is an incredibly powerful tool to learn how to use from the commandline where a single command needs to be performed on multiple inputs, and will often save you a lot of time.
This post also has another alternative.
Related
This question already has answers here:
Argument list too long error for rm, cp, mv commands
(31 answers)
Closed 3 years ago.
When I try to tar all the file in a folder using fowing command:
tar cvf mailpdfs.tar *.pdf
The shell complains:
ksh: /usr/bin/tar: 0403-027 The parameter list is too long.
How to deal with it? My folder contain 25000 pdf files, each file is 2MB in size, how can I copy them very fast?
You can copy/move all the pdf files to a newfolder and then tar the newfolder.
mv *.pdf newfolder
tar cvf mailpdfs.tar newfolder
Referenced from unix.com
The tar option -T is what you need
-T, --files-from=FILE
get names to extract or create from FILE
You are blowing the limit for file globbing in ksh, so you can generate the list of files like this
ls | grep '\.pdf$' >files.txt
Then use that file with tar
tar cvf mailpdfs.tar -T files.txt
Finally, you can do away with creating a temporary file to hold the filenames by getting tar to read them from stdin (by giving the -T option the special filename -).
So we end up with this
ls | grep '\.pdf$' | tar cvf mailpdfs.tar -T -
Requirement: Archive files using UNIX shell script into .gz format without directory structure
I am using below command
tar -C source_dir -zcvf target_dir/xyz.gz source_dir
example:
tar -C /home/log -zcvf /home/archive/xyz.gz /home/log
here xyz.gz contains /home/log
It's creating xyz.gz file maintaining the directory structure. I want only files to be archive without directory structure.
You can try the following command:
$ cd /home/log
$ tar zcvf /home/archive/xyz.gz *
You can use the --transform option to strip leading path components from the archived file names using a sed espression:
tar -C /home/log -zcvf /home/archive/xyz.gz --transform 's_.*/__' /home/log
This however will also write an entry for each encountered directory. If you don't want that, you can use find to find only regular files and pass them to tar on stdin like this:
cd /home/log
find -type f -print0 | tar -zcvf /home/archive/xyz.gz --transform 's_.*/__' --verbatim-files-from --null -T -
Note that this may create multiple entries with the same name in the tar archive, if files with the same name exist in different subdirectories. Also you should probably use the conventional .tar.gz or .tgz extension for the compressed tar archive.
I am trying compress files from an archive with the command
tar -czvf compress_file.tar.gz $(cat file_list.txt)
And I have an error
-bash: /bin/tar: Argument list too long
The files numbers is too long, how can I resolve this?
Use the "-T" option to pass a file to tar that contains the filenames to tar up.
tar -czv -T file_list.txt -f tarball.tar.gz
and how to make list of files to tar up:
first create the list of files to tar up
ls > temp
then
tar cvzf dicionario_ultra.tgz -X FILE -T temp
and finally
rm temp
You can use find to avoid the issue, it will list the files under current folder and the -print will trigger the tar with newline
find . -type f -print | tar -cvf somefile.tar -T -
I'm working on a backup script and want to tar up a file directory:
tar czf ~/backup.tgz /home/username/drupal/sites/default/files
This tars it up, but when I untar the resulting file, it includes the full file structure: the files are in home/username/drupal/sites/default/files.
Is there a way to exclude the parent directories, so that the resulting tar just knows about the last directory (files)?
Use the --directory option:
tar czf ~/backup.tgz --directory=/home/username/drupal/sites/default files
Hi I've a better solution when enter in the specified directory it's impossible (Makefiles,etc)
tar -cjvf files.tar.bz2 -C directory/contents/to/be/compressed .
Do not forget the dot (.) at the end !!
cd /home/username/drupal/sites/default/files
tar czf ~/backup.tgz *
Create a tar archive
tar czf $sourcedir/$backup_dir.tar --directory=$sourcedir WEB-INF en
Un-tar files on a local machine
tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
Upload to a server
scp -r -i $privatekey $sourcedir/$backup_dir.tar $server:$deploydir/med365/
echo "File uploaded.. deployment folders"
Un-tar on server
ssh -i $privatekey $server tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
To gunzip all txt (*.txt) files from /home/myuser/workspace/zip_from/
to /home/myuser/workspace/zip_to/ without directory structure of source files use following command:
tar -P -cvzf /home/myuser/workspace/zip_to/mydoc.tar.gz --directory="/home/myuser/workspace/zip_from/" *.txt
If you want to tar files while keeping the structure but ignore it partially or completely when extracting, use the --strip-components argument when extracting.
In this case, where the full path is /home/username/drupal/sites/default/files, the following command would extract the tar.gz content without the full parent directory structure, keeping only the last directory of the path (e.g. files/file1).
tar -xzv --strip-components=5 -f backup.tgz
I've found this tip on https://www.baeldung.com/linux/tar-archive-without-directory-structure#5-using-the---strip-components-option.
To build on nbt's and MaikoID's solutions:
tar -czf destination.tar.gz -C source/directory $(ls source/directory)
This solution:
Includes all files and folders in the directory
Does not include any of the directory structure (or .) in the final product
Does not require you to change directories.
However, it requires the directory to be given twice, so it may be most useful in another script. It may also be less efficient if there are a lot of files/folders in source/directory. Adjust the subcommand as necessary.
So for instance for the following structure:
|- source
| |- one
| `- two
`- working
the following command:
working$ tar -czf destination.tar.gz -C ../source $(ls ../source)
will produce destination.tar.gz where both one and two (and sub-files/-folders) are the first items.
This worked for me:
gzip -dc "<your_file>.tgz" | tar x -C <location>
For me -C or --directory did not work, I use this
cd source/directory/or/file
tar -cvzf destination/packaged-app.tgz *.jar
# this will put your current directory to what it previously was
cd -
Kindly use the below command to generate tar file without directory structure
tar -C <directoryPath> -cvzf <Path of the tar.gz file> filename1 filename2... filename N
eg:
tar -C /home/project/files -cvzf /home/project/files/test.tar.gz text1.txt text2.txt
tar -Cczf ~/backup.tgz /home/username/drupal/sites/default/files
-C does the cd for you
I have a script that I need to run on a large number of files with the extension **.tar.gz*.
Instead of uncompressing them and then running the script, I want to be able to uncompress them as I run the command and then work on the uncompressed folder, all with a single command.
I think a pipe is a good solution for this but i haven't used it before. How would I do this?
The -v orders tar to print filenames as it extracts each file:
tar -xzvf file.tar.gz | xargs -I {} -d\\n myscript "{}"
This way the script will contain commands to deal with a single file, passed as a parameter (thanks to xargs) to your script ($1 in the script context).
Edit: the -I {} -d\\n part will make it work with spaces in filenames.
The following three lines of bash...
for archive in *.tar.gz; do
tar zxvf "${archive}" 2>&1 | sed -e 's!x \([^/]*\)/.*!\1!' | sort -u | xargs some_script.sh
done
...will iterate over each gzipped tarball in the current directory, decompress it, grab the top-most directories of the decompressed contents and pass those as arguments to somescript.sh. This probably uses more pipes than you were expecting but seems to do what you are asking for.
N.B: tar xf can only take one file per invocation.
You can use a for loop:
for file in *.tar.gz; do tar -xf "$file"; your commands here; done
Or expanded:
for file in *.tar.gz; do
tar -xf "$file"
# your commands here
done