How big can be tar command options - linux

I want to use tar commands to archive multiple files in multiple directories. I just want one tar file as output.
tar -cvf file.tar /path1/inputfile1 /path2/inputfile2 ...
I want to know how long can be the arguments of create tar command. Does tar have some restrictions (or) unix command can only be some char long.

There's no limit in tar itself, but if you put the filenames on the command line you'll be limited by the operating system's command length limit. To get around this, you can use the -T option to get the list of filenames to archive from a file. So put all the filenames in a file filenames.txt (one filename per line), and then do:
tar -cvf file.tar -T filenames.txt

Related

Unix tar returns The parameter list is too long [duplicate]

This question already has answers here:
Argument list too long error for rm, cp, mv commands
(31 answers)
Closed 3 years ago.
When I try to tar all the file in a folder using fowing command:
tar cvf mailpdfs.tar *.pdf
The shell complains:
ksh: /usr/bin/tar: 0403-027 The parameter list is too long.
How to deal with it? My folder contain 25000 pdf files, each file is 2MB in size, how can I copy them very fast?
You can copy/move all the pdf files to a newfolder and then tar the newfolder.
mv *.pdf newfolder
tar cvf mailpdfs.tar newfolder
Referenced from unix.com
The tar option -T is what you need
-T, --files-from=FILE
get names to extract or create from FILE
You are blowing the limit for file globbing in ksh, so you can generate the list of files like this
ls | grep '\.pdf$' >files.txt
Then use that file with tar
tar cvf mailpdfs.tar -T files.txt
Finally, you can do away with creating a temporary file to hold the filenames by getting tar to read them from stdin (by giving the -T option the special filename -).
So we end up with this
ls | grep '\.pdf$' | tar cvf mailpdfs.tar -T -

Unzip a single file in a tbz archive

I have the following archived directory:
itunes20140618.tbz
I want to extract single file from it called:
itunes20140618/video
How would I do this?
So far, I am doing
$ bzip2 -d /tmp/itunes20140618.tbz
But it seems to create a tar directory of everything. How would I extract just the single video file?
There are a few different versions of tar around, but on my machine I can do this:
tar xjf archive.tbz filename
To extract filename from archive.
If that doesn't work you can use:
bzip2 -dc archive.tbz | tar xvf - filename
Which uses bzip2 to extract to stdout and then pipe to tar.
In both cases you can replace the x option with t to get a list of files. Eg:
tar tjf archive.tbz
You can use the tar command and pass the path of the desired file or folder as an argument to it:
tar xjf test.tbz /path/to/file/in/archive

How to gzip all files in all sub-directories into one compressed file in bash

Possible Duplicate:
gzipping up a set of directories and creating a tar compressed file
This post describes how to gzip each file individually within a directory structure. However, I need to do something slightly different. I need to produce one big gzip file for all files under a certain directory. I also need to be able to specify the output filename for the compressed file (e.g., files.gz) and overwrite the old compressed file file if one already exists.
tar -zcvf compressFileName.tar.gz folderToCompress
everything in folderToCompress will go to compressFileName
Edit: After review and comments I realized that people may get confused with compressFileName without an extension. If you want you can use .tar.gz extension(as suggested) with the compressFileName
there are lots of compression methods that work recursively command line and its good to know who the end audience is.
i.e. if it is to be sent to someone running windows then zip would probably be best:
zip -r file.zip folder_to_zip
unzip filenname.zip
for other linux users or your self tar is great
tar -cvzf filename.tar.gz folder
tar -cvjf filename.tar.bz2 folder # even more compression
#change the -c to -x to above to extract
One must be careful with tar and how things are tarred up/extracted, for example if I run
cd ~
tar -cvzf passwd.tar.gz /etc/passwd
tar: Removing leading `/' from member names
/etc/passwd
pwd
/home/myusername
tar -xvzf passwd.tar.gz
this will create
/home/myusername/etc/passwd
unsure if all versions of tar do this:
Removing leading `/' from member names
#amitchhajer 's post works for GNU tar. If someone finds this post and needs it to work on a NON GNU system, they can do this:
tar cvf - folderToCompress | gzip > compressFileName
To expand the archive:
zcat compressFileName | tar xvf -

GZip on Linux to archive files specified in the text file

I have a text file with paths to the list of files I want to compress into a singe archive. How can I pass this file to GZIP so it can create that archive with all files specified in the list?
Milan
gzip can only handle a single file at a time. You'll need to archive the files using tar first. Tar can do the compression at the same time (using the "z" argument).
tar cfz archive.tar.gz `cat file`
Well, in the first place, gzip doesn't compress multiple files into a single one, so you'll first tar. At least the GNU tar I checked has the option
-T, --files-from F
get names to extract or create from file F
so I suppose tar cfzvT target.tar.gz sourcelist would work.
gzip only compresses a single file. Use:
tar czf target.tar.gz `cat listoffile`

How do I exclude absolute paths for tar?

I am running a PHP script that gets me the absolute paths of files I want to tar up. This is the syntax I have:
tar -cf tarname.tar -C /www/path/path/file1.txt /www/path/path2/path3/file2.xls
When I untar it, it creates the absolute path to the files. How do I get just /path with everything under it to show?
If you want to remove the first n leading components of the file name, you need strip-components. So in your case, on extraction, do
tar xvf tarname.tar --strip-components=2
The man page has a list of tar's many options, including this one. Some earlier versions of tar use --strip-path for this operation instead.
You are incorrectly using the -C switch, which is used for changing directories. So what you need to do is:
tar -cf tarname.tar -C /www/path path/file1.txt path2/path3/file2.xls
or if you want to package everything under /www/path do:
tar -cf tarname.tar -C /www/path .
You can use -C switch multiple times.
For me the following works the best:
tar xvf some.tar --transform 's?.*/??g'
--transform argument is a replacement regex for sed, to which every extracted filepath is fed. Unlike --strip-components, it will remove all path information, not just fixed number of components.
If you don't know how many components are in the path, you could try this:
DIR_TO_PACK=/www/path/
cd $DIR_TO_PACK/..
tar -cf tarname.tar $(basename $DIR_TO_PACK)

Resources