-C option in tar changes parent directory permission - linux

related to Tar without retaining directory and tar creation without directory to be retain
Hi All,
Thanks for making understand -C use to create a tar(tar -cf a/b/c/tarfile.tar -C a/b/c .) without retaining directory structure.
But when i un-tar its changing parent directory permission.
Please help me understanding more about it.
It should not change parent directory permission while extracting tar(tar -xvf).

The command
tar -cf a/b/c/tarfile.tar -C a/b/c .
has an entry for . (the directory which you are tarring). When you extract the tar, that directory's permissions apply. If you do not want this behavior, you can either
specify the names (other than just ".") that you put into the tar file, e.g., tar -cf a/b/c/tarfile.tar -C a/b/c foo
specify the names of the items you want to extract. You can see what is available using tar tvf using your tarfile as the parameter.

Related

Tar a directory, but don't store full absolute paths in the archive

I have the following command in the part of a backup shell script:
tar -cjf site1.bz2 /var/www/site1/
When I list the contents of the archive, I get:
tar -tf site1.bz2
var/www/site1/style.css
var/www/site1/index.html
var/www/site1/page2.html
var/www/site1/page3.html
var/www/site1/images/img1.png
var/www/site1/images/img2.png
var/www/site1/subdir/index.html
But I would like to remove the part /var/www/site1 from directory and file names within the archive, in order to simplify extraction and avoid useless constant directory structure. Never know, in case I would extract backuped websites in a place where web data weren't stored under /var/www.
For the example above, I would like to have :
tar -tf site1.bz2
style.css
index.html
page2.html
page3.html
images/img1.png
images/img2.png
subdir/index.html
So, that when I extract, files are extracted in the current directory and I don't need to move extracted files afterwards, and so that sub-directory structures is preserved.
There are already many questions about tar and backuping in stackoverflow and at other places on the web, but most of them ask for dropping the entire sub-directory structure (flattening), or just add or remove the initial / in the names (I don't know what it changes exactly when extracting), but no more.
After having read some of the solutions found here and there as well as the manual, I tried :
tar -cjf site1.bz2 -C . /var/www/site1/
tar -cjf site1.bz2 -C / /var/www/site1/
tar -cjf site1.bz2 -C /var/www/site1/ /var/www/site1/
tar -cjf site1.bz2 --strip-components=3 /var/www/site1/
But none of them worked the way I want. Some do nothing, some others don't archive sub-directories anymore.
It's inside a backup shell script launched by a Cron, so I don't know well, which user runs it, what is the path and the current directory, so always writing absolute path is required for everything, and would prefer not changing current directory to avoid breaking something further in the script (because it doesn't only backup websites, but also databases, then send all that to FTP etc.)
How to achieve this?
Have I just misunderstood how the option -C works?
tar -cjf site1.tar.bz2 -C /var/www/site1 .
In the above example, tar will change to directory /var/www/site1 before doing its thing because the option -C /var/www/site1 was given.
From man tar:
OTHER OPTIONS
-C, --directory DIR
change to directory DIR
The option -C works; just for clarification I'll post 2 examples:
creation of a tarball without the full path:
full path /home/testuser/workspace/project/application.war and what we want is just project/application.war so:
tar -cvf output_filename.tar -C /home/testuser/workspace project
Note: there is a space between workspace and project; tar will replace full path with just project .
extraction of tarball with changing the target path (default to ., i.e current directory)
tar -xvf output_filename.tar -C /home/deploy/
tar will extract tarball based on given path and preserving the creation path; in our example the file application.war will be extracted to /home/deploy/project/application.war.
/home/deploy: given on extract
project: given on creation of tarball
Note : if you want to place the created tarball in a target directory, you just add the target path before tarball name. e.g.:
tar -cvf /path/to/place/output_filename.tar -C /home/testuser/workspace project
Seems -C option upto tar v2.8.3 does not work consistently on all the platforms (OSes). -C option is said to add directory to the archive but on Mac and Ubuntu it adds absolute path prefix inside generated tar.gz file.
tar target_path/file.tar.gz -C source_path/source_dir
Therefore the consistent and robust solution is to cd in to source_path (parent directory of source_dir) and run
tar target_path/file.tar.gz source_dir
or
tar -cf target_path/file.tar.gz source_dir
in your script. This will remove absolute path prefix in your generated tar.gz file's directory structure.
One minor detail:
tar -cjf site1.tar.bz2 -C /var/www/site1 .
adds the files as
tar -tf site1.tar.bz2
./style.css
./index.html
./page2.html
./page3.html
./images/img1.png
./images/img2.png
./subdir/index.html
If you really want
tar -tf site1.tar.bz2
style.css
index.html
page2.html
page3.html
images/img1.png
images/img2.png
subdir/index.html
You should either
cd into the directory first
or run
tar -cjf site1.tar.bz2 -C /var/www/site1 $(ls -A /var/www/site1)
Note though this does not support spaces. Thanks #dragon788 and #Fonic.
The following command will create a root directory "." and put all the files from the specified directory into it.
tar -cjf site1.tar.bz2 -C /var/www/site1 .
If you want to put all files in root of the tar file, #chinthaka is right. Just cd in to the directory and do:
tar -cjf target_path/file.tar.gz *
This will put all the files in the cwd to the tar file as root files.
Using the "point" leads to the creation of a folder named "point" (on Ubuntu 16).
tar -tf site1.bz2 -C /var/www/site1/ .
I dealt with this in more detail and prepared an example. Multi-line recording, plus an exception.
tar -tf site1.bz2\
-C /var/www/site1/ style.css\
-C /var/www/site1/ index.html\
-C /var/www/site1/ page2.html\
-C /var/www/site1/ page3.html\
--exclude=images/*.zip\
-C /var/www/site1/ images/
-C /var/www/site1/ subdir/
/
If you want to archive a subdirectory and trim subdirectory path this command will be useful:
tar -cjf site1.bz2 -C /var/www/ site1
Found tar -cvf site1-$seqNumber.tar -C /var/www/ site1 as more friendlier solution than tar -cvf site1-$seqNumber.tar -C /var/www/site1 . (notice the . in the second solution) for the following reasons
Tar file name can be insignificant as the original folder is now an archive entry
Tar file name being insignificant to the content can now be used for other purposes like sequence numbers, periodical backup etc.

Tarring files exclude the path before the folder i want to tar

Im trying to tar a set of subfolders and then tar its parent folder afterwards via a ruby script.
The structure is as follows:
/x/y/z/ParentFolder/Subfolder1
/x/y/z/ParentFolder/Subfolder2
/x/y/z/ParentFolder/Subfolder3
/x/y/z/ParentFolder/Subfolder4
So what i want to end up with is Subfolder1.tar.gz,Subfolder2.tar.gz,Subfolder3.tar.gz,Subfolder4.tar.gz all contained in ParentFolder.tar.gz.
My problem at the moment is that im able to tar the parent folder with its subfolders but it structure remains as /x/y/z/ParentFolder/SubFolder1----4
tarParentFolder = "tar -zcvf /x/y/z/ParentFolder.tar.gz /x/y/z/ParentFolder 2>/dev/null"
`#{tarParentFolder}`
I have searched around but cannot seem to find a solution to this,
Anybody got any ideas?
Thanks
The answer to how to get the path to be relative to the right path is to use the -C tar option. That's a capital C. The parameter you pass is the directory from which you want the tar to start relative.
so you would do:
tar -zcvf /x/y/z/ParentFolder.tar.gz -C /x/y/z ParentFolder
But ... you should also probably think twice about putting tars in tars. You should be fine just tarring up the containing dir.
For creating tar archives containing multiple files/folders use this:
$ mkdir f1 f2
$ tar -czf tar.tgz f1 f2 # creates the tar
$ tar -tzf tar.tgz # lists tar contents
f1/
f2/
f3/
$
So you should write something like:
tar -zcvf /x/y/z/ParentFolder.tar.gz /x/y/z/Subfolder{1,2,3,4}

How do I tar a directory without retaining the directory structure?

I'm working on a backup script and want to tar up a file directory:
tar czf ~/backup.tgz /home/username/drupal/sites/default/files
This tars it up, but when I untar the resulting file, it includes the full file structure: the files are in home/username/drupal/sites/default/files.
Is there a way to exclude the parent directories, so that the resulting tar just knows about the last directory (files)?
Use the --directory option:
tar czf ~/backup.tgz --directory=/home/username/drupal/sites/default files
Hi I've a better solution when enter in the specified directory it's impossible (Makefiles,etc)
tar -cjvf files.tar.bz2 -C directory/contents/to/be/compressed .
Do not forget the dot (.) at the end !!
cd /home/username/drupal/sites/default/files
tar czf ~/backup.tgz *
Create a tar archive
tar czf $sourcedir/$backup_dir.tar --directory=$sourcedir WEB-INF en
Un-tar files on a local machine
tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
Upload to a server
scp -r -i $privatekey $sourcedir/$backup_dir.tar $server:$deploydir/med365/
echo "File uploaded.. deployment folders"
Un-tar on server
ssh -i $privatekey $server tar -xvf $deploydir/med365/$backup_dir.tar -C $deploydir/med365/
To gunzip all txt (*.txt) files from /home/myuser/workspace/zip_from/
to /home/myuser/workspace/zip_to/ without directory structure of source files use following command:
tar -P -cvzf /home/myuser/workspace/zip_to/mydoc.tar.gz --directory="/home/myuser/workspace/zip_from/" *.txt
If you want to tar files while keeping the structure but ignore it partially or completely when extracting, use the --strip-components argument when extracting.
In this case, where the full path is /home/username/drupal/sites/default/files, the following command would extract the tar.gz content without the full parent directory structure, keeping only the last directory of the path (e.g. files/file1).
tar -xzv --strip-components=5 -f backup.tgz
I've found this tip on https://www.baeldung.com/linux/tar-archive-without-directory-structure#5-using-the---strip-components-option.
To build on nbt's and MaikoID's solutions:
tar -czf destination.tar.gz -C source/directory $(ls source/directory)
This solution:
Includes all files and folders in the directory
Does not include any of the directory structure (or .) in the final product
Does not require you to change directories.
However, it requires the directory to be given twice, so it may be most useful in another script. It may also be less efficient if there are a lot of files/folders in source/directory. Adjust the subcommand as necessary.
So for instance for the following structure:
|- source
| |- one
| `- two
`- working
the following command:
working$ tar -czf destination.tar.gz -C ../source $(ls ../source)
will produce destination.tar.gz where both one and two (and sub-files/-folders) are the first items.
This worked for me:
gzip -dc "<your_file>.tgz" | tar x -C <location>
For me -C or --directory did not work, I use this
cd source/directory/or/file
tar -cvzf destination/packaged-app.tgz *.jar
# this will put your current directory to what it previously was
cd -
Kindly use the below command to generate tar file without directory structure
tar -C <directoryPath> -cvzf <Path of the tar.gz file> filename1 filename2... filename N
eg:
tar -C /home/project/files -cvzf /home/project/files/test.tar.gz text1.txt text2.txt
tar -Cczf ~/backup.tgz /home/username/drupal/sites/default/files
-C does the cd for you

Make Tar + gzip ignore directory paths

Is it possible, when making a tar + gzip through the 'tar c ...' command, to have the relative paths will be ignored upon expanding?
For example,
tar cvf test.tgz foo ../../files/bar
And then expanding the test.tgz with
tar xvf test.tgz
gives a directory containing:
foo files/bar
I want the directory to contain the files:
foo bar
Is this possible?
If all the paths begin with the same initial list of directories then you can use e.g. tar cvf test.tgz -C ../.. other/dir. Beware that the shell won't expand wildcards in pathnames "properly" however because -C asks tar to change directory.
Otherwise, the only way I've ever come up with is to make a temporary directory filled with appropriate symlinks and use the -h option to dereference through symlinks. Of course that won't work if some of the files you want to store are actually symlinks themselves.

Shell command to tar directory excluding certain files/folders

Is there a simple shell command/script that supports excluding certain files/folders from being archived?
I have a directory that need to be archived with a sub directory that has a number of very large files I do not need to backup.
Not quite solutions:
The tar --exclude=PATTERN command matches the given pattern and excludes those files, but I need specific files & folders to be ignored (full file path), otherwise valid files might be excluded.
I could also use the find command to create a list of files and exclude the ones I don't want to archive and pass the list to tar, but that only works with for a small amount of files. I have tens of thousands.
I'm beginning to think the only solution is to create a file with a list of files/folders to be excluded, then use rsync with --exclude-from=file to copy all the files to a tmp directory, and then use tar to archive that directory.
Can anybody think of a better/more efficient solution?
EDIT: Charles Ma's solution works well. The big gotcha is that the --exclude='./folder' MUST be at the beginning of the tar command. Full command (cd first, so backup is relative to that directory):
cd /folder_to_backup
tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz .
You can have multiple exclude options for tar so
$ tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz .
etc will work. Make sure to put --exclude before the source and destination items.
You can exclude directories with --exclude for tar.
If you want to archive everything except /usr you can use:
tar -zcvf /all.tgz / --exclude=/usr
In your case perhaps something like
tar -zcvf archive.tgz arc_dir --exclude=dir/ignore_this_dir
Possible options to exclude files/directories from backup using tar:
Exclude files using multiple patterns
tar -czf backup.tar.gz --exclude=PATTERN1 --exclude=PATTERN2 ... /path/to/backup
Exclude files using an exclude file filled with a list of patterns
tar -czf backup.tar.gz -X /path/to/exclude.txt /path/to/backup
Exclude files using tags by placing a tag file in any directory that should be skipped
tar -czf backup.tar.gz --exclude-tag-all=exclude.tag /path/to/backup
old question with many answers, but I found that none were quite clear enough for me, so I would like to add my try.
if you have the following structure
/home/ftp/mysite/
with following file/folders
/home/ftp/mysite/file1
/home/ftp/mysite/file2
/home/ftp/mysite/file3
/home/ftp/mysite/folder1
/home/ftp/mysite/folder2
/home/ftp/mysite/folder3
so, you want to make a tar file that contain everyting inside /home/ftp/mysite (to move the site to a new server), but file3 is just junk, and everything in folder3 is also not needed, so we will skip those two.
we use the format
tar -czvf <name of tar file> <what to tar> <any excludes>
where the c = create, z = zip, and v = verbose (you can see the files as they are entered, usefull to make sure none of the files you exclude are being added). and f= file.
so, my command would look like this
cd /home/ftp/
tar -czvf mysite.tar.gz mysite --exclude='file3' --exclude='folder3'
note the files/folders excluded are relatively to the root of your tar (I have tried full path here relative to / but I can not make that work).
hope this will help someone (and me next time I google it)
You can use standard "ant notation" to exclude directories relative.
This works for me and excludes any .git or node_module directories:
tar -cvf myFile.tar --exclude=**/.git/* --exclude=**/node_modules/* -T /data/txt/myInputFile.txt 2> /data/txt/myTarLogFile.txt
myInputFile.txt contains:
/dev2/java
/dev2/javascript
This exclude pattern handles filename suffix like png or mp3 as well as directory names like .git and node_modules
tar --exclude={*.png,*.mp3,*.wav,.git,node_modules} -Jcf ${target_tarball} ${source_dirname}
I've experienced that, at least with the Cygwin version of tar I'm using ("CYGWIN_NT-5.1 1.7.17(0.262/5/3) 2012-10-19 14:39 i686 Cygwin" on a Windows XP Home Edition SP3 machine), the order of options is important.
While this construction worked for me:
tar cfvz target.tgz --exclude='<dir1>' --exclude='<dir2>' target_dir
that one didn't work:
tar cfvz --exclude='<dir1>' --exclude='<dir2>' target.tgz target_dir
This, while tar --help reveals the following:
tar [OPTION...] [FILE]
So, the second command should also work, but apparently it doesn't seem to be the case...
Best rgds,
I found this somewhere else so I won't take credit, but it worked better than any of the solutions above for my mac specific issues (even though this is closed):
tar zc --exclude __MACOSX --exclude .DS_Store -f <archive> <source(s)>
After reading all this good answers for different versions and having solved the problem for myself, I think there are very small details that are very important, and rare to GNU/Linux general use, that aren't stressed enough and deserves more than comments.
So I'm not going to try to answer the question for every case, but instead, try to register where to look when things doesn't work.
IT IS VERY IMPORTANT TO NOTICE:
THE ORDER OF THE OPTIONS MATTER: it is not the same put the --exclude before than after the file option and directories to backup. This is unexpected at least to me, because in my experience, in GNU/Linux commands, usually the order of the options doesn't matter.
Different tar versions expects this options in different order: for instance, #Andrew's answer indicates that in GNU tar v 1.26 and 1.28 the excludes comes last, whereas in my case, with GNU tar 1.29, it's the other way.
THE TRAILING SLASHES MATTER: at least in GNU tar 1.29, it shouldn't be any.
In my case, for GNU tar 1.29 on Debian stretch, the command that worked was
tar --exclude="/home/user/.config/chromium" --exclude="/home/user/.cache" -cf file.tar /dir1/ /home/ /dir3/
The quotes didn't matter, it worked with or without them.
I hope this will be useful to someone.
If you are trying to exclude Version Control System (VCS) files, tar already supports two interesting options about it! :)
Option : --exclude-vcs
This option excludes files and directories used by following version control systems: CVS, RCS, SCCS, SVN, Arch, Bazaar, Mercurial, and Darcs.
As of version 1.32, the following files are excluded:
CVS/, and everything under it
RCS/, and everything under it
SCCS/, and everything under it
.git/, and everything under it
.gitignore
.gitmodules
.gitattributes
.cvsignore
.svn/, and everything under it
.arch-ids/, and everything under it
{arch}/, and everything under it
=RELEASE-ID
=meta-update
=update
.bzr
.bzrignore
.bzrtags
.hg
.hgignore
.hgrags
_darcs
Option : --exclude-vcs-ignores
When archiving directories that are under some version control system (VCS), it is often convenient to read exclusion patterns from this VCS' ignore files (e.g. .cvsignore, .gitignore, etc.) This option provide such possibility.
Before archiving a directory, see if it contains any of the following files: cvsignore, .gitignore, .bzrignore, or .hgignore. If so, read ignore patterns from these files.
The patterns are treated much as the corresponding VCS would treat them, i.e.:
.cvsignore
Contains shell-style globbing patterns that apply only to the directory where this file resides. No comments are allowed in the file. Empty lines are ignored.
.gitignore
Contains shell-style globbing patterns. Applies to the directory where .gitfile is located and all its subdirectories.
Any line beginning with a # is a comment. Backslash escapes the comment character.
.bzrignore
Contains shell globbing-patterns and regular expressions (if prefixed with RE:(16). Patterns affect the directory and all its subdirectories.
Any line beginning with a # is a comment.
.hgignore
Contains posix regular expressions(17). The line syntax: glob switches to shell globbing patterns. The line syntax: regexp switches back. Comments begin with a #. Patterns affect the directory and all its subdirectories.
Example
tar -czv --exclude-vcs --exclude-vcs-ignores -f path/to/my-tar-file.tar.gz path/to/my/project/
I'd like to show another option I used to get the same result as the answers before provide, I had a similar case where I wanted to backup android studio projects all together in a tar file to upload to media fire, using the du command to find the large files, I found that I didn't need some directories like:
build, linux e .dart_tools
Using the first answer of Charles_ma I modified it a little bit to be able to run the command from the parent directory of the my Android directory.
tar --exclude='*/build' --exclude='*/linux' --exclude='*/.dart_tool' -zcvf androidProjects.tar Android/
It worked like a charm.
Ps. Sorry if this kind of answer is not allowed, if this is the case I will remove.
For Mac OSX I had to do
tar -zcv --exclude='folder' -f theOutputTarFile.tar folderToTar
Note the -f after the --exclude=
For those who have issues with it, some versions of tar would only work properly without the './' in the exclude value.
Tar --version
tar (GNU tar) 1.27.1
Command syntax that work:
tar -czvf ../allfiles-butsome.tar.gz * --exclude=acme/foo
These will not work:
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude=./acme/foo
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude='./acme/foo'
$ tar --exclude=./acme/foo -czvf ../allfiles-butsome.tar.gz *
$ tar --exclude='./acme/foo' -czvf ../allfiles-butsome.tar.gz *
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude=/full/path/acme/foo
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude='/full/path/acme/foo'
$ tar --exclude=/full/path/acme/foo -czvf ../allfiles-butsome.tar.gz *
$ tar --exclude='/full/path/acme/foo' -czvf ../allfiles-butsome.tar.gz *
I agree the --exclude flag is the right approach.
$ tar --exclude='./folder_or_file' --exclude='file_pattern' --exclude='fileA'
A word of warning for a side effect that I did not find immediately obvious:
The exclusion of 'fileA' in this example will search for 'fileA' RECURSIVELY!
Example:A directory with a single subdirectory containing a file of the same name (data.txt)
data.txt
config.txt
--+dirA
| data.txt
| config.docx
If using --exclude='data.txt' the archive will not contain EITHER data.txt file. This can cause unexpected results if archiving third party libraries, such as a node_modules directory.
To avoid this issue make sure to give the entire path, like --exclude='./dirA/data.txt'
After reading this thread, I did a little testing on RHEL 5 and here are my results for tarring up the abc directory:
This will exclude the directories error and logs and all files under the directories:
tar cvpzf abc.tgz abc/ --exclude='abc/error' --exclude='abc/logs'
Adding a wildcard after the excluded directory will exclude the files but preserve the directories:
tar cvpzf abc.tgz abc/ --exclude='abc/error/*' --exclude='abc/logs/*'
To avoid possible 'xargs: Argument list too long' errors due to the use of find ... | xargs ... when processing tens of thousands of files, you can pipe the output of find directly to tar using find ... -print0 | tar --null ....
# archive a given directory, but exclude various files & directories
# specified by their full file paths
find "$(pwd -P)" -type d \( -path '/path/to/dir1' -or -path '/path/to/dir2' \) -prune \
-or -not \( -path '/path/to/file1' -or -path '/path/to/file2' \) -print0 |
gnutar --null --no-recursion -czf archive.tar.gz --files-from -
#bsdtar --null -n -czf archive.tar.gz -T -
You can also use one of the "--exclude-tag" options depending on your needs:
--exclude-tag=FILE
--exclude-tag-all=FILE
--exclude-tag-under=FILE
The folder hosting the specified FILE will be excluded.
Use the find command in conjunction with the tar append (-r) option. This way you can add files to an existing tar in a single step, instead of a two pass solution (create list of files, create tar).
find /dir/dir -prune ... -o etc etc.... -exec tar rvf ~/tarfile.tar {} \;
You can use cpio(1) to create tar files. cpio takes the files to archive on stdin, so if you've already figured out the find command you want to use to select the files the archive, pipe it into cpio to create the tar file:
find ... | cpio -o -H ustar | gzip -c > archive.tar.gz
gnu tar v 1.26 the --exclude needs to come after archive file and backup directory arguments, should have no leading or trailing slashes, and prefers no quotes (single or double). So relative to the PARENT directory to be backed up, it's:
tar cvfz /path_to/mytar.tgz ./dir_to_backup --exclude=some_path/to_exclude
tar -cvzf destination_folder source_folder -X /home/folder/excludes.txt
-X indicates a file which contains a list of filenames which must be excluded from the backup. For Instance, you can specify *~ in this file to not include any filenames ending with ~ in the backup.
Success Case:
1) if giving full path to take backup, in exclude also should be used full path.
tar -zcvf /opt/ABC/BKP_27032020/backup_27032020.tar.gz --exclude='/opt/ABC/csv/' --exclude='/opt/ABC/log/' /opt/ABC
2) if giving current path to take backup, in exclude also should be used current path only.
tar -zcvf backup_27032020.tar.gz --exclude='ABC/csv/' --exclude='ABC/log/' ABC
Failure Case:
if giving currentpath directory to take backup and full path to ignore,then wont work
tar -zcvf /opt/ABC/BKP_27032020/backup_27032020.tar.gz --exclude='/opt/ABC/csv/' --exclude='/opt/ABC/log/' ABC
Note: mentioning exclude before/after backup directory is fine.
It seems to be impossible to exclude directories with absolute paths.
As soon as ANY of the paths are absolute (source or/and exclude) the exclude command will not work. That's my experience after trying all possible combinations.
Check it out
tar cvpzf zip_folder.tgz . --exclude=./public --exclude=./tmp --exclude=./log --exclude=fileName
I want to have fresh front-end version (angular folder) on localhost.
Also, git folder is huge in my case, and I want to exclude it.
I need to download it from server, and unpack it in order to run application.
Compress angular folder from /var/lib/tomcat7/webapps, move it to /tmp folder with name angular.23.12.19.tar.gz
Command :
tar --exclude='.git' -zcvf /tmp/angular.23.12.19.tar.gz /var/lib/tomcat7/webapps/angular/
Your best bet is to use find with tar, via xargs (to handle the large number of arguments). For example:
find / -print0 | xargs -0 tar cjf tarfile.tar.bz2
Possible redundant answer but since I found it useful, here it is:
While a FreeBSD root (i.e. using csh) I wanted to copy my whole root filesystem to /mnt but without /usr and (obviously) /mnt. This is what worked (I am at /):
tar --exclude ./usr --exclude ./mnt --create --file - . (cd /mnt && tar xvd -)
My whole point is that it was necessary (by putting the ./) to specify to tar that the excluded directories where part of the greater directory being copied.
My €0.02
I had no luck getting tar to exclude a 5 Gigabyte subdirectory a few levels deep. In the end, I just used the unix Zip command. It worked a lot easier for me.
So for this particular example from the original post
(tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz . )
The equivalent would be:
zip -r /backup/filename.zip . -x upload/folder/**\* upload/folder2/**\*
(NOTE: Here is the post I originally used that helped me https://superuser.com/questions/312301/unix-zip-directory-but-excluded-specific-subdirectories-and-everything-within-t)
The following bash script should do the trick. It uses the answer given here by Marcus Sundman.
#!/bin/bash
echo -n "Please enter the name of the tar file you wish to create with out extension "
read nam
echo -n "Please enter the path to the directories to tar "
read pathin
echo tar -czvf $nam.tar.gz
excludes=`find $pathin -iname "*.CC" -exec echo "--exclude \'{}\'" \;|xargs`
echo $pathin
echo tar -czvf $nam.tar.gz $excludes $pathin
This will print out the command you need and you can just copy and paste it back in. There is probably a more elegant way to provide it directly to the command line.
Just change *.CC for any other common extension, file name or regex you want to exclude and this should still work.
EDIT
Just to add a little explanation; find generates a list of files matching the chosen regex (in this case *.CC). This list is passed via xargs to the echo command. This prints --exclude 'one entry from the list'. The slashes () are escape characters for the ' marks.

Resources