Having an issues with rsync. I'm using rsync as a glorified cp command. I have in a script the following code.
rsync -aL --exclude /path/to/exclude/ --exclude='.*' /source/ /destination
I can get the rsync to exclude any hidden files. Hence the '.*' I cannot get the exclude dir to exclude. I've tried using an '=' sign, surrounding the dir with double quotes, with single quotes. Any help would be greatly appreciated. Thanks in advance.
Actually, neither Erik's nor Antoni's answer is fully accurate.
Erik is halfway right in saying that
As test/a is the base directory synced from, the exclude pattern is specified by starting with a/
It is true that the exclude pattern's root is test/a (i.e. the pattern /some/path binds to test/a/some/path), but that's not the whole story.
From the man page:
if the pattern starts with a / then it is anchored to a particular spot in the hierarchy of files, otherwise it is matched against the end of the pathname. This is similar to a leading ^ in regular expressions. Thus "/foo" would match a file named "foo" at either the
"root of the transfer" (for a global rule) or in the merge-file's directory (for a per-directory rule).
We can ignore the per-directory bit as it doesn't apply to us here.
Therefore, rsync -nvraL test/a test/dest --exclude=a/b/c/d will most definitely exclude test/a/b/c/d (and children), but it'll also exclude test/a/other/place/a/b/c/d.
rsync -nvraL test/a test/dest --exclude=/b/c/d, on the other hand, will exclude only test/a/b/c/d (and children) (test/a being the point to which / is anchored).
This is why you still need the anchoring inital slash if you want to exclude that specific path from being backed up. This might seem like a minor detail, and it will be so the more specific your exclude pattern becomes (e.g. Pictures vs. home/daniel/Pictures) but it might just come around to bite you in the butt.
mkdir -p test/a/b/c/d/e
mkdir -p test/dest
rsync -nvraL test/a test/dest --exclude=a/b/c/d
This works. As test/a is the base directory synced from, the exclude pattern is specified by starting with a/
Show us the real paths/excludes if this doesn't help.
Running rsync with -vn will list dirs/files - the pattern is matched against the format that rsync prints.
Following Erik's example you want to do this:
rsync -nvraL test/a/ test/dest --exclude=/b/c/d
Either of these would work:
rsync -nvraL test/a test/dest --exclude=/a/b/c/d
rsync -nvraL test/a/ test/dest --exclude=/b/c/d
Note the ending / in source path makes a critical difference to how --exclude should be specified. This assumes we have:
mkdir -p test/a/b/c/d/e
mkdir -p test/a/other/place/a/b/c/d
mkdir -p test/dest
Original Post has difficulty when exclude path starts with a /. Daniel’s answer is correct that this initial / in exclude path might be desirable to exclude a specific path, and that this initial / should be understood like leading ^ in regular expressions. However, his answer has a critical typo about the ending / in source path.
If you want to specify absolute paths you can convert them to relative paths using realpath:
--exclude="$(realpath --relative-to=$PWD /home/file)"
Related
I'm trying to write a personal backup command-line utility on OSX. Let's say I have two folders:
foo/bar/
foo/baz/
foo/bar contains, among other things, OSX aliases to files in foo/baz:
foo/bar/file_alias# -> foo/baz/file
I want to copy both foo/bar and foo/baz to an external hard drive, but for various reasons I do not just want to copy the entire folder foo. I can't figure out a way to copy these folders separately and make the aliases come out right in the end:
cp -r foo/bar /external_hd/foo/bar follows the aliases, replacing them with the original files.
cp -R foo/bar /external_hd/foo/bar preserves the aliases, but they (not surprisingly) continue to point to the original files (e.g. foo/baz/file, not external_hd/foo/baz/file).
rsync -avE foo/bar /external_hd/foo/bar (see this question) seems to do the same thing as cp -R.
Is there any way to accomplish this without copying the entire parent folder foo?
I know of no way where you can automatically copy folders and relink symbolic links to a new destination without some manual intervention. If you know the new paths its quite simple to script, though.
For your specific example; the following should do the trick to relink:
cd /external_hd/foo
find . -type l | while read x; do y=$(readlink "$x" | sed s'|/foo|/external_hd/foo|'); ln -sf "$y" "$x";done
rsync will get you close, the command:
rsync -avHER --safe-links foo/{bar,baz} /external_hd/
will copy the two folders, preserve "safe" relative symlinks between, and ignore "unsafe" symlinks - those that may reference files outside of the copied tree. Change it to:
rsync -avHER --copy-unsafe-links foo/{bar,baz} /external_hd/
and "safe" relative symlinks are preserve and "unsafe" symlinks are replaced by their destination.
If you only have "safe" relative symlinks the first option will do, the second option may do if some extra copying is OK.
However, the definition of "safe" is over-restrictive. Any absolute symlink is "unsafe" even if its target is within the copied tree. Furthermore even a relative link which goes too far towards the root, or maybe is just too complicated, is also "unsafe".
If you need to fix this it should be possible, as the above options show rsync is pretty close to what you need and the source code is available from Apple's Open Source site. Examine the code around the options --links, --copy-links, --copy-unsafe-links & unsafe-links and you may find fixing the definition of "safe" is fairly easy (and you can re-write the symlinks to use the shortest possible relative path at the same time).
HTH
I have looked and tried to use exuberant ctags with no luck with what I want to do. I am on a Mac trying to work in a project where I want to exclude such directories as .git, node_modules, test, etc. When I try something like ctags -R --exclude=[.git, node_modules, test] I get nothing in return. I really only need to have it run in my core directory. Any ideas on how to accomplish this?
The --exclude option does not expect a list of files. According to ctags's man page, "This option may be specified as many times as desired." So, it's like this:
ctags -R --exclude=.git --exclude=node_modules --exclude=test
Read The Fantastic Manual should always be the first step of any attempt to solve a problem.
From $ man ctags:
--exclude=[pattern]
Add pattern to a list of excluded files and directories. This option may
be specified as many times as desired. For each file name considered by
both the complete path (e.g. some/path/base.ext) and the base name (e.g.
base.ext) of the file, thus allowing patterns which match a given file
name irrespective of its path, or match only a specific path. If appro-
priate support is available from the runtime library of your C compiler,
then pattern may contain the usual shell wildcards (not regular expres-
sions) common on Unix (be sure to quote the option parameter to protect
the wildcards from being expanded by the shell before being passed to
ctags; also be aware that wildcards can match the slash character, '/').
You can determine if shell wildcards are available on your platform by
examining the output of the --version option, which will include "+wild-
cards" in the compiled feature list; otherwise, pattern is matched
against file names using a simple textual comparison.
If pattern begins with the character '#', then the rest of the string is
interpreted as a file name from which to read exclusion patterns, one per
line. If pattern is empty, the list of excluded patterns is cleared.
Note that at program startup, the default exclude list contains "EIFGEN",
"SCCS", "RCS", and "CVS", which are names of directories for which it is
generally not desirable to descend while processing the --recurse option.
From the two first sentences you get:
$ ctags -R --exclude=dir1 --exclude=dir2 --exclude=dir3 .
which may be a bit verbose but that's what aliases and mappings and so on are for. As an alternative, you get this from the second paragraph:
$ ctags -R --exclude=#.ctagsignore .
with the following in .ctagsignore:
dir1
dir2
dir3
which works out to excluding those 3 directories without as much typing.
You can encapsulate a comma separated list with curly braces to handle multiples with one --exclude option:
ctags -R --exclude={folder1,folder2,folder3}
This appears to only work for folders in the root of where you're issuing the command. Excluding nested folders requires a separate --exclude option.
The other answers were straight to the point, and I thought a little example may help:
You should add an asterisk unix-like style to exclude the whole directory.
ctags -R --exclude={.git/*,.env/*,.idea/*} ./
A bit late but following on romainl response, you could use your .gitignore file as a basis, you only need to remove any leading slashes from the file, like so:
sed "s/\///" .gitignore > .ctagsignore
ctags -R --exclude=#.ctagsignore
I really only need to have it run in my core directory.
Simply remove the -R (recursion) flag!!!
I want a simple and working (multiple) exclude option inside my rsync command. Lets say i will exclude a file and a directory:
/var/www/html/test.txt
/var/www/html/images/
What i did is:
rsync -avz --exclude="/var/www/html/test.txt" --exclude="/var/www/html/images/" /var/www/html root#xx.xx.xx.xx:/var/www
or
rsync -avz --exclude=/var/www/html/test.txt --exclude=/var/www/html/images/ /var/www/html root#xx.xx.xx.xx:/var/www
or
rsync -avz --exclude /var/www/html/test.txt --exclude /var/www/html/images/ /var/www/html root#xx.xx.xx.xx:/var/www
..
But however, the --exclude is NOT WORKING!
Everything is going out!
How to do it in this simple format please?
Note: I also don't want to use external exclusion list file. Just want all in one simple command.
i got it solved by myself after i've learned and tested many times. The real problem was the understandable (for me) --exclude option usage format.
I don't know how others are doing but i just found out that:
"--exclude" path CAN NOT be the full absolute path!
Because i was using the path(s) like: --exclude /var/www/html/text.txt which caused the thing DOES NOT work. So i used like:
--exclude text.txt --exclude images/
.. and it WORKS!
I personnaly like the --exclude={text.txt,images/} format.
Remember that with rsync, all exclude (or include) paths beginning with / are are anchored to the root of transfer which in your example will be the /var/www/html directory!!, so if you specify /text.txt it will be only the file which is # the root of your transfer directory not above in the tree. You can find more infos and examples here
Currently i only RSync-ing the Directories as like:
* * * * * rsync -avz /var/www/public_html/images root#<remote-ip>:/var/www/public_html
So how do i rsync one single file like, /var/www/public_html/.htaccess ?
You do it the same way as you would a directory, but you specify the full path to the filename as the source. In your example:
rsync -avz --status=progress /var/www/public_html/.htaccess root#<remote-ip>:/var/www/public_html/
As mentioned in the comments: since -a includes recurse, one little typo can make it kick off a full directory tree transfer, so a more fool-proof approach might to just use -vz, or replace it with -lptgoD.
Basic syntax
rsync options source destination
Example
rsync -az /var/www/public_html/filename root#<remote-ip>:/var/www/public_html
Read more
Michael Place's answer works great if, relative to the root directory for both the source and target, all of the directories in the file's path already exist.
But what if you want to sync a file with this source path:
/source-root/a/b/file
to a file with the following target path:
/target-root/a/b/file
and the directories a and b don't exist?
You need to run an rsync command like the following:
rsync -r --include="/a/" --include="/a/b/" --include="/a/b/file" --exclude="*" [source] [target]
To date, two of the answers aren't quite right, they'll get more than one file, and the other isn't as simple as it could be, here's a simpler answer IMO.
The following gets exactly one file, but you have to create the dest directory with mkdir. This is probably the fastest option:
mkdir -p ./local/path/to/file
rsync user#remote:/remote/path/to/file/ -zarv --include "filename" --exclude "*" ./local/path/to/file/
If there is only one instance of file in /remote/path, rsync can create directories for you if you do the following. This will probably take a little more time because it searches more directories. Plus it's will create empty directories for directories in /remote/path that are not in ./local
cd ./local
rsync user#remote:/remote/path -zarv --include "*/" --include "filename" --exclude "*" .
Keep in mind that the order of --include and --exclude matters.
Aside from the good above answers, rsync expects the destination to be a directory and not a filename. Suppose you are copying the word list file words to /tmp, don't do this:
rsync -az /user/share/dict/words /tmp/words # does not work
'cp' is tolerant of this form, but rsync isn't - it will fail because it doesn't see a directory at /tmp/words. Snip off the destination filename and it works:
rsync -az /user/share/dict/words /tmp
Note that rsync won't let you change the filename during the copy, and cp will.
I am trying to rsync directory A of server1 with directory B of server2.
Sitting in the directory A of server1, I ran the following commands.
rsync -av * server2::sharename/B
but the interesting thing is, it synchronizes all files and directories except .htaccess or any hidden file in the directory A. Any hidden files within subdirectories get synchronized.
I also tried the following command:
rsync -av --include=".htaccess" * server2::sharename/B
but the results are the same.
Any ideas why hidden files of A directory are not getting synchronized and how to fix it. I am running as root user.
thanks
This is due to the fact that * is by default expanded to all files in the current working directory except the files whose name starts with a dot. Thus, rsync never receives these files as arguments.
You can pass . denoting current working directory to rsync:
rsync -av . server2::sharename/B
This way rsync will look for files to transfer in the current working directory as opposed to looking for them in what * expands to.
Alternatively, you can use the following command to make * expand to all files including those which start with a dot:
shopt -s dotglob
See also shopt manpage.
For anyone who's just trying to sync directories between servers (including all hidden files) -- e.g., syncing somedirA on source-server to somedirB on a destination server -- try this:
rsync -avz -e ssh --progress user#source-server:/somedirA/ somedirB/
Note the slashes at the end of both paths. Any other syntax may lead to unexpected results!
Also, for me its easiest to perform rsync commands from the destination server, because it's easier to make sure I've got proper write access (i.e., I might need to add sudo to the command above).
Probably goes without saying, but obviously your remote user also needs read access to somedirA on your source server. :)
I had the same issue.
For me when I did the following command the hidden files did not get rsync'ed
rsync -av /home/user1 server02:/home/user1
But when I added the slashes at the end of the paths, the hidden files were rsync'ed.
rsync -av /home/user1/ server02:/home/user1/
Note the slashes at the end of the paths, as Brian Lacy said the slashes are the key. I don't have the reputation to comment on his post or I would have done that.
I think the problem is due to shell wildcard expansion. Use . instead of star.
Consider the following example directory content
$ ls -a .
. .. .htaccess a.html z.js
The shell's wildcard expansion translates the argument list that the rsync program gets from
-av * server2::sharename/B
into
-av a.html z.js server2::sharename/B
before the command starts getting executed.
The * tell to rsynch to not synch hidden files. You should not omit it.
On a related note, in case any are coming in from google etc trying to find while rsync is not copying hidden subfolders, I found one additional reason why this can happen and figured I'd pay it forward for the next guy running into the same thing: if you are using the -C option (obviously the --exclude would do it too but I figure that one's a bit easier to spot).
In my case, I had a script that was copying several folders across computers, including a directory with several git projects and I noticed that the I couldn't run any of the normal git commands in the copied repos (yes, normally one should use git clone but this was part of a larger backup that included other things). After looking at the script, I found that it was calling rsync with 7 or 8 options.
After googling didn't turn up any obvious answers, I started going through the switches one by one. After dropping the -C option, it worked correctly. In the case of the script, the -C flag appears to have been added as a mistake, likely because sftp was originally used and -C is a compression-related option under that tool.
per man rsync, the option is described as
--cvs-exclude, -C auto-ignore files in the same way CVS does
Since CVS is an older version control system, and given the man page description, it makes perfect sense that it would behave this way.