I have a filter file '.filter' for rsync with the following contents:
merge '../.pattern'
The included file '.pattern' in turn also includes other pattern file in its parent directory and has the same contents:
merge '../.pattern'
When files are included in such a way, rsync apparently goes in infinite recursion and exits with "too many open files" error. Is there a way with rsync to use multiple levels of included files with relative paths?
Related
I know there has been a huge discussion about this but I have not found something this specific.
Im trying to copy all .key files in /home// directory
This does not work
/usr/bin/rsync -auPA --include="*/*.key" --exclude="*" /home/* /tmp/test
This works but it copies over unwanted empty directories like /home/uname/Documents
/usr/bin/rsync -auPA --include="*/" --include="*.key" --exclude="*" /home /tmp/test
Basically what i need for rsync to do is to copy only files with .key extension and only create necessarily folders that contain .key files
I think you are looking for the -m option. From the man page:
-m, --prune-empty-dirs
This option tells the receiving rsync to get rid of empty directories from the file-list, including nested directories that
have no non-directory children. This is useful for avoiding the creation of a bunch of useless directories when the sending
rsync is recursively scanning a hierarchy of files using include/exclude/filter rules.
Note that the use of transfer rules, such as the --min-size option, does not affect what goes into the file list, and thus
does not leave directories empty, even if none of the files in a directory match the transfer rule.
Because the file-list is actually being pruned, this option also affects what directories get deleted when a delete is active.
However, keep in mind that excluded files and directories can prevent existing items from being deleted due to an exclude both
hiding source files and protecting destination files. See the perishable filter-rule option for how to avoid this.
You can prevent the pruning of certain empty directories from the file-list by using a global "protect" filter. For instance,
this option would ensure that the directory "emptydir" was kept in the file-list:
--filter ’protect emptydir/’
Here’s an example that copies all .pdf files in a hierarchy, only creating the necessary destination directories to hold the
.pdf files, and ensures that any superfluous files and directories in the destination are removed (note the hide filter of
non-directories being used instead of an exclude):
rsync -avm --del --include=’*.pdf’ -f ’hide,! */’ src/ dest
If you didn’t want to remove superfluous destination files, the more time-honored options of "--include='*/' --exclude='*'"
would work fine in place of the hide-filter (if that is more natural to you).
Using a shell script I wish to delete all files and folders from /folder2/ that do not exist in /folder1/. Files only need to be matched by name.
I must add that the content of both folders shouldn't necessarily match after this operation because it's possible that /folder1/ contains files that do not in exist in /folder2/. So after executing the shell script all files and folders found in /folder2/ can also be found in /folder1/ but not vice versa.
The following works for me:
rsync -r --delete --existing --ignore-existing /path/to/folder1/ /path/to/folder2/
rsync will delete all files and folders from folder2 that are not found in folder1 recursively. Also, rsync will skip creating files on the destination. This answer was found here: https://serverfault.com/a/713577
Currently I try to use:
"wget --user=xxx --password=xxx -r ftp://www.domain.com/htdocs/"
But this saves output files to current directory in this fashion:
curdir/www.domain.com/htdocs/*
I need it to be:
curdir/*
Is there a way to do this, I only see a way to use output prefix, but i think this will just allow me to define directory outside current dir?
You can combine --no-directories if you want all your files inside one directory or --no-host-directories to have subdirectories but no subdirectories per host with your --directory-prefix option.
2.6 Directory Options
‘-nd’
‘--no-directories’
Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering (if a name shows up more than once, the filenames will get extensions ‘.n’).
‘-nH’
‘--no-host-directories’
Disable generation of host-prefixed directories. By default, invoking Wget with ‘-r http://fly.srk.fer.hr/’ will create a structure of directories beginning with fly.srk.fer.hr/. This option disables such behavior.
‘-P prefix’
‘--directory-prefix=prefix’
Set directory prefix to prefix. The directory prefix is the directory where all other files and subdirectories will be saved to, i.e. the top of the retrieval tree. The default is ‘.’ (the current directory).
(From the wget manual.)
In Perforce, I want to list all the files in the current directory but the result should not include the files from the subdirectories.
For example, if I have,
//depot/X/first.c
//depot/X/second.c
//depot/Y/third.c
//depot/Z/fourth.c
The result of the command, when run for //depot/X, would contain first.c and second.c only.
The command,
p4 files //depot/X/...
will list all the files so it is of no use.
I tried with other wildcards like *, but couldn't find an answer.
I think the question is: How to list all files and directories in specified directory not including the contents from sub directories.
And the command
p4 files //xxx/xxxx/"*"
p4 files //xxx/xxxx/'*'
p4 files //xxx/xxxx/*
These commands will just list the files in directory, but lost the sub directories.
If you want to get all sub directories name, you could use the p4 dirs
p4 dirs //xxx/xxxx/*
Then the sub directories will be printed in the screen.
Normally, the command would be p4 files //depot/X/*, however, it seems in your case you are using the csh shell. In that case, the * wildcard has to be quoted, e.g. p4 files //depot/X/'*'.
Using rsync, how can I rename files when copying with the --files-from argument? I have about 190,000 files, each of which need to be renamed when copying from source to destination. I plan to have the list of files in a text file to pass to the --files-from argument.
Not entirely true... you CAN rename files enroute with rsync, but only if you rsync one file at a time, and set the --no-R --no-implied-dirs options, then explicitly set the destination name in the destination path.
But at that point, you may just want to use some other tool.
This, for example, would work:
rsync --no-R --no-implied-dirs
1.2.3.4::module/$FILENAME
/$PATH/$TOFILE/$NEWFILENAME
There is no way to arbitrarily rename files with rsync. All rsync can do is move files to a different directory.
You must use a second tool either on the sending or receiving side to rename the files.