I'm trying to get a report of file changes for development reasons and just want a simple cron command. With some Googling I managed to discover that you can get a list of files based on time-stamp changes emailed to you.
I'm trying to use this:
find /home/username/public_html/dev --exclude ".admin/cache/*" --exclude ".cache/*" --exclude ".cache/*"-mtime -1 \! -type d -ls
But I get this error:
find: unknown predicate `--exclude'
Also, is there such a command as include, or something that does the opposite? instead of excluding you're including?
--exclude is not listed in my copy of the find(1) man page. I'd try something like this:
find /home/username/public_html/dev -path ".admin/cache" -prune -o -path ".cache" -prune -o -mtime -1 \! -type d -ls
Related
I'd like to use find (or something else if that is more appropriate) to generate a file listing of an entire filesystem, except for /proc. I will the install some software, then run this command again to see what files have changed.
This is what I have tried, with no success:
find / -type f -xdev -exec md5sum {} + > md5sum.txt
That hangs on /proc.
find / \( -path /proc -o -path /var/run \) -prune -o -print -type f -exec md5sum {} + > md5sum.txt
I get a partial file listing, then a listing of files with an md5sum. (??)
Is there a way to use the updatedb command to do this? Thanks.
I've looked at numerous articles and I can't seem to figure it out, I suppose I'm a noob.
Anyways I have a directory that I would like to tar, however I want to exclude the shallow directory's files, as well as exclude the folders
"plugins", "backups", and "logs" that are located in the shallow directory.
->
\#!/bin/bash
mkdir -p /path/to/backup/directory/`date +%d%m%y`
cd /path/to/backup/directory/`date +%d%m%y`
cd .. | find . -not \\( -path plugins -prune -o -path backups -prune -o -path logs -prune \\) -mindepth 1 -print0 | xargs -0 tar cpj --directory=$(cd -) -f `date +%H`.tar.gz
The find section is what's wrong, it doesn't exclude anything. This is my 30th (not literally but probably higher than that actually xD ) attempt to prune and what not, with each attempt looking more ridiculous than the last.
If someone could just show me a solution for the find section, that'd be great - thanks
(the '`' characters are around the dates, it just breaks the code view when I try to put them in there)
Use --exclude=PATTERN and */, as it will only catch directories:
tar --exclude=plugins/ --exclude=logs/ --exclude=backups/ -cf /path/to/backup/directory/`date +%d%m%y`/whatever.tar [other options] */
About find excluding directories, try
find . -type d \( -path plugins -o -path backups -o -path logs \) -prune -o -print
I would like to find all of the files within a directory and its subdirectories except for any settings files and anything in settings or dependency directories.
For example, I want to exclude from my results entire directories like .git, .idea, and node_modules as well as files like .DS_Store and config.codekit, but I want to include .gitignore.
What I want is something like the results of the following Git command, but including any untracked files and able to be easily and safely operated upon (e.g., to change permissions).
git ls-tree -r master --name-only
Here is what I have so far and although it is rather unwieldy it seems to mostly do what I want except for leaving out .gitignore:
find . -type f -not -name ".*" -not -name config.codekit -not -path "./.*" -not -path "./node_modules/*"
I have experimented with -prune without much success.
Is there a way to specify exceptions to exclusions in the find command via Bash—to say something like exclude all the things that match this pattern EXCEPT this thing or these things?
By the way, I am presently using OS X, but I also use Ubuntu and I plan to try Ubuntu on Windows when the Windows 10 Anniversary Update is generally available, so ideally I would like to have a command that works across all of those.
Thank you in advance for any solutions, insights, or optimizations!
Update
Thanks to help from gniourf-gniourf, I have revised my command. This seems to do what I wanted:
find . -type f \( \! -name ".*" \! -name config.codekit \! -path "./.*" \! -path "./node_modules/*" -o -name .gitignore \)
A quick example first: to find all files, pruning the .git directories and ignoring the .DS_Store files:
find . -name .git -type d \! -prune -o \! -name .DS_Store -type f
For example, I want to exclude from my results entire directories like .git, .idea, and node_modules as well as files like .DS_Store and config.codekit, but I want to include .gitignore.
find . \( -name .git -o -name .idea -o -name node_modules \) -type d \! -prune -o \! -name .DS_Store \! -name config.codekit -type f
When building your command, make sure you stick with the POSIX standard: it's a guarantee that your command will work on any (POSIX compliant) system. For example, -not is not POSIX compliant: use ! instead (you'll have to escape it so as to not clash with your shell history expansion).
Is there a way to specify exceptions to exclusions in the find command via Bash—to say something like exclude all the things that match this pattern EXCEPT this thing or these things?
Find files, excluding everything (pattern *) except the files one and two:
find . \( \! -name '*' -o -name one -o -name two \) -type f
How to search (using find command) for directories and copy all the files and directory itself to another directory in linux?
Here is what I have so far:
find -type d -name "*.ABC" -exec {} /Desktop/NewFile \;
I get this as output:
find: './GAE/.ABC: PERMISSION DENIED
Please Help, Thanks!
Your error here above has nothing to do with file read permission. You're trying to execute the directories you find! Avoid running commands as root or sudo unless: (1) you really need it and (2) you really know what you're doing. Quite often people asking for root or sudo privileges are exactly the ones should not have it.
That said... there are several ways to copy a directory tree under *nix. This is just one possible approach:
$ find <start> -type d -name \*.ABC -exec cp -av {} <target> \;
Where:
<start> is a directory name. It's used to tell find where to start its search (for example /usr/local or $HOME)
<target> is another directory name to define the final destination of your copied directories
UPDATE
In case you want to search for multiple paths...
$ find <start> -type d \( -name \*.ABC -o -name \*.DEF \) -exec cp -av {} <target> \;
This should work:
find ./source_dir -name \*.png -print0 | xargs -0 cp -t path/to/destination
For more info, you can look up here.
How do I copy symbolic links only (and not the file it points to) or other files using rsync?
I tried
rsync -uvrl input_dir output_dir
but I need to exclusively copy the symbolic links only ?
any trick using include exclude options?
Per this question+answer, you can script this as a pipe. Pipes are an integral part of shell programming and shell scripting.
find /path/to/files -type l -print | \
rsync -av --files-from=- /path/to/files user#targethost:/path
What's going on here?
The find command starts at /path/to/files and steps recursively through everything "under" that point. The options to find are conditions that limit what gets output by the -print option. In this case, only things of -type l (symbolic link, according to man find) will be printed to find's "standard output".
These files become the "standard input" of the rsync command's --file-from option.
Give it a shot. I haven't actually tested this, but it seems to me that it should work.
You can generate a list of files excluding links with find input_dir -not -type l, rsync has an option --exclude-from=exlude_file.txt
you can do it in two steps :
find input_dir -not -type l > /tmp/rsync-exclude.txt
rsync -uvrl --exclude-from=/tmp/rsync-exclude.txt input_dir output_dir
one line bash :
rsync -urvl --exclude-from=<(find input_dir -not -type l | sed 's/^..//') input_dir output_dir
You can do it more easily like:
find /path/to/dir/ -type l -exec rsync -avP {} ssh_server:/path/to/server/ \;
EDIT:
If you want to copy symbolic links of the current directory only without making it recursive. You can do:
find /path/to/dir/ -maxdepth 1 -type l -exec rsync -avP {} ssh_server:/path/to/server/ \;
I prefer this:
find ./ -type l -print > /tmp/list_of_links.txt
rsync -av --files-from=/tmp/list_of_links.txt /path/to/files user#targethost:/path
The reason is simple. In the previous suggested version, I had to enter my password with every file. This way I can send all symlinks at once, with just one password entered.