How to use 'mv' command to move files except those in a specific directory? - linux

I am wondering - how can I move all the files in a directory except those files in a specific directory (as 'mv' does not have a '--exclude' option)?

Lets's assume the dir structure is like,
|parent
|--child1
|--child2
|--grandChild1
|--grandChild2
|--grandChild3
|--grandChild4
|--grandChild5
|--grandChild6
And we need to move files so that it would appear like,
|parent
|--child1
| |--grandChild1
| |--grandChild2
| |--grandChild3
| |--grandChild4
| |--grandChild5
| |--grandChild6
|--child2
In this case, you need to exclude two directories child1 and child2, and move rest of the directories in to child1 directory.
use,
mv !(child1|child2) child1
This will move all of rest of the directories into child1 directory.

Since find does have an exclude option, use find + xargs + mv:
find /source/directory -name ignore-directory-name -prune -print0 | xargs -0 mv --target-directory=/target/directory
Note that this is almost copied from the find man page (I think using mv --target-directory is better than cpio).

First get the names of files and folders and exclude whichever you want:
ls --ignore=file1 --ignore==folder1 --ignore==regular-expression1 ...
Then pass filtered names to mv as the first parameter and the second parameter will be the destination:
mv $(ls --ignore=file1 --ignore==folder1 --ignore==regular-expression1 ...) destination/

This isn't exactly what you asked for, but it might do the job:
mv the-folder-you-want-to-exclude somewhere-outside-of-the-main-tree
mv the-tree where-you-want-it
mv the-excluded-folder original-location
(Essentially, move the excluded folder out of the larger tree to be moved.)
So, if I have a/ and I want to exclude a/b/c/*:
mv a/b/c ../c
mv a final_destination
mkdir -p a/b
mv ../c a/b/c
Or something like that. Otherwise, you might be able to get find to help you.

This will move all files at or below the current directory not in the ./exclude/ directory to /wherever...
find -E . -not -type d -and -not -regex '\./exclude/.*' -exec echo mv {} /wherever \;

ls | grep -v exclude-dir | xargs -t -I '{}' mv {} exclude-dir

rename your directory to make it hidden so the wildcard does not see it:
mv specific_dir .specific_dir
mv * ../other_dir

#!/bin/bash
touch apple banana carrot dog cherry
mkdir fruit
F="apple banana carrot dog cherry"
mv ${F/dog/} fruit
# this removes 'dog' from the list F, so it remains in the
current directory and not moved to 'fruit'

Inspired by #user13747357 's answer.
First you can ls the file and filter them by:
ls | egrep -v '(dir_name|file_name.ext)'
Then you can run the following command to move the files except the specific ones:
mv $(ls | egrep -v '(dir_name|file_name.ext)') target_dir
* Note that I tested this inside a specific directory. Cross-directory operation should be more carefully executed :)

suppose you directory is
.
├── dir1
│ └── a.txt
├── dir2
│ ├── b.txt
│ └── hello.c
├── file1.txt
├── file2.txt
└── file3.txt
and you gonna put file1 file2 file3 into dir2.
you can use
mv $(ls -p | grep -v /) /dir2 to finish it, because
ls -p | grep -v / will print all files except directory in cwd.

For example, if I want to move all files/directories - except a specified file or directory - inside "var/www/html" to a sub-folder named "my_sub_domain", then I use "mv" with the command "!(what_to_exclude)":
$ cd /var/www/html
$ mv !(my_sub_domain) my_sub_domain
To exclude more I use "|" to seperate file/directory names:
$ mv !(my_sub_domain|test1.html) my_sub_domain

mv * exclude-dir
was the perfect solution for me

Related

Find directories where a text is found in a specific file

How can I find the directories where a text is found in a specific file? E.g. I want to get all the directories in "/var/www/" that contain the text "foo-bundle" in the composer.json file. I have a command that already does it:
find ./ -maxdepth 2 -type f -print | grep -i 'composer.json' | xargs grep -i '"foo-bundle"'
However I want to make an sh script that gets all those directories and do things with them. Any idea?
find
Your current command is almost there, instead off using xargs with grep, lets:
Move the grep to an -exec
Use xargs to pass the result to dirname to show only the parent folder
find ./ -maxdepth 2 -type f -exec grep -l "foo-bundle" {} /dev/null \; | xargs dirname
If you only want to search for composer.json files, we can include the -iname option like so:
find ./ -maxdepth 2 -type f -iname '*composer.json' -exec grep -l "foo-bundle" {} /dev/null \; | xargs dirname
If the | xargs dirname doesn't give enough data, we can extend it so we can loop over the results of find using a while read like so:
find ./ -maxdepth 2 -type f -iname '*composer.json' -exec grep -l "foo-bundle" {} /dev/null \; | while read -r line ; do
parent="$(dirname ${line%%:*})"
echo "$parent"
done
grep
We can use grep to search for all files containing a specific text.
After looping over each line, we can
Remove behind the : to get the filepath
Use dirname to get the parent folder path
Consider this file setup, were /test/b/composer.json contains foo-bundle
➜ /tmp tree
.
├── test
│   ├── a
│   │   └── composer.json
│   └── b
│   └── composer.json
└── test.sh
When running the following test.sh:
#!/bin/bash
grep -rw '/tmp/test' --include '*composer.json' -e 'foo-bundle' | while read -r line ; do
parent="$(dirname ${line%:*})"
echo "$parent"
done
The result is as expected, the path to folder b:
/tmp/test/b
In order to find all files, containing a particular piece of text, you can use:
find ./ -maxdepth 2 -type f -exec grep -l "composer.json" {} /dev/null \;
The result is a list of filenames. Now all you need to do is to get a way to launch the command dirname on all of them. (I tried using a simple pipe, but that would have been too easy :-) )
Thanks to #0stone0 for leading the way. I finally got it with:
#!/bin/sh
find /var/www -maxdepth 2 -type f -print | grep -i 'composer.json' | xargs grep -i 'foo-bundle' | while read -r line ; do
parent="$(dirname ${line%%:*})"
echo "$parent"
done

How to list directories and files in a Bash by script?

I would like to list directory tree, but I have to write script for it and as parameter script should take path to base directory. Listing should start from this base directory.
The output should look like this:
Directory: ./a
File: ./a/A
Directory: ./a/aa
File: ./a/aa/AA
Directory: ./a/ab
File: ./a/ab/AB
So I need to print path from the base directory for every directory and file in this base directory.
UPDATED
Running the script I should type in the terminal this: ".\test.sh /home/usr/Desktop/myDirectory" or ".\test.sh myDirectory" - since I run the test.sh from the Desktop level.
And right now the script should be run from the level of /home/usr/Dekstop/myDirectory"
I have the following command in my test.sh file:
find . | sed -e "s/[^-][^\/]*\// |/g"
But It is the command, not shell code and prints the output like this:
DIR: dir1
DIR: dir2
fileA
DIR: dir3
fileC
fileB
How to print the path from base directory for every dir or file from the base dir? Could someone help me to work it out?
Not clear what you want maybe,
find . -type d -printf 'Directory: %p\n' -o -type f -printf 'File: %p\n'
However to see the subtree of a directory, I find more useful
find "$dirname" -type f
To answer comment it can also be done in pure bash (builtin without external commands), using a recursive function.
rec_find() {
local f
for f in "$1"/*; do
[[ -d $f ]] && echo "Directory: $f" && rec_find "$f"
[[ -f $f ]] && echo "File: $f"
done
}
rec_find "$1"
You can use tree command. Key -L means max depth. Examples:
tree
.
├── 1
│   └── test
├── 2
│   └── test
└── 3
└── test
3 directories, 3 files
Or
tree -L 1
.
├── 1
├── 2
└── 3
3 directories, 0 files
Create your test.sh with the below codes. Here you are reading command line parameter in system variable $1 and provides parameter to find command.
#!/bin/bash #in which shell you want to execute this script
find $1 | sed -e "s/[^-][^\/]*\// |/g"
Now how will it work:-
./test.sh /home/usr/Dekstop/myDirectory #you execute this command
Here command line parameter will be assign into $1. More than one parameter you can use $1 till $9 and after that you have to use shift command. (You will get more detail information online).
So your command will be now:-
#!/bin/bash #in which shell you want to execute this script
find /home/usr/Dekstop/myDirectory | sed -e "s/[^-][^\/]*\// |/g"
Hope this will help you.

Delete all files except the newest 3 in bash script

Question: How do you delete all files in a directory except the newest 3?
Finding the newest 3 files is simple:
ls -t | head -3
But I need to find all files except the newest 3 files. How do I do that, and how do I delete these files in the same line without making an unnecessary for loop for that?
I'm using Debian Wheezy and bash scripts for this.
This will list all files except the newest three:
ls -t | tail -n +4
This will delete those files:
ls -t | tail -n +4 | xargs rm --
This will also list dotfiles:
ls -At | tail -n +4
and delete with dotfiles:
ls -At | tail -n +4 | xargs rm --
But beware: parsing ls can be dangerous when the filenames contain funny characters like newlines or spaces. If you are certain that your filenames do not contain funny characters then parsing ls is quite safe, even more so if it is a one time only script.
If you are developing a script for repeated use then you should most certainly not parse the output of ls and use the methods described here: http://mywiki.wooledge.org/ParsingLs
Solution without problems with "ls" (strange named files)
This is a combination of ceving's and anubhava's answer.
Both solutions are not working for me. Because I was looking for a script that should run every day for backing up files in an archive, I wanted to avoid problems with ls (someone could have saved some funny named file in my backup folder). So I modified the mentioned solutions to fit my needs.
My solution deletes all files, except the three newest files.
find . -type f -printf '%T#\t%p\n' |
sort -t $'\t' -g |
head -n -3 |
cut -d $'\t' -f 2- |
xargs rm
Some explanation:
find lists all files (not directories) in current folder. They are printed out with timestamps.
sort sorts the lines based on timestamp (oldest on top).
head prints out the top lines, up to the last 3 lines.
cut removes the timestamps.
xargs runs rm for every selected file.
For you to verify my solution:
(
touch -d "6 days ago" test_6_days_old
touch -d "7 days ago" test_7_days_old
touch -d "8 days ago" test_8_days_old
touch -d "9 days ago" test_9_days_old
touch -d "10 days ago" test_10_days_old
)
This creates 5 files with different timestamps in the current folder. Run this script first and then the code for deleting old files.
The following looks a bit complicated, but is very cautious to be correct, even with unusual or intentionally malicious filenames. Unfortunately, it requires GNU tools:
count=0
while IFS= read -r -d ' ' && IFS= read -r -d '' filename; do
(( ++count > 3 )) && printf '%s\0' "$filename"
done < <(find . -maxdepth 1 -type f -printf '%T# %P\0' | sort -g -z) \
| xargs -0 rm -f --
Explaining how this works:
Find emits <mtime> <filename><NUL> for each file in the current directory.
sort -g -z does a general (floating-point, as opposed to integer) numeric sort based on the first column (times) with the lines separated by NULs.
The first read in the while loop strips off the mtime (no longer needed after sort is done).
The second read in the while loop reads the filename (running until the NUL).
The loop increments, and then checks, a counter; if the counter's state indicates that we're past the initial skipping, then we print the filename, delimited by a NUL.
xargs -0 then appends that filename into the argv list it's collecting to invoke rm with.
ls -t | tail -n +4 | xargs -I {} rm {}
If you want a 1 liner
In zsh:
rm /files/to/delete/*(Om[1,-4])
If you want to include dotfiles, replace the parenthesized part with (Om[1,-4]D).
I think this works correctly with arbitrary chars in the filenames (just checked with newline).
Explanation: The parentheses contain Glob Qualifiers. O means "order by, descending", m means mtime (See man zshexpn for other sorting keys - large manpage; search for "be sorted"). [1,-4] returns only the matches at one-based index 1 to (last + 1 - 4) (note the -4 for deleting all but 3).
Don't use ls -t as it is unsafe for filenames that may contain whitespaces or special glob characters.
You can do this using all gnu based utilities to delete all but 3 newest files in the current directory:
find . -maxdepth 1 -type f -printf '%T#\t%p\0' |
sort -z -nrk1 |
tail -z -n +4 |
cut -z -f2- |
xargs -0 rm -f --
ls -t | tail -n +4 | xargs -I {} rm {}
Michael Ballent's answer works best as
ls -t | tail -n +4 | xargs rm --
throw me error if I have less than 3 file
Recursive script with arbitrary num of files to keep per-directory
Also handles files/dirs with spaces, newlines and other odd characters
#!/bin/bash
if (( $# != 2 )); then
echo "Usage: $0 </path/to/top-level/dir> <num files to keep per dir>"
exit
fi
while IFS= read -r -d $'\0' dir; do
# Find the nth oldest file
nthOldest=$(find "$dir" -maxdepth 1 -type f -printf '%T#\0%p\n' | sort -t '\0' -rg \
| awk -F '\0' -v num="$2" 'NR==num+1{print $2}')
if [[ -f "$nthOldest" ]]; then
find "$dir" -maxdepth 1 -type f ! -newer "$nthOldest" -exec rm {} +
fi
done < <(find "$1" -type d -print0)
Proof of concept
$ tree test/
test/
├── sub1
│   ├── sub1_0_days_old.txt
│   ├── sub1_1_days_old.txt
│   ├── sub1_2_days_old.txt
│   ├── sub1_3_days_old.txt
│   └── sub1\ 4\ days\ old\ with\ spaces.txt
├── sub2\ with\ spaces
│   ├── sub2_0_days_old.txt
│   ├── sub2_1_days_old.txt
│   ├── sub2_2_days_old.txt
│   └── sub2\ 3\ days\ old\ with\ spaces.txt
└── tld_0_days_old.txt
2 directories, 10 files
$ ./keepNewest.sh test/ 2
$ tree test/
test/
├── sub1
│   ├── sub1_0_days_old.txt
│   └── sub1_1_days_old.txt
├── sub2\ with\ spaces
│   ├── sub2_0_days_old.txt
│   └── sub2_1_days_old.txt
└── tld_0_days_old.txt
2 directories, 5 files
As an extension to the answer by flohall. If you want to remove all folders except the newest three folders use the following:
find . -maxdepth 1 -mindepth 1 -type d -printf '%T#\t%p\n' |
sort -t $'\t' -g |
head -n -3 |
cut -d $'\t' -f 2- |
xargs rm -rf
The -mindepth 1 will ignore the parent folder and -maxdepth 1 subfolders.
This uses find instead of ls with a Schwartzian transform.
find . -type f -printf '%T#\t%p\n' |
sort -t $'\t' -g |
tail -3 |
cut -d $'\t' -f 2-
find searches the files and decorates them with a time stamp and uses the tabulator to separate the two values. sort splits the input by the tabulator and performs a general numeric sort, which sorts floating point numbers correctly. tail should be obvious and cut undecorates.
The problem with decorations in general is to find a suitable delimiter, which is not part of the input, the file names. This answer uses the NULL character.
Below worked for me:
rm -rf $(ll -t | tail -n +5 | awk '{ print $9}')

diff to output only the file names

I'm looking to run a Linux command that will recursively compare two directories and output only the file names of what is different. This includes anything that is present in one directory and not the other or vice versa, and text differences.
From the diff man page:
-q Report only whether the files differ, not the details of the differences.
-r When comparing directories, recursively compare any subdirectories found.
Example command:
diff -qr dir1 dir2
Example output (depends on locale):
$ ls dir1 dir2
dir1:
same-file different only-1
dir2:
same-file different only-2
$ diff -qr dir1 dir2
Files dir1/different and dir2/different differ
Only in dir1: only-1
Only in dir2: only-2
You can also use rsync
rsync -rv --size-only --dry-run /my/source/ /my/dest/ > diff.out
If you want to get a list of files that are only in one directory and not their sub directories and only their file names:
diff -q /dir1 /dir2 | grep /dir1 | grep -E "^Only in*" | sed -n 's/[^:]*: //p'
If you want to recursively list all the files and directories that are different with their full paths:
diff -rq /dir1 /dir2 | grep -E "^Only in /dir1*" | sed -n 's/://p' | awk '{print $3"/"$4}'
This way you can apply different commands to all the files.
For example I could remove all the files and directories that are in dir1 but not dir2:
diff -rq /dir1 /dir2 | grep -E "^Only in /dir1*" | sed -n 's/://p' | awk '{print $3"/"$4}' xargs -I {} rm -r {}
The approach of running diff -qr old/ new/ has one major drawback: it may miss files in newly created directories. E.g. in the example below the file data/pages/playground/playground.txt is not in the output of diff -qr old/ new/ whereas the directory data/pages/playground/ is (search for playground.txt in your browser to quickly compare). I also posted the following solution on Unix & Linux Stack Exchange, but I'll copy it here as well:
To create a list of new or modified files programmatically the best solution I could come up with is using rsync, sort, and uniq:
(rsync -rcn --out-format="%n" old/ new/ && rsync -rcn --out-format="%n" new/ old/) | sort | uniq
Let me explain with this example: we want to compare two dokuwiki releases to see which files were changed and which ones were newly created.
We fetch the tars with wget and extract them into the directories old/ and new/:
wget http://download.dokuwiki.org/src/dokuwiki/dokuwiki-2014-09-29d.tgz
wget http://download.dokuwiki.org/src/dokuwiki/dokuwiki-2014-09-29.tgz
mkdir old && tar xzf dokuwiki-2014-09-29.tgz -C old --strip-components=1
mkdir new && tar xzf dokuwiki-2014-09-29d.tgz -C new --strip-components=1
Running rsync one way might miss newly created files as the comparison of rsync and diff shows here:
rsync -rcn --out-format="%n" old/ new/
yields the following output:
VERSION
doku.php
conf/mime.conf
inc/auth.php
inc/lang/no/lang.php
lib/plugins/acl/remote.php
lib/plugins/authplain/auth.php
lib/plugins/usermanager/admin.php
Running rsync only in one direction misses the newly created files and the other way round would miss deleted files, compare the output of diff:
diff -qr old/ new/
yields the following output:
Files old/VERSION and new/VERSION differ
Files old/conf/mime.conf and new/conf/mime.conf differ
Only in new/data/pages: playground
Files old/doku.php and new/doku.php differ
Files old/inc/auth.php and new/inc/auth.php differ
Files old/inc/lang/no/lang.php and new/inc/lang/no/lang.php differ
Files old/lib/plugins/acl/remote.php and new/lib/plugins/acl/remote.php differ
Files old/lib/plugins/authplain/auth.php and new/lib/plugins/authplain/auth.php differ
Files old/lib/plugins/usermanager/admin.php and new/lib/plugins/usermanager/admin.php differ
Running rsync both ways and sorting the output to remove duplicates reveals that the directory data/pages/playground/ and the file data/pages/playground/playground.txt were missed initially:
(rsync -rcn --out-format="%n" old/ new/ && rsync -rcn --out-format="%n" new/ old/) | sort | uniq
yields the following output:
VERSION
conf/mime.conf
data/pages/playground/
data/pages/playground/playground.txt
doku.php
inc/auth.php
inc/lang/no/lang.php
lib/plugins/acl/remote.php
lib/plugins/authplain/auth.php
lib/plugins/usermanager/admin.php
rsync is run with theses arguments:
-r to "recurse into directories",
-c to also compare files of identical size and only "skip based on checksum, not mod-time & size",
-n to "perform a trial run with no changes made", and
--out-format="%n" to "output updates using the specified FORMAT", which is "%n" here for the file name only
The output (list of files) of rsync in both directions is combined and sorted using sort, and this sorted list is then condensed by removing all duplicates with uniq
On my linux system to get just the filenames
diff -q /dir1 /dir2|cut -f2 -d' '
I have a directory.
$ tree dir1
dir1
├── a
│   └── 1.txt
├── b
│   └── 2.txt
└── c
├── 3.txt
├── 4.txt
└── d
└── 5.txt
4 directories, 5 files
I have another directory.
$ tree dir2
dir2
├── a
│   └── 1.txt
├── b
└── c
├── 3.txt
├── 5.txt
└── d
└── 5.txt
4 directories, 4 files
I can diff two directories.
$ diff <(cd dir1; find . -type f | sort) <(cd dir2; find . -type f| sort)
--- /dev/fd/11 2022-01-21 20:27:15.000000000 +0900
+++ /dev/fd/12 2022-01-21 20:27:15.000000000 +0900
## -1,5 +1,4 ##
./a/1.txt
-./b/2.txt
./c/3.txt
-./c/4.txt
+./c/5.txt
./c/d/5.txt
rsync -rvc --delete --size-only --dry-run source dir target dir

How to copy a file to multiple directories using the gnu cp command

Is it possible to copy a single file to multiple directories using the cp command ?
I tried the following , which did not work:
cp file1 /foo/ /bar/
cp file1 {/foo/,/bar}
I know it's possible using a for loop, or find. But is it possible using the gnu cp command?
You can't do this with cp alone but you can combine cp with xargs:
echo dir1 dir2 dir3 | xargs -n 1 cp file1
Will copy file1 to dir1, dir2, and dir3. xargs will call cp 3 times to do this, see the man page for xargs for details.
No, cp can copy multiple sources but will only copy to a single destination. You need to arrange to invoke cp multiple times - once per destination - for what you want to do; using, as you say, a loop or some other tool.
Wildcards also work with Roberts code
echo ./fs*/* | xargs -n 1 cp test
I would use cat and tee based on the answers I saw at https://superuser.com/questions/32630/parallel-file-copy-from-single-source-to-multiple-targets instead of cp.
For example:
cat inputfile | tee outfile1 outfile2 > /dev/null
As far as I can see it you can use the following:
ls | xargs -n 1 cp -i file.dat
The -i option of cp command means that you will be asked whether to overwrite a file in the current directory with the file.dat. Though it is not a completely automatic solution it worked out for me.
These answers all seem more complicated than the obvious:
for i in /foo /bar; do cp "$file1" "$i"; done
ls -db di*/subdir | xargs -n 1 cp File
-b in case there is a space in directory name otherwise it will be broken as a different item by xargs, had this problem with the echo version
Not using cp per se, but...
This came up for me in the context of copying lots of Gopro footage off of a (slow) SD card to three (slow) USB drives. I wanted to read the data only once, because it took forever. And I wanted it recursive.
$ tar cf - src | tee >( cd dest1 ; tar xf - ) >( cd dest2 ; tar xf - ) | ( cd dest3 ; tar xf - )
(And you can add more of those >() sections if you want more outputs.)
I haven't benchmarked that, but it's definitely a lot faster than cp-in-a-loop (or a bunch of parallel cp invocations).
If you want to do it without a forked command:
tee <inputfile file2 file3 file4 ... >/dev/null
To use copying with xargs to directories using wildcards on Mac OS, the only solution that worked for me with spaces in the directory name is:
find ./fs*/* -type d -print0 | xargs -0 -n 1 cp test
Where test is the file to copy
And ./fs*/* the directories to copy to
The problem is that xargs sees spaces as a new argument, the solutions to change the delimiter character using -d or -E is unfortunately not properly working on Mac OS.
Essentially equivalent to the xargs answer, but in case you want parallel execution:
parallel -q cp file1 ::: /foo/ /bar/
So, for example, to copy file1 into all subdirectories of current folder (including recursion):
parallel -q cp file1 ::: `find -mindepth 1 -type d`
N.B.: This probably only conveys any noticeable speed gains for very specific use cases, e.g. if each target directory is a distinct disk.
It is also functionally similar to the '-P' argument for xargs.
No - you cannot.
I've found on multiple occasions that I could use this functionality so I've made my own tool to do this for me.
http://github.com/ddavison/branch
pretty simple -
branch myfile dir1 dir2 dir3
ls -d */ | xargs -iA cp file.txt A
Suppose you want to copy fileName.txt to all sub-directories within present working directory.
Get all sub-directories names through ls and save them to some temporary file say, allFolders.txt
ls > allFolders.txt
Print the list and pass it to command xargs.
cat allFolders.txt | xargs -n 1 cp fileName.txt
Another way is to use cat and tee as follows:
cat <source file> | tee <destination file 1> | tee <destination file 2> [...] > <last destination file>
I think this would be pretty inefficient though, since the job would be split among several processes (one per destination) and the hard drive would be writing several files at once over different parts of the platter. However if you wanted to write a file out to several different drives, this method would probably be pretty efficient (as all copies could happen concurrently).
Using a bash script
DESTINATIONPATH[0]="xxx/yyy"
DESTINATIONPATH[1]="aaa/bbb"
..
DESTINATIONPATH[5]="MainLine/USER"
NumberOfDestinations=6
for (( i=0; i<NumberOfDestinations; i++))
do
cp SourcePath/fileName.ext ${DESTINATIONPATH[$i]}
done
exit
if you want to copy multiple folders to multiple folders one can do something like this:
echo dir1 dir2 dir3 | xargs -n 1 cp -r /path/toyourdir/{subdir1,subdir2,subdir3}
If all your target directories match a path expression — like they're all subdirectories of path/to — then just use find in combination with cp like this:
find ./path/to/* -type d -exec cp [file name] {} \;
That's it.
If you need to be specific on into which folders to copy the file you can combine find with one or more greps. For example to replace any occurences of favicon.ico in any subfolder you can use:
find . | grep favicon\.ico | xargs -n 1 cp -f /root/favicon.ico
This will copy to the immediate sub-directories, if you want to go deeper, adjust the -maxdepth parameter.
find . -mindepth 1 -maxdepth 1 -type d| xargs -n 1 cp -i index.html
If you don't want to copy to all directories, hopefully you can filter the directories you are not interested in. Example copying to all folders starting with a
find . -mindepth 1 -maxdepth 1 -type d| grep \/a |xargs -n 1 cp -i index.html
If copying to a arbitrary/disjoint set of directories you'll need Robert Gamble's suggestion.
I like to copy a file into multiple directories as such:
cp file1 /foo/; cp file1 /bar/; cp file1 /foo2/; cp file1 /bar2/
And copying a directory into other directories:
cp -r dir1/ /foo/; cp -r dir1/ /bar/; cp -r dir1/ /foo2/; cp -r dir1/ /bar2/
I know it's like issuing several commands, but it works well for me when I want to type 1 line and walk away for a while.
For example if you are in the parent directory of you destination folders you can do:
for i in $(ls); do cp sourcefile $i; done

Resources