Move multiple files to multiple to multiple directories - linux

I have 5 files called file1.txt, file2.txt ... file5.txt and I would like to move each one into a respective directory called dir1, dir2 ... dir5.
So file1.txt is moved into dir1, file2.txt is moved into dir2 and so on.
Is there a way to do this in one line at the command line, using mv and xargs perhaps?
I'm only suggesting xargs because I quite like this answer provided by Robert Gamble to a question asking how to copy one file to multiple directories.
echo dir1 dir2 dir3 | xargs -n 1 cp file1

I would personally prefer a solution that relies on a for loop, e.g.:
for n in {1..5}; do echo mv -- "file$n.txt" "dir$n/"; done
# ^^^^ remove that
This can be done with xargs but I find the solution to be less elegant:
seq 1 5 | xargs -n1 -I{} echo mv -- "file{}.txt" "dir{}/"
# ^^^^ remove that

Another way you could do it, if it weren't necessarily a list of consecutive integers, and dir* didn't necessarily already exist.
for f in *.txt; do mkdir dir${f: -5:1}; mv $f dir${f: -5:1}/; done

Using GNU Parallel you would so:
parallel mv {} {=s/file/dir/=} ::: *.txt

Related

How to move the n number of files which is inside the directory called directory1 to a new directory

I have a list of files inside directory1 and I want to move n number of files from that directory1 to directory2. When I try xargs like this it did not work.
ls -ltr | head -20 | mv xargs /directory2
Why we can't use xargs in middle? how to move n number of files to another directory in command line ?
First, you should not use ls in scripts. ls is for humans, not for scripting. Second, the last command of your pipe tries to move a file named xargs to /directory2. Third, xargs appends its inputs to the command. Even if you swap mv and xargs this will lead to execute mv /directory2 file1 file2 file3... file20, instead of what you want: mv file1 file2 file3... file20 /directory2. Finally, the -l option of ls will print more than just the file name (permissions, owner, group...); you cannot use it as mv argument.
Your ls options suggest that you want to move the 20 oldest files. Try:
while read -d '' -r time file; do
mv -- "$file" "/directory2"
done < <( stat --printf '%Y %n\0' * | sort -zn | head -zn20 )
stat --printf '%Y %n\0' * prints the last modification time as seconds since epoch, followed by the file name, of all files in the current directory. Each record is terminated by the NUL character, the only character that cannot be found in a file name. The -z option of sort and head and the -d '' option of read instruct these utilities to use NUL as the record separator instead of the default (newline). This way, the script should work even if some of your file names contain newlines.
If you prefer xargs:
stat --printf '%Y\t%n\0' * | sort -zn | head -zn20 | cut -zf2- |
xargs -0I{} mv -- {} /directory2
You can try this:
n=20
cd /path/to/directory1 || exit
files=(*)
mv -- "${files[#]:0:n}" /path/to/directory2
Also, you may consider reading this article: Why you shouldn't parse the output of ls

Searching through every file in a directory (and in any sub-directories) one by one

I'm trying to loop through every file in a directory (including files in its subdirectories) and perform some action if the file meets an if-condition.
Part of my code is as follows:
for f in $direc/*
do
if grep -q 'search_term' $f; then
#action on this file
fi
done
However, this fails in the case of subdirectories. I would be very grateful if someone could help me out.
Thank you!
The -R option to grep will read all files in the directory tree including subdirectories. Combined with the -l option to print only the matching file names, you can use that to perform an action on each file that matches.
egrep -Rl pattern directory | while read path; do echo $path && mv $path /tmp; done
For example, that would print the file name and move the file to a different directory.
Find | xargs is the usual pattern I use, and has the advantage of not getting hung up on special characters in file names (spaces etc.) if you use the -print0 option of find.
find . -type f -print0 | xargs -0 -I{} sh -c "if grep -q 'search string' '{}'; then cmd-to-run '{}'; fi"
Yes because with this syntax, grep expect to process file(s) not directories. Minimal change to your script would be to test if $f is a file or not:
...
if [ -f "$f" ] && grep -q 'search_term' $f; then
...
In reality you would probably want to get list of files with patter match and act on those:
while read f; do
: #action on file file $f
done < <(grep -rl 'search_term' $direc/)
I've opted for getting the get the list of files through <(list) because piping it into while would cause the inside of your loop to run in another process (which could be a problem in particular if you expect any variable (changes) to be accessible from outside. And unlike simple for with `` it's not as as sensitive to what filenames you encounter (namely I have spaces in mind, this would still get confused by newlines though). Speaking of which:
while read -d "" f; do
: #action on file file $f
done < <(grep -rZl 'search_term' $direc/)
Nothing should be able to confuse that, as entries are nul character delimited and that one just must not appear in a file name.
Assuming no newlines in your file names:
find "$direc" -type f -exec grep -q 'search_term' {} \; -print |
while IFS= read -r f; do
#action on this file
done

grep filenames matching a pattern and move to desired folder

I have a list of patterns in a .txt file. [list.txt]. Foreach line in list.txt, I want to find all the files at a location which begin with the specified pattern in list.txt, and then move these files to another location.
Consider an example case.
at ~/home/ana/folder_a I have list.txt, which looks like this...
list.txt
1abc
2def
3xyz
At this location i.e /home/ana/folder_a/, there are multiple files which are beginning with the patterns in list.txt. So, there are files like 1abc_a.txt, 1abc_c.txt, 1abc_f.txt, 2def_g.txt, 3xyz_a.txt
So what I want to achieve is this:
for i in cat list.txt; do
ls | grep '^$i' [thats the pattern] |
mv [files containing the pattern] to /home/ana/folder_b/
Please note that at the other location, i.e /home/ana/folder_b/ I have already created directories, specific for each pattern.
So /home/ana/folder_b/ contains subdirectories like 1abc/ , 2def/ , 3xyz/
In effect, I wish to move all the files matching pattern '1abc', '2def' and '3xyz' from /home/ana/folder_a/ to their respective sub-directories in /home/ana/folder_b/, such that /home/ana/folder_b/1abc will have 1abc_a.txt , 1abc_c.txt , and 1abc_f.txt ; /home/ana/folder_b/2def/ will have 2def_g.txt and /home/ana/folder_b/3xyz/ will have 3xyz_a.txt
Grep's -f option matches patterns from a file so you don't have to loop over each line in the file in shell:
$ ls # List all files in dir, some match, some don't
1abc_a.txt 1abc_c.txt 1abc_f.txt 2def_g.txt 3xyz_a.txt file1 file2 list.txt
$ cat list.txt # List patterns to match against
1abc
2def
3xyz
$ ls | grep -f list.txt # grep for files that only match pattern
1abc_a.txt
1abc_c.txt
1abc_f.txt
2def_g.txt
3xyz_a.txt
Pipe to xargs to do the move:
ls | grep -f list.txt | xargs -i -t mv {} ../folder_B
mv 1abc_a.txt ../folderB
mv 1abc_c.txt ../folderB
mv 1abc_f.txt ../folderB
mv 2def_g.txt ../folderB
mv 3xyz_a.txt ../folderB
Edit: Realised I missed the subdirectory part of the question, #Thor's answers is the best approach for this, still I think you might find some use from this answer.
I think glob expansion is the way to go here:
while read pattern; do
mv "${pattern}"* ../folder_b/"$pattern"
done < list.txt
Start with an echo in front of the mv command, and remove it when you're happy with the output.
i'd suggest using the -exec action of find to call mv in your loop.
beginning file structure: (as you can see, i'm calling this from the parent of folder_a and folder_b)
$ find
.
./folder_a
./folder_a/1abc_a.txt
./folder_a/1abc_c.txt
./folder_a/1abc_f.txt
./folder_a/2def_g.txt
./folder_a/3xyz_a.txt
./folder_b
./folder_b/1abc
./folder_b/2def
./folder_b/3xyz
./list.txt
$ cat list.txt
1abc
2def
3xyz
command:
while read pattern
do
find ./folder_a -type f -name "$pattern*" -exec mv "{}" "./folder_b/$pattern" \;
done <list.txt
alternate command (same thing, just all on one line):
while read pattern; do find ./folder_a -type f -name "$pattern*" -exec mv "{}" "./folder_b/$pattern" \;; done <list.txt
resulting file structure:
$ find
.
./folder_a
./folder_b
./folder_b/1abc
./folder_b/1abc/1abc_a.txt
./folder_b/1abc/1abc_c.txt
./folder_b/1abc/1abc_f.txt
./folder_b/2def
./folder_b/2def/2def_g.txt
./folder_b/3xyz
./folder_b/3xyz/3xyz_a.txt
./list.txt

Copy the three newest files under one directory (recursively) to another specified directory

I'm using bash.
Suppose I have a log file directory /var/myprogram/logs/.
Under this directory I have many sub-directories and sub-sub-directories that include different types of log files from my program.
I'd like to find the three newest files (modified most recently), whose name starts with 2010, under /var/myprogram/logs/, regardless of sub-directory and copy them to my home directory.
Here's what I would do manually
1. Go through each directory and do ls -lt 2010*
to see which files starting with 2010 are modified most recently.
2. Once I go through all directories, I'd know which three files are the newest. So I copy them manually to my home directory.
This is pretty tedious, so I wondered if maybe I could somehow pipe some commands together to do this in one step, preferably without using shell scripts?
I've been looking into find, ls, head, and awk that I might be able to use but haven't figured the right way to glue them together.
Let me know if I need to clarify. Thanks.
Here's how you can do it:
find -type f -name '2010*' -printf "%C#\t%P\n" |sort -r -k1,1 |head -3 |cut -f 2-
This outputs a list of files prefixed by their last change time, sorts them based on that value, takes the top 3 and removes the timestamp.
Your answers feel very complicated, how about
for FILE in find . -type d; do ls -t -1 -F $FILE | grep -v "/" | head -n3 | xargs -I{} mv {} ..; done;
or laid out nicely
for FILE in `find . -type d`;
do
ls -t -1 -F $FILE | grep -v "/" | grep "^2010" | head -n3 | xargs -I{} mv {} ~;
done;
My "shortest" answer after quickly hacking it up.
for file in $(find . -iname *.php -mtime 1 | xargs ls -l | awk '{ print $6" "$7" "$8" "$9 }' | sort | sed -n '1,3p' | awk '{ print $4 }'); do cp $file ../; done
The main command stored in $() does the following:
Find all files recursively in current directory matching (case insensitive) the name *.php and having been modified in the last 24 hours.
Pipe to ls -l, required to be able to sort by modification date, so we can have the first three
Extract the modification date and file name/path with awk
Sort these files based on datetime
With sed print only the first 3 files
With awk print only their name/path
Used in a for loop and as action copy them to the desired location.
Or use #Hasturkun's variant, which popped as a response while I was editing this post :)

How to copy a file to multiple directories using the gnu cp command

Is it possible to copy a single file to multiple directories using the cp command ?
I tried the following , which did not work:
cp file1 /foo/ /bar/
cp file1 {/foo/,/bar}
I know it's possible using a for loop, or find. But is it possible using the gnu cp command?
You can't do this with cp alone but you can combine cp with xargs:
echo dir1 dir2 dir3 | xargs -n 1 cp file1
Will copy file1 to dir1, dir2, and dir3. xargs will call cp 3 times to do this, see the man page for xargs for details.
No, cp can copy multiple sources but will only copy to a single destination. You need to arrange to invoke cp multiple times - once per destination - for what you want to do; using, as you say, a loop or some other tool.
Wildcards also work with Roberts code
echo ./fs*/* | xargs -n 1 cp test
I would use cat and tee based on the answers I saw at https://superuser.com/questions/32630/parallel-file-copy-from-single-source-to-multiple-targets instead of cp.
For example:
cat inputfile | tee outfile1 outfile2 > /dev/null
As far as I can see it you can use the following:
ls | xargs -n 1 cp -i file.dat
The -i option of cp command means that you will be asked whether to overwrite a file in the current directory with the file.dat. Though it is not a completely automatic solution it worked out for me.
These answers all seem more complicated than the obvious:
for i in /foo /bar; do cp "$file1" "$i"; done
ls -db di*/subdir | xargs -n 1 cp File
-b in case there is a space in directory name otherwise it will be broken as a different item by xargs, had this problem with the echo version
Not using cp per se, but...
This came up for me in the context of copying lots of Gopro footage off of a (slow) SD card to three (slow) USB drives. I wanted to read the data only once, because it took forever. And I wanted it recursive.
$ tar cf - src | tee >( cd dest1 ; tar xf - ) >( cd dest2 ; tar xf - ) | ( cd dest3 ; tar xf - )
(And you can add more of those >() sections if you want more outputs.)
I haven't benchmarked that, but it's definitely a lot faster than cp-in-a-loop (or a bunch of parallel cp invocations).
If you want to do it without a forked command:
tee <inputfile file2 file3 file4 ... >/dev/null
To use copying with xargs to directories using wildcards on Mac OS, the only solution that worked for me with spaces in the directory name is:
find ./fs*/* -type d -print0 | xargs -0 -n 1 cp test
Where test is the file to copy
And ./fs*/* the directories to copy to
The problem is that xargs sees spaces as a new argument, the solutions to change the delimiter character using -d or -E is unfortunately not properly working on Mac OS.
Essentially equivalent to the xargs answer, but in case you want parallel execution:
parallel -q cp file1 ::: /foo/ /bar/
So, for example, to copy file1 into all subdirectories of current folder (including recursion):
parallel -q cp file1 ::: `find -mindepth 1 -type d`
N.B.: This probably only conveys any noticeable speed gains for very specific use cases, e.g. if each target directory is a distinct disk.
It is also functionally similar to the '-P' argument for xargs.
No - you cannot.
I've found on multiple occasions that I could use this functionality so I've made my own tool to do this for me.
http://github.com/ddavison/branch
pretty simple -
branch myfile dir1 dir2 dir3
ls -d */ | xargs -iA cp file.txt A
Suppose you want to copy fileName.txt to all sub-directories within present working directory.
Get all sub-directories names through ls and save them to some temporary file say, allFolders.txt
ls > allFolders.txt
Print the list and pass it to command xargs.
cat allFolders.txt | xargs -n 1 cp fileName.txt
Another way is to use cat and tee as follows:
cat <source file> | tee <destination file 1> | tee <destination file 2> [...] > <last destination file>
I think this would be pretty inefficient though, since the job would be split among several processes (one per destination) and the hard drive would be writing several files at once over different parts of the platter. However if you wanted to write a file out to several different drives, this method would probably be pretty efficient (as all copies could happen concurrently).
Using a bash script
DESTINATIONPATH[0]="xxx/yyy"
DESTINATIONPATH[1]="aaa/bbb"
..
DESTINATIONPATH[5]="MainLine/USER"
NumberOfDestinations=6
for (( i=0; i<NumberOfDestinations; i++))
do
cp SourcePath/fileName.ext ${DESTINATIONPATH[$i]}
done
exit
if you want to copy multiple folders to multiple folders one can do something like this:
echo dir1 dir2 dir3 | xargs -n 1 cp -r /path/toyourdir/{subdir1,subdir2,subdir3}
If all your target directories match a path expression — like they're all subdirectories of path/to — then just use find in combination with cp like this:
find ./path/to/* -type d -exec cp [file name] {} \;
That's it.
If you need to be specific on into which folders to copy the file you can combine find with one or more greps. For example to replace any occurences of favicon.ico in any subfolder you can use:
find . | grep favicon\.ico | xargs -n 1 cp -f /root/favicon.ico
This will copy to the immediate sub-directories, if you want to go deeper, adjust the -maxdepth parameter.
find . -mindepth 1 -maxdepth 1 -type d| xargs -n 1 cp -i index.html
If you don't want to copy to all directories, hopefully you can filter the directories you are not interested in. Example copying to all folders starting with a
find . -mindepth 1 -maxdepth 1 -type d| grep \/a |xargs -n 1 cp -i index.html
If copying to a arbitrary/disjoint set of directories you'll need Robert Gamble's suggestion.
I like to copy a file into multiple directories as such:
cp file1 /foo/; cp file1 /bar/; cp file1 /foo2/; cp file1 /bar2/
And copying a directory into other directories:
cp -r dir1/ /foo/; cp -r dir1/ /bar/; cp -r dir1/ /foo2/; cp -r dir1/ /bar2/
I know it's like issuing several commands, but it works well for me when I want to type 1 line and walk away for a while.
For example if you are in the parent directory of you destination folders you can do:
for i in $(ls); do cp sourcefile $i; done

Resources