copy few files from a directory that are specified in a text file (linux) - linux

I have a directory called images
and it contains many images.
For example:
images/
imag001.png
imag002.png
imag003.png
imag004.png
And I have a text file that has the files that I want to copy somewhere else. Say the test.txt file has
img001.png
img003.png
How do I copy the files specified in test.txt from the images folder to some other place?

try this one-liner under your images directory:
awk '{print "cp "$0" /target/path"}' test.txt|sh

There are probably many solutions to this problem. I would do it by using xargs:
cd images/
cat path/to/test.txt | xargs -I FILES cp FILES path/to/dest/

I think in the bash shell you can do:
for image in $(cat copylist.txt); do
cp $image destination
done

You can write:
while IFS= read -r image_filename ; do
cp images/"$image_filename" some_other_place/
done < test.txt

cat copylist.txt | xargs -n1 -I % echo cp % destination/.
# remove echo after testing

Related

Renaming folders and files in subdirectories using text file linux

I am trying to rename the files and directories using a text file separated by space.
The text file looks like this:
dir1-1 dir1_1
dir2-1 dir223_1
My command is as follows:
xargs -r -a files.txt -L1 mv
This command can rename only folders from dir1-1 to dir1_1 and dir2-1to dir223_1so on but it doesn't rename the files in the subdirectories. The files in the corresponding directories also have these prefix of these directories.
Looking forward for the assistance.
Assuming you don't have special characters(space of tab...) in your file/dir names,
try
perl_script=$(
echo 'chop($_); $orig=$_;'
while read -r src tgt; do
echo 'if (s{(.*)/'"$src"'([^/]*)}{$1/'"$tgt"'\2}) { print "$orig $_\n";next;}'
done < files.txt)
find . -depth | perl -ne "$perl_script" | xargs -r -L1 echo mv
Remove echo once you see it does what you wanted.

Copy the newest two files and append the date

I'm looking for a way to copy the newest .gz files I have in Dir A to Dir B, and append the date to that files.
Example:
Dir A
cinema.gz
radio.gz
cars.txt
camera.gz
Dir B
cinema.gz.20200310
radio.gz.20200310
Using following command I can copy the newest .gz files to dirb
cp $(ls -1t /dira *gz | head -2) "dirb"
However I don't find a way to change append the date to the filename.
I was trying something like this:
cp $(ls -1t /dira *gz | head -2) "dirb/*.$(date +%F_%R)"
But don't works at all.
Your help please :)
for TO_MOVE in `ls -t *.gz | head -n2`; do
cp $TO_MOVE dirb/;
mv dirb/$TO_MOVE dirb/$TO_MOVE.`date +%Y%m%d`;
done
You can't compact this into one cp command sadly. However you can do this.
for f in $(ls -1t/dira gz | head -2); do
cp "$f" "dirb/$f.$(date +%F_%R | sed 's;/;\\/;g')"
done
Edit: I pipe the date through sed to escape the '/' characters in the date string (Otherwise cp will interpret them as directory names).
Try the below code from parent directory of dira and dirb;
ls -1t dira/*.gz | head -2 | while read -r line; do cp $line dirb/${line#*/}.$(date +%F_%R); done
I'm using while to loop over the files and ${line#*/} trims the directory name. Let me know if you have any query.

Copy files from a list to a folder

i have a text file abc.txt and it's contents are,
/lag/cnn/org/one.txt
/lag/cnn/org/two.txt
/lag/cnn/org/three.txt
if i use ,
tar -cvf allfiles.tar -T abc.txt
i am getting the tar of files in the list. Similarly is it possible to copy those files in abc.txt to a folder.
I tried ,
cp --files-from test1.txt ./Folder
but it is not working. Please help
You could use xargs:
cat test1.txt | xargs -I{} cp {} ./Folder
In order to avoid Useless use of cat, you could say:
xargs -a test1.txt -I{} cp {} ./Folder
You can use xargs for example, or do it in a loop:
while read FILE; do cp "$FILE" ./Folder; done <test1.txt
You can write:
while IFS= read -r FILENAME ; do
cp -- "$FILENAME" ./Folder || break
done < test1.txt
which loops over the lines of the file, reading each line into the variable FILENAME, and running cp for each one.
You could use tar cvf - to write the tar to stdout and pipe it right into a tar xvfC - Folder.

mkdir command for a list of filenames paths

I have txt file with content like this
/home/username/Desktop/folder/folder3333/IMAGw488.jpg
/home/username/Desktop/folder/folder3333/IMAG04f88.jpg
/home/username/Desktop/folder/folder3333/IMAGe0488.jpg
/home/username/Desktop/folder/folder3333/IMAG0r88.jpg
/home/username/Desktop/folder/folder3333/
/home/username/Desktop/folder/
/home/username/Desktop/folder/IMAG0488.jpg
/home/username/Desktop/folder/fff/fff/feqw/123.jpg
/home/username/Desktop/folder/fffa/asd.png
....
these are filenames paths but also paths of folders.
The problem I want to solve is to create all folders that doesn't exist.
I want to call mkdir command for every folder that does not exist
How can I do this on easy way ?
Thanks
This can be done in native bash syntax without any calls to external binaries:
while read line; do mkdir -p "${line%/*}"; done < infile
Or perhaps with a just a single call to mkdir if you have bash 4.x
mapfile -t arr < infile; mkdir -p "${arr[#]%/*}"
How about...
for p in $(xargs < somefile.txt);
do
mkdir -p $(dirname ${p})
done
xargs -n 1 dirname <somefile.txt | xargs mkdir -p
It can be done without loop also (provided input file not huge):
mkdir -p $(perl -pe 's#/(?!.*/).*$##' file.txt)
If you have file "file1" with filenames you could try this oneliner:
cat file1 |xargs -I {} dirname "{}"| sort -u | xargs -I{} mkdir -p "{}"
Use of:
xargs -I{} mkdir -p "{}"
ensures that even path names with spaces will be created
Using a perl one-liner and File::Path qw(make_path):
perl -MFile::Path=make_path -lne 'make_path $_' dirlist.txt

How to copy a file to multiple directories using the gnu cp command

Is it possible to copy a single file to multiple directories using the cp command ?
I tried the following , which did not work:
cp file1 /foo/ /bar/
cp file1 {/foo/,/bar}
I know it's possible using a for loop, or find. But is it possible using the gnu cp command?
You can't do this with cp alone but you can combine cp with xargs:
echo dir1 dir2 dir3 | xargs -n 1 cp file1
Will copy file1 to dir1, dir2, and dir3. xargs will call cp 3 times to do this, see the man page for xargs for details.
No, cp can copy multiple sources but will only copy to a single destination. You need to arrange to invoke cp multiple times - once per destination - for what you want to do; using, as you say, a loop or some other tool.
Wildcards also work with Roberts code
echo ./fs*/* | xargs -n 1 cp test
I would use cat and tee based on the answers I saw at https://superuser.com/questions/32630/parallel-file-copy-from-single-source-to-multiple-targets instead of cp.
For example:
cat inputfile | tee outfile1 outfile2 > /dev/null
As far as I can see it you can use the following:
ls | xargs -n 1 cp -i file.dat
The -i option of cp command means that you will be asked whether to overwrite a file in the current directory with the file.dat. Though it is not a completely automatic solution it worked out for me.
These answers all seem more complicated than the obvious:
for i in /foo /bar; do cp "$file1" "$i"; done
ls -db di*/subdir | xargs -n 1 cp File
-b in case there is a space in directory name otherwise it will be broken as a different item by xargs, had this problem with the echo version
Not using cp per se, but...
This came up for me in the context of copying lots of Gopro footage off of a (slow) SD card to three (slow) USB drives. I wanted to read the data only once, because it took forever. And I wanted it recursive.
$ tar cf - src | tee >( cd dest1 ; tar xf - ) >( cd dest2 ; tar xf - ) | ( cd dest3 ; tar xf - )
(And you can add more of those >() sections if you want more outputs.)
I haven't benchmarked that, but it's definitely a lot faster than cp-in-a-loop (or a bunch of parallel cp invocations).
If you want to do it without a forked command:
tee <inputfile file2 file3 file4 ... >/dev/null
To use copying with xargs to directories using wildcards on Mac OS, the only solution that worked for me with spaces in the directory name is:
find ./fs*/* -type d -print0 | xargs -0 -n 1 cp test
Where test is the file to copy
And ./fs*/* the directories to copy to
The problem is that xargs sees spaces as a new argument, the solutions to change the delimiter character using -d or -E is unfortunately not properly working on Mac OS.
Essentially equivalent to the xargs answer, but in case you want parallel execution:
parallel -q cp file1 ::: /foo/ /bar/
So, for example, to copy file1 into all subdirectories of current folder (including recursion):
parallel -q cp file1 ::: `find -mindepth 1 -type d`
N.B.: This probably only conveys any noticeable speed gains for very specific use cases, e.g. if each target directory is a distinct disk.
It is also functionally similar to the '-P' argument for xargs.
No - you cannot.
I've found on multiple occasions that I could use this functionality so I've made my own tool to do this for me.
http://github.com/ddavison/branch
pretty simple -
branch myfile dir1 dir2 dir3
ls -d */ | xargs -iA cp file.txt A
Suppose you want to copy fileName.txt to all sub-directories within present working directory.
Get all sub-directories names through ls and save them to some temporary file say, allFolders.txt
ls > allFolders.txt
Print the list and pass it to command xargs.
cat allFolders.txt | xargs -n 1 cp fileName.txt
Another way is to use cat and tee as follows:
cat <source file> | tee <destination file 1> | tee <destination file 2> [...] > <last destination file>
I think this would be pretty inefficient though, since the job would be split among several processes (one per destination) and the hard drive would be writing several files at once over different parts of the platter. However if you wanted to write a file out to several different drives, this method would probably be pretty efficient (as all copies could happen concurrently).
Using a bash script
DESTINATIONPATH[0]="xxx/yyy"
DESTINATIONPATH[1]="aaa/bbb"
..
DESTINATIONPATH[5]="MainLine/USER"
NumberOfDestinations=6
for (( i=0; i<NumberOfDestinations; i++))
do
cp SourcePath/fileName.ext ${DESTINATIONPATH[$i]}
done
exit
if you want to copy multiple folders to multiple folders one can do something like this:
echo dir1 dir2 dir3 | xargs -n 1 cp -r /path/toyourdir/{subdir1,subdir2,subdir3}
If all your target directories match a path expression — like they're all subdirectories of path/to — then just use find in combination with cp like this:
find ./path/to/* -type d -exec cp [file name] {} \;
That's it.
If you need to be specific on into which folders to copy the file you can combine find with one or more greps. For example to replace any occurences of favicon.ico in any subfolder you can use:
find . | grep favicon\.ico | xargs -n 1 cp -f /root/favicon.ico
This will copy to the immediate sub-directories, if you want to go deeper, adjust the -maxdepth parameter.
find . -mindepth 1 -maxdepth 1 -type d| xargs -n 1 cp -i index.html
If you don't want to copy to all directories, hopefully you can filter the directories you are not interested in. Example copying to all folders starting with a
find . -mindepth 1 -maxdepth 1 -type d| grep \/a |xargs -n 1 cp -i index.html
If copying to a arbitrary/disjoint set of directories you'll need Robert Gamble's suggestion.
I like to copy a file into multiple directories as such:
cp file1 /foo/; cp file1 /bar/; cp file1 /foo2/; cp file1 /bar2/
And copying a directory into other directories:
cp -r dir1/ /foo/; cp -r dir1/ /bar/; cp -r dir1/ /foo2/; cp -r dir1/ /bar2/
I know it's like issuing several commands, but it works well for me when I want to type 1 line and walk away for a while.
For example if you are in the parent directory of you destination folders you can do:
for i in $(ls); do cp sourcefile $i; done

Resources