Copy the newest two files and append the date - linux

I'm looking for a way to copy the newest .gz files I have in Dir A to Dir B, and append the date to that files.
Example:
Dir A
cinema.gz
radio.gz
cars.txt
camera.gz
Dir B
cinema.gz.20200310
radio.gz.20200310
Using following command I can copy the newest .gz files to dirb
cp $(ls -1t /dira *gz | head -2) "dirb"
However I don't find a way to change append the date to the filename.
I was trying something like this:
cp $(ls -1t /dira *gz | head -2) "dirb/*.$(date +%F_%R)"
But don't works at all.
Your help please :)

for TO_MOVE in `ls -t *.gz | head -n2`; do
cp $TO_MOVE dirb/;
mv dirb/$TO_MOVE dirb/$TO_MOVE.`date +%Y%m%d`;
done

You can't compact this into one cp command sadly. However you can do this.
for f in $(ls -1t/dira gz | head -2); do
cp "$f" "dirb/$f.$(date +%F_%R | sed 's;/;\\/;g')"
done
Edit: I pipe the date through sed to escape the '/' characters in the date string (Otherwise cp will interpret them as directory names).

Try the below code from parent directory of dira and dirb;
ls -1t dira/*.gz | head -2 | while read -r line; do cp $line dirb/${line#*/}.$(date +%F_%R); done
I'm using while to loop over the files and ${line#*/} trims the directory name. Let me know if you have any query.

Related

How could I rename the first 5 files in a folder with a different extension while keeping the old ones the same?

I have about 20 files in a folder. I want to rename the extension of the first 5 from .txt to .html. I want to keep the first 5 files with the .txt extension though. Here is what I have so far. It is a bash script
cp 'ls | head -5 "files would go here I think"
files=(*.txt)
for ((i=0; i<5; i++)); do
cp -v "${files[i]}" "${files[i]%.txt}.html"
done
You should be able to pipe these commands to get what you need.
ls *.txt | head -5 | sed -e 's/.txt$//' | xargs -n1 -I% cp %.txt %.html

Copy files from a list to a folder

i have a text file abc.txt and it's contents are,
/lag/cnn/org/one.txt
/lag/cnn/org/two.txt
/lag/cnn/org/three.txt
if i use ,
tar -cvf allfiles.tar -T abc.txt
i am getting the tar of files in the list. Similarly is it possible to copy those files in abc.txt to a folder.
I tried ,
cp --files-from test1.txt ./Folder
but it is not working. Please help
You could use xargs:
cat test1.txt | xargs -I{} cp {} ./Folder
In order to avoid Useless use of cat, you could say:
xargs -a test1.txt -I{} cp {} ./Folder
You can use xargs for example, or do it in a loop:
while read FILE; do cp "$FILE" ./Folder; done <test1.txt
You can write:
while IFS= read -r FILENAME ; do
cp -- "$FILENAME" ./Folder || break
done < test1.txt
which loops over the lines of the file, reading each line into the variable FILENAME, and running cp for each one.
You could use tar cvf - to write the tar to stdout and pipe it right into a tar xvfC - Folder.

delete file other than particular extension file format

i have a lot of different type of files in one folder. i need to delete the files but except the pdf file.
I tried to display the pdf file only. but i need to delete the other than pdf files
ls -1 | xargs file | grep 'PDF document,' | sed 's/:.*//'
You could do the following - I've used echo rm instead of rm for safety:
for i in *
do
[ x"$(file --mime-type -b "$i")" != xapplication/pdf ] && echo rm "$i"
done
The --mime-type -b options to file make the output of file easier to deal with in a script.
$ ls
aa.txt a.pdf bb.cpp b.pdf
$ ls | grep -v .pdf | xargs rm -rf
$ ls
a.pdf b.pdf
:) !
ls |xargs file|awk -F":" '!($2~/PDF document/){print $1}'|xargs rm -rf
Try inverting the grep match:
ls -1 | xargs file | grep -v 'PDF document,' | sed 's/:.*//'
It's rare in my experience to encounter PDF files which don't have a .pdf extension. You don't state why "file" is necessary in the example, but I'd write this as:
# find . -not -name '*.pdf' -delete
Note that this will recurse into subdirectories; use "-maxdepth 1" to limit to the current directory only.

Copy the three newest files under one directory (recursively) to another specified directory

I'm using bash.
Suppose I have a log file directory /var/myprogram/logs/.
Under this directory I have many sub-directories and sub-sub-directories that include different types of log files from my program.
I'd like to find the three newest files (modified most recently), whose name starts with 2010, under /var/myprogram/logs/, regardless of sub-directory and copy them to my home directory.
Here's what I would do manually
1. Go through each directory and do ls -lt 2010*
to see which files starting with 2010 are modified most recently.
2. Once I go through all directories, I'd know which three files are the newest. So I copy them manually to my home directory.
This is pretty tedious, so I wondered if maybe I could somehow pipe some commands together to do this in one step, preferably without using shell scripts?
I've been looking into find, ls, head, and awk that I might be able to use but haven't figured the right way to glue them together.
Let me know if I need to clarify. Thanks.
Here's how you can do it:
find -type f -name '2010*' -printf "%C#\t%P\n" |sort -r -k1,1 |head -3 |cut -f 2-
This outputs a list of files prefixed by their last change time, sorts them based on that value, takes the top 3 and removes the timestamp.
Your answers feel very complicated, how about
for FILE in find . -type d; do ls -t -1 -F $FILE | grep -v "/" | head -n3 | xargs -I{} mv {} ..; done;
or laid out nicely
for FILE in `find . -type d`;
do
ls -t -1 -F $FILE | grep -v "/" | grep "^2010" | head -n3 | xargs -I{} mv {} ~;
done;
My "shortest" answer after quickly hacking it up.
for file in $(find . -iname *.php -mtime 1 | xargs ls -l | awk '{ print $6" "$7" "$8" "$9 }' | sort | sed -n '1,3p' | awk '{ print $4 }'); do cp $file ../; done
The main command stored in $() does the following:
Find all files recursively in current directory matching (case insensitive) the name *.php and having been modified in the last 24 hours.
Pipe to ls -l, required to be able to sort by modification date, so we can have the first three
Extract the modification date and file name/path with awk
Sort these files based on datetime
With sed print only the first 3 files
With awk print only their name/path
Used in a for loop and as action copy them to the desired location.
Or use #Hasturkun's variant, which popped as a response while I was editing this post :)

How to copy a file to multiple directories using the gnu cp command

Is it possible to copy a single file to multiple directories using the cp command ?
I tried the following , which did not work:
cp file1 /foo/ /bar/
cp file1 {/foo/,/bar}
I know it's possible using a for loop, or find. But is it possible using the gnu cp command?
You can't do this with cp alone but you can combine cp with xargs:
echo dir1 dir2 dir3 | xargs -n 1 cp file1
Will copy file1 to dir1, dir2, and dir3. xargs will call cp 3 times to do this, see the man page for xargs for details.
No, cp can copy multiple sources but will only copy to a single destination. You need to arrange to invoke cp multiple times - once per destination - for what you want to do; using, as you say, a loop or some other tool.
Wildcards also work with Roberts code
echo ./fs*/* | xargs -n 1 cp test
I would use cat and tee based on the answers I saw at https://superuser.com/questions/32630/parallel-file-copy-from-single-source-to-multiple-targets instead of cp.
For example:
cat inputfile | tee outfile1 outfile2 > /dev/null
As far as I can see it you can use the following:
ls | xargs -n 1 cp -i file.dat
The -i option of cp command means that you will be asked whether to overwrite a file in the current directory with the file.dat. Though it is not a completely automatic solution it worked out for me.
These answers all seem more complicated than the obvious:
for i in /foo /bar; do cp "$file1" "$i"; done
ls -db di*/subdir | xargs -n 1 cp File
-b in case there is a space in directory name otherwise it will be broken as a different item by xargs, had this problem with the echo version
Not using cp per se, but...
This came up for me in the context of copying lots of Gopro footage off of a (slow) SD card to three (slow) USB drives. I wanted to read the data only once, because it took forever. And I wanted it recursive.
$ tar cf - src | tee >( cd dest1 ; tar xf - ) >( cd dest2 ; tar xf - ) | ( cd dest3 ; tar xf - )
(And you can add more of those >() sections if you want more outputs.)
I haven't benchmarked that, but it's definitely a lot faster than cp-in-a-loop (or a bunch of parallel cp invocations).
If you want to do it without a forked command:
tee <inputfile file2 file3 file4 ... >/dev/null
To use copying with xargs to directories using wildcards on Mac OS, the only solution that worked for me with spaces in the directory name is:
find ./fs*/* -type d -print0 | xargs -0 -n 1 cp test
Where test is the file to copy
And ./fs*/* the directories to copy to
The problem is that xargs sees spaces as a new argument, the solutions to change the delimiter character using -d or -E is unfortunately not properly working on Mac OS.
Essentially equivalent to the xargs answer, but in case you want parallel execution:
parallel -q cp file1 ::: /foo/ /bar/
So, for example, to copy file1 into all subdirectories of current folder (including recursion):
parallel -q cp file1 ::: `find -mindepth 1 -type d`
N.B.: This probably only conveys any noticeable speed gains for very specific use cases, e.g. if each target directory is a distinct disk.
It is also functionally similar to the '-P' argument for xargs.
No - you cannot.
I've found on multiple occasions that I could use this functionality so I've made my own tool to do this for me.
http://github.com/ddavison/branch
pretty simple -
branch myfile dir1 dir2 dir3
ls -d */ | xargs -iA cp file.txt A
Suppose you want to copy fileName.txt to all sub-directories within present working directory.
Get all sub-directories names through ls and save them to some temporary file say, allFolders.txt
ls > allFolders.txt
Print the list and pass it to command xargs.
cat allFolders.txt | xargs -n 1 cp fileName.txt
Another way is to use cat and tee as follows:
cat <source file> | tee <destination file 1> | tee <destination file 2> [...] > <last destination file>
I think this would be pretty inefficient though, since the job would be split among several processes (one per destination) and the hard drive would be writing several files at once over different parts of the platter. However if you wanted to write a file out to several different drives, this method would probably be pretty efficient (as all copies could happen concurrently).
Using a bash script
DESTINATIONPATH[0]="xxx/yyy"
DESTINATIONPATH[1]="aaa/bbb"
..
DESTINATIONPATH[5]="MainLine/USER"
NumberOfDestinations=6
for (( i=0; i<NumberOfDestinations; i++))
do
cp SourcePath/fileName.ext ${DESTINATIONPATH[$i]}
done
exit
if you want to copy multiple folders to multiple folders one can do something like this:
echo dir1 dir2 dir3 | xargs -n 1 cp -r /path/toyourdir/{subdir1,subdir2,subdir3}
If all your target directories match a path expression — like they're all subdirectories of path/to — then just use find in combination with cp like this:
find ./path/to/* -type d -exec cp [file name] {} \;
That's it.
If you need to be specific on into which folders to copy the file you can combine find with one or more greps. For example to replace any occurences of favicon.ico in any subfolder you can use:
find . | grep favicon\.ico | xargs -n 1 cp -f /root/favicon.ico
This will copy to the immediate sub-directories, if you want to go deeper, adjust the -maxdepth parameter.
find . -mindepth 1 -maxdepth 1 -type d| xargs -n 1 cp -i index.html
If you don't want to copy to all directories, hopefully you can filter the directories you are not interested in. Example copying to all folders starting with a
find . -mindepth 1 -maxdepth 1 -type d| grep \/a |xargs -n 1 cp -i index.html
If copying to a arbitrary/disjoint set of directories you'll need Robert Gamble's suggestion.
I like to copy a file into multiple directories as such:
cp file1 /foo/; cp file1 /bar/; cp file1 /foo2/; cp file1 /bar2/
And copying a directory into other directories:
cp -r dir1/ /foo/; cp -r dir1/ /bar/; cp -r dir1/ /foo2/; cp -r dir1/ /bar2/
I know it's like issuing several commands, but it works well for me when I want to type 1 line and walk away for a while.
For example if you are in the parent directory of you destination folders you can do:
for i in $(ls); do cp sourcefile $i; done

Resources