Linux commands to get Latest file depending on file name - linux

I am new to linux. I have a folder with many files in it and i need to get the latest file depending on the file name. Example: I have 3 files RAT_20190111.txt RAT_20190212.txt RAT_20190321.txt . I need a linux command to move the latest file here RAT20190321.txt to a specific directory.

If file pattern remains the same then you can try below command :
mv $(ls RAT*|sort -r|head -1) /path/to/directory/
As pointed out by #wwn, there is no need to use sort, Since the files are lexicographically sortable ls should do the job already of sorting them so the command will become :
mv $(ls RAT*|tail -1) /path/to/directory

The following command works.
ls | grep -v '/$' |sort | tail -n 1 | xargs -d '\n' -r mv -- /path/to/directory
The command first splits output of ls with newline. Then sorts it, takes the last file and then it moves this to the required directory.
Hope it helps.

Use the below command
cp ls |tail -n 1 /data...

Related

Linux: Reverse Sort files in directory and get second file

I am trying to get the second file, when file contents sorted in reverse (desc order) and copy it to my local directory using scp
Here's what I got:
scp -r uname#host:./backups/dir1/$(ls -r | head -2| tail -1) /tmp/data_sync/dir1/
I still seem to copy all the files when I run this script. What am I missing? TIA.
The $(...) is being interpreted locally. If you want the commands to run on the remote, you'll need to use ssh and have the remote side use scp to copy files to your local system.
Since parsing ls's output has a number of problems, I'll use find to accomplish the same thing as ls, telling it to use NUL between each filename rather than newline. sort sorts that list of filenames, and sed -n 2p prints the second element of the sorted list of filenames. xargs runs the scp command, inserting the filename as the first argument.
ssh uname#host "find ./backups/dir1/ -mindepth 1 -maxdepth 1 -name '[^.]*' -print0 | \
sort -r -z | sed -z -n 2p | \
xargs -0 -I {} scp {} yourlocalhost:/tmp/data_sync/dir1/"
If I got your question, your command is ok with just one specification:
you first ran scp -r which recursively scps your files which have theri content sorted in reverse order.
Try without -r:
scp uname#host:./backups/dir1/$(ls -r | head -2 | tail -1) /tmp/data_sync/dir1/
The basic syntax for scp is:
scp username#source:/location/to/file username#destination:/where/to/put
Don't forget that -rrecursively copy entire directories. More, note that scp follows symbolic links encountered in the tree traversal.

How to: compare command output to text file with bash

I'm trying to compare the output of the command
`ls -l directory`
with a file that I've created using the same command. (I'm trying a poor man's way of making sure no files have been modified.)
The trouble is, I don't know how to diff the output of the ls command with the file that I've created. I've tried the following and each time it doesn't work
diff file.ls <(ls -l directory)
ls -l directory | xargs diff file.ls
ls -l directory | diff file.ls
diff file.ls < `ls -l directory`
What is the magic command to compare the output of ls to a file that I've already saved?
The answer (for posterity) is to do the following
diff file.ls <(ls -l directory)
When I did this previously, the output was blank. I thought I had done it wrong; in actuality there was no difference between the contents of the directory and my file.
<\facepalm>
diff is easiest when you compare files.
$ ls $DIR > original.ls
do some stuff
$ ls $DIR > new.ls
$ diff original.ls new.ls

ls in a directory for a list of files

I have a C codebase, all resides in the same directory.
I want to find all the header files that have a code file with the same name.
Right now, I have the following command:
ls *.h | sed s/.h/.c/
This returns a 'list' of filenames that I want to search for. How can I pass this list to another command so that I can see which header files have code files sharing the same name?
Without any external command:
$ for i in *.h
> do
> [ -f ${i/.h/.c} ] && echo $i
> done
The first line loops through every file.
The third line is a test construct. The -f flag to test (aka man [) checks to see if the file exists. If it does, it returns 0 (which is considered true in shell). The && only operates if the following command if the previous line returned successfully.
${i/.h/.c} is an in-place in-shell regex substitution so that the file tested is the corresponding .c to the .h.
you could use xargs which transforms its input:
a
b
c
to an argument list:
a b c
So this should print "a b c":
echo -e "a\nb\nc" | xargs echo
ls `ls *.h|sed s/.h/.c/` 2>/dev/null
should do the trick
ls -1 *.c* *.h*|awk -F. '{a[$1]++}END{for(i in a){if(a[i]==2)print i".hh"} }'
ls *.h | sed s/.h/.c/ | xargs ls 2>/dev/null
The remainder of the command runs ls again with the new *.c filenames. ls will complain about every file that doesn't exist, so we redirect stderr to nowhere.
Example without 2>/dev/null:
$ ls
a.c a.h b.c c.c d.h
$ ls *.h | sed s/.h/.c/ | xargs ls
ls: d.c: No such file or directory
a.c

grep command working in testdir but not in "real" directory

I just thought I had found my solution because the command works in my test directory.
grep -H -e 'author="[^"].*' *.xml | cut -d: -f1 | xargs -I '{}' mv {} mydir/.
But using the command in the non-test-direcory the command did not work:
This is the error message:
grep: unknown option -- O
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
Not even this worked:
$ grep -H author *.xml
or this:
$ grep -H 'author' *.xml
(same error message)
I suspect it has some relation to the file names or the amount of files.
I have almost 3000 files in the non-test-directory and only 20 in my test directory.
In both directories almost all file names contain spaces and " - ".
Some more info:
I'm using Cygwin.
I am not allowed to change the filenames
Try this (updated):
grep -HlZ 'author="[^"].*' -- *.xml | xargs -0 -I {} mv -- {} mydir/
EXPLANATION (updated)
In your "real" directory you have a file with name starting with -O.
Your shell expands the file list *.xml and grep takes your - starting filename as an option (not valid). Same thing happens with mv. As explained in the Common options section of info coreutils, you can use -- to delimit the option list. What comes after -- is considered as an operand, not an option.
Using the -l (lowercase L) option, grep outputs only the filename of matching files, so you don't need to use cut.
To correctly handle every strange filename, you have to use the pair -Z in grep and -0 in xargs.
No need to use -e because your pattern does not begin with -.
Hope this will help!

Linux: cat matching files in date order?

I have a few files in a directory with names similar to
_system1.log
_system2.log
_system3.log
other.log
but they are not created in that order.
Is there a simple, non-hardcoded, way to cat the files starting with the underscore in date order?
Quick 'n' dirty:
cat `ls -t _system*.log`
Safer:
ls -1t _system*.log | xargs -d'\n' cat
Use ls:
ls -1t | xargs cat
ls -1 | xargs cat
You can concatenate and also store them in a single file according to their time of creation and also you can specify the files which you want to concatenate. Here, I find it very useful. The following command will concatenate the files which are arranged according to their time of creaction and have common string 'xyz' in their file name and store all of them in outputfile.
cat $(ls -t |grep xyz)>outputfile

Resources