How to open a latest file in a sub directory in Linux? - linux

I have a logs/ directory which contains many log files. I would like to open the latest log file. My env is Ubuntu 16.
I know the name of the latest file. But the file is in logs/ dir. In such a case, how do I tell the relative pass via xargs?
$ ls -t logs/ | head -1
20180615-184233.log
$ ls -t logs/ | head -1 | xargs less
failed

I found it by myself. I need * for printing the whole relative path.
$ ls -t logs/ | head -1
20180615-184233.log
$ ls -t logs/ | head -1 | xargs less
failed
$ ls -t logs/* | head -1
logs/20180615-184233.log
$ ls -t logs/* | head -1 | xargs less
success

Related

Using one command line count the lines in the last file located in /etc in ubuntu

ls /etc | tail -1 | wc -l
so basically I used this command but it counts the number of files that I've got from the tail command (which is the last file in the directory=1) but I didn't get the number of lines that are in the file.
and I used the cat command to open the file and count the lines but it didn't work.
ls /etc | cat tail -1 | wc -l
ls /etc | tail -1 | cat |wc -l
You could use xargs to use the result of the tail as an argument for wc, although I'd recommend using find instead of ls so you get the full path and don't need mess around with relative pathes:
$ find /etc -type f | tail -1 | xargs wc -l
You should never parse ls (instead parse /etc/*)
$ wc -l < `find /etc -maxdepth 1 -type f | tail -n 1`
or
$ find /etc -maxdepth 1 -type f | tail -n 1 | wc -l
What this does is find the last file for /etc
And puts it's content in wc -l

Find Most Recent File in a Directory That Matches Certain File Size

I need to find the most recently modified file in a directory that matches 3.0 MB.
First Attempt
ls -t /home/weather/some.cool*.file | head -n +1 | grep "3.0M"
Second Attempt
find /home/weather/ -maxdepth 1 -type f -name "some.cool*.file" -size 3M -exec ls -t "{}" +; | head -n +1
Am I close?
I hope this is of some use -
ls -ltr --block-size=MB | grep 3MB
The latest modified files will be displayed at the bottom of the output.
The -r flag shows the output in reverse order and the --block-size=MB will show the size of files in MB.
This should work:
ls -lh --sort=time /path/to/directory/*.file | grep "3.0M" | head -n =1

linux command: show content of all files

I tried the two following cmd to show content of all files under current directory. I want to know why one works, the other does not.
ls | xargs cat # does not work, No such file or directory
find . | xargs cat # works
cat is just an example, it can be any cmd which takes a file name as its parameter.
---------------------------------Update---------------------------------
Here are some observation from my PC.
$ echo 1 > test1.txt
$ echo 2 > test2.txt
$ echo 3 > test3.txt
$ ls
test1.txt test2.txt test3.txt
$ ls *.txt | xargs cat
cat: test1.txt: No such file or directory
cat: test2.txt: No such file or directory
cat: test3.txt: No such file or directory
$ find . -name '*.txt' | xargs cat
2
1
3
For others that might see this, we found the issue in the comments. Hao's problem was that ls was an alias, causing issues piping xargs to cat.
With 'type ls', they saw it was aliased, and using '\ls' to remove the alias solved the problem.

Bash script to delete files in a directory if there are more than 5

This is a backup script that copies files from one directory to another. I use a for loop to check if there are more than five files. If there are, the loop should delete the oldest entries first.
I tried ls -tr | head -n -5 | xargs rm from the command line and it works successfully to delete older files if there are more than 5 in the directory.
However, when I put it into my for loop, I get an error rm: missing operand
Here is the full script. I don't think I am using the for loop correctly in the script, but I'm really not sure how to use the commands ls -tr | head -n -5 | xargs rm in a loop that iterates over the files in the directory.
timestamp=$(date +"%m-%d-%Y")
dest=${HOME}/mybackups
src=${HOME}/safe
fname='bu_'
ffname=${HOME}/mybackups/${fname}${timestamp}.tar.gz
# for loop for deletion of file
for f in ${HOME}/mybackups/*
do
ls -tr | head -n -5 | xargs rm
done
if [ -e $ffname ];
then
echo "The backup for ${timestamp} has failed." | tee ${HOME}/mybackups/Error_${timestamp}
else
tar -vczf ${dest}/${fname}${timestamp}.tar.gz ${src}
fi
Edit: I took out the for loop, so it's now just:
[...]
ffname=${HOME}/mybackups/${fname}${timestamp}.tar.gz
ls -tr | head -n -5 | xargs rm
if [ -e $ffname ];
[...]
The script WILL work if it is in the mybackups directory, however, I continue to get the same error if it is not in that directory. The script gets the file names but tries to remove them from the current directory, I think... I tried several modifications but nothing has worked so far.
I get an error rm: missing operand
The cause of that error is that there are no files left to be deleted. To avoid that error, use the --no-run-if-empty option:
ls -tr | head -n -5 | xargs --no-run-if-empty rm
In the comments, mklement0 notes that this issue is peculiar to GNU xargs. BSD xargs will not run with an empty argument. Consequently, it does not need and does not support the --no-run-if-empty option.
More
Quoting from a section of code in the question:
for f in ${HOME}/mybackups/*
do
ls -tr | head -n -5 | xargs rm
done
Note that (1) f is never used for anything and (2) this runs the ls -tr | head -n -5 | xargs rm several times in a row when it needs to be run only once.
Obligatory Warning
Your approach parses the output of ls. This makes for a simple and easily understood command. It can work if all your files are sensibly named. It will not work in general. For more on this, see: Why you shouldn't parse the output of ls(1).
Safer Alternative
The following will work with all manner of file names, whether they contains spaces, tabs, newlines, or whatever:
find . -maxdepth 1 -type f -printf '%T# %i\n' | sort -n | head -n -5 | while read tstamp inode
do
find . -inum "$inode" -delete
done
SMH. I ended up coming up to the simplest solution in the world by just cd-ing into the directory before I ran ls -tr | head -n -5 | xargs rm . Thanks for everyone's help!
timestamp=$(date +"%m-%d-%Y")
dest=${HOME}/mybackups
src=${HOME}/safe
fname='bu_'
ffname=${HOME}/mybackups/${fname}${timestamp}.tar.gz
cd ${HOME}/mybackups
ls -tr | head -n -5 | xargs rm
if [ -e $ffname ];
then
echo "The backup for ${timestamp} has failed." | tee ${HOME}/mybackups/Error_${timestamp}
else
tar -vczf ${dest}/${fname}${timestamp}.tar.gz ${src}
fi
This line ls -tr | head -n -5 | xargs rm came from here
ls -tr displays all the files, oldest first (-t newest first, -r
reverse).
head -n -5 displays all but the 5 last lines (ie the 5 newest files).
xargs rm calls rm for each selected file
.

linux: most recent file in a directory, excluding directories and . files

I would like to find the most recently changed file in a directory, excluding hidden files (the ones that start with .) and also excluding directories.
This question is headed in the right direction, but not exactly what I need:
Linux: Most recent file in a directory
The key here is to exclude directories...
Like the answer there except without -A
ls -rt | tail -n 1
Look at man ls for more info.
To make it exclude directories, we use the -F option to add a "/" to each directory, and then filter for those that don't have the "/":
ls -Frt | grep "[^/]$" | tail -n 1
This does what you want, excluding directories:
stat --printf='%F %Y %n\n' * | sort | grep -v ^directory | head -n 1
same one, not very clean but: ls -c1 + tail if you want => ls -c1 | tail -1
$ touch a .b
$ ls -c1
a
$ ls -c1a
a
.b
$ touch d
$ ls -c1
d
a
$ ls -c1a
.
d
a
.b
..
$ touch .b
$ ls -c1a
.b
.
d
a
..
As you can see, without a arg, only visible files are listed.
probably the same as the answer in the other post but with a small difference (excluding directories) -
ls --group-directories-first -rt | tail -n 1

Resources