Pipe find command output to ls command - linux

Need to pipe find command output to ls command limiting ls command to a certain date. Sort of a filter by date modified but I want with the format of the ls -lth --full-time. The find command gives me last modified files within 365 days.
But ls shows me everything.
find . -mtime -365 | ls -lth --full-time
find gives me:
$ find . -mtime -365
.
./en_mapp
./main.py
./file1.txt
./file2.csv
And ls -lth --full-time gives me:
$ ls -lth --full-time
total 8.0K
-rw-r--r--. 1 root root 174 2020-09-21 10:59:26.858601430 +0200 main.py
-rw-r--r--. 1 root root 0 2020-09-21 09:36:17.072137720 +0200 file2.csv
drwxr-xr-x. 2 root root 4.0K 2020-09-21 09:35:48.296169162 +0200 en_mapp
-rw-r--r--. 1 root root 0 2020-09-21 09:35:28.502502950 +0200 file1.txt
-rw-r--r--. 1 root root 0 2012-01-01 00:00:00.000000000 +0100 goldenfile.xls

xargs takes input from stdin and passes it as arguments to some command.
find . -mtime -365 | xargs ls -lth --full-time
# or better with no dirs and handle spaces and quotes in filename
find . -type f -mtime -365 | xargs -d '\n' ls -lth --full-time
# or better - handle newlines in filenames
find . -mtime -365 -print0 | xargs -0 ls -lth --full-time

Use the exec option of find with a terminating plus:
find . -mtime -365 -exec ls -lth --full-time '{}' +

Related

printing directory with simple ls and grep command Linux

So I have this command ls -al -R | grep libbpf.h and it just act dump print
-rw-r--r-- 1 root root 53107 جنوری 27 12:05 libbpf.h
I also need the exact subdirectories that contain this file is there a way I can use the above command with some option for grep or ls so it also prints some thining like
-rw-r--r-- 1 root root ./libbpf/src/include/libbpf.h 53107 جنوری 27 12:05 libbpf.h
so I only knows the the libbpf.h does exists in somewhere from root directory recursively searching just give me the path, does any one knows this
you can use find command
find "$(pwd -P)" -type f -name "libbpf.h" -ls
if you want only paths
find "$(pwd -P)" -type f -name "libbpf.h"
or
find . -type f -name "libbpf.h" -exec realpath {} \;

how to get the number records along with file size in unix

I am trying to get the file details and the number of records for each file along with size.
i tried with this ls -lhtr 234*201406*.log.gz it is giving all the details except record count. if i tried ls -lhtr 234*201406*.log.gz | wc -l it is showing the number of files.
present o/p:
-rw-r--r-- 1 jenkins tomcat 120M Jun 30 18:25 234_1404165601_20140630220001.log.gz
-rw-r--r-- 1 jenkins tomcat 144M Jun 30 19:24 234_1404169201_20140630230001.log.gz
i need o/p as
-rw-r--r-- 1 jenkins tomcat 120M Jun 30 18:25 234_1404165601_20140630220001.log.gz 20000
can you please help me on this to get.thanks in advance.
You can use zcat (or gunzip -c) for printing # of lines from .gz files:
find . -name '*.gz' -exec bash -c 'f="$1"; du -h "$f"; zcat "$f" | wc -l' - '{}' \;

Linux combine sort files by date created and given file name

I need to combine these to commands in order to have a sorted list by date created with the specified "filename".
I know that sorting files by date can be achieved with:
ls -lrt
and finding a file by name with
find . -name "filename*"
I don't know how to combine these two. I tried with a pipeline but I don't get the right result.
[EDIT]
Not sorted
find . -name "filename" -printf '%TY:%Tm:%Td %TH:%Tm %h/%f\n' | sort
Forget xargs. "Find" and "sort" are all the tools you need.
My best guess would be to use xargs:
find . -name 'filename*' -print0 | xargs -0 /bin/ls -ltr
There's an upper limit on the number of arguments, but it shouldn't be a problem unless they occupy more than 32kB (read more here), in which case you will get blocks of sorted files :)
find . -name "filename" -exec ls --full-time \{\} \; | cut -d' ' -f7- | sort
You might have to adjust the cut command depending on what your version of ls outputs.
Check the below-shared command:
1) List Files directory with Last Modified Date/Time
To list files and shows the last modified files at top, we will use -lt options with ls command.
$ ls -lt /run
output
total 24
-rw-rw-r--. 1 root utmp 2304 Sep 8 14:58 utmp
-rw-r--r--. 1 root root 4 Sep 8 12:41 dhclient-eth0.pid
drwxr-xr-x. 4 root root 100 Sep 8 03:31 lock
drwxr-xr-x. 3 root root 60 Sep 7 23:11 user
drwxr-xr-x. 7 root root 160 Aug 26 14:59 udev
drwxr-xr-x. 2 root root 60 Aug 21 13:18 tuned
https://linoxide.com/linux-how-to/how-sort-files-date-using-ls-command-linux/

How can I list files modified within a directory yesterday via command line?

I'd like to list out all files with modification dates in the last n days (or even simply after Y-m-d) in a directory. It must work recursively through all subdirectories as well.
How can I do this?
Ideal output:
file.txt Mar 26 15:15
file2.txt Mar 27 01:15
Acceptable output:
file.txt
file2.txt
Answered! (Thanks for all the help)
$ find . -type f -mtime -1 -exec ls -lah {} \;
-rw-r--rw- 1 apache apache 18K Mar 26 08:22 ./file1.txt
-rw-r--rw- 1 apache apache 12K Mar 26 09:23 ./dir1/file2.txt
-rw-r--rw- 1 apache apache 16K Mar 26 10:24 ./dir1/dir2/file3.txt
find . -type f -mtime -1 -exec ls -l {} \;
will list all files within last 24 hours, with a long listing just to confirm modification date
use :
find . -mtime +1
For more informations, see
man find
find dir -mtime +1 -print
That will find all files in dir and subdirectories that were modified 1 day ago or before that.

Recursively traverse Samba shares?

With bash on linux, how would I write a command to recursively traverse shares mounted, and run commands on each file, to get the file type and size, permissions etc, and then output all of this to a file?
A CIFS share mount would look like a regular directory tree in the linux shell.
The command to search as you need is therefore generic.
From the base directory,
find . -type f -exec ls -lsrt {} \; > file.txt
Ok, this does not give you the file-type detail;
that can be done with a -exec file filename on each file.
mount -v | grep smbfs | awk '{print $3}' | xargs ls -lsR
which you can redirect to a file.
mount -v | awk '/smbfs/{
cmd="ls -lsR "$3
while((cmd | getline d)>0){
print d "->file "$3
}
close(cmd)
}'
find $(mount -t smbfs | awk '{print $3}') -mount -type f -ls -execdir file {} \;
...
33597911 4 -rw-rw-r-- 2 peter peter 5 Dec 6 00:09 ./test.d\ ir/base
./base: ASCII text
3662 4 -rw-rw-r-- 2 peter peter 4 Dec 6 02:26 ./test.txt...
./test.txt...: ASCII text
3661 0 -rw-rw-r-- 2 peter peter 0 Dec 6 02:45 ./foo.txt
./foo.txt: empty
...
If you used -exec file {} +, it would run file once with multiple arguments, but then the output wouldn't be nicely interleaved with find's -ls output. (GNU find's -execdir {} + currently behaves the same as -execdir {} \;, due to a bug workaround. Use -exec file {} \; if you want the full path in the file output as well as in the -ls output above it.
find -ls output is not quite the same as ls -l, since it includes inode an number of blocks as the first two fields.

Resources