Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I have modified some files present in various folders in my webroot. This was development environment. Now I have to find all files modified yesterday to migrate to productions.
Is there any way (Linux command) to list only those files modified yesterday in my webroot tree?
find ./ -mtime -1
Finds everything, what was modified in the current directory at the last 24 hours.
find . -daystart -mtime 1 -print
This gets just files modified YESTERDAY - ie: today is Jun 21, only files for Jun 20 are found.
(-mtime takes a '-', a '+', or an explicit number of exact days).
If you want a long listing, substitute
-exec ls -ld \;
for the
-print.
find . -mtime +2 -prune -o -mtime +1 -print
This does a find but excludes anything that was modified more than two days ago, then finds anything that was modified more than one day ago.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
This post was edited and submitted for review 1 year ago and failed to reopen the post:
Original close reason(s) were not resolved
Improve this question
I'm using tail in Ubuntu to fetch information from a ubuntu device. I'm kind of new to Linux so the command I use is
root#mydevice:/# tail /path/to/directory/*/*
So I can fetch many files at once.
However, some of the subdirectories also contain subdirectories, so the output looks like this:
==> /path/to/directory/number_one/subdirectory<==
tail: error reading '/path/to/directory/number_one/subdirectory': Is a directory
==> /path/to/directory/number_one/data_one <==
23300000
==> /path/to/directory/number_one/data_two <==
23953
==> /path/to/directory/number_one/data_three <==
667
etc...
Is there a way to use tail while ignoring subdirectories so I don't get this error?
Thanks a lot in advance.
You can use find and use the -f option so it only finds regular files, no directories. -maxdepth 1 keeps it from recursing into the subdirectories.
find /path/to/sensor/*/* -type f -maxdepth 1 -exec tail {} +
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
Remove only files inside directory and subdirectory
Not the directory
Not the subdirectory
only want to delete the files inside directory and subdirectory
To remove all the files under dir/subdir/file (both dir/subdir/file and dir/subdir/subsubdir/file will be removed):
find dir/subdir -type f -delete
To remove files at most one level below dir/subdir (this removes dir/subdir/file but not dir/subdir/subsubdir/file):
find dir/subdir -maxdepth 1 -type f -delete
To remove dir/file and dir/subdir/file but not dir/subdir/subsubdir/file:
find dir dir/subdir -maxdepth 1 -type f -delete
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I need to remove a large number of symbolic links from a folder that has other files which I don't want to remove. Is there any easy way to remove only symbolic links?
You can use the find(1) command
find . -maxdepth 1 -type l -exec rm {} \;
-maxdepth 1 is for only scanning current directory.
-type l is for searching symbolic links
-exec executes rm to delete given file, the {} being replaced by find with an appropriate path, and the \; ending the sub-command run by find
See also the man page of rm(1) (and of ls(1), mv(1), cp(1), ln(1), stat(1) if you want to use them in a variant of that find command).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have to do this task on linux
find recursively only files in /etc that are larger than 200kb and redirect stdout to a file named FLfindout and redirect stderr to a file named FLfinderr
and I typed in
find /etc 200k > FLfindout 2> FLstderr
and I don't know what the output suppose to be look like. and is this command right?
If I understand your question right you just want to get the list of files greater than 200Kb, you can try
find /etc -type f -size +200
if you want this print into a file, you could try
find /etc -type f -size +200 > file.txt
Try this:
find /etc -type f -size +200k -print > FLfindout 2> FLstderr
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have lot of files in specific folder.
I want to delete all files expect *.html file type in that folder.
Is there any way to do this in command line? I am using Linux.
I'll assume that you refer to linux command line, please update your question if not.
find ./folder/to/look/in -not -iname '*.html' -exec rm {} \;
Here's an explanation of what this does
edit
If you have not too many files then you might want to make find execute one single rm command. You can do that with using + instead of ;
find ./folder/to/look/in -not -iname '*.html' -exec rm {} +
Here's an explanation of this one