Open all files in Sublime Text from a directory except ones explicitly specified - linux

I know that we can use this command find . -type f ! -name '[PATTERN]' to find all files in the current directory except the ones that match the specified search pattern, but what do I do to feed the files so found as command-line arguments into the Sublime Text editor to open them up in there?
misha#hp-laptop:~/work/cpp/class$ ls -l
total 28
-rwxrwxr-x 1 misha misha 14252 Mar 24 00:49 out
-rw-rw-r-- 1 misha misha 236 Mar 24 00:48 Person.cpp
-rw-rw-r-- 1 misha misha 255 Mar 24 00:49 Person.h
-rw-rw-r-- 1 misha misha 200 Mar 24 00:49 test.cpp
misha#hp-laptop:~/work/cpp/class$ find . -type f ! -name 'out'
./Person.h
./test.cpp
./Person.cpp
misha#hp-laptop:~/work/cpp/class$

For this, xargs is your friend! It allows you to turn lines from STDIN into parameters.
Assuming you can open files with sublime some_file some_other_file ..., you can use:
find . -type f ! -name 'out' | xargs sublime
In your case, it will take the output from find
./Person.h
./test.cpp
./Person.cpp
And append them to sublime to build and run a command:
sublime ./Person.h ./test.cpp ./Person.cpp

Related

how to get previous date files and pass ls output to array in gawk

I have log files like below generated, and I need to daily run script ,which will list them , and then do 2 things.
1- get previous / yesterday files and transfer them to x server
2- get files older than one day and transfer them to y server
files are like below and I am trying below code but not working.
how can we pass ls -altr output to gawk ? can we built an associate array like below.
array[index]=ls -altr | awk '{print $6,$7,$8}'
code I am trying to retrieve previous date files , but not working
previous_dates=$(date -d "-1 days" '+-%d')
ls -altr |gawk '{if ( $7!=previous_dates ) print $9 }'
-r-------- 1 root root 6291563 Jun 22 14:45 audit.log.4
-r-------- 1 root root 6291619 Jun 24 09:11 audit.log.3
drwxr-xr-x. 14 root root 4096 Jun 26 03:47 ..
-r-------- 1 root root 6291462 Jun 26 04:15 audit.log.2
-r-------- 1 root root 6291513 Jun 27 23:05 audit.log.1
drwxr-x---. 2 root root 4096 Jun 27 23:05 .
-rw------- 1 root root 5843020 Jun 29 14:57 audit.log
To select files modified yesterday, you could use
find . -daystart -type f -mtime 1
and to select older files, you could use
find . -daystart -type f -mtime +1
possibly adding a -name test to select only files like audit.log*, for example. You could then use xargs to process the files, e.g.
find . -daystart -type f -mtime 1 | xargs -n 1 -I{} scp {} user#server

Deleting multiple files in Linux?

How can I delete multiple files in Linux created at same date and time? How can I manage this without using date? The file have different names.
I have these .txt files:
-rw-r--r-- 1 root root 54 Jan 6 17:28 file1.txt
-rw-r--r-- 1 root root 33 Jan 6 17:28 file2.txt
-rw-r--r-- 1 root root 24 Jan 6 18:05 file3.txt
-rw-r--r-- 1 root root 0 Jan 6 17:28 file4.txt
-rw-r--r-- 1 root root 0 Jan 6 17:28 file5.txt
How can I delete all the files with one command?
You can use find command and specify the time range. In your example: if you would like to find all files with modified timestamp from 6. Jan 17:28 you can do something like:
find . -type f -newermt '2016-01-06 17:28' ! -newermt '2016-01-06 17:29'
if you would like to delete them, just use finds exec parameter:
find . -type f -newermt '2016-01-06 17:28' ! -newermt '2016-01-06 17:29' -exec rm {} \;
you can also include -name '*.txt' if you want to process only *.txt files, and check maxdepth parameter as well if you would like to avoid processing subdirectories
simply use rm -f file*.txt to delete all files which starts with file and ends with the extention .txt
If you know the minutes of the file modified then you can deleted all files using find command. consider the file was last modified ten minutes ago. Then you can use,
find -iname "*.txt" -mmin 10 -ok rm {} \;
If you don't need to prompt before deleting then use -exec.
find -iname "*.txt" -mmin 10 -exec rm {} \;
If you need to delete the files using access time then you can use -amin

How to find files modified in last x minutes (find -mmin does not work as expected)

I'm trying to find files modified in last x minutes, for example in the last hour. Many forums and tutorials on the net suggest to use the find command with the -mmin option, like this:
find . -mmin -60 |xargs ls -l
However, this command did not work for me as expected. As you can see from the following listing, it also shows files modified earlier than 1 hour ago:
-rw------- 1 user user 9065 Oct 28 23:13 1446070435.V902I67a5567M283852.harvester
-rw------- 1 user user 1331 Oct 29 01:10 1446077402.V902I67a5b34M538793.harvester
-rw------- 1 user user 1615 Oct 29 01:36 1446078983.V902I67a5b35M267251.harvester
-rw------- 1 user user 72365 Oct 29 02:27 1446082022.V902I67a5b36M873811.harvester
-rw------- 1 user user 69102 Oct 29 02:27 1446082024.V902I67a5b37M142247.harvester
-rw------- 1 user user 2611 Oct 29 02:34 1446082482.V902I67a5b38M258101.harvester
-rw------- 1 user user 2612 Oct 29 02:34 1446082485.V902I67a5b39M607107.harvester
-rw------- 1 user user 2600 Oct 29 02:34 1446082488.V902I67a5b3aM465574.harvester
-rw------- 1 user user 10779 Oct 29 03:27 1446085622.V902I67a5b3bM110329.harvester
-rw------- 1 user user 5836 Oct 29 03:27 1446085623.V902I67a5b3cM254104.harvester
-rw------- 1 user user 8970 Oct 29 04:27 1446089232.V902I67a5b3dM936339.harvester
-rw------- 1 user user 165393 Oct 29 06:10 1446095400.V902I67a5b3eM290158.harvester
-rw------- 1 user user 105054 Oct 29 06:10 1446095430.V902I67a5b3fM265065.harvester
-rw------- 1 user user 1615 Oct 29 06:24 1446096244.V902I67a5b40M55701.harvester
-rw------- 1 user user 1620 Oct 29 06:24 1446096292.V902I67a5b41M337769.harvester
-rw------- 1 user user 10436 Oct 29 06:36 1446096973.V902I67a5b42M707215.harvester
-rw------- 1 user user 7150 Oct 29 06:36 1446097019.V902I67a5b43M415731.harvester
-rw------- 1 user user 4357 Oct 29 06:39 1446097194.V902I67a5b56M446687.harvester
-rw------- 1 user user 4283 Oct 29 06:39 1446097195.V902I67a5b57M957052.harvester
-rw------- 1 user user 4393 Oct 29 06:39 1446097197.V902I67a5b58M774506.harvester
-rw------- 1 user user 4264 Oct 29 06:39 1446097198.V902I67a5b59M532213.harvester
-rw------- 1 user user 4272 Oct 29 06:40 1446097201.V902I67a5b5aM534679.harvester
-rw------- 1 user user 4274 Oct 29 06:40 1446097228.V902I67a5b5dM363553.harvester
-rw------- 1 user user 20905 Oct 29 06:44 1446097455.V902I67a5b5eM918314.harvester
Actually, it just listed all files in the current directory. We can take one of these files as an example and check if its modification time is really as displayed by the ls command:
stat 1446070435.V902I67a5567M283852.harvester
File: ‘1446070435.V902I67a5567M283852.harvester’
Size: 9065 Blocks: 24 IO Block: 4096 regular file
Device: 902h/2306d Inode: 108680551 Links: 1
Access: (0600/-rw-------) Uid: ( 1001/ user) Gid: ( 1027/ user)
Access: 2015-10-28 23:13:55.281515368 +0100
Modify: 2015-10-28 23:13:55.281515368 +0100
Change: 2015-10-28 23:13:55.313515539 +0100
As we can see, this file was definitely last modified earlier than 1 hour ago! I also tried find -mmin 60 or find -mmin +60, but it did not work either.
Why is this happening and how to use the find command correctly?
I can reproduce your problem if there are no files in the directory that were modified in the last hour. In that case, find . -mmin -60 returns nothing. The command find . -mmin -60 |xargs ls -l, however, returns every file in the directory which is consistent with what happens when ls -l is run without an argument.
To make sure that ls -l is only run when a file is found, try:
find . -mmin -60 -type f -exec ls -l {} +
The problem is that
find . -mmin -60
outputs:
.
./file1
./file2
Note the line with one dot?
That makes ls list the whole directory exactly the same as when ls -l . is executed.
One solution is to list only files (not directories):
find . -mmin -60 -type f | xargs ls -l
But it is better to use directly the option -exec of find:
find . -mmin -60 -type f -exec ls -l {} \;
Or just:
find . -mmin -60 -type f -ls
Which, by the way is safe even including directories:
find . -mmin -60 -ls
To search for files in /target_directory and all its sub-directories, that have been modified in the last 60 minutes:
$ find /target_directory -type f -mmin -60
To find the most recently modified files, sorted in the reverse order of update time (i.e., the most recently updated files first):
$ find /etc -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort -r
Manual of find:
Numeric arguments can be specified as
+n for greater than n,
-n for less than n,
n for exactly n.
-amin n
File was last accessed n minutes ago.
-anewer file
File was last accessed more recently than file was modified. If file is a symbolic link and the -H option or the -L option is in effect, the access time of the file it points to is always
used.
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime +1, a file has to
have been accessed at least two days ago.
-cmin n
File's status was last changed n minutes ago.
-cnewer file
File's status was last changed more recently than file was modified. If file is a symbolic link and the -H option or the -L option is in effect, the status-change time of the file it points
to is always used.
-ctime n
File's status was last changed n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file status change times.
Example:
find /dir -cmin -60 # creation time
find /dir -mmin -60 # modification time
find /dir -amin -60 # access time
I am working through the same need and I believe your timeframe is incorrect.
Try these:
15min change: find . -mtime -.01
1hr change: find . -mtime -.04
12 hr change: find . -mtime -.5
You should be using 24 hours as your base. The number after -mtime should be relative to 24 hours. Thus -.5 is the equivalent of 12 hours, because 12 hours is half of 24 hours.
Actually, there's more than one issue here. The main one is that xargs by default executes the command you specified, even when no arguments have been passed. To change that you might use a GNU extension to xargs:
--no-run-if-empty
-r
If the standard input does not contain any nonblanks, do not run the command. Normally, the command is run once even if there is no input. This option is a GNU extension.
Simple example:
find . -mmin -60 | xargs -r ls -l
But this might match to all subdirectories, including . (the current directory), and ls will list each of them individually. So the output will be a mess. Solution: pass -d to ls, which prohibits listing the directory contents:
find . -mmin -60 | xargs -r ls -ld
Now you don't like . (the current directory) in your list? Solution: exclude the first directory level (0) from find output:
find . -mindepth 1 -mmin -60 | xargs -r ls -ld
Now you'd need only the files in your list? Solution: exclude the directories:
find . -type f -mmin -60 | xargs -r ls -l
Now you have some files with names containing white space, quote marks, or backslashes? Solution: use null-terminated output (find) and input (xargs) (these are also GNU extensions, afaik):
find . -type f -mmin -60 -print0 | xargs -r0 ls -l
This may work for you. I used it for cleaning folders during deployments for deleting old deployment files.
clean_anyfolder() {
local temp2="$1/**"; //PATH
temp3=( $(ls -d $temp2 -t | grep "`date | awk '{print $2" "$3}'`") )
j=0;
while [ $j -lt ${#temp3[#]} ]
do
echo "to be removed ${temp3[$j]}"
delete_file_or_folder ${temp3[$j]} 0 //DELETE HERE
fi
j=`expr $j + 1`
done
}
this command may be help you sir
find -type f -mtime -60

Linux combine sort files by date created and given file name

I need to combine these to commands in order to have a sorted list by date created with the specified "filename".
I know that sorting files by date can be achieved with:
ls -lrt
and finding a file by name with
find . -name "filename*"
I don't know how to combine these two. I tried with a pipeline but I don't get the right result.
[EDIT]
Not sorted
find . -name "filename" -printf '%TY:%Tm:%Td %TH:%Tm %h/%f\n' | sort
Forget xargs. "Find" and "sort" are all the tools you need.
My best guess would be to use xargs:
find . -name 'filename*' -print0 | xargs -0 /bin/ls -ltr
There's an upper limit on the number of arguments, but it shouldn't be a problem unless they occupy more than 32kB (read more here), in which case you will get blocks of sorted files :)
find . -name "filename" -exec ls --full-time \{\} \; | cut -d' ' -f7- | sort
You might have to adjust the cut command depending on what your version of ls outputs.
Check the below-shared command:
1) List Files directory with Last Modified Date/Time
To list files and shows the last modified files at top, we will use -lt options with ls command.
$ ls -lt /run
output
total 24
-rw-rw-r--. 1 root utmp 2304 Sep 8 14:58 utmp
-rw-r--r--. 1 root root 4 Sep 8 12:41 dhclient-eth0.pid
drwxr-xr-x. 4 root root 100 Sep 8 03:31 lock
drwxr-xr-x. 3 root root 60 Sep 7 23:11 user
drwxr-xr-x. 7 root root 160 Aug 26 14:59 udev
drwxr-xr-x. 2 root root 60 Aug 21 13:18 tuned
https://linoxide.com/linux-how-to/how-sort-files-date-using-ls-command-linux/

counting number of directories in a specific directory

How to count the number of folders in a specific directory. I am using the following command, but it always provides an extra one.
find /directory/ -maxdepth 1 -type d -print| wc -l
For example, if I have 3 folders, this command provides 4. If it contains 5 folders, the command provides 6. Why is that?
find is also printing the directory itself:
$ find .vim/ -maxdepth 1 -type d
.vim/
.vim/indent
.vim/colors
.vim/doc
.vim/after
.vim/autoload
.vim/compiler
.vim/plugin
.vim/syntax
.vim/ftplugin
.vim/bundle
.vim/ftdetect
You can instead test the directory's children and do not descend into them at all:
$ find .vim/* -maxdepth 0 -type d
.vim/after
.vim/autoload
.vim/bundle
.vim/colors
.vim/compiler
.vim/doc
.vim/ftdetect
.vim/ftplugin
.vim/indent
.vim/plugin
.vim/syntax
$ find .vim/* -maxdepth 0 -type d | wc -l
11
$ find .vim/ -maxdepth 1 -type d | wc -l
12
You can also use ls:
$ ls -l .vim | grep -c ^d
11
$ ls -l .vim
total 52
drwxrwxr-x 3 anossovp anossovp 4096 Aug 29 2012 after
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 autoload
drwxrwxr-x 13 anossovp anossovp 4096 Aug 29 2012 bundle
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 colors
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 compiler
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 doc
-rw-rw-r-- 1 anossovp anossovp 48 Aug 29 2012 filetype.vim
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftdetect
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftplugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 indent
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 plugin
-rw-rw-r-- 1 anossovp anossovp 2505 Aug 29 2012 README.rst
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 syntax
$ ls -l .vim | grep ^d
drwxrwxr-x 3 anossovp anossovp 4096 Aug 29 2012 after
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 autoload
drwxrwxr-x 13 anossovp anossovp 4096 Aug 29 2012 bundle
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 colors
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 compiler
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 doc
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftdetect
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftplugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 indent
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 plugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 syntax
Get a count of only the directories in the current directory
echo */ | wc
you will get out put like 1 309 4594
2nd digit represents no. of directories.
or
tree -L 1 | tail -1
find . -mindepth 1 -maxdepth 1 -type d | wc -l
For find -mindepth means total number recusive in directories
-maxdepth means total number recusive in directories
-type d means directory
And for wc -l means count the lines of the input
If you only have directories in the folder and no files this does it:
ls | wc -l
Run stat -c %h folder and subtract 2 from the result. This employs only a single subprocess as opposed to the 2 (or even 3) required by most of the other solutions here (typically find or ls plus wc).
Using sh/bash:
echo $((`stat -c %h folder` - 2))   # 'echo' is a shell builtin
Using csh/tcsh:
# cnt = `stat -c %h folder` - 2; echo $cnt   # 'echo' is a shell builtin
Explanation: stat -c %h folder prints the number of hardlinks to folder, and each subfolder under folder contains a ../ entry which is a hardlink back to folder. You must subtract 2 because there are two additional hardlinks in the count:
folder's own self-referential ./ entry, and
folder's parent's link to folder
Best way to navigate to your drive and simply execute
ls -lR | grep ^d | wc -l
and to Find all folders in total, including subdirectories?
find /mount/point -type d | wc -l
...or find all folders in the root directory (not including subdirectories)?
find /mount/point -maxdepth 1 -type d | wc -l
Cheers!
I think the easiest is
ls -ld images/* | wc -l
where images is your target directory. The -d flag limits to directories, and the -l flag will perform a per-line listing, compatible with the very familiar wc -l for line count.
No of directory we can find using below command
ls -l | grep "^d" | wc -l
Some useful examples:
count files in current dir
/bin/ls -lA | egrep -c '^-'
count dirs in current dir
/bin/ls -lA | egrep -c '^d'
count files and dirs in current dir
/bin/ls -lA | egrep -c '^-|^d'
count files and dirs in in one subdirectory
/bin/ls -lA subdir_name/ | egrep -c '^-|^d'
I have noticed a strange thing (at least in my case) :
When I have tried with ls instead /bin/ls
the -A parameter do not list implied . and .. NOT WORK as espected. When I use
ls that show ./ and ../ So that result wrong count. SOLUTION : /bin/ls instead ls
To get the number of directories - navigate
go to the directory and execute
ls -l | grep -c ^d
A pure bash solution:
shopt -s nullglob
dirs=( /path/to/directory/*/ )
echo "There are ${#dirs[#]} (non-hidden) directories"
If you also want to count the hidden directories:
shopt -s nullglob dotglob
dirs=( /path/to/directory/*/ )
echo "There are ${#dirs[#]} directories (including hidden ones)"
Note that this will also count links to directories. If you don't want that, it's a bit more difficult with this method.
Using find:
find /path/to/directory -type d \! -name . -prune -exec printf x \; | wc -c
The trick is to output an x to stdout each time a directory is found, and then use wc to count the number of characters. This will count the number of all directories (including hidden ones), excluding links.
The methods presented here are all safe wrt to funny characters that can appear in file names (spaces, newlines, glob characters, etc.).
Using zsh:
a=(*(/N)); echo ${#a}
The N is a nullglob, / makes it match directories, # counts. It will neatly cope with spaces in directory names as well as returning 0 if there are no directories.
The best answer to what you want is
echo `find . -maxdepth 1 -type d | wc -l`-1 | bc
this subtracts one to remove the unwanted '.' directory that find lists (as patel deven mentioned above).
If you want to count subfolders recursively, then just leave off the maxdepth option, so
echo `find . -type d | wc -l`-1 | bc
PS If you find command substitution ugly, subtracting one can be done as a pure stream using sed and bc.
Subtracting one from count:
find . -maxdepth 1 -type d | wc -l | sed 's/$/-1\n/' | bc
or, adding count to minus one:
find . -maxdepth 1 -type d | wc -l | sed 's/^/-1+/' | bc
Count all files and subfolders, windows style:
dir=/YOUR/PATH;f=$(find $dir -type f | wc -l); d=$(find $dir -mindepth 1 -type d | wc -l); echo "$f Files, $d Folders"
If you want to use regular expressions, then try:
ls -c | grep "^d" | wc -l
If you want to count folders that have similar names like folder01,folder02,folder03, etc then you can do
ls -l | grep ^d | grep -c folder
Best way to do it:
ls -la | grep -v total | wc -l
This gives you the perfect count.

Resources