This question already has answers here:
How can I recursively find all files in current and subfolders based on wildcard matching?
(19 answers)
Closed 1 year ago.
I'm on Ubuntu, and I'd like to find all files in the current directory and subdirectories whose name contains the string "John". I know that grep can match the content of the files, but I have no idea how to use it with file names.
Use the find command,
find . -type f -name "*John*"
The find command will take long time because it scans real files in file system.
The quickest way is using locate command, which will give result immediately:
locate "John"
If the command is not found, you need to install mlocate package and run updatedb command first to prepare the search database for the first time.
More detail here: https://medium.com/#thucnc/the-fastest-way-to-find-files-by-filename-mlocate-locate-commands-55bf40b297ab
This is a very simple solution using the tree command in the directory you want to search for. -f shows the full file path and | is used to pipe the output of tree to grep to find the file containing the string filename in the name.
tree -f | grep filename
use ack its simple.
just type ack <string to be searched>
Related
This question already has answers here:
SFTP: return number of files in remote directory?
(4 answers)
Closed 6 years ago.
I am writing a script in bash and I need to count how many files starting with ddd that are in a remote directory using SFTP. After it download each file, so them I can compare how many files had in the remote directory and how many files were downloaded. Check if they match and such.
I was doing something like this:
echo ls -l | sftp "user#$123.45.67.8:/home/user/datafolder/ddd*" | wc -l
The one above works, but when I run this it downloads all the files to my local folder, which I do not want.
How can I count the number of files and do not download them. I want to download them in another part of the code.
As it said in the comments the best way to do is using ssh. So this outputs what I wanted
ssh user#123.45.67.8 ls /home/user/datafolder/ddd* | wc -l
rsync --list-only provides a succinct way to list the files in a remote directory. Simply passing the result to wc -l takes care of the count (excluding the . and .. (dot) files), e.g.
rsync --list-only server:/path/to/dir/ | wc -l
(note the trailing '/' to count the contents rather than the directory itself. Add -r for a recursive count. You have all rsync options available to tailor the files counted, e.g. --exclude="stuff", etc.)
This question already has answers here:
Rename multiple files based on pattern in Unix
(24 answers)
Closed 2 years ago.
I have a folder containing a sequence of files whose names bear the form filename-white.png. e.g.
images
arrow-down-white.png
arrow-down-right-white.png
...
bullets-white.png
...
...
video-white.png
I want to strip out the -white bit so the names are simply filename.png. I have played around, dry run with -n, with the Linux rename command. However, my knowledge of regexes is rather limited so I have been unable to find the right way to do this.
If you are in the directory above images, the command is
rename "s/-white.png/.png/" images/*
If your current directory is images, then run rename "s/-white.png/.png/" ./* instead. To do a dry run, just attach a -n like you said:
rename -n "s/-white.png/.png/" images/*
or
rename -n "s/-white.png/.png/" ./*
Presently i am using linux(Fedora 15) and i ma trying to search a folder in the entire file system like with below command
find / -name "apache-tomcat*"
The execution of the above command is taking more and more time that a user cant wait and results are some thing like below
[root#user fedrik]# find / -name "apache-tomcat*"
find: `/proc/6236/task/6236/ns/net': No such file or directory
find: `/proc/6236/task/6236/ns/uts': No such file or directory
find: `/proc/6236/task/6236/ns/ipc': No such file or directory
find: `/proc/6236/ns/net': No such file or directory
find: `/proc/6236/ns/uts': No such file or directory
find: `/proc/6236/ns/ipc': No such file or directory
find: `/proc/6462/task/6462/ns/net': No such file or directory
.................
.................
But as i have mentioned it is taking long time to process and sometimes it is been strucked, so can anyone please let me know on how to search a particular folder by name with a command from linux terminal that will be very fast and should search in the entire file system like above i used '/'
Edit
Actually my intention is to search the folder something like apache-tomcat-7.0.37 in the entire filesystem,
for example there may be many folders like apache-tomcat-6.0.45, apache-tomcat-5.1.7, apache-tomcat-5.0.37........... on different locations on filesystem
So as we can observe only the last part(which is numerical part) is changing and the entire folder name is same, so is there a way to search for these kind of folders irrespective of the last numerical part , like by using regular expression or somethingl ike that.
Finally my intention is to find the folders of the format apache-tomcat-xxxxxxx on the entire file system, because if we search for just apache-tomcat we will get hundreds of results and even thousands too sometimes which is difficult to analyze and search from them
?
Try this:
locate apache-tomcat
It uses a database (updated by the hilariously-named updatedb, which you can run with sudo updatedb to refresh the search index).
locate apache-tomcat | grep -E '^apache-tomcat-[[:digit:]]+\.[[:digit:]]+\.[[:digit:]]+$'
or just use [0-9] instead of [[:digit:]]. That's probably more readable. Or
locate apache-tomcat | perl -ne 'print if /^apache-tomcat-\d+\.\d+\.\d+$/'
Whatever you do, you definitely want to use locate instead of find, as it will be much faster.
I prefer to seach with locate command but I don't know how to perform a partial search with it.
Suppose I want to search file containing the word libevent. How can I do that?
Locate searches for file names. Not file contents.
The ugly way is to use grep It'll start searching from / directory.
grep -irn 'libevent' /
The better way is to narrow down the suspected directories where this files could exists. Suppose those directories' full paths are /path/to/dir1, /path/to/dir2 etc. Then invoke the following command.
for dir in /path/to/dir1 /path/to/dir2
do
grep -irn 'libevent' $dir
done
The locate command is not searching inside the content of files like grep (and other commands) do. It is simply searching inside file paths.
locate work by using a cache index of file paths, and this index is often updated by the updatedb utility.
addenda
A useful way to search some pattern inside (the content of) some files is to use the ability of zsh or some recent versions of bash to expand the ** file pattern, like e.g.
grep foo ~/gee/**/*.[ch]
with zsh this search inside all files named *.c or *.h under $HOME/gee/ containing foo. I find this feature tremendously useful, and justifying alone the adoption of zsh as my interactive shell. With other shells you might type the much longer
find $HOME/gee -name '*.ch' | xargs grep foo
I primarily program in Linux, using tcsh shell. By default, my current directory is the root of my code base - I use "find" to locate whichever file I'm interested in modifying, and then once find shows up the location of the file, I can then edit/modify on Vim.
The problem is, due to the size of the code base, every time I ask find to show up the location of a file , it takes at least 4-5 seconds to complete the search, which are too short to be used for anything else !! So, since the rate is new files being added to the code base is very small, i'm looking for a way as follows:
1) Generate the list of all files in my code base
2) Have find look in only those locations/files to answer my query
I've seen how opening up files in cscope is lightning fast, as it stores the list of files previously. I'd like to use the same mechanism for find, just not from within the cscope window, but from the generic cmd line.
Any ideas ?
Install the locate, mlocate, or slocate package from your distribution, and either wait for cron to run the update task :) or run the updatedb command manually via the /etc/cron.daily/mlocate or similar file.
$ time locate kernel.txt
/home/sarnold/Local/linux-2.6/Documentation/sysctl/kernel.txt
/home/sarnold/Local/linux-2.6-config-all/Documentation/sysctl/kernel.txt
/home/sarnold/Local/linux-apparmor/Documentation/sysctl/kernel.txt
/usr/share/doc/libfuse2/kernel.txt.gz
real 0m0.595s
Yes. See slocate (or updatedb & locate).
The -U flag is particularily interesting because you can just index the directory that contains your code (and thus, updating or creating the database will be quick).
You could write a list of directories to a file and use them in your find command:
$ find /path/to/src -type d > dirs
$ find $(cat dirs) -type f -name "foo"
Alternatively, write a list of files to a file and use grep on it. The list of files is more likely to change than the list of dirs though.
$ find /path/to/src -type f > files
$ vi $(grep foo files)
find in conjunction with xargs (substituting -exec) does differ significantly in execution timings:
http://forrestrunning.wordpress.com/2011/08/01/find-exec-xargs/