'ls | grep -c' and full path - linux

Can I use ls | grep -c /full/path/to/file to count the occurrences of a file, but while executing the command from a different directory than where the files I'm looking for are?
Let's say I want to look how many .txt files I have in my "results" directory. Can I do something like ls | grep -c /full/path/to/results/*.txt while I'm in another directory?
Although I have .txt files in that directory, I always get a zero when I run the command from another directory :( What's happening? Can I only use ls for the current directory?

You have to use ls <dirname>. Plain ls defaults only to the current directory.
What you are trying to do can be accomplished by find <dir> -name "*.txt" | grep -c txt or find <dir> -name "*.txt" | wc -l
But you can do ls * | grep \.txt$ as well. Please read the manual to find the differences.

grep accepts regular expressions, not glob. /foo/bar/*.txt is a glob. Try /foo/bar/.*\.txt
also ls lists files and directories under your current directory. It will not list the full path. Do some tests, and you will see it easily.
ls may output results in a single line, and this could make your grep -c give an incorrect result. Because grep does line-based matching.

Related

How to count all the files using ls -l statements separated by && in a single line?

I'm tyring to count all the files from several ls -l statements for certain file types separated by the double amperand symbol like so:
ls -l *.xml && ls -l *.json && ls -l *.md
The technique I've seen for a single file type or all files will simply count all the end of line characters it finds using an egrep command: egrep -c '^-'
Here is a link for a single ls -l command to find the count for all files : Link to a question about ls -l count using egrep -c command in stack overflow.
If I count several ls -l statements using a command on a single line using like 'ls -l' and for each file type, how do I count each statement's totals in Linux using sh or bash shell script?
I tried this and it doesn't seem to work:
ls -l *.xml && ls -l *.json && ls -l *.md | egrep -c '^-'
I also tried:
ls -l *.xml && ls -l *.json && ls -l *.md | grep -v /$ | wc -l
Unfortunately, it doesn't like the '&&' symbols that concatenate the results, and it also doesn't work with the '|' (pipe symbol) for some reason. The pipe symbol must not work with the '&&' symbol the way I'm using it.
I'm not quite sure if I understood the objective correctly.
If you're wanting a total number of all three types combined:
ls *.xml *.json *.md | wc -l
First, you need to use -d option for ls, to make it not to expand the items that are directories and show all the files inside them. Second you need to cut the first line (the one that shows Total 45 at the top). Third using
ls -l *.xml && ls -l *.json && ls -l *.md
is equivalente to
ls -l *.xml *.json *.md
so you can avoid two calls to ls and two processes.
There's still an issue with ls and it is that you can have no *.xml files, and you will get (on stderr, that is) a *.xml: no such file. This is because the * wildcard expansion is made at the shell, and it is passed verbatim if the shell is unable to find any file with that name.
So, finally, a way to do so would be:
ls -ld *.xml *.json *.md | tail +1 | wc -l
Note:
I shouldn't use -l option, it makes your search more difficult (ls takes more time as it has to stat(2) each file, it outputs the annoying Total blocks line at the top, that you have to eliminate, and you get a blank line between directory listings --this feature is eliminated if you specify option -d-- that you should have to eliminate also). It will skip unreadable files (as the unreadable files cannot be stat(2)ed for info, and you are not using the extra information for anything. You can use just
ls -d *.xml *.json *.md | wc -l
to get only the names (and also strip the used blocks line at the top) that match the patterns you put in the command line. If the output of ls is piped to another command, then it doesn't group the output in columns, as it does when outputting to a terminal.
Anyway, if I had to use some tool to count files, I should use find(1), instead (it allows far more flexibility on files selection, and allows you to search recursively a directory structure) as in:
find . \( -name "*.xml" -o -name "*.json" -o -name "*.md" \) -print | wc -l
or, if you want it to be only in the current directory, just add the option -depth 1 and you will not get it searching recursively into directories.
find . -depth 1 \( -name "*.xml" -o -name "*.json" -o -name "*.md" \) -print | wc -l

linux : listing files that contain several words

I try to find a way to list all the files in the directory tree (recursively) that contain several words.
While searching I found example such as egrep -R -l 'toto|tata' . but | induce OR. I would like AND...
Thank you for your help
Using GNU grep with GNU xargs,
grep -ERl 'toto' | xargs -r grep 'tata'
The first grep lists those files containing the pattern toto which is then fed to xargs and with the second grep those files containing tata is retrieved. The -r flag is to ensure second grep doesn't run on an empty output.
The -r flag in xargs from the man page,
-r, --no-run-if-empty
If the standard input does not contain any nonblanks, do not run the command.
Normally, the command is run once even if there is no input. This option is a GNU
extension.
agrep tool is designed for providing AND to grep with usage:
agrep 'pattern1;pattern2' file
In your case you could run
find . -type f -exec agrep 'toto;tata' {} \; #apply -l to display the file names
PS1: For current directory you can just agrep 'pattern1;pattern2' *.*
PS2: Unfortunatelly agrep does not support -R option.

How to know which file holds grep result?

There is a directory which contains 100 text files. I used grep to search a given text in the directory as follow:
cat *.txt | grep Ya_Mahdi
and grep shows Ya_Mahdi.
I need to know which file holds the text. Is it possible?
Just get rid of cat and provide the list of files to grep:
grep Ya_Mahdi *.txt
While this would generally work, depending on the number of .txt files in that folder, the argument list for grep might get too large.
You can use find for a bullet proof solution:
find --maxdepth 1 -name '*.txt' -exec grep -H Ya_Mahdi {} +

How to get my expected search result using grep on Linux

I am trying to find out which files contains this text 'roads' on the server, so I use this command: grep -rl 'roads'. But it shows lots of such files:
./res/js/.svn/entries
./res/js/.svn/all-wcprops
./res/styles/.svn/entries
./res/styles/.svn/all-wcprops
./res/images/.svn/entries
I do not want this folder .svn show in the search result because this is just for version control, it means nothing for me, so is there a way that I can do this: if the result contains .svn, then it does not show up in the final result. e.g. below three files contain the text 'roads':
check.php
./res/js/.svn/entries
./res/js/.svn/all-wcprops
Then the result only shows:
check.php
One simple way is to simply grep away your false positives:
grep -rl roads . | grep -v '/\.svn/'
If you want to be more efficient and not spend time searching through the SVN files, you can filter them away before grepping through them:
find -type f | grep -v '/\.svn/' | xargs grep -l roads
grep has a feature to exclude a particular directory, called "--exclude-dir", so simply you can pass .svn as "--exclude-dir" param, like below:
"grep -rl --exclude-dir=.svn ./ -e roads"

How to list files on directory shell script

I need to list the files on Directory. But only true files, not the folders.
Just couldn't find a way to test if the file is a folder or a directory....
Could some one provide a pice o script for that?
Thanks
How about using find?
To find regular files in the current directory and output a sorted list:
$ find -maxdepth 1 -type f | sort
To find anything that is not a directory (Note: there are more things than just regular files and directories in Unix-like systems):
$ find -maxdepth 1 ! -type d | sort
In bash shell test -f $file will tell you if $file is a file:
if test -f $file; then echo "File"; fi
You can use ls -l | grep ^d -v to realize what you want. I tested it in Redhat 9.0, it lists only the true files, including the Hidden Files.
If u want to get a list the folders on Directory. But only folders, not the true files. You can use ls -l | grep ^d

Resources