How does the word count program count hidden files? - linux

After creating a few directories and hidden files and running the following commands
ls -al | wc -l
ls -a1 | wc -l
I get a difference in the total returned by the word count program. The
ls -al | wc -l
command returns one more count. Why is this?

$ ls -al | head -n 1
total 57600
This line is not shown with -1.

| is the pipe that connect the command, the output of left command ls -al is the input of right command wc -l.
output of command ls -al is string then wc -l will count the string as a file content,The file names in the string content isn't the argument for command wc -l.
the command xargs is useful,you can use it.
like:
ls -a | xargs wc -l
# find command to find files
find ./* | xargs wc -l

Related

How to count all the files using ls -l statements separated by && in a single line?

I'm tyring to count all the files from several ls -l statements for certain file types separated by the double amperand symbol like so:
ls -l *.xml && ls -l *.json && ls -l *.md
The technique I've seen for a single file type or all files will simply count all the end of line characters it finds using an egrep command: egrep -c '^-'
Here is a link for a single ls -l command to find the count for all files : Link to a question about ls -l count using egrep -c command in stack overflow.
If I count several ls -l statements using a command on a single line using like 'ls -l' and for each file type, how do I count each statement's totals in Linux using sh or bash shell script?
I tried this and it doesn't seem to work:
ls -l *.xml && ls -l *.json && ls -l *.md | egrep -c '^-'
I also tried:
ls -l *.xml && ls -l *.json && ls -l *.md | grep -v /$ | wc -l
Unfortunately, it doesn't like the '&&' symbols that concatenate the results, and it also doesn't work with the '|' (pipe symbol) for some reason. The pipe symbol must not work with the '&&' symbol the way I'm using it.
I'm not quite sure if I understood the objective correctly.
If you're wanting a total number of all three types combined:
ls *.xml *.json *.md | wc -l
First, you need to use -d option for ls, to make it not to expand the items that are directories and show all the files inside them. Second you need to cut the first line (the one that shows Total 45 at the top). Third using
ls -l *.xml && ls -l *.json && ls -l *.md
is equivalente to
ls -l *.xml *.json *.md
so you can avoid two calls to ls and two processes.
There's still an issue with ls and it is that you can have no *.xml files, and you will get (on stderr, that is) a *.xml: no such file. This is because the * wildcard expansion is made at the shell, and it is passed verbatim if the shell is unable to find any file with that name.
So, finally, a way to do so would be:
ls -ld *.xml *.json *.md | tail +1 | wc -l
Note:
I shouldn't use -l option, it makes your search more difficult (ls takes more time as it has to stat(2) each file, it outputs the annoying Total blocks line at the top, that you have to eliminate, and you get a blank line between directory listings --this feature is eliminated if you specify option -d-- that you should have to eliminate also). It will skip unreadable files (as the unreadable files cannot be stat(2)ed for info, and you are not using the extra information for anything. You can use just
ls -d *.xml *.json *.md | wc -l
to get only the names (and also strip the used blocks line at the top) that match the patterns you put in the command line. If the output of ls is piped to another command, then it doesn't group the output in columns, as it does when outputting to a terminal.
Anyway, if I had to use some tool to count files, I should use find(1), instead (it allows far more flexibility on files selection, and allows you to search recursively a directory structure) as in:
find . \( -name "*.xml" -o -name "*.json" -o -name "*.md" \) -print | wc -l
or, if you want it to be only in the current directory, just add the option -depth 1 and you will not get it searching recursively into directories.
find . -depth 1 \( -name "*.xml" -o -name "*.json" -o -name "*.md" \) -print | wc -l

Using one command line count the lines in the last file located in /etc in ubuntu

ls /etc | tail -1 | wc -l
so basically I used this command but it counts the number of files that I've got from the tail command (which is the last file in the directory=1) but I didn't get the number of lines that are in the file.
and I used the cat command to open the file and count the lines but it didn't work.
ls /etc | cat tail -1 | wc -l
ls /etc | tail -1 | cat |wc -l
You could use xargs to use the result of the tail as an argument for wc, although I'd recommend using find instead of ls so you get the full path and don't need mess around with relative pathes:
$ find /etc -type f | tail -1 | xargs wc -l
You should never parse ls (instead parse /etc/*)
$ wc -l < `find /etc -maxdepth 1 -type f | tail -n 1`
or
$ find /etc -maxdepth 1 -type f | tail -n 1 | wc -l
What this does is find the last file for /etc
And puts it's content in wc -l

Counting files in a huge directory [duplicate]

This question already has answers here:
What is the best way to count "find" results?
(6 answers)
Closed 3 years ago.
Related to this question.
How do I count the number of files in a directory so huge that ls returns too many characters for the command line to handle?
$ ls 150_sims/combined/ | wc -l
bash: /bin/ls: Argument list too long
Try this:
$ find 150_sims/combined/ -maxdepth 1 -type f | wc -l
If you're sure there are no directories inside your directory, you can reduce the command to just:
$ find 150_sims/combined/ | wc -l
If there are no newlines in file names, a simple ls -A | wc -l tells you how many files there are in the directory. Please note that if you have an alias for ls, this may trigger a call to stat (Example: ls --color or ls -F need to know the file type, which requires a call to stat), so from the command line, call command ls -A | wc -l or \ls -A | wc -l to avoid an alias.
ls -A 150_sims/combined | wc -l
If you are interested in counting both the files and directories you can try something like this:
\ls -afq 150_sims/combined | wc -l
This includes . and .., so you need subtract 2 from the count:
echo $(\ls -afq 150_sims/combined | wc -l) - 2 | bc

Return the number of directories

I have this piped a command that tells me how many directories are inside the current directory:
ls -lR | grep ^d | wc -l
But is there a way to check for a given directory? Something like:
ls -lR | grep ^d | wc -l /folder1/?
I think your just passing /folder1 to the wrong cmd
ls -lR /folder1 | grep ^d | wc -l
I suggest you use find. With -type d you can tell find to search only for directories. Like so
find /folder1 -type d | wc -l
The advantage is that you can easily change this to retrieve the names of the directories and also act on them with -exec.
The drawback is that this command also counts the directory /folder or ./, but that's easily circumvented:
find /folder1 -mindepth 1 -type d | wc -l

Problems with ls -l | grep combination

i want to see all files that end with .sh through a ls -l | grep combination. The problem with that is that it has to show only the filename, no other attributes. How do I do this with a ls-l | grep combination?
You can solve this in two ways, one with grep and one without grep:
ls -a | grep "\.sh"
or
ls *.sh

Resources