Show first 5 lines every file without name - linux

I need to show first 5 lines of every file inside my home folder but without showing name of the file. I know that has something to do with head -n 5 command and I know I can list files using ls -al|grep ^- but I don't know how to combine that knowledge to solve my problem. Any tips?

This uses find to find all regular files in the home dir without (recursing into subdirectories), and passes them on to head:
find ~ -maxdepth 1 -type f -exec head -q -n 5 '{}' '+'

Related

how to efficiently find if a linux directory including sudirectories has at least 1 file

In my project various jobs are created as files in directories inside subdirectories.
But usually the case is I find that the jobs are mostly in some dirs and not in the most others
currently I use
find $DIR -type f | head -n 1
to know if the directory has atleast 1 file , but this is a waste
how to efficiently find if a linux directory including sudirectories has at least 1 file
Your code is already efficient, but perhaps the reason is not obvious. When you pipe the output of find to head -n 1 you probably assume that find lists all the files and then head discards everything after the first one. But that's not quite what head does.
When find lists the first file, head will print it, but when find lists the second file, head will terminate itself, which sends SIGPIPE to find because the pipe between them is closed. Then find will stop running, because the default signal handler for SIGPIPE terminates the program which receives it.
So the cost of your pipelined commands is only the cost of finding two files, not the cost of finding all files. For most obvious use cases this should be good enough.
Try this
find -type f -printf '%h\n' | uniq
The find part finds all files, but prints only the directory. The uniq part eliminates duplicates.
Pitfall: It doesn't work (like your example) for files containing a NEWLINE in the directory path.
This command finds the first subdiretory containing at least one file and then stop:
find . -mindepth 1 -type d -exec bash -c 'c=$(find {} -maxdepth 1 -type f -print -quit);test "x$c" != x' \; -print -quit
The first find iterates through all subdirectories and second find finds the first file and then stop.

How to search for files ending/starting/containing a certain letter in terminal?

I have been looking all over the internet to help me with this. I want to list all files that start/end/contain a certain letter but the results I found on the internet do not seem to work for me. I need to use the ls command for this (assignment).
I tried this code from another question:
ls abc* # list all files starting with abc---
ls *abc* # list all files containing --abc--
ls *abc # list all files ending with --abc
but when ever I try any of those it comes back with "ls: cannot access '*abc': No such file or directory"
Use find for finding files:
find /path/to/folder -maxdepth 1 -type f -name 'abc*'
This will give you all regular filenames within /path/to/folder which start with abc.

Counting number of files in a directory with an OSX terminal command

I'm looking for a specific directory file count that returns a number. I would type it into the terminal and it can give me the specified directory's file count.
I've already tried echo find "'directory' | wc -l" but that didn't work, any ideas?
You seem to have the right idea. I'd use -type f to find only files:
$ find some_directory -type f | wc -l
If you only want files directly under this directory and not to search recursively through subdirectories, you could add the -maxdepth flag:
$ find some_directory -maxdepth 1 -type f | wc -l
Open the terminal and switch to the location of the directory.
Type in:
find . -type f | wc -l
This searches inside the current directory (that's what the . stands for) for all files, and counts them.
The fastest way to obtain the number of files within a directory is by obtaining the value of that directory's kMDItemFSNodeCount metadata attribute.
mdls -name kMDItemFSNodeCount directory_name -raw|xargs
The above command has a major advantage over find . -type f | wc -l in that it returns the count almost instantly, even for directories which contain millions of files.
Please note that the command obtains the number of files, not just regular files.
I don't understand why folks are using 'find' because for me it's a lot easier to just pipe in 'ls' like so:
ls *.png | wc -l
to find the number of png images in the current directory.
I'm using tree, this is the way :
tree ph

Regarding searching a keyword in all files in particular directory in linux

I want to search a word suppose "abcd" in all the files(Including hidden and all possible files) in dir suppose /home/john/?
This is what I tried, I am running the below command and its getting stuck for more than 24 hours.
command --> find /home/john -type f -exec grep -iH 'abcd' {} \;
Result something which will show all the files which have this particular word or any file which is have the name as our search word.
Thanks
What about using grep recursion option ?
grep -r abcd /home/john

find -exec doesn't recognize argument

I'm trying to count the total lines in the files within a directory. To do this I am trying to use a combination of find and wc. However, when I run find . -exec wc -l {}\;, I recieve the error find: missing argument to -exec. I can't see any apparent issues, any ideas?
You simply need a space between {} and \;
find . -exec wc -l {} \;
Note that if there are any sub-directories from the current location, wc will generate an error message for each of them that looks something like that:
wc: ./subdir: Is a directory
To avoid that problem, you may want to tell find to restrict the search to files :
find . -type f -exec wc -l {} \;
Another note: good idea using the -exec option . Too many times people pipe commands together thinking to get the same result, for instance here it would be :
find . -type f | xargs wc -l
The problem with piping commands in such a manner is that it breaks if any files has spaces in it. For instance here if a file name was "a b" , wc would receive "a" and then "b" separately and you would obviously get 2 error messages: a: no such file and b: no such file.
Unless you know for a fact that your file names never have any spaces in them (or non-printable characters), if you do need to pipe commands together, you need to tell all the tools you are piping together to use the NULL character (\0) as a separator instead of a space. So the previous command would become:
find . -type f -print0 | xargs -0 wc -l
With version 4.0 or later of bash, you don't need your find command at all:
shopt -s globstar
wc -l **/*
There's no simple way to skip directories, which as pointed out by Gui Rava you might want to do, unless you can differentiate files and directories by name alone. For example, maybe directories never have . in their name, while all the files have at least one extension:
wc -l **/*.*

Resources