under mac terminal: List all of the users whom has at least one running process? - linux

how to list all of the users whom has at least one running process.
The user name should not be duplicated.
The user name should be sorted.

$ ps xau | cut -f1 -d " "| sort | uniq | tail -n +2
You may want to weed out names starting with _ as well like so :
ps xau | cut -f1 -d " "| sort | uniq | grep -v ^_ | tail -n +2

users does what is requested. From the man page:
users lists the login names of the users currently on the system, in
sorted order, space separated, on a single line.

Try this:
w -h | cut -d' ' -f1 | sort | uniq
The w -h displays all users in system, without header and some output. The cut part removes all other information without username. uniq ignores duplicate lines.

Related

how to use sort, cut, and unique commands in pipe

I was wondering how do you use the cut, sort, and uniq commands in a pipeline and give a command line that indicates how many users are using each of the shells mentioned in /etc/passwd?
i'm not sure if this is right but
cut -f1 -d':' /etc/passwd | sort -n | uniq
?
Summarizing the answers excruciatingly hidden in comments:
You were close, only
as tripleee noticed, the shell is in the seventh field
as shellter noticed, since the shells are not numbers, -n is useless
as shellter noticed, for the counting, there's uniq -c
That gives
cut -f7 -d: /etc/passwd | sort | uniq -c

WHM server access logs for all accounts

I've had a lot of issues with hacks and DDOS attacks on a few servers, though this is usually caused by some very simple things. However I've found it invaluable to be able to look through an accounts access logs and list the hit pages in order of lowest to highest using the following through ssh cat example.co.uk | cut -d\" -f2 | awk '{print $1 " " $2}' | cut -d? -f1 | sort | uniq -c | sort -n
However this means I need to run this against every single accounts access log. is there a server wide version or script out there to scan all access logs for activity?
You can use your command in for loop to check all domain access logs file.
for i in `cat /etc/trueuserdomains | awk {'print $1'} | cut -d":" -f1`;do echo "Pages list Of $i" ; cat /usr/local/apache/domlogs/$i* | grep GET | cut -d\" -f2 | awk '{print $1 " " $2}' | cut -d? -f1 | sort | uniq -c | sort -n;done > /root/report.txt
Once it's done, Please check your /root/report.txt file.

How to only grep one of each address. Linux

Okay so lets say I have a list of addresses in a text file like this:
https://www.amazon.com
https://www.google.com
https://www.msn.com
https://www.google.com
https://www.netflix.com
https://www.amazon.com
...
There is a whole bunch of other stuff there but basically the issue I am having is that after running this:
grep "https://" addresses.txt | cut -d"/" -f3
I get amazon.com and google.com twice. I want to only get them once. I don't know how to make the search only grep for things that are unique.
Pipe your output to sort and uniq:
grep "https://" addresses.txt | cut -d"/" -f3 | sort | uniq
you can use sort for this purpose.
just add another pipe to your command and use the unique feature of sort to remove duplicates.
grep 'https://' addresses.txt | cut -d"/" -f3 | sort -u
EDIT: you can use sed instead of grep and cut which would reduce your command to
sed -n 's#https://\([^/]*\).*#\1#p' < addresses.txt | sort -u
I would filter the results post-grep.
e.g. using sort -u to sort and then produce a set of unique entries.
You can also use uniq for this, but the input has to be sorted in advance.
This is the beauty of being able to pipe these utilities together. Rather than have a single grepping/sorting/uniq(ing) tool, you get the distinct executables, and you can chain them together how you wish.
grep "https://" addresses.txt | cut -d"/" -f3 | sort | uniq is what you want
with awk you can use only one unix command instead of four with 3 pipes:
awk 'BEGIN {FS="://"}; { myfilter = match($1,/https/); if (myfilter) loggeddomains[$2]=0} END {for (mydomains in loggeddomains) {print mydomains}}' addresses.txt

The number of processes a user is running using bash

I would like to know how I could get the number of processes for each user that is currently logged in.
You could try some variation of this:
ps haux Ou | cut '-d ' -f1 | uniq -c
It gives you the number of processes for each users (being logged in or not). Now you could filter those results using the output of the w command or another way of determining who is logged in.
Give this a try:
ps -u "$(echo $(w -h | cut -d ' ' -f1 | sort -u))" o user= | sort | uniq -c | sort -rn
In order to properly handle usernames that may be longer than eight characters, use users instead of w. The latter truncates usernames.
ps -u "$(echo $(printf '%s\n' $(users) | sort -u))" o user= | sort | uniq -c | sort -rn
ps -u aboelnour | awk 'END {print NR}'
will show number of process which user aboelnour running it
If you are ever concerned about nearing the user process limit shown by ulimit -a, the you want to get ALL the processes (including LWPs). In such a case you should use:
ps h -Led -o user | sort | uniq -c | sort -n
On one system doing this:
ps haux Ou | cut '-d ' -f1 | uniq -c
yields:
# ps haux Ou | cut '-d ' -f1 | uniq -c
30 user1
1 dbus
3 user2
1 ntp
1 nut
1 polkitd
2 postfix
124 root
2 serv-bu+
where doing the former yields the true process count:
# ps h -Led -o user | sort | uniq -c | sort -n
1 ntp
1 nut
2 dbus
2 postfix
2 serv-builder
3 user2
6 polkitd
141 root
444 user1
Just try:
lslogins -o USER,PROC
If you just want a count of processes you can use procfs directly like this:
(requires linux 2.2 or greater)
you can use wc:
number_of_processes=`echo /proc/[0-9]* | wc -w`
or do it in pure bash (no external commands) like this
procs=( /proc/[0-9]* )
number_of_proccesses=${#procs[*]}
If you only want the current userid
procs=( /proc/[0-9]*/fd/. )
number_of_proccesses=${#procs[*]}
userlist=$(w|awk 'BEGIN{ORS=","}NR>2{print $1}'|sed 's/,$//' )
ps -u "$userlist"
Following links contain useful ps commands options including your requirements:
Displaying all processes owned by a specific user
Show All Running Processes in Linux
Here is my solution, for Linux:
$ find /proc –user $USER -maxdepth 1 -name '[0-9]*' | wc –l
This solution will not fail when the number of processes is larger than the command line limit.

Sorting in bash

I have been trying to get the unique values in each column of a tab delimited file in bash. So, I used the following command.
cut -f <column_number> <filename> | sort | uniq -c
It works fine and I can get the unique values in a column and its count like
105 Linux
55 MacOS
500 Windows
What I want to do is instead of sorting by the column value names (which in this example are OS names) I want to sort them by count and possibly have the count in the second column in this output format. So It will have to look like:
Windows 500
MacOS 105
Linux 55
How do I do this?
Use:
cut -f <col_num> <filename>
| sort
| uniq -c
| sort -r -k1 -n
| awk '{print $2" "$1}'
The sort -r -k1 -n sorts in reverse order, using the first field as a numeric value. The awk simply reverses the order of the columns. You can test the added pipeline commands thus (with nicer formatting):
pax> echo '105 Linux
55 MacOS
500 Windows' | sort -r -k1 -n | awk '{printf "%-10s %5d\n",$2,$1}'
Windows 500
Linux 105
MacOS 55
Mine:
cut -f <column_number> <filename> | sort | uniq -c | awk '{ print $2" "$1}' | sort
This will alter the column order (awk) and then just sort the output.
Hope this will help you
Using sed based on Tagged RE:
cut -f <column_number> <filename> | sort | uniq -c | sort -r -k1 -n | sed 's/\([0-9]*\)[ ]*\(.*\)/\2 \1/'
Doesn't produce output in a neat format though.

Resources