I'm trying to build a script that checks that windows under XFCE are minimized before displaying a window I've chosen (it's a part from a bigger project)
I tried to recover the count of the open windows with wmctrl
but these are not the minimized windows :
CURRWORKSPACE=$(wmctrl -d | grep '*' | cut -d ' ' -f1)
OPENWINDOWS=$(wmctrl -l | cut -d ' ' -f3 | grep $CURRWORKSPACE | wc -l)
I also try with xdotool, but without success :(
I was wondering if you knew of any way to get this information.
I'm on XFCE, but another way with any tool would be great
Many thanks !
Given a window and its id as listed by wmctrl you can use the following function to determine whether that window is minimized. Note that minimized windows are called iconic in X.
# usage: isMinimized <windowId>
# returns status 0 if and only if window with given id is minimized
isMinimized() {
xprop -id "$1" | grep -Fq 'window state: Iconic'
}
In order to count open windows you can loop over the list of window ids.
openWindows() {
count=0
for id in $(wmctrl -l | cut -f1 -d' '); do
isMinimized "$id" || ((count++))
done
echo $count
}
At least on my desktop environment (Cinnamon) some "windows" were always open. These windows are for instance the Desktop. I adjusted the function by filtering out these windows before looping over them. Since they were sticky and I normally don't use sticky windows I just neglected all sticky windows: $(wmctrl -l | grep -vE '^0x\w* -1' | cut -f1 -d' ').
You can adjust the filtering to your needs. In this case all open and non-sticky windows on all workspaces/desktops are counted.
How can use one ssh for two commands to save 2 different files. Like one command is ps -ef | grep Consumer | cut -f6 -d' ' and save this output in file.log, second command is ps -ef | grep Test | cut -f7 -d' ' and save output in test.log
Only ps -ef needs to be run on the remote system. Parsing the output can happen at local system.
& It's easier with awk. Needs just single ssh session & ps -ef command snapshot:
ssh user#host ps -ef | awk -F' ' '/Consumer/{print $6 > "file.log"}; /Test/{print $7 > "test.log"}'
grep+cut can happen within awk - '/pattern/{print $n}'
File redirection can also happen easily with awk. Check the syntax in above answer.
I would rather prefer to do parsing on remote system only as ps -ef gives a big output and if we don't parse and cut it over there then entire output is transferred from remote system to local system over network. It can take more time if output size increases. And we don't even need entire output on local system too so it is better parse it on remote system only.
ssh user#host ps -ef | grep Consumer | cut -f6 -d' ' > file.log; ps -ef | grep Test | cut -f7 -d' ' > test.log
I need create a shell script to list the process by status type.
The output must be something like:
Process running:
[process]
Process sleeping:
[process]
ETC
I did this, but doesnt work the ps aux | awk '$8 ~ PROCESS':
for PROCESS in `ps -v | awk 'NR!=1 {print $2}' | sort -u`; do
echo "Procesos como $PROCESS:"
ps aux | awk '$8 ~ PROCESS'
done
Cause that script outputs all the process, not filter by Process.
Any help?
A simple solution would be to use ps and sort:
ps u | sort -rk 8
-r reverses the sort (so that the list header remains above), and -k 8 selects the 8th field (STAT).
You can then select processes in a specific state using anything form head to awk, and print out whatever you like.
You can also use top, in non-interactive mode ( the -S option to display and sort by state):
top -b -n 1 -S
Is there any way to specify a field delimiter for more spaces with the cut command? (like " "+) ?
For example: In the following string, I like to reach value '3744', what field delimiter I should say?
$ps axu | grep jboss
jboss 2574 0.0 0.0 3744 1092 ? S Aug17 0:00 /bin/sh /usr/java/jboss/bin/run.sh -c example.com -b 0.0.0.0
cut -d' ' is not what I want, for it's only for one single space.
awk is not what I am looking for either, but how to do with 'cut'?
thanks.
Actually awk is exactly the tool you should be looking into:
ps axu | grep '[j]boss' | awk '{print $5}'
or you can ditch the grep altogether since awk knows about regular expressions:
ps axu | awk '/[j]boss/ {print $5}'
But if, for some bizarre reason, you really can't use awk, there are other simpler things you can do, like collapse all whitespace to a single space first:
ps axu | grep '[j]boss' | sed 's/\s\s*/ /g' | cut -d' ' -f5
That grep trick, by the way, is a neat way to only get the jboss processes and not the grep jboss one (ditto for the awk variant as well).
The grep process will have a literal grep [j]boss in its process command so will not be caught by the grep itself, which is looking for the character class [j] followed by boss.
This is a nifty way to avoid the | grep xyz | grep -v grep paradigm that some people use.
awk version is probably the best way to go, but you can also use cut if you firstly squeeze the repeats with tr:
ps axu | grep jbos[s] | tr -s ' ' | cut -d' ' -f5
# ^^^^^^^^^^^^ ^^^^^^^^^ ^^^^^^^^^^^^^
# | | |
# | | get 5th field
# | |
# | squeeze spaces
# |
# avoid grep itself to appear in the list
I like to use the tr -s command for this
ps aux | tr -s [:blank:] | cut -d' ' -f3
This squeezes all white spaces down to 1 space. This way telling cut to use a space as a delimiter is honored as expected.
I am going to nominate tr -s [:blank:] as the best answer.
Why do we want to use cut? It has the magic command that says "we want the third field and every field after it, omitting the first two fields"
cat log | tr -s [:blank:] |cut -d' ' -f 3-
I do not believe there is an equivalent command for awk or perl split where we do not know how many fields there will be, ie out put the 3rd field through field X.
Shorter/simpler solution: use cuts (cut on steroids I wrote)
ps axu | grep '[j]boss' | cuts 4
Note that cuts field indexes are zero-based so 5th field is specified as 4
http://arielf.github.io/cuts/
And even shorter (not using cut at all) is:
pgrep jboss
One way around this is to go:
$ps axu | grep jboss | sed 's/\s\+/ /g' | cut -d' ' -f3
to replace multiple consecutive spaces with a single one.
Personally, I tend to use awk for jobs like this. For example:
ps axu| grep jboss | grep -v grep | awk '{print $5}'
As an alternative, there is always perl:
ps aux | perl -lane 'print $F[3]'
Or, if you want to get all fields starting at field #3 (as stated in one of the answers above):
ps aux | perl -lane 'print #F[3 .. scalar #F]'
If you want to pick columns from a ps output, any reason to not use -o?
e.g.
ps ax -o pid,vsz
ps ax -o pid,cmd
Minimum column width allocated, no padding, only single space field separator.
ps ax --no-headers -o pid:1,vsz:1,cmd
3443 24600 -bash
8419 0 [xfsalloc]
8420 0 [xfs_mru_cache]
8602 489316 /usr/sbin/apache2 -k start
12821 497240 /usr/sbin/apache2 -k start
12824 497132 /usr/sbin/apache2 -k start
Pid and vsz given 10 char width, 1 space field separator.
ps ax --no-headers -o pid:10,vsz:10,cmd
3443 24600 -bash
8419 0 [xfsalloc]
8420 0 [xfs_mru_cache]
8602 489316 /usr/sbin/apache2 -k start
12821 497240 /usr/sbin/apache2 -k start
12824 497132 /usr/sbin/apache2 -k start
Used in a script:-
oldpid=12824
echo "PID: ${oldpid}"
echo "Command: $(ps -ho cmd ${oldpid})"
Another way if you must use cut command
ps axu | grep [j]boss |awk '$1=$1'|cut -d' ' -f5
In Solaris, replace awk with nawk or /usr/xpg4/bin/awk
I still like the way Perl handles fields with white space.
First field is $F[0].
$ ps axu | grep dbus | perl -lane 'print $F[4]'
My approach is to store the PID to a file in /tmp, and to find the right process using the -S option for ssh. That might be a misuse but works for me.
#!/bin/bash
TARGET_REDIS=${1:-redis.someserver.com}
PROXY="proxy.somewhere.com"
LOCAL_PORT=${2:-6379}
if [ "$1" == "stop" ] ; then
kill `cat /tmp/sshTunel${LOCAL_PORT}-pid`
exit
fi
set -x
ssh -f -i ~/.ssh/aws.pem centos#$PROXY -L $LOCAL_PORT:$TARGET_REDIS:6379 -N -S /tmp/sshTunel$LOCAL_PORT ## AWS DocService dev, DNS alias
# SSH_PID=$! ## Only works with &
SSH_PID=`ps aux | grep sshTunel${LOCAL_PORT} | grep -v grep | awk '{print $2}'`
echo $SSH_PID > /tmp/sshTunel${LOCAL_PORT}-pid
Better approach might be to query for the SSH_PID right before killing it, since the file might be stale and it would kill a wrong process.
how to list all of the users whom has at least one running process.
The user name should not be duplicated.
The user name should be sorted.
$ ps xau | cut -f1 -d " "| sort | uniq | tail -n +2
You may want to weed out names starting with _ as well like so :
ps xau | cut -f1 -d " "| sort | uniq | grep -v ^_ | tail -n +2
users does what is requested. From the man page:
users lists the login names of the users currently on the system, in
sorted order, space separated, on a single line.
Try this:
w -h | cut -d' ' -f1 | sort | uniq
The w -h displays all users in system, without header and some output. The cut part removes all other information without username. uniq ignores duplicate lines.