How to overwrite previous output in bash - linux

I have a bash script, that outputs top most CPU intensive processes every second to the terminal.
tmp=$(ps -e -eo pid,cmd,%mem,%cpu,user --sort=-%cpu | head -n 11)
printf "\n%s\n" "$tmp[pid]"
I know that I can move my cursor to the predeclared position, but that fails every time terminal is not cleared.
I could also just go to the beginning of the line and write over it, but that again makes a problem when current output is shorter that the previous and when the number of lines is not the same as it was at the previous output.
Is there a way to completely erase the previous output and write from there?

Yes, you can clear a part of the screen before each iteration (see https://unix.stackexchange.com/questions/297502/clear-half-of-the-screen-from-the-command-line), but the function watch does it for you. Try:
watch -n 1 "ps -e -eo pid,cmd,%mem,%cpu,user --sort=-%cpu | head -n 11"

Related

Bash script- number of processes sorted

im trying to improve my bash on my road to become a DevOps,
one of my excercises states that i should be able to
1-Write a bash script using Vim editor that checks all the processes running for the current user
2-Extend the previous script to ask for a user input for sorting the processes output either by memory or CPU consumption, and print the sorted list.
3-Extend the previous script to ask additionally for user input about how many processes to print. Hint: use head program to limit the number of outputs.
The error message is a syntax error, it doesnt seem to work, any ideas or tips ?
#!/bin/bash
read -p "Press 1- to sort by memory OR 2 to sort by CPU consumption" sorting
read -p "how much output should be displayed, choose a number between 1-9 ?" output
if[$sorting = 1];
ps auck-%mem | head -n $output | grep kami
else
ps auck-%cpu | head -n $output | grep kami
fi
Cheers
Kami

Bash while read loop: output never changes; being cached?

***FINAL ANSWER:
dbus-monitor --profile "interface='net.sacredchao.QuodLibet',member='SongStarted'" |
while read -r line; do
sleep 1
echo -e "$(grep '~filename=' $HOME/.quodlibet/current | sed 's/~filename=//')\n$(python2 -c "import sys, urllib as ul; print ul.unquote_plus(sys.argv[1])" "$(quodlibet --print-queue | sed 's|file://||g')")" > $HOME/Dropbox/Playlists/queue
done
There were issues with race conditions and clobbering a file with more than one write, and I also replaced the echo -e with a python command since, quodlibet being made with python, it was probably parsed into the parsing I don't want with python in the first place. It all works now. I've finished giving quodlibet functionality that most music players already have: a queue that actually survives the program crashing and doesn't require constant tinkering and babysitting to just play each song exactly once. sigh
------------------------Original Question:
I'm using Arch Linux, and I've made a script that's supposed to save quodlibet's queue every time a song finishes/starts playing. Everything works EXCEPT for one thing (so you don't have to check all those sed commands for errors or anything). If I run all this stuff manually in a terminal, without any while loop or DBus--just manually writing to the files every time I notice a song ending--it works fine. Once I put it in the while loop, it still writes to the file every time a song starts, but it writes the exact same file every single time until I restart the script. I think it's caching the outputs of these commands and not ever updating that cache.
So, let's spell this out: quodlibet has its currently playing song and the queue of what to play next. Here's the script:
while read -r line; do
grep '~filename=' $HOME/.quodlibet/current \
| sed 's/~filename=//' > $HOME/Dropbox/Playlists/queue
echo -e "$(quodlibet --print-queue | sed 's|%|\\\x|g' | sed 's|file://||g')" \
>> $HOME/Dropbox/Playlists/queue
done <<< "$(dbus-monitor --profile "interface='net.sacredchao.QuodLibet',member='SongStarted'")"
Every time a song changes, what's supposed to happen is those two commands write the now-playing song and the queue to a file. What happens instead is every time a song changes, those two commands write what WAS the now-playing song and queue the FIRST time that while loop wrote to the file.
So let's say the now-playing is track 1 and tracks 2 thru 10 are in the queue. The script writes that to the file.
Then, track 1 finishes. Now, now-playing is track 2 and the queue is tracks 3 thru 10. However, even though the loop notices the song change and writes to the file, what it writes instead is track 1 as now-playing and tracks 2 thru 10 as the queue. Again. And on and on, and before you know it, it's playing track 10 and the queue is empty, but the file still has all 10 tracks in it.
I tried running the exact same commands inside the loop manually, myself, in a terminal outside the loop, immediately after a song change, while the loop script was running. The file would reflect what it was supposed to. But then when the song changed again, the loop would catch that and rewrite all 10 tracks, again. In other words, these exact commands only don't do what I want when they're inside the loop. SOMEthing is definitely being cached and never updated, here.
EDIT: It looks like I need to clarify some things.
1) Despite how this shouldn't be, my script behaves the exact same way whether it's:
while read -r line; do
grep '~filename=' $HOME/.quodlibet/current \
| sed 's/~filename=//' > $HOME/Dropbox/Playlists/queue
echo -e "$(quodlibet --print-queue | sed 's|%|\\\x|g' | sed 's|file://||g')" \
>> $HOME/Dropbox/Playlists/queue
done <<< "$(dbus-monitor --profile "interface='net.sacredchao.QuodLibet',member='SongStarted'")"
Or:
dbus-monitor --profile "interface='net.sacredchao.QuodLibet',member='SongStarted'" |
while read -r line; do
grep '~filename=' $HOME/.quodlibet/current \
| sed 's/~filename=//' > $HOME/Dropbox/Playlists/queue
echo -e "$(quodlibet --print-queue | sed 's|%|\\\x|g' | sed 's|file://||g')" \
>> $HOME/Dropbox/Playlists/queue
done
2) I tested a simpler version out before bringing the more complex stuff in. Here's a script I made:
while true; do
echo "$(cat /tmp/foo | sed 's/b/n/g')"
done
I tried changing the file /tmp/foo in the middle of that loop running, just as the output of the quodlibet commands changes on its own, and it updated just fine. The output changed like it should.
This wasn't what I started with, but it's the last one I made before moving on to the actual script. You'll see it incorporates everything I'm doing except for 4 things: quodlibet, the dbus command, >saving or >>appending to a file, and the while-read combo. One of those is making the output constantly the same no matter what changes in the environment, and I think we can rule out quodlibet, since, as I said before, running those commands manually works fine.
EDIT 2: Welp, I haven't been scrolling down on the file, but I did just now. This issue just got more complicated but probably easier to solve. It's not writing the exact same output every time at all. It's somehow skipping the line that overwrites the file--the one that starts with grep--and JUST appending the output of the second line, the echo -e.
EDIT 3: And now I'm stumped again. When I copy and paste the exact two lines right out of the while loop and into a bash Terminal, they do what I want. But in the while-loop, the first grep command never actually writes to that file. I thought maybe it was inexplicably eating the first command, so I tried adding an empty echo beforehand:
dbus-monitor --profile "interface='net.sacredchao.QuodLibet',member='SongStarted'" |
while read -r line; do
echo
grep '~filename=' $HOME/.quodlibet/current \
| sed 's/~filename=//' > $HOME/Dropbox/Playlists/queue
echo -e "$(quodlibet --print-queue | sed 's|%|\\\x|g' | sed 's|file://||g')" \
>> $HOME/Dropbox/Playlists/queue
done
but it's still eating that first grep. Why won't it save the file?
EDIT 4: I've confirmed that the grep command is definitely outputting the correct output, but just won't write it to the file.
In an effort to decouple the specific thing you're trying to do from the problem you perceive, I'm going to suggest trying out some independent commands before trying to use anything fancy.
while read -r line ; do
echo $line
done <<< $(yes)
Note that this will never output anything. The input provided by $(yes) never completes so the outer loop will never print anything. This was pointed out by at least one of the comments to your main post.
Instead of using a capture variable, try piping the output directly to your loop. I can't say this will work for you, but it at least fixes one of the obvious problems of streaming output to a while loop.
yes | while read -r line ; do
echo $line
done
Based on the other debugging efforts here, the problem appears to be a race condition between the signal to dbus about the song transition and the writing of the new "current" song to disk. In an effort to provide a more complete solution (untested of course), here is what I might start with:
function urldecode() {
python -c "import sys, urllib as ul; print ul.unquote_plus(sys.argv[1])" "$#"
}
dbus-monitor --profile "interface='net.sacredchao.QuodLibet',member='SongStarted'" | \
while read -r line; do
sleep 1 # Adjust as necessary to preventing race conditions
$CurrentSongFile=$(grep '~filename=' $HOME/.quodlibet/current | sed 's/~filename=//')
$SongFileQueue=$(urldecode "$(quodlibet --print-queue)") | sed 's_file://__g')
printf "$CurrentSongFile\n$SongFileQueue" > $HOME/Dropbox/Playlists/queue
done

Continuous grep, output at same spot on console

I use
tail -f file | grep pattern
all the time for continuous grep.
However, is there a way I can make grep output its pattern at the same spot, say at the top of the screen? so that the screen doesn't scroll all the time?
My case is something like this: tail -f log_file | grep Status -A 2 will show the current status and what changed it to that status. The problem is the screen scrolls and it becomes annoying. I'd rather have the output stuck on the first 3 lines in the screen.
Thank you!
you could use the watch command; which will always execute the same command, but the position on the screen will stay the same. The process might eat some more CPU or memory though:
watch "tail file | grep pattern"
by default watch executes that command every 2 seconds. You can adjust up to 0.1 seconds using:
watch -n 0.1
NOTE
As noted by #etanReisner: this is not exactly the same as tail -f: tail -f will change immediately if something is added to your logfile, the watch command will only notice that when it executes, ie every 2 (or 0.1 seconds).
Assuming you are using a vt100 compatible emulator...
This command will tail a file, pipe it into grep, read it a line at a time and then display it in reverse on the top line of the screen:
TOSL=$(tput sc;tput cup 0 0;tput rev;tput el)
FROMSL=$(tput sgr0; tput rc)
tail -f file | grep --line-buffered pattern | while read line
do
echo -n "$TOSL${line}$FROMSL"
done
It assumes your output appears a line at a time. If you want more than one line, you can read more than a line, but you have to decide how you want to buffer the output. You could also use the csr terminfo command to set up an entire separate scrolling region instead of just having one line.
Here is the scrolling region version with a ten line status area at the top:
TOSL=$(tput sc; tput csr 0 10; tput cup 10 0;tput rev;tput el)
FROMSL=$(tput sgr0; tput rc;tput csr 10 50;tput rc)
tail -f file | grep --line-buffered pattern | while read line
do
echo -n "$TOSL${line}
$FROMSL"
done
Note that it is not impossible that your display will be corrupted from time-to-time as it could be that the output from your main shell and your background task get mixed up.
Simply replace the newlines with carriage returns.
tail -f file | grep --line-buffered whatever | tr '\012' '\015'
The line buffering is to avoid jumpy output; see http://mywiki.wooledge.org/BashFAQ/009
This is quick and dirty. As noted in comments, this will leave the previous contents of the line underneath, so a shorter line will not completely overlay a longer line. You could add some control codes to address that, but then you might as well use Curses for the formatting too, like in rghome's answer.

How can you read the most recent line from the linux program screen?

I use screen to run a minecraft server .jar, and I would like to write a bash script to see if the most recent line has changed every five minutes or so. If it has, then the script would start from the beginning and make the check again in another five minutes. If not, it should kill the java process.
How would I go about getting the last line of text from a screen via a bash script?
If I have understand, you can redirect the output of your program in a file and work on it, with the operator >.
Try to run :
ls -l > myoutput.txt
and open the file created.
You want to use the tail command. tail -n 1 will give you the last line of the file or redirected standard output, while tail -f will keep the tail program going until you cancel it yourself.
For example:
echo -e "Jello\nPudding\nSkittles" | tail -n 1 | if grep -q Skittles ; then echo yes; fi
The first section simply prints three lines of text:
Jello
Pudding
Skittles
The tail -n 1 finds the last line of text ("Skittles") and passes that to the next section.
grep -q simply returns TRUE if your pattern was found or FALSE if not, without actually dumping or outputting anything to screen.
So the if grep -q Skittles section will check the result of that grep -q Skittles pattern and, if it found Skittles, prints 'yes' to the screen. If not, nothing gets printed (try replacing Skittles with Pudding, and even though it was in the original input, it never made it out the other end of the tail -n 1 call).
Maybe you can use that logic and output your .jar to standard output, then search that output every 5 minutes?

How to capture the output of a top command in a file in linux?

I want to write the output of a specific 'top' command to a file. I did some googling and find out that it can be done by using the following command.
top -n 10 -b > top-output.txt
where -n is to specify the number of iterations and -b is for batch mode. This works very well if let top for the 10 iterations. But if i break the running of the command with a Ctrl-C, the output file seems to be empty.
I won't be knowing the number of iterations beforehand, so i need to break it manually. How can i capture the output of top in a file without specifying iterations?
The command which I am trying to use precisely is
top -b | grep init > top-output.txt
and break it whenever i want. But it doesn't work.
EDIT: To give more context to the question, I have a Java Code which invokes a tool with an Input File. As in the tool takes a file as an input and runs for some time, then takes the next file and so on. I have a set of 100,000 files which need to be fed to the tool. So now I am trying to monitor that specific tool ( It runs as a process in Linux). I cannot capture the whole of 'top' s data as the file as would be too huge with unwanted data. How to capture the system stats of just that process and write it to a file using top?
for me top -b > test.txt will store all output from top ok even if i break it with ctrl-c. I suggest you dump first, and then grep the resulting file.
How about using while loop and -n 1:
while sleep 3; do
top -b -n1 | grep init > top-output.txt
done
It looks like the output is not writing to the file until all iterations are finished. You could solve this by wrapping with an external loop like this:
touch top-output.txt
while true; do
top -b | grep init >> top-output.txt
done
Here is the 1-liner I like to use on my mac:
top -o -pid -l 1 | grep "some regexp"
Cheers.
As pointed out by #Thor in a comment, you just need to ensure that grep is not buffering arbitrarily but per-line with the --line-buffered option:
top -bn 10 | grep 'init' --line-buffered | tee top-output.txt
Without grep-ing, redirecting the output of top to a file works just fine, interrupt included.
Solved this issue. This works even if you press Ctrl+c Even I was facing the same issue when I wanted to log Cpu%.
Execute this shell script:
#!/bin/sh
while true; do
echo "$(top -b -n 1 | grep init)" | tee -a top-output.log
sleep 1
done
You can grep anything you wanna extract out of top command, use this script to store it to a file.
-b : Batch mode operation
Starts top in Batch mode, which could be useful for sending output from top to other programs or
to a file. In this mode, top will not accept input and runs until the iterations limit you've set
with the -n command-line option or until killed.
-n number, this option specifies the maximum number of iterations, or frames, top should produce before ending. Here I've used -n 1.
Do man top for more details
tee -a enables the output to be visible on the console and also stores the output onto the file. -a option appends the output to the file.
Here, I have given an interval of 1 second. You can mention any other interval.
Source for explanations of -b and -n: manpages
man top
Kruthika
CTRL+C is not a ideal solution due to control stays in CLI. You can use below command which dumps top output to a file:
top -n 1 -b > top-output.txt
I had the exact same problem...
here was my line:
top -b -u myUser | grep -v Prog.sh | grep Prog > myFile.txt
It would create myFile.txt but it would be empty when I Ctrl+C'd it. So after I kicked off my top command, then I started a SECOND top process. When I found the first top's PID (took some trial and error), and I killed it thru the second top, the first top wrote to the file as expected.
Hope that helps!
If you wish to run the top command in background (just not to worry about logout/sleep, etc) - you can make use of nohup or batch job or cron or screen.
Using nohup (stands for : No Hang Up):
Say suppose if you save the top command in a file called top-exec.sh with following content :
top -p <PID> -b > /tmp/top.log
You can replace the top command for whatever process you are interested in.
Then, You can execute top-exec.sh using nohup as follows :
$> nohup top-exec.sh &
This will redirect all the output of top command to a file named "top.log".
Set the -n argument to 1 it tells top how many frames it will produce before exits.
top -b -n 1 > ~/mytopview.txt
or even
myvar=`top -b -n 1`
echo $myvar
From the top command, we can see all the processes with their PID (Process ID).
To print top output for only one process, use the following command:
$ top –p PID
To save top command of any process to a file, use the following command:
top -p $PROCESS_ID -b > top.log
where > redirects standard output to a file.

Resources