My bash script uses so much memory - linux
I was looking for which program is using my memory, where is leak?
And, I founded it, leak is at bash script.
But, how can it possible? Bash script will always alloc new space for each variable assignment?
My bash script is like the following, please let me know how can I correct this problem.
CONF="/conf/my.cfg"
HIGHRES="/data/high.dat"
getPeriod()
{
meas=`head -n 1 $CONF`
statperiod=`echo $meas`
}
(while true
do
lastline=`tail -n 1 $HIGHRES |cut -d"," -f2`
linenumber=`grep -n $lastline $HIGHRES | cut -f1 -d:`
/bin/stat $linenumber
getPeriod
sleep $statperiod
done)
EDIT #1:
The last line of high.dat
2013-02-11,10:59:13,1,0,0,0,0,0,0,0,0,12.340000,0.330000,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,24.730000,24.709990,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
I was unable to verify a memory leak with a close approximation of that script, so maybe the leak isn't actually where you think it is. Consider updating your question with much more info, including a complete working example along with what you did to figure out that you had a memory leak.
That said, you have chosen quite an odd way to find out how many lines a file has. The most usual way would be to use the standard wc tool:
$ wc -l < test.txt
19
$
Note: Use < file instead of passing the file name, since the latter will cause the file name to be written to stdout, and you'll then have to edit it away:
$ wc -l test.txt
19 test.txt
$
Related
Store result of "ps -ax" for later iterating through it
When I do ps -ax|grep myApp I get the one line with PID and stuff of my app. Now, I'ld liked to process the whole result of ps -ax (without grep, so, the full output): Either store it in a variable and grep from it later Or go through the results in a for loop, e.g. like that: for a in $(ps -ax) do echo $a done Unfortunally, this splits with every space, not with newline as |grep does it. Any ideas, how I can accomplish one or the other (grep from variable or for loop)? Important: No bash please, only POSIX, so #!/bin/sh Thanks in advance
Like stated above, while loop can be helpful here. One more useful thing is --no-headers argument which makes ps skip the header. Or - even better - specify the exact columns you need to process, like ps -o pid,command --no-header ax The overall code would look like processes=`ps --no-headers -o pid,command ax` echo "$processes" | while read pid command; do echo "we have process with pid $pid and command line $command" done The only downside to this approach is that commands inside while loop will be executed in subshell so if you need to export some var to the parent process you'll have to do it using inter-process communication stuff. I usually dump the results into temp file created before while loop and read them after the loop is finished.
I found a solution by removing the spaces while executing the command: result=$(ps -aux|sed 's/ /_/g') You can also make it more filter friendly by removing duplicated spaces: result=$(ps -aux| tr -s ' '|sed 's/ /_/g')
How to overwrite previous output in bash
I have a bash script, that outputs top most CPU intensive processes every second to the terminal. tmp=$(ps -e -eo pid,cmd,%mem,%cpu,user --sort=-%cpu | head -n 11) printf "\n%s\n" "$tmp[pid]" I know that I can move my cursor to the predeclared position, but that fails every time terminal is not cleared. I could also just go to the beginning of the line and write over it, but that again makes a problem when current output is shorter that the previous and when the number of lines is not the same as it was at the previous output. Is there a way to completely erase the previous output and write from there?
Yes, you can clear a part of the screen before each iteration (see https://unix.stackexchange.com/questions/297502/clear-half-of-the-screen-from-the-command-line), but the function watch does it for you. Try: watch -n 1 "ps -e -eo pid,cmd,%mem,%cpu,user --sort=-%cpu | head -n 11"
How to view syslog entries since last time I looked
I want to view the entries in Linux /var/log/syslog, but I only want to see the entries since last time I looked (preferably create a bash script to do this). The solution I thought of was to take a copy of syslog and diff it against the last time I took a copy, but this seems unclean because syslog can be big and diff adds artifacts in its output. Im thinking maybe somehow use tail directly on syslog, but I dont know how to do this when I dont know how many lines have been added since last time I tried. Any better thoughts? I would like to be able to redirect the result to a file so I can later interactively grep for specific parts of interest.
Linux has a wc command which can count the number of lines within a file, for example wc -l /var/log/syslog. The bash script below stores the output of the wc -l command in a file called ./prevlinecount. Whenever you want just the new lines in a file it gets the value in ./prevlinecount and subtracts this value from a new instance of wc -l /var/log/syslog called newlinecount. Then it tails (newlinecount - prevlinecount). #!/bin/bash prevlinecount=`cat ./prevlinecount` if [ -z $prevlinecount ]; then echo `wc -l $1 | awk '{ print $1 }' > ./prevlinecount` tail -n +1 $1 else newlinecount=`wc -l $1 | awk '{print $1}'` tail -n `expr $newlinecount - $prevlinecount` $1 echo $newlinecount > ./prevlinecount fi beware this is a very rudimentary script which can only keep track of one file. If you would like to extend this script to multiple files, look into associative arrays. With associative arrays you could keep track of multiple files by having the key as the filename and value being the previous line count. beware too that over time syslog files can be archived after the file reaches a predetermined size (maybe 10MB) and this script does not account for the archival process.
Redirecting linux cout to a variable and the screen in a script
I am currently trying to make a script file that runs multiple other script files on a server. I would like to display the output of these script to the screen IN ADDITION to passing it into grep so I can do error testing. currently I have written this: status=$(SOMEPROCESS | grep -i "SOMEPROCESS started completed correctly") I do further error handling below this using the variable status, so I would like to display SOMEPROCESS's output to the screen for error reference. This is a read only server and I can not save the output to a log file.
You need to use the tee command. It will be slightly fiddly, since tee outputs to a file handle. However you could create a file descriptor using pipe. Or (simpler) for your use case. Start the script without grep and pipe it through tee SOMEPROCESS | tee /my/safely/generated/filename. Then use tail -f /my/safely/generated/filename | grep -i "my grep pattern separately.
You can use process substituion together with tee: SOMEPROCESS | tee >(grep ...) This will use an anonymous pipe and pass /dev/fd/... as file name to tee (or a named pipe on platforms that don't support /dev/fd/...). Because SOMEPROCESS is likely to buffer its output when not talking to a terminal, you might see significant lag in screen output.
I'm not sure whether I understood your question exactly. I think you want to get the output of SOMEPROCESS, test it, print it out when there are errors. If it is, I think the code bellow may help you: s=$(SOMEPROCESS) grep -q 'SOMEPROCESS started completed correctly' <<< $s if [[ $? -ne 0 ]];then # specified string not found in the output, it means SOMEPROCESS started failed echo $s fi But in this code, it will store the all output in the memory, if the output is big enough, there will be a OOM risk.
Grep filtering output from a process after it has already started?
Normally when one wants to look at specific output lines from running something, one can do something like: ./a.out | grep IHaveThisString but what if IHaveThisString is something which changes every time so you need to first run it, watch the output to catch what IHaveThisString is on that particular run, and then grep it out? I can just dump to file and later grep but is it possible to do something like background it and then bring it to foreground and bringing it back but now piped to some grep? Something akin to: ./a.out Ctrl-Z fg | grep NowIKnowThisString just wondering..
No, it is only in your screen buffer if you didn't save it in some other way.
Short form: You can do this, but you need to know that you need to do it ahead-of-time; it's not something that can be put into place interactively after-the-fact. Write your script to determine what the string is. We'd need a more detailed example of the output format to give a better example of usage, but here's one for the trivial case where the entire first line is the filter target: run_my_command | { read string_to_filter_for; fgrep -e "$string_to_filter_for" } Replace the read string_to_filter_for with as many commands as necessary to read enough input to determine what the target string is; this could be a loop if necessary. For instance, let's say that the output contains the following: Session id: foobar and thereafter, you want to grep for lines containing foobar. ...then you can pipe through the following script: re='Session id: (.*)' while read; do if [[ $REPLY =~ $re ]] ; then target=${BASH_REMATCH[1]} break else # if you want to print the preamble; leave this out otherwise printf '%s\n' "$REPLY" fi done [[ $target ]] && grep -F -e "$target" If you want to manually specify the filter target, this can be done by having the loop check for a file being created with filter contents, and using that when starting up grep afterwards.
That is a little bit strange what you need, but you can do it tis way: you must go into script session first; then you use shell how usually; then you start and interrupt you program; then run grep over typescript file. Example: $ script $ ./a.out Ctrl-Z $ fg $ grep NowIKnowThisString typescript
You could use a stream editor such as sed instead of grep. Here's an example of what I mean: $ cat list Name to look for: Mike Dora 1 John 2 Mike 3 Helen 4 Here we find the name to look for in the fist line and want to grep for it. Now piping the command to sed: $ cat list | sed -ne '1{s/Name to look for: //;h}' \ > -e ':r;n;G;/^.*\(.\+\).*\n\1$/P;s/\n.*//;br' Mike 3 Note: sed itself can take file as a parameter, but you're not working with text files, so that's how you'd use it. Of course, you'd need to modify the command for your case.