I'm trying to make a simple and reliable script, preferably in bash, that is executed every minute using crontab. The script simply has to read the contents of the last couple of lines of an open screen and store them as a var so that I can search for a sub-string. Does anyone know of an easy way to do this, thanks.
You can send the hardcopy command to the screen and read the last line using the tail command:
screen -X hardcopy "~/test.log"
tail -n1 ~/test.log
Related
I need to be able to read the second last line of all the files within a specific directory.
These files are log files and, that specific line contains the status of tasks that ran, 'successful', 'fail', 'warning'.
I need to pull this to dump it after in reports.
At this stage i am looking only to pull the data, so the entire line, and will worry about the handling after.
As the line numbers are not set, they are irregular, I am looking at doing it with a 'while' loop, so it goes through the whole thing, but i am actually not getting the last 2 lines read, and also, i can read 1 file not all of them.
Any ideas on a nice little script to do this?
And anyone knows if this can be just done with just a linux command?
Use the tail command to get the last 2 lines, and then the head command to get the first of these:
for file in $DIR/*; do
tail -2 "$file" | head -1
done
Edit: This question was originally bash specific. I'd still rather have a bash solution, but if there's a good way to do this in another shell then that would be useful to know as well!
Okay, top level description of the problem. I would like to be able to add a hook to bash such that, when a user enters, for example $cat foo | sort -n | less, this is intercepted and translated into wrapper 'cat foo | sort -n | less'. I've seen ways to run commands before and after each command (using DEBUG traps or PROMPT_COMMAND or similar), but nothing about how to intercept each command and allow it to be handled by another process. Is there a way to do this?
For an explanation of why I'd like to do this, in case people have other suggestions of ways to approach it:
Tools like script let you log everything you do in a terminal to a log (as, to an extent, does bash history). However, they don't do it very well - script mixes input with output into one big string and gets confused with applications such as vi which take over the screen, history only gives you the raw commands being typed in, and neither of them work well if you have commands being entered into multiple terminals at the same time. What I would like to do is capture much richer information - as an example, the command, the time it executed, the time it completed, the exit status, the first few lines of stdin and stdout. I'd also prefer to send this to a listening daemon somewhere which could happily multiplex multiple terminals. The easy way to do this is to pass the command to another program which can exec a shell to handle the command as a subprocess whilst getting handles to stdin, stdout, exit status etc. One could write a shell to do this, but you'd lose much of the functionality already in bash, which would be annoying.
The motivation for this comes from trying to make sense of exploratory data analysis like procedures after the fact. With richer information like this, it would be possible to generate decent reporting on what happened, squashing multiple invocations of one command into one where the first few gave non-zero exits, asking where files came from by searching for everything that touched the file, etc etc.
Run this bash script:
#!/bin/bash
while read -e line
do
wrapper "$line"
done
In its simplest form, wrapper could consist of eval "$LINE". You mentioned wanting to have timings, so maybe instead have time eval "$line". You wanted to capture exit status, so this should be followed by the line save=$?. And, you wanted to capture the first few lines of stdout, so some redirecting is in order. And so on.
MORE: Jo So suggests that handling for multiple-line bash commands be included. In its simplest form, if eval returns with "syntax error: unexpected end of file", then you want to prompt for another line of input before proceeding. Better yet, to check for proper bash commands, run bash -n <<<"$line" before you do the eval. If bash -n reports the end-of-line error, then prompt for more input to add to `$line'. And so on.
Binfmt_misc comes to mind. The Linux kernel has a capability to allow arbitrary executable file formats to be recognized and passed to user application.
You could use this capability to register your wrapper but instead of handling arbitrary executable, it should handle all executable.
Is it possible to trigger a command with every new line in to a file?
For example: I have a log file say maillog. I want to get every new entry in to the log file as a mail.
If a new entry like " Mail Sent " added in to maillog file then my script should grep the new entry and send me a mail with the entry(data).
I know its crazy but i want to automate my Linux box with these kind of things.
Regards,
Not so crazy. Check periodically (once per hour, per day, what you like) the file for new parts by storing the original length of the file, compare the length, in case it has grown, handle the part which was appended:
length=0
while sleep 3600 # use wanted delay here
do
new_length=$(find "$file" -printf "%s")
if [ $length -lt $new_length ]
then
tail --bytes=$[new_length-length] "$file" | handle_part
fi
length=$new_length
done
Now you only have to write that handle_part function which could for instance mail its input somewhere.
Using this way (instead of the obvious tail -f) has the advantage that you can store the current length into a file and later on restarting your script read that length again. So you won't get the whole file after a restart of your script (e. g. due to a machine reboot).
If you want a faster response you could have a look at inotify which is a facility on Linux to monitor file actions; so that polling could be replaced.
Use tail -f, that watches a file and sents whatever is appended to it to stdout. If you have a script that performs the desired action, say mail_per_line, then you can set it up as
tail -f maillog | mail_per_line
In this case, mail_per_line runs once and gets all the lines. If you want to spawn a separate process each time a line comes in, use the shell built-in read:
tail -f maillog | while IFS='' read line; do
send_a_message "$line"
done
To counter the effect described by Alfe, that a restart of this program will cause all the previous logs to be processed again, consider using logrotate.
Is is possible to have the history of a specific user in one more file other than the default file mentioned as HISTFILE?
I would like to have as backup file if main file was removed and let it be like a backup for that one.
Regards,
Sriharsha Kalluru.
You can create a hardlink to the file
cp --link --verbose /home/$USER/.bash_history /somewhere/else/users_history
When the original file in his home is removed the file is still there and preserves the content from being lost.
Many times I've found myself using Ctrl-R in Bash to get a old command four times the terminal width, just to find out that too many days have passed and it's no longer in the .bash_history file. Here are two lines that will keep track of every command line you type at the bash prompt and use no external processes at all, just plain bash.
My first approach to this problem was increasing the maximum number of lines in the history to a very large quantity. But no matter how large it was, there was always a moment when I needed a long command I typed many months ago and it had already left the history. The current solution came to my mind when I learned about the PROMPT_COMMAND variable, a command that bash executes before showing each prompt. Here are the two lines:
export HISTTIMEFORMAT="%s "
PROMPT_COMMAND="${PROMPT_COMMAND:+$PROMPT_COMMAND ; }"'echo $$ $USER \
"$(history 1)" >> ~/.bash_eternal_history'
One goal I set to myself was to achieve it without using any external process, so bash wouldn't have to fork a new process after every ENTER pressed at its prompt. The first line sets the format of history lines to include the date as a Unix timestamp, so you can now when you typed every command. The second line, which is the core of the solution, first ensures that, if a previous PROMPT_COMMAND was set, it gets executed before our stuff and then appends a line of the format:
PID USER INDEX TIMESTAMP COMMAND
to a file called .bash_eternal_history in the current user home.
Adding the username, which at first seemed unnecesary, became useful later to distiguish between "sudo -s" sessions and normal sessions which retain the same value for "~/", and so append lines to the same .bash_eternal_history file.
I hope some of you find these two lines as useful as I do. :-)
Would hard links solve your problem?
Also you can read the man page here.
i would write a cronjob that copies the original histfiel to a backuplocation every minute or so. the you don't have to worry about defining a second histfile.
otherwise you could write every command the user enters to a alternate file
for this approach take a look here:
bash, run some command before or after every command entered from console
I've been using screen for quite some time now and I agree, it improves my productivity.But one thing that I really miss is the command history. Anything I type in a screen session doesn't get logged in command history. When I googled for the same I found something related to this issue:
http://www.linuxquestions.org/questions/slackware-14/aliases-lost-when-using-screen-723624/
But surprisingly in my case all the aliases are intact and I'm able to use them without any issues. As far as I know opening a new screen session actually opens a new sub-shell. If this is true, could someone help me how to get the commands typed in screen session to be logged in the command history so that if I open a new terminal/screen later on I'll be able to access the commands from command history using CTRL+R . Any solution that helps me make screen log commands in command history would be very much helpful. Appreciate your time. Thank you.
Assuming a bash shell is being used within the screen.
Insert the 2 statements into ~/.bashrc:
shopt -s histappend
PROMPT_COMMAND="$PROMPT_COMMAND;history -a"
The first command appends the commands to the history file, rather than overwrite it while the second command saves each command right after it has been executed, not at the end of the session.
To expand on my answer.. the history for each bash session that you have open is stored in memory until you logout/close the session. Then it will overwrite the bash history file.
These commands will append to the history file, and then flush to the file after every command.
It's easy to use shared history between sessions in Zsh, and this blog post by Derek Reeve explains how to do it. In short, add this to your ~/.zshrc:
setopt share_history
HISTSIZE=1000
SAVEHIST=1000
HISTFILE=~/.history
setopt APPEND_HISTORY
I also found instructions for doing the same thing on Bash, but I've only tried this on Zsh.