Assigning the output of the "at" command in a bash script - linux

I'm trying to capture the results of the "at" command inside a Bash script. The various ways of capturing command output don't seem to work, but I'm not sure if it's the pipe in the command or something else.
echo $cmd | at $deployat
produces the output
job 42 at 2014-04-03 12:00
And I'm trying to get at the time the job was set for.
However, I expected something like
v=$($cmd | at $deployat)
echo $v
Would work, or
v=$(echo $cmd | at $deployat)
echo $v
Or
v=`$cmd | at $deployat`
echo $v
But all of those leave the script hung, looking like it's waiting for some input.
What is the proper way to do this to end up with a variable like:
2014-04-03 12:00
============================
Edit:
One possible complication is that the $cmd has flags with it:
ls -l
for example.
The expanded command could be something like:
echo ls -l | at noon tomorrow
Solution:
v=$(echo $cmd | at $deployat 2>&1)
echo $v

at prints its output to stderr not stdout. Use 2>&1 to pipe the stderr of at into stdout. Example:
~$ out=$(echo cat Hello | at -v 2014-04-03 2>&1 | head -n 1)
~$ echo $out
Thu Apr 3 01:21:00 2014
With -v it prints the execution time on the first line which is taken by head -n 1.

Related

How to store elapsed time of a variable assignment to another variable in bash script?

I wanted to run a maven command and store the console output to a variable and in turn, store the real time of the mentioned operation in another variable. I wrote the following command -
x1=`( time t1=$( mvn test -Drat.skip)) 2>&1 | grep real`
When I echo variable x1 I get 0m17.430s which is the desired output but when I echo variable t1 it prints nothing! How can I store the console output of mvn test -Drat.skip in t1?
Everything inside of () or backticks happens in a subshell. Variable values aren't exported from a subshell back to the parent shell.
You can assign both the output of the command and output of time into a variable and then extract it from there:
#!/bin/bash
all=$((time mvn test -Drat.skip )2>&1)
time=$(tail -n3 <<< "$all" | grep real)
output=$(head -n-3 <<< "$all")
As #choroba said t1 is created in different subshell and can't be exported back.
You cat test it like this:
t1=test
x1=`(time t1=$(echo ok); echo $t1) 2>&1`
echo $t1
echo $x1
The output will be:
$ echo $t1
test
$ echo $x1
real 0m0,001s user 0m0,001s sys 0m0,001s ok
But this litle hack may help
fun () { t1=$(mvn test -Drat.skip); }
x1=$((time fun) 2>&1 | grep real)

Bash output to screen and logfile differently

I have been trying to get a bash script to output different things on the terminal and logfile but am unsure of what command to use.
For example,
#!/bin/bash
freespace=$(df -h / | grep -E "/" | awk '{print $4}')
greentext="\033[32m"
bold="\033[1m"
normal="\033[0m"
logdate=$(date +"%Y%m%d")
logfile="$logdate"_report.log
exec > >(tee -i $logfile)
echo -e $bold"Quick system report for "$greentext"$HOSTNAME"$normal
printf "\tSystem type:\t%s\n" $MACHTYPE
printf "\tBash Version:\t%s\n" $BASH_VERSION
printf "\tFree Space:\t%s\n" $freespace
printf "\tFiles in dir:\t%s\n" $(ls | wc -l)
printf "\tGenerated on:\t%s\n" $(date +"%m/%d/%y") # US date format
echo -e $greentext"A summary of this info has been saved to $logfile"$normal
I want to omit the last output (echo "A summary...") in the logfile while displaying it in the terminal. Is there a command to do so? It would be great if a general solution can be provided instead of a specific one because I want to apply this to other scripts.
EDIT 1 (after applying >&6):
Files in dir: 7
A summary of this info has been saved to 20160915_report.log
Generated on: 09/15/16
One option:
exec 6>&1 # save the existing stdout
exec > >(tee -i $logfile) # like you had it
#... all your outputs
echo -e $greentext"A summary of this info has been saved to $logfile"$normal >&6
# writes to the original stdout, saved in file descriptor 6 ------------^^^
The >&6 sends echo's output to the saved file descriptor 6 (the terminal, if you're running this from an interactive shell) rather than to the output path set up by tee (which is on file descriptor 1). Tested on bash 4.3.46.
References: "Using exec" and "I/O Redirection"
Edit As OP found, the >&6 message is not guaranteed to appear after the lines printed by tee off stdout. One option is to use script, e.g., as in the answers to this question, instead of tee, and then print the final message outside of the script. Per the docs, the stdbuf answers to that question won't work with tee.
Try a dirty hack:
#... all your outputs
echo >&6 # <-- New line
echo -e $greentext ... >&6
Or, equally hackish, (Note that, per OP, this worked)
#... all your outputs
sleep 0.25s # or whatever time you want <-- New line
echo -e ... >&6

while loop in ssh going infinite in shell scripting [duplicate]

I'm a shell script newbie, so I must be doing something stupid, why won't this work:
#!/bin/sh
myFile=$1
while read line
do
ssh $USER#$line <<ENDSSH
ls -d foo* | wc -l
count=`ls -d foo* | wc -l`
echo $count
ENDSSH
done <$myfile
Two lines should be printed, and each should have the same value... but they don't. The first print statement [the result of ls -d foo* | wc -l] has the correct value, the second print statement is incorrect, it always prints blank. Do I need to do something special to assign the value to $count?
What am I doing wrong?
Thanks
#!/bin/sh
while read line; do
echo Begin $line
ssh $USER#$line << \ENDSSH
ls -d foo* | wc -l
count=`ls -d foo* | wc -l`
echo $count
ENDSSH
done < $1
The only problem with your script was that when the heredoc token is not quoted, the shell does variable expansion, so $count was being expanded by your local shell before the remote commands were shipped off...

How can I get the command that was executed at the command line?

If I call a script this way:
myScript.sh -a something -b anotherSomething
Within my script is there a way to get the command that called the script?
In my script on the first line I'm trying to use:
lastCommand=!!
echo $lastCommand
But the result is always null.
If I do echo !! the only thing that prints to the console is !!, but from the command line if I do echo !! I get the last command printed.
I've also tried:
echo $BASH_COMMAND
but I'm getting null here as well. Is it because the script is called in a subshell and thus there is no previous command stored in memory for the subshell?
The full command which called the script would be "$0" "$#", that is, the command itself followed by all the arguments quoted. This may not be the exact command which was run, but if the script is idempotent it can be run to get the same result:
$ cat myScript.sh
#!/usr/bin/env bash
printf '%q ' "$0" "$#"
printf '\n'
$ ./myScript.sh -a "foo bar" -b bar
./myScript.sh -a foo\ bar -b bar
Here's my script myScript.sh
#!/bin/bash
temp=`mktemp`
ps --pid $BASHPID -f > $temp
lastCommand=`tail -n 1 $temp | xargs | cut -d ' ' -f 8-`
rm $temp
echo $lastCommand
or
#!/bin/sh
last=`cat /proc/$$/cmdline | xargs -0`
echo $last

Why part of the script cannot execute in the crontab

I have a script stopping the application and zipping some files:
/home/myname/project/stopWithZip.sh
With the properties below:
-rwxrwxr-x. 1 myname myname778 Jun 25 13:48 stopWithZip.sh
Here is the content of the script:
ps -ef | grep project | grep -v grep | awk '{print $2}' |xargs kill -15
month=`date +%m`
year=`date +%Y`
fixLogs=~/project/log/fix/$year$month/*.log.*
errorLogs=~/project/log/error/$year$month/log.*
for log in $fixLogs
do
if [ ! -f "$log.gz" ];
then
gzip $log
echo "Archived:"$log
else
echo "skipping" $log
fi
done
echo "Archived fix log files done"
for log in $errorLogs
do
if [ ! -f "$log.gz" ]; then
gzip $log
echo "Archived:"$log
else
echo "skipping" $log
fi
done
echo "Archived errorlog files done"
The problem is except this ps -ef | grep project | grep -v grep | awk '{print $2}' |xargs kill -15 command, other gzip commands are not executed. I totally don't understand why.
I cannot see any compression of the logs in the directory.
BTW, when I execute the stopWithZip.sh explicitly in command line, it works perfectly fine.
In crontab:
00 05 * * 2-6 /home/myname/project/stopWithZip.sh >> /home/myname/project/cronlog/$(date +"\%F")-stop.log 2>&1 (NOT work)
In command line:
/home/myname/project>./stopWithZip.sh (work)
Please help
The script fails when run under cron because your script is invoked with project in its path, so the kill pipeline kills the script too.
You could prove (or disprove) this by adding some tracing. Log the output of ps and of awk to log files:
ps -ef |
tee /tmp/ps.log.$$ |
grep project |
grep -v grep |
awk '{print $2}' |
tee /tmp/awk.log.$$ |
xargs kill -15
Review the logs and see that your script is one of the processes being killed.
The crontab entry contains:
/home/myname/project/stopWithZip.sh >> /home/myname/project/cronlog/$(date +"\%F")-stop.log 2>&1
When ps lists that, it contains 'project' and does not contain 'grep' so the kill in the script kills the script itself.
When you run it from the command line (using a conventional '$' as the prompt), you run:
$ ./stopWithZip.sh
and when ps lists that, it does not contain 'project' so it is not killed.
If you ran:
$ /home/myname/project/stopWithZip.sh >> /home/myname/project/cronlog/$(date +"\%F")-stop.log 2>&1
from the command line, like you do with cron (crontab), you would find it fails.

Resources