How to get logs in a file when using the timeout command? - linux

I am trying to grab the logs of my below command to a text file like below:
timeout 10 glxheads &> test.txt
But unfortunately, I am not not getting any logs transferred to the text file by this approach.
Infact any simple command done under timeout doesnt give out the output to a file.
Note:
The below command works,
glxheads &> test.txt
Could anyone suggest any ideas to get around this issue?
Thanks !

As per the link specified by Dmitri, I was able to resolve this issue by doing the following:
stdbuf -oL -eL timeout 10 glxheads &> test.txt
or using
unbuffer timeout 10 glxheads &> test.txt

Related

I want not to save the logs that are "warning" in the log crontab

I want not to save the logs that are "warning" in the log file that the crontab creates, I only want the "error" messages, does anyone know how I can exclude these messages?
I have tried doing a grep -v but it doesn't work:
45 5 * * * /home/username/barc/backupsql.sh 2>&1 | grep -v 'Warning: Using a password on the command line interface can be insecure.'
Thanks in advance for anyone trying to help me.
Crontab lets you execute 1 scheduled command/script at a time.
Piping the output of your script to Grep command won't work. Furthermore, crontab by default redirects output to dev/null, therefore you won't see the output unless you save it to a file.
I suggest something like this:
Edit your script to redirect it's output to a file with your grep command. For example by adding
DATE=$(date +"%m_%d_%Y")
some command | grep -v Warning >> /tmp/$DATE.log # Here
Edit your Cron job to execute the script every day like you did, removing everything after the pipe:
45 5 * * * /home/username/barc/backupsql.sh
In order to monitor the output you could use tail command as follows:
tail -f /tmp/$DATE.log

bash redirect output to file but result is incomplete

The question of redirecting output of a command was already asked many times, however I am having a strange behavior. I am using a bash shell (debian) with version
4.3.30(1)-release and tried to redirect output to a file, however not everything are logged in the file.
The bin file that I tries to run is sauce-connectv4.4.1 for linux (client of saucelabs that is publicly available in internet)
If I run
#sudo ./bin/sc --doctor
it showed me a complete lines
it prints :
INFO: resolved to '23.42.27.27'
INFO: resolving 'g2.symcb.com' using
DNS server '10.0.0.5'...
(followed by other line)
INFO: 'google.com' is not in hosts file
INFO: URL https://google.com can be reached
However, if I redirect the same command to a file with the following command
#sudo ./bin/sc --doctor > alloutput.txt 2>&1
and do
#cat alloutput.txt
the same command output is logged, but deprecated as following:
INFO: resolved to '23.42.2me#mymachine:/opt/$
It has incomplete line, and the next lines that follows are not even logged (missing).
I have tried with >> for appending, it has the same problem. Using command &> alloutput.txt also is not printing the whole stuff. Can anyone point out how to get all lines of the above command to be logged completely to the text file?
UPDATE
In the end I manage to use the native binary logging by using --log
alloutput.txt where it completely provide me with the correct output.
However I let this issue open as I am still wondering why one misses some information/lines by doing an output redirection
you should try this: stdbuf -o0
like:
stdbuf -o0 ./bin/sc --doctor 2>&1 | tee -a alloutput.txt
That is a funny problem, I've never seen that happening before. I am going to go out on a limb here and suggest this, see how it works:
sudo ./bin/sc --doctor 2>&1 | tee -a alloutput.txt
#commandtorun &> alloutput.txt
This command will redirects both the error and output to same file.

How to redirect output of cu to a file ?

I am using cu utility to connect my Cubieboard 1 with my laptop. As I boot my Cubieboard , it sends the boot log to my terminal.
What I want is that the output should be displayed to my screen as well as be sent to some log file which I specify.
I can't find any option to do the same. Any ideas on how I could do that ?
Alternatively, if standard error should be in the file
<command> 2>&1 | tee log.txt
or
<command> |& log.txt
You could pipe into tee. You should then see output on screen and logged to a file of your choice.
<command> | tee log.txt

Keep a script running through ssh after logout

This is the first question that I post here. I tried to do a throughout search, but if I haven't (and the answer is obvious somewhere else), please just let me know.
I have a script that runs a program for me, here it is:
csv_file=../data/teste_nohup.csv
trace_file=../data/gnp.trace
declare -i n=100
declare -i p=1
declare -i counter=0
while [ $counter -lt 3 ];
do
n=100
while true
do
nice -19 sage gnptest.py ${n} ${p} | tee -a ${csv_file}
notify-send "finished test gnp ${n} ${p}"
done
done
So, what I'm trying to do is run the gnptest.py program a few times, and have the result be written to the csv_file.
The problem is, that depending on the input, the program may take a long time to complete. So I'd like to connect to the server over ssh, start the program, close the terminal, and check the output file from time to time.
I've tried nohup and disown. Nohup creates a huge nohup.out file, full with errors that I don't get while normally running the script (it complains about using the -lt operand, for example). But the biggest problem that I'm facing is that no command (nohup ou disown -h) is executing the program and sending the output to the file that I've specified in the csv_file variable, which is being done using the tee command. Also, none of them seem to continue running after I logout...
Any help will be much appreciated.
Thanks in advance!!
i hv just joined so cannt add comment
Please try by using redirection instead of tee in script
And to get rid of Nohup.out use following to run script
nohup script.sh > /dev/null 2>&1 &
If above produces error use
nohup script.sh > /dev/null 2>&1 </dev/null &
Hope this will help.

Redirecting Output of Bash Child Scripts

I have a basic script that outputs various status messages. e.g.
~$ ./myscript.sh
0 of 100
1 of 100
2 of 100
...
I wanted to wrap this in a parent script, in order to run a sequence of child-scripts and send an email upon overall completion, e.g. topscript.sh
#!/bin/bash
START=$(date +%s)
/usr/local/bin/myscript.sh
/usr/local/bin/otherscript.sh
/usr/local/bin/anotherscript.sh
RET=$?
END=$(date +%s)
echo -e "Subject:Task Complete\nBegan on $START and finished at $END and exited with status $RET.\n" | sendmail -v group#mydomain.com
I'm running this like:
~$ topscript.sh >/var/log/topscript.log 2>&1
However, when I run tail -f /var/log/topscript.log to inspect the log I see nothing, even though running top shows myscript.sh is currently being executed, and therefore, presumably outputting status messages.
Why isn't the stdout/stderr from the child scripts being captured in the parent's log? How do I fix this?
EDIT: I'm also running these on a remote machine, connected via ssh using pseudo-tty allocation, e.g. ssh -t user#host. Could the pseudo-tty be interfering?
I just tried your the following: I have three files t1.sh, t2.sh, and t3.sh all with the following content:
#!/bin/bash
for((i=0;i<10;i++)) ; do
echo $i of 9
sleep 1
done
And a script called myscript.sh with the following content:
#!/bin/bash
./t1.sh
./t2.sh
./t3.sh
echo "All Done"
When I run ./myscript.sh > topscript.log 2>&1 and then in another terminal run tail -f topscript.log I see the lines being output just fine in the log file.
Perhaps the things being run in your subscripts use a large output buffer? I know when I've run python scripts before, it has a pretty big output buffer so you don't see any output for a while. Do you actually see the entire output in the email that gets sent out at the end of topscript.sh? Is it just that while the processes run you're not seeing the output?
try
unbuffer topscript.sh >/var/log/topscript.log 2>&1
Note that unbuffer is not always available as a std binary in old-style Unix platforms and may require a search and installation for a package to support it.
I hope this helps.

Resources