Store output of every command with timestamp in a file in Linux - linux

Is there any way to store the output of every command in a log file with a timestamp?
I have tried this script but it did nothing.
mkdir /home/my_name/demo |& tee /home/my_name/My_log.log

mkdir has no output so you won't see any output. Also, you need to use ts to get the timestamp.
echo hello | ts '[%Y-%m-%d %H:%M:%S]' | tee ~/my_name/My_log.log
ts might not be installed on your system, but it can be found in the package moreutils.
If you have multiple commands you want to log, you can put them in a script and then pipe the output of the script through the pipeline above:
myscript | ts '[%Y-%m-%d %H:%M:%S]' | tee ~/my_name/My_log.log

use >> operator to write your output to file.You can use tee command as well. the only difference is >> doesnt write the output to STDOUT.
have your script or command executed something like below:
customScript | ts -r '[%Y-%m-%d %H:%M:%S]' | tee -a /home/my_name/My_log.log
or
customScript | ts -r '[%Y-%m-%d %H:%M:%S]' >> /home/my_name/My_log.log

Related

Bash Multiple Process Substitution Redirect Order

I am trying to use multiple process substitutions in a BASH command but I seem to be misunderstanding the order in which they resolve and redirect to each other.
The System
Ubuntu 18.04
BASH version - GNU bash, version 4.4.20(1)-release (x86_64-pc-linux-gnu)
The Problem
I am trying to redirect an output of a command into tee, have that redirect into ts (adding a timestamp) and then have that redirect into split (splitting the output into separate files). I can get the output to redirect into tee and ts but when redirecting into split I run into a problem.
My Attempts
command >(tee -a >(ts '[%Y-%m-%d %H:%M:%S]' > tempfile.txt)) - this will redirect the output into process substitution of tee then redirext to process substitution ts and add the timestamp then redirect to tempfile.txt this is what I would expect
command >(tee -a >(ts '[%Y-%m-%d %H:%M:%S]' >(split -d -b 10 -))) - this does nothing even though I would hope that the result would have been a bunch of 10 byte files with timestamps on the different rows.
To continue testing I tried with echo instead to see what happens
command >(tee -a >(ts '[%Y-%m-%d %H:%M:%S]' >(echo))) - the print from the initial tee prints (as it should) but the echo prints an empty line apparently this irrelevant because of new result I got - see edit at the bottom
command >(tee -a >(ts '[%Y-%m-%d %H:%M:%S]') >(split -d -b 10 -)) - This prints the command with the timestamp (as tee and ts should) and in addition creates 10 byte files with the command output (no timestamp on them). - this is what I expected and makes sense as the tee gets redirected to both process substitutions separately, it was mostly a sanity check
What I think is happening
From what I can tell >(ts '[%Y-%m-%d %H:%M:%S]' >(split -d -b 10 -)) are resolving first as a complete and separate command of its own. Thus split (and echo) are receiving an empty output from ts which has no output on its own. Only after this does the actual command resolve and send its output to its substitution tee.
This doesn't explain why command >(tee -a >(ts '[%Y-%m-%d %H:%M:%S]' > tempfile.txt)) does work as by this theory tee by itself has no output so ts should be receiving not input and should also output a blank.
All this is to say I am not really sure what is happening.
What I want
Basically I just want to understand how to make command >(tee -a >(ts '[%Y-%m-%d %H:%M:%S]' >(split -d -b 10 -))) work in the way it seems it should. I need the commands output to send itself to the process substitution tee which will send it to the process substitution ts and add the timestamps which will sent it to split and split the output to several small files.
I have tried command > >(echo) and saw the output is blank, which is not what I expected (i expected echo to receive and then output the command output). I think I am just very much misunderstanding how process substitution works at this point.
You can split send the error stream from the command into a different pipeline than the output, if that is desired:
{ { cmd 2>&3 | ts ... | split; } 3>&1 >&4 | ts ... | split; } 4>&1
This sends the output of cmd to the first pipeline, while the error stream from cmd goes into the 2nd pipe. File descriptor 3 is introduced to keep the error streams from ts and split separate, but that may be undesirable. fd 4 is introduced to prevent the output of split from being consumed by the second pipeline, and that may be unnecessary (if split does not produce any output, for example.)
One thing you could do if you really want to have one command redirect stdin/stderr to a separate ts|tee|split is this
command 1> >(ts '[%Y-%m-%d %H:%M:%S]' | tee -a >(split -d -b 10 -)) 2> >(ts '[%Y-%m-%d %H:%M:%S]' | tee -a >(split -d -b 10 -))
But the downside is tee only prints after the prompt gets printed. There is probably a way to avoid this by duplicating file descriptors, but this is the best I could think of.
This:
ts '[%Y-%m-%d %H:%M:%S]' >(split -d -b 10 -)
expands the file name generated by the process substitution on the command line of ts, so what gets run is something like ts '[%Y-%m-%d %H:%M:%S]' /dev/fd/63. ts then tries to open the fd that goes to split to read input from there, instead of reading from the original stdin.
That's probably not what you want, and on my machine, I got some copies of ts and split stuck in the background while testing. Possibly successfully connected to each other, which may explain the lack of error messages.
You probably meant to write
ts '[%Y-%m-%d %H:%M:%S]' > >(split -d -b 10 -)
^
with a redirection to the process substitution.
That said, you could just use a pipe there between ts and split.

How to reference the output of the previous command twice in Linux command?

For instance, if I'd like to reference the output of the previous command once, I can use the command below:
ls *.txt | xargs -I % ls -l %
But how to reference the output twice? Like how can I implement something like:
ls *.txt | xargs -I % 'some command' % > %
PS: I know how to do it in shell script, but I just want a simpler way to do it.
You can pass this argument to bash -c:
ls *.txt | xargs -I % bash -c 'ls -l "$1" > "out.$1"' - %
You can lookup up 'tpipe' on SO; it will also lead you to 'pee' (which is not a good search term elsewhere on the internet). Basically, they're variants of the tee command which write to multiple processes instead of writing to files like the tee command does.
However, with Bash, you can use Process Substitution:
ls *.txt | tee >(cmd1) >(cmd2)
This will write the input to tee to each of the commands cmd1 and cmd2.
You can arrange to lose standard output in at least two different ways:
ls *.txt | tee >(cmd1) >(cmd2) >/dev/null
ls *.txt | tee >(cmd1) | cmd2

Shell script to log output of console

I want to grep the output of my script - which itself contains call to different binaries...
Since the script has multiple binaries within I can't simply put exec and dump the output in file (it does not copy output from the binaries)...
And to let you know, I am monitoring the script output to determine if the system has got stuck!
Why don't you append instead?
mybin1 | grep '...' >> mylog.txt
mybin2 | grep '...' >> mylog.txt
mybin3 | grep '...' >> mylog.txt
Does this not work?
#!/bin/bash
exec 11>&1 12>&2 > >(exec tee /var/log/somewhere) 2>&1 ## Or add -a option to tee to append.
# call your binaries here
exec >&- 2>&- >&11 2>&12 11>&- 12>&-

How to redirect ubuntu terminal output to a file?

I have tried redirecting the terminal output to a file using tee and > as in the examples here and the question. It worked for echo test | tee log.txt or ls -l | tee log.txt
But It does not work (does not add anything to the log.txt) when I run a command like divine verify file.dve | tee log.txt
where divine is an installed tool. Any ideas or alternatives?
Try divine verify file.dve 2>&1 | tee log.txt. If the program is outputting to stderr instead of stdout, this redirects stderr to stdout.
works on ffmpeg output too
{ echo ffmpeg -i [rest of command]; ffmpeg -i [rest of command]; } 2>&1 | tee ffmpeg.txt
and tee -a to append if file already exists
======
also if you want to see mediainfo on all files in a folder and make sure command is also visible in mediainfo.txt
{ echo mediainfo *; mediainfo *; } 2>&1 | tee mediainfo.txt
NB: { echo cmd; cmd; } means the command is kept in the txt file ; without this it is not printed

Write into .log file

I'm making a script and every time something is done I would like to write into my custom .log file. How do I do that?
And in the end.. I'd just like to read it with Bash,.. do I just use cat?
Thanks.
The simplest syntax I always use is 2>&1 | tee -a file_name.log.
The syntax can be used after a command or execution of a file. e.g.
find . -type f 2>&1 | tee -a file_name.log
or
./test.sh 2>&1 | tee -a file_name.log
Just cat <log message here> >> custom.log.
The >> means add on to the bottom of the file rather than > which would delete the contents of the file and then write the message.

Resources