Avoid writing progress bars to file in linux - linux

I have a process that prints output, but this output also includes loading bars.
I would like to both write this output to a file and display this output.
Normally I could just do:
./my_process.sh | tee -a my_log_file.txt
or
./my_process.sh >> my_log_file.txt
tail -f my_log_file.txt
This prints everything to my terminal, however it also prints EVERYTHING to the log file, including each step of the progress bar!
I would like to exclude progress bar iterations from getting printed to the log file.
For my purposes, any line with a carriage return can be excluded from the log file. How can I exclude carriage return lines from getting appended to the log file while still printing them to stdout on the terminal?

You can filter the tee before logging
for example
$ echo -e "progress_ignore\r\nlog this\nprogress_ignore\r" | tee >(awk '!/\r/' >> output.log)
progress_ignore
log this
progress_ignore
$ cat output.log
log this

Related

Continuously print stdout and write to file

I have a program that will intermittently print to stdout. I want to see the output on the screen AND redirect it to a file.
I can use tee as follows:
foo | tee ./log.txt
However, this would print the output to the screen only when foo has terminated not allowing me to observe the progress of my program.
Is there a way to continuously show the output of the program and redirect it to a log file?
Is it acceptable to write output to a file and display it live ?
$> foo > ./log.txt & tail -f ./log.txt

How to get printf to write a new file, append an existing file, and write to stdout?

I have a printf command that will write a file but won't print to stdout. I would like to have both so I can let the user see what's happening, and at the same time, write a record to a log file.
printf "%s\n" "This is some text" "That will be written to a file" "There will be several lines" | tee -a bin/logfile.log > bin/newfile.conf
That command appends to the log file and writes to the new file, but writes no output to the screen :(
OS: Centos 7
It's because you're redirecting the screen output with > bin/newfile.conf in addition to what you're doing with tee. Just drop the > and everything after it. If you want to output to both of those files at once in addition to the screen, you can use tee twice, e.g.:
printf ... | tee -a bin/logfile.log | tee bin/newfile.conf
That appends to logfile.log and overwrites newfile.conf, and also writes out to the screen. Use or omit the -a option as needed.
As John1024 points out you can also use tee once since it accepts multiple filenames, although in that case -a applies to all filenames, but it can be useful in the case where you want the append vs. overwrite behavior to be the same for all files.

Redirect output of command to file and stdout [Only once, not at each command]

I want to have a snapshot of all the commands I executed and output (stdout, stderr) in a file to view later. Often I need to view it for debugging purposes.
This link gives the output for single command:
How to redirect output to a file and stdout
However, I do not want to write it for each command since it is time consuming. Also, this does not log the command I executed.
Example functionality:
bash> start-logging logger.txt
bash> echo "Hi"
Hi
bash> cat 1.txt
Contents of 1.txt
bash> stop-logging
Contents of logger.txt
bash> echo "Hi"
Hi
bash> cat 1.txt
Contents of 1.txt
Ideally, I want the above behavior.
Any other command with some missing functionality (maybe missing command executed from tmp.txt) would also help.
You can use
$ bash | tee log
to start logging and then
$ ^D
(that's Ctrl-D)
to stop logging.
Create a new bash session that will log, then terminate it when it's not needed.
But unfortunately, bash seems to not write its prompt and commands to stderr so you will only catch command output.

Bash standard output display and redirection at the same time

In terminal, sometimes I would like to display the standard output and also save it as a backup. but if I use redirection ( > &> etc), it does not display the output in the terminal anymore.
I think I can do for example ls > localbackup.txt | cat localbackup.txt. But it just doesn't feel right. Is there any shortcut to achieve this?
Thank you!
tee is the command you are looking for:
ls | tee localbackup.txt
In addition to using tee to duplicate the output (and it's worth mentioning that tee is able to append to the file instead of overwriting it, by using tee -a, so that you can run several commands in sequence and retain all of the output), you can also use tail -f to "follow" the output file from a parallel process (e.g. a separate terminal):
command1 >localbackup.txt # create output file
command2 >>localbackup.txt # append to output
and from a separate terminal, at the same time:
tail -f localbackup.txt # this will keep outputting as text is appended to the file

How to redirect output to a file and stdout

In bash, calling foo would display any output from that command on the stdout.
Calling foo > output would redirect any output from that command to the file specified (in this case 'output').
Is there a way to redirect output to a file and have it display on stdout?
The command you want is named tee:
foo | tee output.file
For example, if you only care about stdout:
ls -a | tee output.file
If you want to include stderr, do:
program [arguments...] 2>&1 | tee outfile
2>&1 redirects channel 2 (stderr/standard error) into channel 1 (stdout/standard output), such that both is written as stdout. It is also directed to the given output file as of the tee command.
Furthermore, if you want to append to the log file, use tee -a as:
program [arguments...] 2>&1 | tee -a outfile
$ program [arguments...] 2>&1 | tee outfile
2>&1 dumps the stderr and stdout streams.
tee outfile takes the stream it gets and writes it to the screen and to the file "outfile".
This is probably what most people are looking for. The likely situation is some program or script is working hard for a long time and producing a lot of output. The user wants to check it periodically for progress, but also wants the output written to a file.
The problem (especially when mixing stdout and stderr streams) is that there is reliance on the streams being flushed by the program. If, for example, all the writes to stdout are not flushed, but all the writes to stderr are flushed, then they'll end up out of chronological order in the output file and on the screen.
It's also bad if the program only outputs 1 or 2 lines every few minutes to report progress. In such a case, if the output was not flushed by the program, the user wouldn't even see any output on the screen for hours, because none of it would get pushed through the pipe for hours.
Update: The program unbuffer, part of the expect package, will solve the buffering problem. This will cause stdout and stderr to write to the screen and file immediately and keep them in sync when being combined and redirected to tee. E.g.:
$ unbuffer program [arguments...] 2>&1 | tee outfile
Another way that works for me is,
<command> |& tee <outputFile>
as shown in gnu bash manual
Example:
ls |& tee files.txt
If ‘|&’ is used, command1’s standard error, in addition to its standard output, is connected to command2’s standard input through the pipe; it is shorthand for 2>&1 |. This implicit redirection of the standard error to the standard output is performed after any redirections specified by the command.
For more information, refer redirection
You can primarily use Zoredache solution, but If you don't want to overwrite the output file you should write tee with -a option as follow :
ls -lR / | tee -a output.file
Something to add ...
The package unbuffer has support issues with some packages under fedora and redhat unix releases.
Setting aside the troubles
Following worked for me
bash myscript.sh 2>&1 | tee output.log
Thank you ScDF & matthew your inputs saved me lot of time..
Using tail -f output should work.
In my case I had the Java process with output logs. The simplest solution to display output logs and redirect them into the file(named logfile here) was:
my_java_process_run_script.sh |& tee logfile
Result was Java process running with output logs displaying and
putting them into the file with name logfile
You can do that for your entire script by using something like that at the beginning of your script :
#!/usr/bin/env bash
test x$1 = x$'\x00' && shift || { set -o pipefail ; ( exec 2>&1 ; $0 $'\x00' "$#" ) | tee mylogfile ; exit $? ; }
# do whaetever you want
This redirect both stderr and stdout outputs to the file called mylogfile and let everything goes to stdout at the same time.
It is used some stupid tricks :
use exec without command to setup redirections,
use tee to duplicates outputs,
restart the script with the wanted redirections,
use a special first parameter (a simple NUL character specified by the $'string' special bash notation) to specify that the script is restarted (no equivalent parameter may be used by your original work),
try to preserve the original exit status when restarting the script using the pipefail option.
Ugly but useful for me in certain situations.
Bonus answer since this use-case brought me here:
In the case where you need to do this as some other user
echo "some output" | sudo -u some_user tee /some/path/some_file
Note that the echo will happen as you and the file write will happen as "some_user" what will NOT work is if you were to run the echo as "some_user" and redirect the output with >> "some_file" because the file redirect will happen as you.
Hint: tee also supports append with the -a flag, if you need to replace a line in a file as another user you could execute sed as the desired user.
< command > |& tee filename # this will create a file "filename" with command status as a content, If a file already exists it will remove existed content and writes the command status.
< command > | tee >> filename # this will append status to the file but it doesn't print the command status on standard_output (screen).
I want to print something by using "echo" on screen and append that echoed data to a file
echo "hi there, Have to print this on screen and append to a file"
tee is perfect for this, but this will also do the job
ls -lr / > output | cat output

Resources