Write into .log file - linux

I'm making a script and every time something is done I would like to write into my custom .log file. How do I do that?
And in the end.. I'd just like to read it with Bash,.. do I just use cat?
Thanks.

The simplest syntax I always use is 2>&1 | tee -a file_name.log.
The syntax can be used after a command or execution of a file. e.g.
find . -type f 2>&1 | tee -a file_name.log
or
./test.sh 2>&1 | tee -a file_name.log

Just cat <log message here> >> custom.log.
The >> means add on to the bottom of the file rather than > which would delete the contents of the file and then write the message.

Related

How to write the output of a command twice into a file

I run some command that generates ip address as output. But I am developing a workflow where I need the ip address to be written twice. Below is the sample command and its output.
$ some command >> out.txt
$ cat out.txt
10.241.1.85
hdfs://10.241.1.236/
hdfs://10.241.1.237/
What i want is to duplicate the output and it should look like this.
10.241.1.85
hdfs://10.241.1.236/
hdfs://10.241.1.237/
10.241.1.85
hdfs://10.241.1.236/
hdfs://10.241.1.237/
Any help please?
The solution given by #ott in a comment seems fine:
var=$(some cmd); echo -e "$var\n$var".
This is not assigning the command o a varable but it is assigning the output of the command to a variable.
When you do not want this, you can use tee (perhaps this will give some ordering problems) or duplicate it differently:
some_command > out.txt.tmp
cat out.txt.tmp out.txt.tmp > out.txt
rm out.txt.tmp
This way you first get the lines of the copy after al the lines of first entries. When you want to double the output directly, you can use
some_command | sed 'p' > out.txt
some command | tee -a out.txt out.txt
Or
some command | tee -a out.txt >> out.txt
Or
some command | tee -a out.txt out.txt >/dev/null
Run command
Pipe to tee
Enable append mode
Append to same file twice
Generate the output to some temporary, then duplicate the temporary to destination and remove temporary :
some command > /tmp/out.txt; cat /tmp/out.txt /tmp/out.txt > out.txt; rm /tmp/out.txt
Here are some more options you could play around with. Seeing as the output is too large to store in a variable, I'd probably go with tee, a temp file, and gzip if the disk write speed is a bottleneck.
someCommand > tmp.txt && cat tmp.txt tmp.txt > out.txt && rm tmp.txt
Now, if the disk read/write speed is a bottleneck, you can tee the output of someCommand and redirect one of the pipelines through gzip initially.
someCommand | tee >(gzip > tmp.gz) > out.txt && gunzip -c tmp.gz >> out.txt && rm tmp.gz
Additionally, if you don't need random access abilities for out.txt and plan on processing it through some other pipeline, you could always keep it stored gzipped until you need it.
someCommand | gzip > tmp.gz && cat tmp.gz tmp.gz > out.txt.gz && rm tmp.gz
I would suggest this:
(someCommand | tee tmp.txt; cat tmp.txt) > out.txt; rm tmp.txt
Not sure there's a way to safely do this without resorting to a temporary file. You could capture it to a variable, as some have suggested, but you have to be careful about quoting then to make sure whitespace doesn't get mangled, and you also might run into problems if the output is particularly large.

cat file_name | grep "something" results "cat: grep: No such file or directory" in shell scripting

I have written shell script which reads commands from input file and execute commands. I have command like:
cat linux_unit_test_commands | grep "dmesg"
in the input file. I am getting below error message while executing shell script:
cat: |: No such file or directory
cat: grep: No such file or directory
cat: "dmesg": No such file or directory
Script:
#!/bin/bash
while read line
do
output=`$line`
echo $output >> logs
done < $1
Below is input file(example_commands):
ls
date
cat linux_unit_test_commands | grep "dmesg"
Execute: ./linux_unit_tests.sh example_commands
Please help me to resolve this issue.
Special characters like | and " are not parsed after expanding variables; the only processing done after variable expansion is word splitting and wildcard expansions. If you want the line to be parsed fully, you need to use eval:
while read line
do
output=`eval "$line"`
echo "$output" >> logs
done < $1
You might be wondering why its not working with cat command.
Then here is the answer for your question.
output=`$line` i.e. output=`cat linux_unit_test_commands | grep "dmesg"`
here the cat command will take (linux_unit_test_commands | grep "dmesg") all these as arguments i.e. fileNames.
From Man page:
SYNTAX : cat [OPTION]... [FILE]...
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Script is OK!
#!/bin/bash
while read line;
do
output=`$line`
echo $output >> logs
done < $1
To make it work you need to change 'cat: "dmesg": No such file or directory' to 'grep "dmesg" linux_unit_test_commands'. It will work!
cat linux_unit_test_commands
ls
date
grep "dmesg" linux_unit_test_commands

Shell script to log output of console

I want to grep the output of my script - which itself contains call to different binaries...
Since the script has multiple binaries within I can't simply put exec and dump the output in file (it does not copy output from the binaries)...
And to let you know, I am monitoring the script output to determine if the system has got stuck!
Why don't you append instead?
mybin1 | grep '...' >> mylog.txt
mybin2 | grep '...' >> mylog.txt
mybin3 | grep '...' >> mylog.txt
Does this not work?
#!/bin/bash
exec 11>&1 12>&2 > >(exec tee /var/log/somewhere) 2>&1 ## Or add -a option to tee to append.
# call your binaries here
exec >&- 2>&- >&11 2>&12 11>&- 12>&-

Command output redirect to file and terminal [duplicate]

This question already has answers here:
How to redirect output to a file and stdout
(11 answers)
Closed 4 years ago.
I am trying to throw command output to file plus console also. This is because i want to keep record of output in file. I am doing following and it appending to file but not printing ls output on terminal.
$ls 2>&1 > /tmp/ls.txt
Yes, if you redirect the output, it won't appear on the console. Use tee.
ls 2>&1 | tee /tmp/ls.txt
It is worth mentioning that 2>&1 means that standard error will be redirected too, together with standard output. So
someCommand | tee someFile
gives you just the standard output in the file, but not the standard error: standard error will appear in console only. To get standard error in the file too, you can use
someCommand 2>&1 | tee someFile
(source: In the shell, what is " 2>&1 "? ). Finally, both the above commands will truncate the file and start clear. If you use a sequence of commands, you may want to get output&error of all of them, one after another. In this case you can use -a flag to "tee" command:
someCommand 2>&1 | tee -a someFile
In case somebody needs to append the output and not overriding, it is possible to use "-a" or "--append" option of "tee" command :
ls 2>&1 | tee -a /tmp/ls.txt
ls 2>&1 | tee --append /tmp/ls.txt

How to log output in bash and see it in the terminal at the same time?

I have some scripts where I need to see the output and log the result to a file, with the simplest example being:
$ update-client > my.log
I want to be able to see the output of the command while it's running, but also have it logged to the file. I also log stderr, so I would want to be able to log the error stream while seeing it as well.
update-client 2>&1 | tee my.log
2>&1 redirects standard error to standard output, and tee sends its standard input to standard output and the file.
Just use tail to watch the file as it's updated. Background your original process by adding & after your above command After you execute the command above just use
$ tail -f my.log
It will continuously update. (note it won't tell you when the file has finished running so you can output something to the log to tell you it finished. Ctrl-c to exit tail)
You can use the tee command for that:
command | tee /path/to/logfile
The equivelent without writing to the shell would be:
command > /path/to/logfile
If you want to append (>>) and show the output in the shell, use the -a option:
command | tee -a /path/to/logfile
Please note that the pipe will catch stdout only, errors to stderr are not processed by the pipe with tee. If you want to log errors (from stderr), use:
command 2>&1 | tee /path/to/logfile
This means: run command and redirect the stderr stream (2) to stdout (1). That will be passed to the pipe with the tee application.
Learn about this at askubuntu site
another option is to use block based output capture from within the script (not sure if that is the correct technical term).
Example
#!/bin/bash
{
echo "I will be sent to screen and file"
ls ~
} 2>&1 | tee -a /tmp/logfile.log
echo "I will be sent to just terminal"
I like to have more control and flexibility - so I prefer this way.

Resources