How to redirect docker logs with grep command to a text file? [duplicate] - linux

This question already has answers here:
How to 'grep' a continuous stream?
(13 answers)
Closed 1 year ago.
For finding specific logs from the docker logs, I am using the grep with docker logs as below
docker logs -f docker_container 2>&1 | grep "KafkaRecordGenerator:197"
Which is giving the correct result in the console. I need to redirect these logs to a text file. For that, I used the below command
docker logs -f docker_container 2>&1 | grep "KafkaRecordGenerator:197" >> test.txt
Here the new file test.txt is created but the output is not redirected to the test.txt file. How to redirect the docker logs with grep command to a text file?

As per the anemyte's comment, --line-buffered with grep command fixed my problem. The final command is,
docker logs -f docker_container 2>&1 | grep --line-buffered "KafkaRecordGenerator:197" >> test.txt

Related

All combined docker logs with container name

So I am trying to get the combined output of all the container logs with its container name in a log file
docker logs --tail -3 with container name >> logst.log file.
docker logs takes a single command so you would have to run them for each container. I guess I could do something along:
docker ps --format='{{.Names}}' | xargs -P0 -d '\n' -n1 sh -c 'docker logs "$1" | sed "s/^/$1: /"' _
docker ps --format='{{.Names}}' - print container names
xargs - for input
-d '\n' - for each line
-P0 - execute in parallel with any count of parallel jobs
remove this option if you don't intent to do docker logs --follow
it may cause problems, consider adding stdbuf -oL and sed -u to unbuffer the streams
-n1 - pass one argument to the underyling process
sh -c 'script' _ - execute script for each line with line passed as first positional argument
docker logs "$1" - get the logs
sed 's/^/$1: /' - prepend the name of the docker name to each log line
But a way better and industrial grade solution would be to forward docker logs to journalctl or other logging solution and use that utility to aggregate and filter logs.
Got it.
for i in docker ps -a -q --format "table {{.ID}}"; do { docker ps -a -q --format "table {{.ID}}\t{{.Names}}\n" | grep "$i" & docker logs --timestamps --tail 1 "$i"; } >> logs.log; done
logs.log is a generic file name.

Preserve colors of heroku logs output when piping to other command (e.g. grep)

I am using grep to remove a lot of log noise generated e.g. by NewRelic. I do so using the following command:
heroku logs --force-colors -t -a myApp -s app | grep --color=never web.1
Unfortunately the useful coloring of the logs gets lost somewhere, and the output is uncolored.
The --force-colors flag should force the heroku logs command to output colors even when pipping the output elsewhere. the --color=never flag is supposed to force grep not to use their own coloring scheme.
I have tried all possible combinations with absence or presence of these two color flags, to no avail. Does anybody have a suggestion on how to solve this issue?
I have found a solution here:
script -q /dev/null heroku logs --force-colors -t -a myApp -s app | grep --color=never web.1
The color flags are no even necessary so this works as well:
script -q /dev/null heroku logs -t -a myApp -s app | grep web.1

Need response time and download time for the URLs and write shell scripts for same

I have use command to get response time :
curl -s -w "%{time_total}\n" -o /dev/null https://www.ziffi.com/suggestadoc/js/ds.ziffi.https.v308.js
and I also need download time of this below mentioned js file link so used wget command to download this file but i get multiple parameter out put. I just need download time from it
$ wget --output-document=/dev/null https://www.ziffi.com/suggestadoc/js/ds.ziffi.https.v307.js
please suggest
I think what you are looking for is this:
wget --output-document=/dev/null https://www.ziffi.com/suggestadoc/js/ds.ziffi.https.v307.js 2>&1 >/dev/null | grep = | awk '{print $5}' | sed 's/^.*\=//'
Explanation:
2>&1 >/dev/null | --> Makes sure stderr gets piped instead of stdout
grep = --> select the line that contains the '=' symbol
sed 's/^.*\=//' --> deletes everything from linestart to the = symbol

tail command with dynamic file parameter

I am now using the tail command as below
show_log.sh:
LOGFILE=`ls -1 -r ./myservice.log.????????.?????? | head -n 1`
tail -v -f -s 1 -n 100 ${LOGFILE}
to monitor the log file.
The problem with it is that after each service restart, a new log file will be created, and the prior log file will be compressed. So the tail command stops working.
I need to change the script so that to continue tailing with the new file
Found a way. ojblass with the capital F parameter suggestion helped.
Actually I created a link to the latest log file by the following command after each service restart:
ln -n service-blabla.log log_lnk
and changed the tail command like this:
tail -v -F -s 1 -n 100 log_lnk
Note the capital F in the tail command. Lowercase f doesn't work in this situation.
done.

Command output redirect to file and terminal [duplicate]

This question already has answers here:
How to redirect output to a file and stdout
(11 answers)
Closed 4 years ago.
I am trying to throw command output to file plus console also. This is because i want to keep record of output in file. I am doing following and it appending to file but not printing ls output on terminal.
$ls 2>&1 > /tmp/ls.txt
Yes, if you redirect the output, it won't appear on the console. Use tee.
ls 2>&1 | tee /tmp/ls.txt
It is worth mentioning that 2>&1 means that standard error will be redirected too, together with standard output. So
someCommand | tee someFile
gives you just the standard output in the file, but not the standard error: standard error will appear in console only. To get standard error in the file too, you can use
someCommand 2>&1 | tee someFile
(source: In the shell, what is " 2>&1 "? ). Finally, both the above commands will truncate the file and start clear. If you use a sequence of commands, you may want to get output&error of all of them, one after another. In this case you can use -a flag to "tee" command:
someCommand 2>&1 | tee -a someFile
In case somebody needs to append the output and not overriding, it is possible to use "-a" or "--append" option of "tee" command :
ls 2>&1 | tee -a /tmp/ls.txt
ls 2>&1 | tee --append /tmp/ls.txt

Resources