I am trying to configure my php errors to output to docker logs. All documentation I have read indicates that docker logs are tied to the containers stdout and stderr which come from /proc/self/fd/1 and /proc/self/fd/2. I created a symlink from my php error log file /var/log/php_errors.log to /proc/self/fd/1 with command:
ln -sf /proc/self/fd/1 /var/log/php_errors.log
After linking the error log I have tested its functionality by running this php script:
<?php
error_log("This is a custom error message constructed to test the php error logging functionality of this site.\n");
?>
The output echos the error message to the console so I can see that php error logging is now redirected to stdout in the container, but when I run docker logs -f <containername> I never see the error message in the logs. Also echoing from inside the container doesn't show in the logs either which is confusing because my understanding is the echo command is stdout.
Further reading informed me that docker logs will only show output from pid 1 which could be the issue. If this is the case how can I correctly configure my php error logging to show in docker logs outside the container.
Also I have checked that I am using the default json-file docker driver, and have tried this both on my local environment and a web server.
Related
I have a Java application that uses log4j2. Upon running the application it creates the log files in the logs folder and writes debug statements into the log file.
However when I create a Docker Image and run, though I see the logs folder getting created inside the container, and log printed in the file. but when i run docker log command so i can't see any logs.
I have several modules and the corresponding log4j file but when I am running the docker log command then all logs are not getting printed in the docker container while I want to print all logs in the docker container.
I run a simple Flask application in a docker container.
To run it i do :
docker run --name my_container_name my_image_name
The logs are redirected to stdout. So after this command, i see as an output :
* Serving Flask app 'main'
* Debug mode: off
WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:8080
* Running on http://172.17.0.2:8080
Press CTRL+C to quit
When i want to get the logs with a separate command, i do :
docker container logs my_container_name
It returns well the logs. Exactly the same output as the output written above.
But if I try to redirect the output to a file :
docker container logs my_container_name > mylogfile.log
I don't get all the logs! I get only :
* Serving Flask app 'main'
* Debug mode: off
Why that ?
Running the container with a dedicated Pseudo-TTY solves the problem.
docker run -t --name my_container_name my_image_name
The parameter "-t" solved my issue.....But i don't understand why.
No need to redirect logs, just use:
tail -f `docker inspect --format='{{.LogPath}}' my_container_name`
or if you don`t like it , you can try:
docker logs -f my_container_name &> my_container.log
the trick is the &> which redirects both pipes to the file
I have configured Alertmanager to send a mail everytime an alert is triggered. However for an unknow reason I'm not receiving any mail.
How can I debug this? Is there a log file stored somewhere?
How have you started the Alertmanager? The tool should show the warnings/errors in the terminal. You can start the Alertmanager redirecting the output to a log file like in the following example:
ALERTMANAGER-INSTALL-PATH/alertmanager >> ALERTMANAGER-LOG-PATH/alertmanager.log 2>&1 &
If you're running the Alertmanager inside a Docker container try to use the Docker logs.
For example in Node.js container I do:
throw new Error('lol'); or console.error('lol');
But when I open container logs: docker-compose logs -f nodejs
there are no any statuses or colors like all logs have info status.
I use Datadog to collect logs from container - it also mark all logs as 'info'.
docker logs and similar just collect the stdout and stderr streams from the main process running inside the container. There's not a "log level" associated with that, though some systems might treat or highlight the two streams differently.
As a basic example, you could run
docker run -d --name lister --rm busybox ls /
docker logs lister
The resulting file listing isn't especially "error" or "debug" level.
The production-oriented setups I'm used to include the log level in log messages (in a Node context, I've used the Winston logging library), and then use a tool like fluentd to collect and parse those messages.
I need to debug the node-apn (some pushes are getting lost), so i need to analyze node-apn logs. Therefore, i want to store node-apn logs in a file.
What i have tried
I have enabled the node-apn logs and they are appearing on my console. Now, i am running following commands to start the server but i could not see node-apn logs in the file. I can see application logs (generated by winston) in file.
sudo node app.js test >> /home/gaurav/temp.txt
sudo node app.js test | tee /home/gaurav/temp.txt
Can anybody suggest how to achieve this.
The logs are probably being sent to stderr instead of stdout, so try this:
sudo node app.js test >> /home/gaurav/temp.txt 2>&1
sudo node app.js test 2>&1 | tee /home/gaurav/temp.txt
The 2>&1 redirects everything being sent to stderr to stdout.