log4j logs not printed in Docker Container Logs - linux

I have a Java application that uses log4j2. Upon running the application it creates the log files in the logs folder and writes debug statements into the log file.
However when I create a Docker Image and run, though I see the logs folder getting created inside the container, and log printed in the file. but when i run docker log command so i can't see any logs.
I have several modules and the corresponding log4j file but when I am running the docker log command then all logs are not getting printed in the docker container while I want to print all logs in the docker container.

Related

Spring-boot application logs eating up the space in mounted location in docker

Im running a docker container (springboot application running inside) with mounting a location for the logs.
docker run -d --name myContainer 8080:80 -v /server/appLogs:/var/log myContainer:latest
Here im mounting /server/appLogs to the location /var/log/. my spring boot application logs are written inside the /var/log location and I need to take logs out to the host machine.
But with the time log are getting collected inside /server/appLogs location and its filling all my space in the server.
I know from logback-spring.xml file we can handle the max size and max history but those settings doesnt applies for the mounted location.
I have a plan to write a shell script and add a cron job for auto deleting the logs in the location.
Is there any other good method to clear the logs in this /server/appLogs location?
why the configurations in logback-spring.xml doesn't get applies here?
You can use ELK (https://www.elastic.co/) for application logs.
this is the complete guide for your ref: https://logz.io/learn/complete-guide-elk-stack/#elasticsearch

Logs file is empty in spring boot application docker container

I have a spring-boot application running inside a docker container & its working fine. But the thing is application log file is empty inside the docker container.
In logback-spring.xml log path has been configured to /var/log.
When I go to /var/log directory inside the docker container, I can see log file has been created like "myservice.log"
but when I "cat" the file to see the content, that is completely empty.
Also when I execute
docker logs <container-id>
it returns nothing.
And also I checked the docker root directory in the server.
/apps/docker/containers/<container-id>/<container-id-json.log>
that is also empty.
my Dockerfile has the following structure.
From private-docker-repo/openjdk:11-jre-slim
WORKDIR /opt/services
COPY target/my-service-0.0.1-SNAPSHOT.jar /opt/services/my-service.jar
CMD java -Dspring.profiles.active=dev -Dserver.port=61016 -jar my-service.jar
EXPOSE 61016
What can be the reason for being the log file is empty here. Highly appreciate if anyone can point me out.
Edit - when I deploy the same jar using a linux systemd service logs are just writing fine. I want to know why the same jar not printing any logs inside the docker container
Thanks in advance..!
are you sure that your application running? Get into the docker container and check whether it's running, seems to me it's not started.
I solved this just by replacing the CMD command with ENTRYPOINT. Now the logs are printing just fine. I did some search on the difference between CMD and ENTRYPOINT but still I cant understand how that affect the logging of a container. So if anyone can add a comment what could be happened, that's great not only for me but for the other who will see this question in future.
Thank you :)

Docker container STDOUT not showing in Docker Logs

I am trying to configure my php errors to output to docker logs. All documentation I have read indicates that docker logs are tied to the containers stdout and stderr which come from /proc/self/fd/1 and /proc/self/fd/2. I created a symlink from my php error log file /var/log/php_errors.log to /proc/self/fd/1 with command:
ln -sf /proc/self/fd/1 /var/log/php_errors.log
After linking the error log I have tested its functionality by running this php script:
<?php
error_log("This is a custom error message constructed to test the php error logging functionality of this site.\n");
?>
The output echos the error message to the console so I can see that php error logging is now redirected to stdout in the container, but when I run docker logs -f <containername> I never see the error message in the logs. Also echoing from inside the container doesn't show in the logs either which is confusing because my understanding is the echo command is stdout.
Further reading informed me that docker logs will only show output from pid 1 which could be the issue. If this is the case how can I correctly configure my php error logging to show in docker logs outside the container.
Also I have checked that I am using the default json-file docker driver, and have tried this both on my local environment and a web server.

Does Docker have log statuses, e.g. error, warn, info?

For example in Node.js container I do:
throw new Error('lol'); or console.error('lol');
But when I open container logs: docker-compose logs -f nodejs
there are no any statuses or colors like all logs have info status.
I use Datadog to collect logs from container - it also mark all logs as 'info'.
docker logs and similar just collect the stdout and stderr streams from the main process running inside the container. There's not a "log level" associated with that, though some systems might treat or highlight the two streams differently.
As a basic example, you could run
docker run -d --name lister --rm busybox ls /
docker logs lister
The resulting file listing isn't especially "error" or "debug" level.
The production-oriented setups I'm used to include the log level in log messages (in a Node context, I've used the Winston logging library), and then use a tool like fluentd to collect and parse those messages.

Download file in jenkins job

I have a jenkins job which will execute node application. This job is configured to run on docker only during execution.
Is it possible to download file from node application everytime when job gets executed?
I tried using nodejs plugins to save and download file. File is getting saved in local but not able to download.
If your docker container runs some job and creates a file as the output of the job, and you want it available outside the container after the job is done, my suggestion is that you create the file in a location that is mapped to a host folder via the volume option. Run your docker container as follows:
sudo docker -d -v /my/host/folder:/my/location/inside/container mynodeapp:latest
Ensure that your node application writes the output file to the location /my/location/inside/container. When the job is completed, the output file can be accessed on the host-machine at /my/host/folder.

Resources