Accessing log files within linux container in Azure App Service - azure

This may be a simple question but I find the Azure documentation vast and a bit vague to advice would be appreciated.
So I've got docker container running in Azure app service. I have a linux container which is pushed to Azure container registry from our pipelines where it is then used by the web app. I can view the log stream which automatically displays the docker logs and I assume anything sent to standard out.
There are various logs files within the container on certain file paths. How can I access these logs? (Other than using ssh in Kudu to get into the container). Is there a way of mapping these file paths to one of the Azure log analytic tools?
Thanks - Please let me know if you need more information on any part of this setup.

Sending arbitrary file logs to log analytics from Azure App Service is not natively supported.
You could try setting up your container with the Log Analytics Agent which allows you to configure a custom logs data source, as long as your file has a compliant format.

There are two ways of managing log files in container.
Docker best practices will ask that ALL logs be redirected to the STDOUT. This allow commands like docker logs or az webapp log.
Note that there is a request to have to log files stores to a storage account: https://github.com/Azure/azure-cli/issues/10043
Also, what you could do, would be to mount a folder in your web app container into a storage account and ensure that those internal log files are store on the storage account: https://learn.microsoft.com/en-us/cli/azure/webapp/config/storage-account?view=azure-cli-latest#az-webapp-config-storage-account-add
https://learn.microsoft.com/en-us/azure/app-service/containers/how-to-serve-content-from-azure-storage
this would allow you to expose a storage account for your container

Related

Check how many times an App Service pulls an image from an Azure Container Repository

I have an Azure App Service that runs a Linux container which is pulled from a tag in an Azure Container Registry. If I enable continuous deployment in the App Service and overwrite the tag associated with it, the App Service will automatically update itself by pulling the image from the associated tag. Since this process of deployment from the tag happens automatically, I want to know when and how many times did the App Service pull from the tag and updated itself, all the way from the very first instance that it happened. The deployment and Activity Logs of the App Service does not show what I've been looking for, nor the Azure Monitor logs and metrics.
Have you looked at this article ?
https://learn.microsoft.com/en-us/azure/container-registry/container-registry-diagnostics-audit-logs
You should be able to configure diagnostic logs on the ACR and store those in Log Analytic to view the different pull counts.

I have an API deployed to an Azure VM. I'd like to send all console output from this API to some cloud text storage for reading How do I do this?

As per the title, I just want all stdio output from this service to be sent to a cloud resource like cloudwatch logs or Azure Monitor so I can just read them in a time span. All stdio lines should be there.
The documentation for this seems nonexistent
This really depends on what tech that API is built on and what OS your VM is hosting. But the simplest way is IMO to enable the monitoring agent on your VM. Create a Log Analytics workspace and attach your VM straight from the portal. This way you can subscribe to different sources of logs and make them appear in your LA workspace, to be queried and filtered for further analysis.
Docs here: https://learn.microsoft.com/en-us/azure/azure-monitor/learn/quick-collect-azurevm
You output from your API can then within the VM be directed to either Windows EventLog or Linux syslog, which is supported out of the box in Log Analytics workspace and the monitoring agent. If your API runs in a container, say with docker, you can enable a special container monitoring solution on your created LA workspace - or you can configure docker to direct container logs to either syslog or EventLog directly.
If you run docker containers, here's a guide for configuring the logging driver: https://docs.docker.com/config/containers/logging/syslog/
If you run you API on IIS, you can simply enable IIS logs fetching on your LA workspace from the portal. However, this will only send HTTP logs and not stdout as far as I know.
If Anything else, please enhance your question.

Unable to sort files by modified date/time in Azure Storage Explorer

We have logs on our application server hosted in Azure cloud. We want to show the logs to the customer who does not have access to the application server directly. We decided to use Azure sync to synchronize the logs from the app server to Azure File storage and enable view those logs from Azure Storage Explorer. The sync all works fine, but I am unable to sort the logs based on modified date-time. Our production server has 1000s of log files and it is not easy to search through the files to check logs. Any idea how to bring the modified date-time in Storage explorer? Or Is there any another approach?
In the Azure file explorer App, fileshares don't have the date column, only Blob containers do, indeed.
However, if you mount the fileshare as a drive on your computer, you'll get the date info and will be able to sort.
The command script (Powershell) to mount the fileshare as a drive on Windows is available in the Azure portal.

How to mount a volume (Azure File Share) to a bitnami-based docker image on Azure (Web App for Container)?

I have the Matomo Docker Image from https://github.com/bitnami/bitnami-docker-matomo that I run in a Web App for Container on Azure with my own Azure Container Registry (ACR).
Also, I have an Azure Storage Account with a File Share available.
What I would like to achieve is to mount a persistent storage (File Share from Az Storage Account) to it so I don't loose the config and plugins installed of Matomo.
I tried using the Mount Storage (Preview), but I couldn't get it to work.
Name: matomo_data
Storage Type: Azure Files
Mount path: /bitnami
As described in: https://github.com/bitnami/bitnami-docker-matomo#persisting-your-application
This didn't work.
I also tried via the setting WEBSITES_ENABLE_APP_SERVICE_STORAGE = true on the Web App for Containers, but apparently seems not to do anything either.
I would appreciate any hints here, as otherwise I would have to make a custom docker image, push it to the registry, with a custom docker compose file, which I would like to avoid.
Thanks a lot in advance for any hints on this!
To mount the Azure File Share to the Web App for Container, as I think, it's not simple persistent storage, it's a share action. See the Caution below:
Linking an existing directory in a web app to a storage account will
delete the directory contents. If you are migrating files for an
existing app, make a backup of your app and its content before you
begin.
So, if you want to mount the file share to the web app to persist the storage, you need to upload all the files needed to the file share first. And the steps that mount the Azure File Share to the Web app are here. It shows for Windows, and for Linux is also the same way.
But I will suggest you'd better use the persistent storage following the steps here. This way will create persistent storage at the beginning and will not delete the directory contents.

Azure Webapp (using Nodejs) logging to Blob storage

I'm using Azure Webapp (Windows) to host a nodejs app. Inside my app, I'm using bunyan as my logger library.
I have configured App logging to Azure Storage Blog through the web portal. I selected the storage accounts, blob container, etc.
If I go to Log Stream, I do see the logs from my app in the portal. However, those logs are not being stored in the selected Blob container. There's a folder (inside the container) that was created by the app service and it does log things like when a new version of my app is deployed using a CSV file but that's all.
I've read my posts indicating that Nodejs is not supported for this environment (logging to blob storage). Is this true? Am I missing something?
For node.js websites the way to write application logs is by writing to the console using console.log('message') and console.error('message') which goes to Information/Error level log entries. Currently only supported target for the nodejs log files is the file system.
It looks as though you can't use blob storage for application logs in Azure Web Apps. However Azure blob storage adapters for popular nodejs loggers such as winston: winston-azure-blob-transport.

Resources