Can we get logs from the Azure SFTP service in Storage account? I've enabled the SFTP feature in the storage account which works as an SFTP service, I'm able to push files into it using client apps like Filezilla and Win SCP. But it's not working when I'm trying to push files from a client app. Client app is a reporting system which has the ability to push reports into an SFTP server.
My intention is to check the logs for the SFTP service and see why it's not serving requests from the client app
If the client can't reach the server for whatever reason, having logs on the Azure side won't help in any way. Check the client logs. What error are you getting?
Here's a list of common issues that might help:
https://learn.microsoft.com/en-us/azure/storage/blobs/secure-file-transfer-protocol-known-issues
As per this current documentation, you can pull the sftp logs by applying the filters in the activity logs or by enabling diagnostic settings on the storage accounts.
You can refer this documentation , for more information about the know issues with SFTP protocol in Azure blob storage account & currently this feature is in preview.
You can activate logs on your storage account from the diagnostic settings.
Go to the sftp storage account resource, then from the side menu you will see:
From it select the storage type (blob for example) you can then add a diagnostic settings:
Then select the category and select to which ever destination you desire, for example you can map it to a log analytic resource
Then you can query the logs, for example:
StorageBlobLogs
| where Category contains "StorageWrite"
Related
For storing Azure SQL database logs, is it necessary to explicitly create blob for logs storage or it is implicitly created. I read this and this post but still not sure about it?
Also, for Java/Spring application deployed in App Services, when using App Insights, Log Analytics and Azure Monitor for Application logs, HTTP and Access logs and DB logs, do I need to explicitly setup blob for storing logs?
No, you do not need to create a blob storage for Azure SQL database logs as they are stored in Azure SQL database transaction logs and can be viewed using Azure Monitor or audited using Azure SQL Auditing.
Steps to check the Logs in SQL DB under Monitor section.
After creating azure SQL database and server.
Go to monitoring tab as mentioned in below screenshot and the logs can be viewed.
Approach 2
Using Log Analytics
Create a Log analytics workspace in Azure.
And go to the SQL Database and choose the Diagnostics from left pane in monitoring tab.
Add a diagnostic setting and choose the created log analytics and choose the log option as mentioned in below screenshot.
You can find the logs as shown below.
To store the Azure SQL Logs explicitly
You need to create 'Storage Account' for storing logs.
And have to enable Azure Monitor Logs from your SQL server and select 'Diagnostic logs' from the Azure Monitor menu and then, turn on the logs and select the storage account you created.
And configure log retention by selecting the Logs tab in the Azure Monitor menu, and then choose 'Retention policy' to configure how long logs will be retained.
To verify logs in the storage account, go to the storage account and select 'Containers.' You should see a container named 'insights-logs-sqlserverauditlog.' You can then browse the logs stored in this container.
Some one has deleted the file share folder from the storage account in azure . It can be recovered as soft delete is enabled. But how to know that who has deleted the file?
It is possible to view operations within an Azure resource using Resource Logs. This is possible by Monitoring Azure Blob Storage which is a feature of Azure Monitor.
You would first start with creating a Diagnostic Setting- https://learn.microsoft.com/en-us/azure/storage/blobs/monitor-blob-storage?tabs=azure-portal#creating-a-diagnostic-setting
And then view logged activity by using a Log Analytics query or you can go the destination that you are forwarding the logs to as setup in the diagnostics setting and look for the respective API, example- "DeleteBlob" or "DeleteContainer" etc.,
However, if you have not already setup a diagnostic setting already and are forwarding data to a specific destination, it may not be possible to retrieve this information right now. Hope this helps!
I see log messages in azure iot device client source code like this:
log.debug("Connection already opened by TransportClient."); or
log.info("Device client opened successfully");
My question is where these log messages going? how to get that messages for debug purpose?
Thanks
In general, Blob Storage is added as a 'logging endpoint' which shall encompass a storage account, container in the account and blob in the container. The blobs of type 'Block blobs' shall be utilized for storing text and binary data.
All logs get stored in 'Block blobs' in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://.blob.core.windows.net/$logs
To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. There are many tools like AzCopy, Azure Storage Data Movement library, Azure Import/Export service to import or export data to and from your storage account. To view the logs, you can also use any tool that can access Azure blob storage, such as Visual Studio or Cerebrata Azure Management Studio.
In case of azure-iot-sdk, each IoT hub exposes a set of endpoints(service endpoints) for the solution's back end to communicate with the devices. An IoT hub has a default built-in-endpoint (messages/events). By default, messages are routed to the built-in service-facing endpoint (messages/events) that is compatible with Event Hubs. You can refer to below link to various methods to read from built-in endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin
You can also create custom endpoints to route messages to by linking other services in your subscription to the IoT Hub. In case if the custom endpoint is created, a message is routed to multiple endpoints whose routing queries it matches. There are two storage services IoT Hub can route messages to Azure Blob Storage and ADLS (Azure Data Lake Storage) Gen2 accounts. You can refer to the below link to various methods to read from custom endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-custom
In the scenario of logs from IoT SDK itself, the logs can get logged in stdout or stderr as per type of log and based on the deployment environment and also can be redirected accordingly as per the requirement. The SDK uses debug library for detailed logs. The below link can be helpful in giving additional details https://github.com/Azure/azure-iot-sdk-node/wiki/Troubleshooting-Guide-Devices
Our Project is a Java Spring boot application, We have a logging system using log4j, Which we are pushing into the Azure Storage accounts.
Question:
I want to query these custom logs in OMS. (Is it possible)
If Yes how.
Till now what i have tried is.
1. Pushed the logs in Blob storage using Logback and container looks like
Pushed logs in table storage
And configured Storage accounts in log analytics in Azure workspace
But i am unable to see any Analytic data to query in OMS .
Please help.
If you can't use Application Insights, you can read logs files from Storage and use HTTP Data Collector API to push logs into Log Analytics workspace. Samples and reference: https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api
I have some VMs running on Azure Service. I'd like to redirect logs from them (Windows Event Logs and MS SQL server logs) to a specific log concentrator (like Graylog). For Windows logs, I'm using Nxlog (https://nxlog.co/docs/nxlog-ce/nxlog-reference-manual.html#quickstart_windows). However, for specific (PaaS) applications such as SQL Server (PaaS in general) Nxlog does not apply.
Is there a way to redirect logs (VMs and PaaS) just using Azure (web) tools?
Most services keep their logs in a Storage Account so you can tap into that source and forward logs to your own centralized log database. You generally define the storage account at the place you enable diagnostics for the service.
Don't know what king of logs you are looking for in SQL DB, but for example the audit logs are saved in a storage account.
Azure Operations Management Suite (OMS) can ingest from dozens of services as well as custom logs. As itaysk mentioned, most services in Azure write service related diagnostic information to a storage account. It's really easy to ingest these from within OMS.
https://azure.microsoft.com/en-us/services/log-analytics/
For Azure Web Sites, you can use Application Insights and store custom metrics as well. There's also an option to continuously write these metrics to a storage account.
Here's a similar option for Azure SQL:
https://azure.microsoft.com/en-us/documentation/articles/sql-database-auditing-get-started/