Unable to link storage account to Log analytics workspace - azure

We are using fluentbit to output application logs to a Azure log analytics workspace. The application log does appear in the workspace as a table under the Logs blade, Custom Logs category. So far so good.
Due to the maximum retention period of the Log analytic workspace limited to 730 days, I thought linking a storage account to type Custom logs & IIS logs under the Linked storage accounts would solve the problem for me. My understanding is once a storage account is linked to type Custom logs & IIS logs, all Custom Logs will be written into the nominated storage account instead of the default storage account that comes with the creation of the Log analytics workspace. Is this understanding correct?
Secondly, after clicking on the Custom logs & IIS logs item, and selecting a storage account from the Pop-up blade on the left hand side, Azure Portal reported a message Successfully linked storage account . However, the Linked storage accounts view still reports No linked storage accounts.
Browsing the target storage account, no log seems to be written to the storage account.
Updates 1
Storage account network configuration.
Updates 2
The answer is accepted as it is technically correct. However, a few steps/details are missing in the documentation. In order to, map a customer storage account to a LA Workspace, one must build resources to match the following diagram.
Create a AMPLS resource.
Link the AMPLS resource to your LA workspace.
Create private endpoint on the target vnet for the AMPLS resource
Create storage account.
Create print endpoints (blob type) on the target vnet
Link the storage account to the LA workspace.

We need to follow few prerequisites before linking the storage account to the workspace.
Storage account should be in the same region as log analytics workspace.
Need to give permissions for other services to allow accessing the storage account.
Allow Azure Monitor to access the storage account. If you chose to allow only select networks to access your storage account, you should select the exception: “Allow trusted Microsoft services to access this storage account”
For rest of the configuration information refer to MS Docs.
By following the above documentation, I can link the storage account successfully as below:

Related

How to Access Azure Storage NFS File Share from another Subscription

I have a storage account in a subscription which has a VNet that the storage account is setup to use. This works well in the kubernetes cluster in that subscription that attached to that Vnet. NFS works fine to the the storage account in question.
But we have a secondary subscription for failover in a paired region (East US and West US) that I'd like to have that k8s cluster also be able to mount the NFS share.
I've tried creating a peering and adding the secondary subscription's VNet (which doesn't overlap) to the Storage account, but the k8s cluster in the secondary subscription times out connecting the share.
I didn't do any routing options when creating the peering, but I would have assumed that this would just work.
Does anyone have any instructions on how to get this working so that the secondary cluster can access the NFS share?
The storage sync service and/or storage account can be moved to a different resource group, subscription, or Azure AD tenant. After the storage sync service or storage account is moved, you need to give the Microsoft.StorageSync application access to the storage account.
Click Access control (IAM) on the left-hand table of contents.
Click the Role assignments tab to the list the users and applications (service principals) that have access to your storage account.
Verify Microsoft.StorageSync or Hybrid File Sync Service (old application name) appears in the list with the Reader and Data Access role.
This GitHub document on Azure file share can give you better insights.

Connecting storage account datalake 2 to log analytics workspace

I have a storage account datalake Gen2.
I need to connect my storage account logs to a Log analytics workspace.
But there is no Diagnostic Settings menu, so I don't know how to do.
I think this was supported by datalake Gen1, but is there a workaround for datalake gen 2?
thank you
There is a Diagnostic settings option at the end of the left sidebar, but you have to scroll quite a bit to find it.
Sadly, i believe currently there is no such option to automatically send diagnostic logs to a log analytics workspace. The active logs will be generated inside a folder named "$logs" located on the root path of your storage account, its only visible through Azure storage explorer.
Microsoft provides a Powershell script located at Azure GitHub which aims to upload the log files generated to a log analytics workspace of your choice.
You can refer to this official guide from Microsoft to build this workflow to send your logs to log analytics: Querying Azure Storage logs in Azure Monitor Log Analytics

Azure Storage Explorer - Unable to list resources

Granted Reader & Storage Blob Data Reader Role access on Azure Data Lake Gen2 Storage Account to the user DataLakeTester
Also under Manage Access granted full rights on Access / Default section.
But when logged into into Azure Storage Explorer with above user, is successfully connected to the data lake but cannot list the containers and throw below error. Is there some other role assignment to be done?
The latest version of Storage Explore now available is 1.11.1. Please update and try again:https://github.com/Microsoft/AzureStorageExplorer/releases
In response to your query:
But when logged into Azure Storage Explorer with above user, is successfully connected to the data lake but cannot list the containers and throw below error. Is there some other role assignment to be done? It works fine on my side, could you try to sign out and sign in again?
The RBAC roles you have appear to be sufficient. It can take some time for RBAC changes to propagate. So accessing things in Storage Explorer might not work as expected for a few minutes.

How to monitor read write activities on Azure Blob Storage

Need to figure out how to log/retrieve information about who (which Azure AD user) has read/write on blobs in our azure blob storage.
I know you can turn on logging on the storage account level using this:
I can see in the logs the different api calls that have been performed on the blob but If I myself went via the azure portal to open some of the blobs, I could not see this activity recorded in the logs. Any ideas how to monitor this? I need it for auditing purposes.
When you enable Storage Analytics on Portal, you will have $logs folder on your Blob with storage logs.
When you are using Azure AD authentication you need to configure 2.0 logs and use UserPrincipalName column to identify the user and parse the column with JSON AuthorizationDetail.action to identify the action of the user on storage, i.e. Microsoft.Storage/storageAccounts/blobServices/containers/read for list the blobs in a container.
You will not capture OAuth/Azure AD authenticated requests with log format 1.0.
On Azure Storage Uservoice there is also the request for integration with LogAnalytics to simplify logs monitoring, the private preview should start this month.

Why HDInsight Cluster can not add Blob Storage account as data source in Azure portal

As a newbie of Azure, I plan to build a cloud computing service with a free trial account.
I first created a Storage account. The Deployment model is Resource Manager as recommended so that I chose Blob storage as the Account kind.
Then I created an HDInsight cluster. But in the Data source configuration, the aforementioned Blob storage account can not be selected but with a warning - Could not reach the storage!. However, If I have created the Storage account with Classic as the Deployment model, the created Storage account can be selected as the Data source.
Anyone have any idea about why is it so?
Thanks in advance! I got stuck up here for long time
If you have selected 'Resource Manger' as the Deployment model, then the storage account should be of type 'general purpose azure blob storage account', you might have created azure blob only storage type account.

Resources