Azure synapse analytics- synapse link authentication - azure

The synapse link for Dataverse is running fine when the storage account access key is disabled. We can able to create new records, there is no problem here.
But it fails to set up a new synapse link for Dataverse when the storage account key is disabled. Has anyone seen this issue before?
expecting synapse link to work when the storage account access key is disabled

As per my analysis looks like the Storage account key access should be enabled at the time of Synapse link creation and once it is created successfully, then you can disable the storage access key and the behavior should be similar to that your existing Synapse link service.

Related

Microsoft purview data access policy not applying

I am working on this Tutorial from Microsoft Azure team to implement Access provisioning by data owner to Azure Storage datasets. As shown in the image below, the Data Owner Policy is supposed to allow Grady Archie a Read permission on Azure Data Lake Gen2 storage account called acct4dlsgen2. But for some reasons, when Grady Archie logs into Azure portal in the same network, he is unable to access acct4dlsgen2 storage.
Question: What I may be doing wrong, and how can we fix the issue?
Remarks:
I have satisfied all the prerequisites of the same article mentioned above.
Have also given Grady Archie the Read permissions on the Purview Collection where this storage account is registered in Purview.
When I give Grady Archie a Read permission directly by going
through that storage account via Azure portal, Grady Archie can
access that storage after he logs-in. But this defeats the purpose of implementing Data Access using Purview as described here by Microsoft team.
One of the pre-requisites you have done is to configure the subscription for Purview policies using a PowerShell script
But this configuration is only applied to newly created storage accounts. And maybe your storage account was already existing when you configured the subscription for purview policies
if you create a new storage account inside your subscription, I believe your purview policies will work on this account.

ADF Unable to connect to Synapse Link SQL Pool External Tables

I am trying to create an ADF Linked Service connection to a Synapse Link Serverless SQL Pool connected to ADSL Storage. I can successfully get a connection but when I try and use a dataset to access the data I get a permission issue.
I can successfully access the data via Synapse studio :
This is the error I get when I use the data set in ADF.
I can also look at the schemas in SSMS , where they appear as External tables. But get a similar credential error at the same point.
Has anyone come across this issue please ?
There are a few pieces of information you haven’t supplied in your question but I believe I know what happened. The external table worked in Synapse Studio because you were connected to the Serverless SQL pool with your AAD account and it passed through your AAD credentials to the data lake and succeeded.
However when you setup the linked service to the Serverless SQL Pool Im guessing you used a SQL auth account for the credentials. With SQL auth it doesn’t know how to authenticate with the data lake so looked for a server scoped credential but couldn’t find one.
The same happened when you connected from SSMS with a SQL auth account I’m guessing.
You have several options. If it’s important to be able to access the external table with SQL auth you can execute the following to tell it how to access the data lake. This assumes the Synapse Workspace Managed Service Identity has Storage Blob Data Reader or Storage Blob Data Contributor role on the data lake.
CREATE CREDENTIAL [https://<YourDataLakeName>.dfs.core.windows.net]
WITH IDENTITY = 'Managed Identity';
Or you could change the authentication on the linked service to use the Managed Service Identity.

Unable to link storage account to Log analytics workspace

We are using fluentbit to output application logs to a Azure log analytics workspace. The application log does appear in the workspace as a table under the Logs blade, Custom Logs category. So far so good.
Due to the maximum retention period of the Log analytic workspace limited to 730 days, I thought linking a storage account to type Custom logs & IIS logs under the Linked storage accounts would solve the problem for me. My understanding is once a storage account is linked to type Custom logs & IIS logs, all Custom Logs will be written into the nominated storage account instead of the default storage account that comes with the creation of the Log analytics workspace. Is this understanding correct?
Secondly, after clicking on the Custom logs & IIS logs item, and selecting a storage account from the Pop-up blade on the left hand side, Azure Portal reported a message Successfully linked storage account . However, the Linked storage accounts view still reports No linked storage accounts.
Browsing the target storage account, no log seems to be written to the storage account.
Updates 1
Storage account network configuration.
Updates 2
The answer is accepted as it is technically correct. However, a few steps/details are missing in the documentation. In order to, map a customer storage account to a LA Workspace, one must build resources to match the following diagram.
Create a AMPLS resource.
Link the AMPLS resource to your LA workspace.
Create private endpoint on the target vnet for the AMPLS resource
Create storage account.
Create print endpoints (blob type) on the target vnet
Link the storage account to the LA workspace.
We need to follow few prerequisites before linking the storage account to the workspace.
Storage account should be in the same region as log analytics workspace.
Need to give permissions for other services to allow accessing the storage account.
Allow Azure Monitor to access the storage account. If you chose to allow only select networks to access your storage account, you should select the exception: “Allow trusted Microsoft services to access this storage account”
For rest of the configuration information refer to MS Docs.
By following the above documentation, I can link the storage account successfully as below:

attaching additional storage accounts with SAS key while creating HDInsight cluster from the Azure portal

How do I specify an additional storage account with SAS keyfrom Azure portal while creating HDInsgith cluster? It's expecting actual storage key , not SAS key. Ideally I want to do that and export a template out of it. My goal is to get ARM template example for attaching storage with SAS key to HDInsight cluster. But I am not able to find this template anywhere. I just need an example that I can use.
Unfortunately, you don't have option to attach additional storage accounts with SAS key while creating HDInsight cluster from the Azure portal.
I would request you to provide the feedback here:
https://feedback.azure.com/forums/217335-hdinsight
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.

Azure Cross Directory Data Access

I'm currently developing a Azure solution for one of my managed service clients.
we are developing a power bi service for their Azure backup/ azure recovery.
we are looking to host the whole process in our own azure environment, however we cannot get the data from A) their recovery vault logs into B) our Azure environment.
Anyone have any ideas on how to move data from their environment into our environment storage?
thank you
Power BI based reporting gets the data from storage accounts which store Azure Backup data. Once customer configured diagnostic settings to send data to a storage account(ask him to create a dedicated storage account for this), he can share the access keys with you so that you can connect to customer's storage account to get the required data and run Power Bi report in your environment.
This doc has all the details - with the change being in this case, customer will store data in his storage account and provide you access to that storage account via access key.

Resources