ADF Unable to connect to Synapse Link SQL Pool External Tables - azure

I am trying to create an ADF Linked Service connection to a Synapse Link Serverless SQL Pool connected to ADSL Storage. I can successfully get a connection but when I try and use a dataset to access the data I get a permission issue.
I can successfully access the data via Synapse studio :
This is the error I get when I use the data set in ADF.
I can also look at the schemas in SSMS , where they appear as External tables. But get a similar credential error at the same point.
Has anyone come across this issue please ?

There are a few pieces of information you haven’t supplied in your question but I believe I know what happened. The external table worked in Synapse Studio because you were connected to the Serverless SQL pool with your AAD account and it passed through your AAD credentials to the data lake and succeeded.
However when you setup the linked service to the Serverless SQL Pool Im guessing you used a SQL auth account for the credentials. With SQL auth it doesn’t know how to authenticate with the data lake so looked for a server scoped credential but couldn’t find one.
The same happened when you connected from SSMS with a SQL auth account I’m guessing.
You have several options. If it’s important to be able to access the external table with SQL auth you can execute the following to tell it how to access the data lake. This assumes the Synapse Workspace Managed Service Identity has Storage Blob Data Reader or Storage Blob Data Contributor role on the data lake.
CREATE CREDENTIAL [https://<YourDataLakeName>.dfs.core.windows.net]
WITH IDENTITY = 'Managed Identity';
Or you could change the authentication on the linked service to use the Managed Service Identity.

Related

Azure synapse analytics- synapse link authentication

The synapse link for Dataverse is running fine when the storage account access key is disabled. We can able to create new records, there is no problem here.
But it fails to set up a new synapse link for Dataverse when the storage account key is disabled. Has anyone seen this issue before?
expecting synapse link to work when the storage account access key is disabled
As per my analysis looks like the Storage account key access should be enabled at the time of Synapse link creation and once it is created successfully, then you can disable the storage access key and the behavior should be similar to that your existing Synapse link service.

Received Azure data share not mapping to SQL db target - permissions error

We are testing the process for azure data share and have set up source shares from SQL db on our primary azure domain and shared them to accounts on our secondary azure domain.
We can accept the share invite and successfully map the source SQL data to blob storage but when we attempt to map the same share to a target SQL db we get the error:
Mapping failed. Please check Troubleshoot Azure Data Share for help
and try again.
Data Share account's Managed identity is missing required permissions
on database. For more details, please refer to
https://learn.microsoft.com/en-us/azure/data-share/subscribe-to-data-share
We have followed all the troubelshooting tips and can confirm the Data Share external user has the 3 roles on the target db (db_datareader, db_datawriter, db_ddladmin).
We have also made the Data Share identity a SQL Db Contributor on the target SQL server.
Is there anything else we need to update or checks we can run to find out why the data share is not mapping to SQL target. We really need to prove this process before we offer it to our clients so any help that can be provided would be gratefully received.
Thanks
There are several permissions are needed or conditions to access Database from Azure DataShare resource's managed identity:
First Set Azure Active Directory Admin to yourself in SQL server. After becoming admin Connect through Azure Active Directory authentication to the Azure SQL Database using SSMS and then grant db_datareader permission to the Data Share Resource-Managed Identity while granting this permission, make sure you're connected to the database by Azure Active Directory authentication not SQL authentication.

In Azure Synapse, how do I setup a SQL Server that can access Datalake Storage?

I have setup a Synapse environment and filled my storage account with some sample Parquet files. I have then created a serverless SQL database and created some external tables over the Parquet files. All this works fine and I can query these tables fine from the Synapse UI and SSMS using AD Authentication.
The problem is I want to connect an app to the serverless SQL database which doesn't support AD authentication. Therefore I want to connect it using a standard SQL account. I have setup a SQL account (username and password) and I'm able to connect through SSMS, but not query any tables due to this error...
External table 'TableName' is not accessible because content of directory cannot be listed.
I assume this is a double-hop authentication problem because the SQL user doesn't have access to the storage account? I can't seem to find any guides on how to do this. Does anyone know?
I've written a blog-post where this issue is tackled, as I've encountered this problem as well a few days ago. You can read it here.
Basically, it comes down to the fact that you have to:
create a SQL login for your user
create a credential in SQL that has the same name as the URL that points to the container in your datalake that contains the files you want to query
grant reference rights on that credential to your SQL login
create a user on your database for that login
Next to that, you also need to create some specific role-assignments.

Unable to connect to Azure data lake store via SSIS

I have been trying to connect my SSIS package (on prem) to connect to my data lake store. I have installed the Azure Feature pack which has worked fine.
But when I create a Data Lake connection in my ssis package, I need the following .
Image of SSIS Azure Data Lake connector Manager
ADLS Host – which is fine I know how to get that.
Authentication ( Azure AD User Identity )
UserName & Password, - which I am having issues with.
My question is how do I define a username and password for my data lake?
You can find them in Azure AD User which is within the same subscription with your Azure Data Lake. Usually, it is your email address and password which you used to login Azure portal.
More details, you can refer to this documentation.

Debug sql database scoped credentials failure

I created a scoped credential in a Azure SQL Datawarehouse database to create an external table over some files in a Azure Data Lake Store.
When I try creating the external table I get the message.
Msg 105061, Level 16, State 1, Line 35 Unable to find any valid
credential associated with the specified data source. Credential is
required to connect to Azure Data Lake Store.
How do I troubleshoot this? My AzureAD application has access to the storage. I use the same AD-application (with a different key) for my Azure Data Factory pipeline that stores the files in the Azure Data Lake Store.
I haven't found any commands that let you test your credentials and see what credentials the database tries to use or why it fails. Any ideas?
https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql
So I had missed adding my scoped credential when I created the external data source. So create the scoped credential first, then the external data source.

Resources