I created a scoped credential in a Azure SQL Datawarehouse database to create an external table over some files in a Azure Data Lake Store.
When I try creating the external table I get the message.
Msg 105061, Level 16, State 1, Line 35 Unable to find any valid
credential associated with the specified data source. Credential is
required to connect to Azure Data Lake Store.
How do I troubleshoot this? My AzureAD application has access to the storage. I use the same AD-application (with a different key) for my Azure Data Factory pipeline that stores the files in the Azure Data Lake Store.
I haven't found any commands that let you test your credentials and see what credentials the database tries to use or why it fails. Any ideas?
https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql
So I had missed adding my scoped credential when I created the external data source. So create the scoped credential first, then the external data source.
Related
I am trying to create an ADF Linked Service connection to a Synapse Link Serverless SQL Pool connected to ADSL Storage. I can successfully get a connection but when I try and use a dataset to access the data I get a permission issue.
I can successfully access the data via Synapse studio :
This is the error I get when I use the data set in ADF.
I can also look at the schemas in SSMS , where they appear as External tables. But get a similar credential error at the same point.
Has anyone come across this issue please ?
There are a few pieces of information you haven’t supplied in your question but I believe I know what happened. The external table worked in Synapse Studio because you were connected to the Serverless SQL pool with your AAD account and it passed through your AAD credentials to the data lake and succeeded.
However when you setup the linked service to the Serverless SQL Pool Im guessing you used a SQL auth account for the credentials. With SQL auth it doesn’t know how to authenticate with the data lake so looked for a server scoped credential but couldn’t find one.
The same happened when you connected from SSMS with a SQL auth account I’m guessing.
You have several options. If it’s important to be able to access the external table with SQL auth you can execute the following to tell it how to access the data lake. This assumes the Synapse Workspace Managed Service Identity has Storage Blob Data Reader or Storage Blob Data Contributor role on the data lake.
CREATE CREDENTIAL [https://<YourDataLakeName>.dfs.core.windows.net]
WITH IDENTITY = 'Managed Identity';
Or you could change the authentication on the linked service to use the Managed Service Identity.
I have been trying to connect my SSIS package (on prem) to connect to my data lake store. I have installed the Azure Feature pack which has worked fine.
But when I create a Data Lake connection in my ssis package, I need the following .
Image of SSIS Azure Data Lake connector Manager
ADLS Host – which is fine I know how to get that.
Authentication ( Azure AD User Identity )
UserName & Password, - which I am having issues with.
My question is how do I define a username and password for my data lake?
You can find them in Azure AD User which is within the same subscription with your Azure Data Lake. Usually, it is your email address and password which you used to login Azure portal.
More details, you can refer to this documentation.
My application will write several records on a Azure SQL Database on a regular basis. I found that the most efficient way to do this was using BULK INSERT, so I was building a Blob Trigger function which would execute everytime a file is uploaded in my blob container to upload their contents into the database. I read here that a scoped credential should be created in the database with a SAS from the blob storage account, however I understand SAS has expiration dates, so I guess I should have to update the scoped credential at some point. Is there a way to create a permanent credential so I don't have to update it?
Have you tried using a function output binding instead to connect to the SQL Server? This would keep the function code simple.
See here for an example of how to achieve this : http://davidpallmann.blogspot.com/2019/02/new-sql-database-binding-for-azure.html
I am trying to restore a SQL Server database in Azure from a database backup file stored in a blob. I have followed this link but got this error
TITLE: Microsoft SQL Server Management Studio
An error occurred while loading data.
ADDITIONAL INFORMATION:
The type of a blob in the container is unrecognized by this version. (Microsoft.SqlServer.StorageClient)
The remote server returned an error: (409) Conflict. (System)
I have also tried this:
CREATE CREDENTIAL mycredential1
WITH IDENTITY= 'jjt', -- this is the name of the storage account you specified when creating a storage account
SECRET = 'storage account key'
Then try to use the credential to restore the sql db from the azure blob, but failed on the above step with the following error:
Msg 40514, Level 16, State 1, Line 1
'CREATE CREDENTIAL' is not supported in this version of SQL Server.
What is the correct way?
You cannot use CREATE CREDENTIAL on Azure SQL Database, you need to create a CREATE DATABASE SCOPED CREDENTIAL as shown below:
CREATE DATABASE SCOPED CREDENTIAL UploadInvoices
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'QLYMgmSXMklt%2FI1U6DcVrQixnlU5Sgbtk1qDRakUBGs%3D';
For more information, please read this article.
Additionally, you cannot restore a native SQL Server backup into an Azure SQL Database. The article you are making reference is restoring a bacpac. You can restore a bacpac into an Azure SQL Database.
Please create a bacpac of your SQL Server (on-premises) database using Microsoft SQL Server Management Studio (SSMS), make a right click on the database, choose Tasks, then choose Export Data-tier Application to create the mentioned bacpac. Once the bacpac has been created you can then import it to Azure SQL Database as a new database.
A better method to migrate your database from a SQL Server instance to Azure SQL database is using the latest version of Data Migration Assistant which you can download from here. This tool would perform an assessment of your SQL Server assessment, it will let you know about any adjustments you need to make to make the database compatible to Azure SQL Database. If the database is compatible, the tool will migrate the database to an Azure SQL Database logical server.
I have written console application to monitor/analyze the files in Azure data lake store. I have created an application in Azure active directory to access the azure resources.
I have followed all the steps given here to give access to application on azure data lake store. I have provided access to the parent and all the childern folders/files of data lake store.
Now, I am able to access the files through my code. I am trying to get the modification time and expiration time of a file produced by USQL job in data lake store by using DataLakeStoreFileSystemManagement Client in the code. I am using Microsoft provided .NET API for data lake analytics & data lake store.
I am getting all that information for the files for which i have provided access to.
But when the usql job adds a new folder/file in the azure data lake store then i am not getting the modification time and expiration in my code. Instead i am getting exception of Forbidden Error 403.
the Usql jobs creates plenty of folders everyday and i just can't go there and provide access manually to all newly created files and folder. It should inherit the access role for newly created folders/files.
what should I do ? or Is that a bug in Azure data lake store?
Please Help.
You are not allowed to call REST end points from within user code in U-SQL (reasons are explained here). The DataLakeStoreFileSystem Management client is attempting to (recursively) call into ADL through REST end points and is being blocked by the container boundary protection. So the 403 (Forbidden) is by design.
We are working on adding file properties to our U-SQL APIs as meta properties in one of the upcoming refreshes. Would that help?
We need more information to debug the issue you are facing. Please file a support ticket (from the Azure Portal) and then email me the ticket number (cpalmer#microsoft.com). In the support ticket, identify the ADLS account name, timestamp of the access, name of the file/folder you were accessing that got 403, (approximate) timestamp when you believe you created that file/folder.