Link Azure SQL Database to Data Factory using managed identity - azure

I have been trying to use Managed Identity to connect to Azure SQL Database from Azure Data factory.
Steps are as follow:
Created a Linked Service and selected Managed Identity as the Authentication Type
On SQL Server, added Managed Identity created for Azure Data Factory as Active Directory Admin
The above steps let me do all data operations on the database. Actually that is the problem. I want to restrict the privileges given to Azure Data Factory on my SQL database.
First, let me know whether I have followed the correct steps to set up the managed identity. Then, how to limit privileges because I don't want data factory to do any DDL on SQL database.

As Raunak comments,you should change the role to db_datareader.
In you sql database,run this sql:
CREATE USER [your Data Factory name] FROM EXTERNAL PROVIDER;
and this sql:
ALTER ROLE db_datareader ADD MEMBER [your Data Factory name];
You can find '[your Data Factory name]' here
Then you do any DDL operation in Data Factory,you will the error like this:
"errorCode": "2200",
"message": "ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed. Please search error to get more details.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The INSERT permission was denied on the object
Update:
1.Search for and select SQL server in azure portal
2.select you and save as admin
3.click the button and run two sql in sql database.
More details,you can refer to this documentation.

Related

Azure SQL database with MFA login to connect from Azure ADF

I have an Azure SQL server and database which have MFA login and I am the admin. But when I try to establish a connection via a new linked service from ADF to this database using System Managed Identity option, it throws error -
"Cannot connect to SQL Database. Please contact SQL server team for further support. Server: 'Server details', Database: 'database name', User: ''. Check the linked service configuration is correct, and make sure the SQL Database firewall allows the integration runtime to access.
I have already given contributor role access to ADF in SQL database using system managed Identity. Also, I have tried to access this database using Autoresolve runtime and azure runtime. But still the error is coming.
It sounds like you are missing the user creation and role assignment within the SQL database:
Connect to the database with your account and create an account for the data factory:
CREATE USER [<datafactory-name>] FROM EXTERNAL PROVIDER;
Then grant it the required role for your task:
ALTER ROLE [<roleName>] ADD MEMBER [<datafactory-name>]
Some available role names are:
db_accessadmin
db_backupoperator
db_datareader
db_datawriter
db_ddladmin
db_denydatareader
db_denydatawriter
db_owner
db_securityadmin
public
I created Azure SQL database in portal and created linked service in azure data factory with managed identity authentication I got below error:
I followed below procedure to resolve this:
I turned on the managed identity of data factory
I set admin for azure SQL database:
Login with Admin to sql database Create User username as data factory name using below code:
CREATE USER [DATAFACTORY NAME] FROM eXTERNAL PROVIDER
Added rules to the user using below code:
ALTER ROLE db_datareader ADD MEMBER [DATA FACTORY NAME];
I tested linked service again, tested successfully
It worked for me, once check from your end.

DataBricks UnityCatalog create table fails with "Failed to acquire a SAS token UnauthorizedAccessException: PERMISSION_DENIED: request not authorized"

I'm new to DataBricks Unity Catalog and I'm trying to follow the quickstart notebook on https://docs.databricks.com/_static/notebooks/unity-catalog-example-notebook.html.
It seems to me I did whatever I had to do:
I created a Databricks access connector in Azure (which becomes a managed identity)
I created a storage Account ADLS Gen2 (DAtalake with hierarchical namespace) plus container
On my datalake container I assigned Storage Blob Data Contributor role to the managed identity above
I created a new Databricks Premium Workspace
I created a new metastore in Unity Catalog that "binds" the access connector to the DataLake
Bound the metastore to the premium databricks workspace
I gave my Databricks user Admin permission on the above Databricks workspace
I created a new cluster in the same premium workspaces, choosing framework 11.1 and "single user" access mode
I ran the workspace, which correctly created a new catalog, assinged proper rights to it, created a schema, confirmed that I am the owner for that schema
The only (but most important) SQL command of the same notebook that fails is the one that tries to create a managed Delta table and insert two records:
CREATE TABLE IF NOT EXISTS quickstart_catalog_mauromi.quickstart_schema_mauromi.quickstart_table
(columnA Int, columnB String) PARTITIONED BY (columnA);
When I run it, it starts working and in fact it starts creating the folder structure for this delta table in my storage account
, however then it fails with the following error:
java.util.concurrent.ExecutionException: Failed to acquire a SAS token for list on /data/a3b9da69-d82a-4e0d-9015-51646a2a93fb/tables/eab1e2cc-1c0d-4ee4-9a57-18f17edcfabb/_delta_log due to java.util.concurrent.ExecutionException: com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException: PERMISSION_DENIED: request not authorized
Please consider that I didn't have any folder created under "unity-catalog" container before running the table creation command. So it seems that is can successfully create the folder structure, but after it creates the "table" folder, it can't acquare "the SAS token".
So I can't understand since I am an admin in this workspace and since Databricks managed identity is assigned the contributor role on the storage container, and since Databricks actually starts creating the other folders. What else should I configure?
I found it: you need to only to assign, at container level, the Storage Blob Data Contributor role to the Azure Databricks Connector. In fact, you need to assign the same role and the same connector at STORAGE ACCOUNT level.
I couldn't find this information in the documentation and I frankly can't understand why this is needed since the delta table path was created.
However, this way, it works.
I solved this issue by doing the following:
Grant the "Access Connector for Azure Databricks" the permission "Storage Blob Data Reader" at the Storage Account level.
Grant the "Access Connector for Azure Databricks" the permission "Storage Blob Data Contributor" at the container level used by the workspace.
That keeps the permissions a bit more restrictive without having to go down the 'Owner' level.

Azure Data Factory Exception while reading table from Synapse and using staging for Polybase

I'm using Data Flow in Data Factory and I need to join a table from Synapse with my flow of data.
When I added the new source in Azure Data Flow I had to add a Staging linked service (as the label said: "For SQL DW, please specify a staging location for PolyBase.")
So I specified a path in Azure Data Lake Gen2 in which Polybase can create its tem dir.
Nevertheless I'm getting this error:
{"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: at Source 'keyMapCliente': shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: CREATE EXTERNAL TABLE AS SELECT statement failed as the path name 'abfss://MyContainerName#mystorgaename.dfs.core.windows.net/Raw/Tmp/e3e71c102e0a46cea0b286f17cc5b945/' could not be used for export. Please ensure that the specified path is a directory which exists or can be created, and that files can be created in that directory.","Details":"shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException: CREATE EXTERNAL TABLE AS SELECT statement failed as the path name 'abfss://MyContainerName#mystorgaename.dfs.core.windows.net/Raw/Tmp/e3e71c102e0a46cea0b286f17cc5b945/' could not be used for export. Please ensure that the specified path is a directory which exists or can be created, and that files can be created in that directory.\n\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:262)\n\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1632)\n\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:872)\n\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:767)\n\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7418)\n\tat shaded.msdataflow.com.microsoft.sqlserver.jd"}
The following are the Azure Data Flow Settings:
this the added source inside the data flow:
Any help is appreciated
I have reproed and was able to enable stagging location as azure data lake Gen2 storage account for polybase and connected synapse table data successfully.
Create your database scooped credentials with azure storage account key as secret.
Create an external data source and an external table with the scooped credentials created.
In Azure data factory:
Enable staging and connect to azure data lake Gen2 storage account with Account key authentication type.
In the data flow, connect your source to the synapse table and enable staging property in the source option

Permission Issue on Bulk Insert in Azure SQL Server

User is getting below error while running bulk insert command in Azure SQL Server. I am using Azure SQL Server and not SQL Sever. Most of the commands related to Bulk Insert grant permission is not working in Azure SQL Server.
Error
You do not have permission to use the bulk load statement.
Commands Tried in Azure SQL Server to Add User
EXEC sp_addrolemember 'db_ddladmin', 'testuser';
ALTER SERVER ROLE [bulkadmin] ADD MEMBER testuser
GRANT ADMINISTER BULK OPERATIONS TO testuser
Error
Msg 40520, Level 16, State 1, Line 5
Securable class 'server' not supported in this version of SQL Server.
Your help is highly appreciated.
In Azure SQL Database, grant ADMINISTER DATABASE BULK OPERATIONS to the principal in the context of the desire database:
GRANT ADMINISTER DATABASE BULK OPERATIONS TO testuser;
The user will also need INSERT permissions on the target table. These Azure SQL Database permissions are detailed in the BULK INSERT documentation under the permissions section.
On Azure it works on tables in the database in question only. It does not work on temp tables. So if you are bulk loading in parallel and want to use temp tables, you are in a corner.
GRANT CONTROL to testuser
nothing else is needed, just this to be executed in content DB (not master)
full steps
in Master
CREATE LOGIN login1 WITH password='1231!#ASDF!a';
in content DB
CREATE USER user1 FROM LOGIN login1;
GRANT CONTROL to user1; --(this is for bulk to work)

Create Database Permission on Azure SQL DW

Im trying to assign Create Database Permission to one of User Account in Azure SQL DW but its giving error while trying to execute it.
Tsql:
grant alter any database to [test_user1]
Error:
Securable class 'server' not supported in this version of SQL Server.
Please let me know if this permission (Create Database) can be used in Azure SQL DW

Resources