Loading file from Azure Blob storage into Azure SQL DB: error code 86 The specified network password is not correct - azure

I've been trying to run the following script to read the file from azure blob storage.
--------------------------------------------
--CREATING CREDENTIAL
-- --------------------------------------------
--------------------------------------------
--shared access signature
-- --------------------------------------------
CREATE DATABASE SCOPED CREDENTIAL dlcred
with identity='SHARED ACCESS SIGNATURE',
SECRET = 'sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-12-01T07:28:58Z&st=2019-08-31T23:28:58Z&spr=https,http&sig=<signature from storage account>';
--------------------------------------------
--CREATING SOURCE
--------------------------------------------
CREATE EXTERNAL DATA SOURCE datalake
WITH (
TYPE =  BLOB_STORAGE,
LOCATION='https://<storageaccount>.blob.core.windows.net/<blob>',
CREDENTIAL = dlcred
);
Originally, the script worked just fine, but later on it started giving the following error when running the last query below - Cannot bulk load because the file "test.txt" could not be opened. Operating system error code 86(The specified network password is not correct.)
--TEST
--------------------------------------------
SELECT CAST(BulkColumn AS XML)
FROM OPENROWSET
(
 BULK 'test.xml',
 DATA_SOURCE = 'datalake', 
 SINGLE_BLOB
) as xml_import
The same error happens if I create a credential with service principal or access key.
Tried literally everything and logged the ticket with Azure support, however they are struggling to replicate this error.
I feel like it's an issue outside of the storage account and SQL server - Azure has a whole bunch of services that can be activated/deactivated against a subscription, and I feel like it's one of these that's preventing us from successfully mapping the storage account.
Has anyone encountered this error? And if so, how did you solve it?

I was able to get this issue resolved with Microsoft support. In section F here, I granted Storage Blob Data Contributor access to the managed identity of the SQL Server instance, then ran the SQL statements using the managed identity section in here: https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15#f-importing-data-from-a-file-in-azure-blob-storage.
Preserving the code solution below:
CREATE DATABASE SCOPED CREDENTIAL msi_cred WITH IDENTITY = 'Managed Identity';
GO
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://****************.blob.core.windows.net/curriculum'
, CREDENTIAL= msi_cred );
BULK INSERT Sales.Invoices
FROM 'inv-2017-12-08.csv'
WITH (DATA_SOURCE = 'MyAzureBlobStorage');
In order to do this, the SQL server instance requires a managed identity to be assigned to it. This can be done at creation time with the --assign-identity flag.

Related

Error when creating view on pipeline (problem with BULK path)

Good morning everybody!
Me and my team managed to create part of an Azure Synapse pipeline which selects the database and creates a data source named 'files'.
Now we want to create a view in the same pipeline using a Script activity. However, this error comes up:
Error message here
Even if we hardcoded the folder names and the file name on the path, the pipeline won't recognise the existance of the file in question.
This is our query. If we run it manually on a script in the Develop section everything works smoothly:
CREATE VIEW query here
We expected to get every file with ".parquet" extension inside every folder available on our data_source named 'files'. However, running this query on the Azure Synapse Pipeline won't work. If we run it on a script in Develop section, it works perfectly. We want to achieve that result.
Could anyone help us out?
Thanks in advance!
I tried to reproduce the same thing my environment and got error.
The cause of error can be the Your synapse service principal or the user who is accessing the storage account does not have the role of Storage Blob data Contributor role assigned to it or your External data source have some issue. try with creating new external data source with SAS token.
Sample code:
CREATE DATABASE SCOPED CREDENTIAL SasToken
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'SAS token';
GO
CREATE EXTERNAL DATA SOURCE mysample1
WITH ( LOCATION = 'storage account',
CREDENTIAL = SasToken
)
CREATE VIEW [dbo].[View4] AS SELECT [result].filepath(1) as [YEAR], [result].filepath(2) as [MONTH], [result].filepath(3) as [DAY], *
FROM
OPENROWSET(
BULK 'fsn2p/*-*-*.parquet',
DATA_SOURCE = 'mysample1',
FORMAT = 'PARQUET'
) AS [result]
Execution:
Output:

Empty error while executing SSIS package in Azure Data Factory

I have created a simple SSIS project and in this project, I have a package that will delete a particular file in Downloads folder.
I deployed this project to Azure. And when I am trying to execute this package using Azure Data Factory then the pipeline fails with an empty error (I am attaching the screenshot here).
enter image description here
What I have done to fix this error is:
I have added self-hosted IR to Azure-SSIS IR as the proxy to access the data on-premise.
Set the ConnectByProxy as True.
Converted the project to Project Deployment Model.
Please help me out to fix this error and if you need more details then just leave a comment.
Windows Authentication :
To access data stores such as SQL servers/file shares on-premises or Azure Files, check the Windows authentication check box.
If this check box is selected, fill in the Domain, Username, and Password fields with the values for your package execution credentials. The domain is Azure, the username is storage account name>, and the password is storage account key> to access Azure Files, for example.
Using the secrets stored in your Azure Key Vault
As a substitute, you can leverage secrets from your Azure Key Vault as values. Select the AZURE KEY VAULT check box next to them to do so. Create a new key vault connected service or choose or update an existing one. Then choose your value's secret name and version. You can pick or update an existing key vault or create a new one when creating or editing your key vault connected service. If you haven't previously done so, allow Data Factory managed identity access to your key vault. You may also directly input your secret in the format key vault linked service name>/secret name>/secret version>.
Note : If you are using Windows Authentication, there are four methods to
access data stores with Windows authentication from SSIS packages
running on your Azure-SSIS IR: Access data stores and file shares with
Windows authentication from SSIS packages in Azure | Docs
Make Sure it Falls under one of such methods, else it could potentially fail at the Run Time.

Azure ML studio export data Azure Storage V2

I already post my problem here and they suggested me to post it here.
I am trying to export data from Azure ML to Azure Storage but I have this error:
Error writing to cloud storage: The remote server returned an error: (400) Bad Request.. Please check the url. . ( Error 0151 )
My blob storage configuration is Storage v2 / Standard and Require secure transfer set as enabled.
If I set the Require secure transfer set as disabled, the export works fine.
How can I export data to my blob storage with the require secure transfer set as enabled ?
According to the offical tutorial Export to Azure Blob Storage, there are two authentication types for exporting data to Azure Blob Storage: SAS and Account. The description for them as below.
For Authentication type, choose Public (SAS URL) if you know that the storage supports access via a SAS URL.
A SAS URL is a special type of URL that can be generated by using an Azure storage utility, and is available for only a limited time. It contains all the information that is needed for authentication and download.
For URI, type or paste the full URI that defines the account and the public blob.
For private accounts, choose Account, and provide the account name and the account key, so that the experiment can write to the storage account.
Account name: Type or paste the name of the account where you want to save the data. For example, if the full URL of the storage account is http://myshared.blob.core.windows.net, you would type myshared.
Account key: Paste the storage access key that is associated with the account.
I try to use a simple module combination as the figure and Python code below to test the issue you got.
import pandas as pd
def azureml_main(dataframe1 = None, dataframe2 = None):
dataframe1 = pd.DataFrame(data={'col1': [1, 2], 'col2': [3, 4]})
return dataframe1,
When I tried to use the authentication type Account of my Blob Storage V2 account, I got the same issue as yours which the error code is Error 0151 as below via click the View error log Button under the link of View output log.
Error 0151
There was an error writing to cloud storage. Please check the URL.
This error in Azure Machine Learning occurs when the module tries to write data to cloud storage but the URL is unavailable or invalid.
Resolution
Check the URL and verify that it is writable.
Exception Messages
Error writing to cloud storage (possibly a bad url).
Error writing to cloud storage: {0}. Please check the url.
Based on the error description above, the error should be caused by the blob url with SAS incorrectly generated by the Export Data module code with account information. May I think the code is old and not compatible with the new V2 storage API or API version information. You can report it to feedback.azure.com.
However, I switched to use SAS authentication type to type a blob url with a SAS query string of my container which I generated via Azure Storage Explorer tool as below, it works fine.
Fig 1: Right click on the container of your Blob Storage account, and click the Get Shared Access Signature
Fig 2: Enable the permission Write (recommended to use UTC timezone) and click Create button
Fig 3: Copy the Query string value, and build a blob url with a container SAS query string like https://<account name>.blob.core.windows.net/<container name>/<blob name><query string>
Note: The blob must be not exist in the container, otherwise an Error 0057 will be caused.

Azure blob storage networking rules (Ip) for Azure data warehouse

I need to load external data (in blob storage) to my Azure data warehouse using Polybase. I had it working fine when I was using Classic Azure Storage.
Recently, I have to update our Storage to ARM and I could not figure out how to set up the firewall rule on the ARM Storage to my Azure data warehouse. If I set the firewall to "All networks" everything works seamlessly. However, I cannot let the blob wide open.
I tried using nslookup to find the outbound ip for our Azure Data warehouse and put the value into the Firewall of the Storage; I got "This request is not authorized to perform this operation." error
Is there a way I can find the ip address for an Azure Data warehouse? Or I should use different approach to make it work?
Any Suggestions are appreciated.
Kevin
Under the section 1.1 Create a Credential, it states:
Don't skip this step if you are using this tutorial as a template for loading your own data. To access data through a credential, use the following script to create a database-scoped credential, and then use it when defining the location of the data source.
-- A: Create a master key.
-- Only necessary if one does not already exist.
-- Required to encrypt the credential secret in the next step.
CREATE MASTER KEY;
-- B: Create a database scoped credential
-- IDENTITY: Provide any string, it is not used for authentication to Azure storage.
-- SECRET: Provide your Azure storage account key.
CREATE DATABASE SCOPED CREDENTIAL AzureStorageCredential
WITH
IDENTITY = 'user',
SECRET = '<azure_storage_account_key>'
;
-- C: Create an external data source
-- TYPE: HADOOP - PolyBase uses Hadoop APIs to access data in Azure blob storage.
-- LOCATION: Provide Azure storage account name and blob container name.
-- CREDENTIAL: Provide the credential created in the previous step.
CREATE EXTERNAL DATA SOURCE AzureStorage
WITH (
TYPE = HADOOP,
LOCATION = 'wasbs://<blob_container_name>#<azure_storage_account_name>.blob.core.windows.net',
CREDENTIAL = AzureStorageCredential
);
Edit: (additional way to access Blobs from ADW through the use of SAS):
You also can create a Storage linked service by using a shared access signature. It provides the data factory with restricted/time-bound access to all/specific resources (blob/container) in the storage.
A shared access signature provides delegated access to resources in your storage account. You can use a shared access signature to grant a client limited permissions to objects in your storage account for a specified time. You don't have to share your account access keys. The shared access signature is a URI that encompasses in its query parameters all the information necessary for authenticated access to a storage resource. To access storage resources with the shared access signature, the client only needs to pass in the shared access signature to the appropriate constructor or method. For more information about shared access signatures, see Shared access signatures: Understand the shared access signature model.
Full document can be found here

Restoring Database from Azure Blob Storage failing from SSMS while using RESTORE FILELISTONLY

I am trying to restore a SQL 2016 database backup file which is in Azure Blob Storage from SSMS using the below T- SQL command :
RESTORE FILELISTONLY
FROM URL = 'https://.blob.core.windows.net//.bak'
GO
It works fine with my normal Azure subscription. But when I use a CSP account ,I get the below error :
Cannot open backup device 'https://.blob.core.windows.net//.bak'. Operating system error 86(The specified network password is not correct.).
Any help on fixing this issue is greatly appreciated.
Following the steps below you should be able to get the file-list.
First you need to create a 'credential': e.g.
create credential [cmbackupprd-sqlbackup]
with
identity = '<storageaccountname>',
secret = 'long-and-lengthy-storageaccountkey'
Now you can use this credential to connect to your storage-account.
restore filelistonly
from URL = 'https://yourstorageaccount.blob.core.windows.net/path/to/backup.bak'
with credential='storageaccount-credential'
Note, I'm asssuming the backup is made directly from sql to azure blob-storage. Otherwise you might need to check the blob-type.

Resources