Azure Data Factory Event Trigger - Storage Account Key in Json? - azure

we have a storage account that is locked down. My pipeline has connections that reference a key vault to get the access token for the storage account.
When I create an event trigger in ADF, ADF lets me find and connect to the storage account (without asking for a key or prompting me to select the linked service connection). It tells me what files it will include based on my begins with and ends with values (it found 2 files). It saves successfully.
When I publish it, I get this error in between publish to adf-publish and generating the arm templates.
The attempt to configure storage notifications for the provided storage account ****failed. Please ensure that your storage account meets the requirements described at https://aka.ms/storageevents. The error is Failed to retrieve credentials for request=RequestUri=https://management.azure.com/subscriptions/********/resourceGroups/<resource group name>/providers/Microsoft.Storage/storageAccounts/<storage account name here to gen 2 data lake>/listAccountSas, Method=POST, response=StatusCode=400, StatusDescription=Bad Request, IsSuccessStatusCode=False, Content=System.Net.HttpWebResponse, responseContent={"error":{"code":"InvalidValuesForRequestParameters","message":"Values for request parameters are invalid: keyToSign."}}
I believe this is due to the fact that ADF trigger creation process (and therefore its JSON) does not allow you to point to a Key Vault to get the access token for the storage account you are connecting to. Is this the issue? Is there a fix for this?
Appreciate any help, thanks - April

I think the storage account is attached to a VNET and running behind the firewall. I faced similar issue because of this. You may remove the firewall once and configure the trigger and then bring the firewall back.

It's not strictly necessary to disable the firewall. You can also use this feature on your storage account.

Related

Service account i keyvault got locked

We noticed that our service account in one of our key vault was locked and we do not know why or who locked it. Is there a way for us to check logs or investigate why it got locked
Hoping you have enabled logging on key vault, if so you should be able to access it from storage account. Some information about key vault logging is also available within Azure Monitor. Check following link for details. Azure Key Vault Logging
If you want you can use tools like log analytics to interrogate the data you have stored within the storage account.

How to change where azure storage emulator stores its files

I am attempting to debug an application that is comprised of several microservices. Part of the cross service messaging is carried out by storing information in azure blob storage by one service to be read by the other. For local testing we use Azure storage emulator.
Recently my AD logon had to be recreated by our IT team. My username has gone from , to <myname.COMPANYNAME> and since then Azure storage emulator has failed me.
Attempting to view all local blob storage results in an error "Unable to retrieve child resources." though I can can confirm that each container still exists manually. Hunting online suggests the problem is due to the period in my AD logon name (changing this is non trivial due to it needing to be done by another department)
Unable to retrieve child resources.
Details:
{
"name": "RestError",
"message": "The specifed resource name contains invalid
characters.\nRequestId:b305591f-acf0-4e2e-8cc6-e3305fa18fab\nTime:2021-09-
My current thinking is to try and configure the emulator to not store its files in my user account but I have yet to find anywhere that this can be carried out - the config file mentioned in this question doesn't appear to have what I need.
For this a successful answer would be guidance on how to relocate the storage explorer without IT having to create a new logon, or a workaround that will allow storage explorer and the services to retrieve my various blob stores.
Please check this thread > Azure Storage Emulator store data on specific path - Stack Overflow if it can help related to azure storage emulator.
NOTE:
The Azure Storage Emulator is now deprecated. Microsoft recommends
that you use the Azurite emulator for local development with
Azure Storage Refer
Most cases change in Logon names doesn’t have affect on blob, but maybe in few cases due to name connected permissions or SID.
After changing the username, check if any permissions or roles assigned previously are given to that new one and make sure if DN and SID are not modified, to access resources or check all the configurations that have done previously that depend only on DN. The Storage Emulator supports only a single fixed account and a well-known authentication key.
1.Try to restart the emulator and check whenever tried with new port or any newconfiguration.
See this Thread
The invalid characters in the error in most of the cases happens with container name (all lowercase, no special characters) .
Try to check once and refer below threads for resolution possibilities,if it is container issue.
SO ref1
SO Ref 2
Storage Explorer has several options for how and where it can source the information needed to connect to your proxy. To change which option is being used, go to Settings (gear icon on the left vertical toolbar) > Application > Proxy. Network Connections in Azure Storage Explorer | Microsoft Docs

Unable to delete Storage Account : Azure

I have a storage account in my Azure sub which I am trying to delete. When I try to do so, it says I have a storage account in my Azure which I cant delete. When I try to delete that it says
Failed to delete storage account <storage acc name>. Error: An operation is currently performing on this storage account that requires exclusive access.
I have tried multiple times, even after waiting, but nothing is working. Any solution?
If you have waited and still facing the issue, you will have to open a support ticket with Azure Support to get it fixed.
Was your storage Account Encrypted?
Was your storage Account used as a VDI Blob storage ?
Thanks and regards,
Abdel
Please try the following:
Login as (or ask your) global admin and attempt to delete the storage account
Unlikely but if the above fails ask Global Administrator add themselves as a Storage Blob Data Owner and then try to delete the Storage Account.
Remove Resource Lock
In my case Microsoft support are attempting to delete the storage account. All other workflows failed. Will post RCA when isolated. Note this is using Professional Direct support contract.
For reference I logged into the Azure portal with an account with no support contract to test your condition. I was able to create a support ticket but with a very limited set of options, under the TECHNICAL heading, only AZURE INFORMATION PROTECTION option was available for example.
My storage account deletion issue has highlighted an error condition exists where a Global Admin user is unable to delete a storage resource containing no data or resource lock. In this case Microsoft are actioning the removal of the problem resource following several weeks of calls , threads and remote sessions.
I suggest you attempt to raise a ticket via any means , highlight your issue and hopefully Microsoft will internally route the issue to the correct department.
Root Cause Analysis
An operation to delete the storage account was never finished Azure side.
Every operation holds a lock on the storage account to prevent conflicts between different operations.
In my case the lock prevented all other operations we performed: deleting, updating, deploying etc.
Microsoft are yet to detect which operation exactly was causing the issue but explained similar issues when another user tried to modify the tag of a storage account with misused escape characters.
In my case its likely that a DevOps pipeline was modifying the storage account, the request to Azure Resource Management(ARM) was somehow incomplete due to, for example, a network fluctuation / transient error.
Resolution
The way to resolve the issue is Microsoft manually reset the status of the lock on the storage account and release the lock form the storage account. Delete operations are then possible. Microsoft are taking steps to improve the process.

Azure Data Factory connecting to Blob Storage via Access Key

I'm trying to build a very basic data flow in Azure Data Factory pulling a JSON file from blob storage, performing a transformation on some columns, and storing in a SQL database. I originally authenticated to the storage account using Managed Identity, but I get the error below when attempting to test the connection to the source:
com.microsoft.dataflow.broker.MissingRequiredPropertyException:
account is a required property for [myStorageAccountName].
com.microsoft.dataflow.broker.PropertyNotFoundException: Could not
extract value from [myStorageAccountName] - RunId: xxx
I also see the following message in the Factory Validation Output:
[MyDataSetName] AzureBlobStorage does not support SAS,
MSI, or Service principal authentication in data flow.
With this I assumed that all I would need to do is switch my Blob Storage Linked Service to an Account Key authentication method. After I switched to Account Key authentication though and select my subscription and storage account, when testing the connection I get the following error:
Connection failed Fail to connect to
https://[myBlob].blob.core.windows.net/: Error Message: The
remote server returned an error: (403) Forbidden. (ErrorCode: 403,
Detail: This request is not authorized to perform this operation.,
RequestId: xxxx), make sure the
credential provided is valid. The remote server returned an error:
(403) Forbidden.StorageExtendedMessage=, The remote server returned an
error: (403) Forbidden. Activity ID:
xxx.
I've tried selecting from Azure directly and also entering the key manually and get the same error either way. One thing to note is the storage account only allows access to specified networks. I tried connecting to a different, public storage account and am able to access fine. The ADF account has the Storage Account Contributor role and I've added the IP address of where I am working currently as well as the IP range of Azure Data Factory that I found here: https://learn.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses
Also note, I have about 5 copy data tasks working perfectly fine with Managed Identity currently, but I need to start doing more complex operations.
This seems like a similar issue as Unable to create a linked service in Azure Data Factory but the Storage Account Contributor and Owner roles I have assigned should supersede the Reader role as suggested in the reply. I'm also not sure if the poster is using a public storage account or private.
Thank you in advance.
At the very bottom of the article listed above about white listing IP ranges of the integration runtime, Microsoft says the following:
When connecting to Azure Storage account, IP network rules have no
effect on requests originating from the Azure integration runtime in
the same region as the storage account. For more details, please refer
this article.
I spoke to Microsoft support about this and the issue is that white listing public IP addresses does not work for resources within the same region because since the resources are on the same network, they connect to each other using private IP's rather than public.
There are four options to resolve the original issue:
Allow access from all networks under Firewalls and Virtual Networks in the storage account (obviously this is a concern if you are storing sensitive data). I tested this and it works.
Create a new Azure hosted integration runtime that runs in a different region. I tested this as well. My ADF data flow is running in East region and I created a runtime that runs in East 2 and it worked immediately. The issue for me here is I would have to have this reviewed by security before pushing to prod because we'd be sending data across the public network, even though it's encrypted, etc, it's still not as secure as having two resources talking to each other in the same network.
Use a separate activity such as an HDInsight activity like Spark or an SSIS package. I'm sure this would work, but the issue with SSIS is cost as we would have to spin up an SSIS DB and then pay for the compute. You also need to execute multiple activities in the pipeline to start and stop the SSIS pipeline before and after execution. Also I don't feel like learning Spark just for this.
Finally, the solution that works that I used is I created a new connection that replaced the Blob Storage with a Data Lakes Gen 2 connection for the data set. It worked like a charm. Unlike Blob Storage connection, Managed Identity is supported for Azure Data Lakes Storage Gen 2 as per this article. In general, the more specific the connection type, the more likely the features will work for the specific need.
This is what you faced now:
From the description we know that is a connection error of storage. I also set the contributer role to the data factory, but still get the problem.
The problem comes from the network and firewall of your storage account. Please have a check of it.
Make sure you have add the client id and the 'Trusted Microsoft services' exception.
Have a look of this doc:
https://learn.microsoft.com/en-us/azure/storage/common/storage-network-security#trusted-microsoft-services
Then, go to your adf, choose these:
After that, it should be ok.

Azure Storage account consistently only adding Blob storage, missing Table/Queue/Files

Whenever I create a new Storage (classic) account through the Azure portal I consistently have issues whereby the Table/Queue/File storage is not created at all, leaving the account with only Blob storage, like this:
Instead of like this (separate account):
I have tried this multiple times and all have had the same result. I don't see how I can be getting this wrong as there is only 4 options on the form to create the account, and none of them govern the content of the account.
When I then attempt to create a new Table or Queue in this new account I get a 502 Bad Gateway error.
Am I missing something here? Can anyone tell me how I can add the required storage types to the account.
Not sure what's up with the portal, but a storage account always comprises blob, table, queue, and file storage (unless you create a Premium storage account - that's strictly blobs).
You should be able to confirm this by creating an app to, say, create, write, and read from a queue or table.
EDIT I see you edited your question, showing that you did try to create a table/queue. If this is a non-premium account, I suggest reaching out to support, as this makes no sense.
EDIT 4/2017 Aside from Premium storage accounts (which only have page blobs), there is another type of general (non-premium) storage account, specific to blobs only, where you won't be able to create Tables and Queues, but it's not available via the "Classic" deployment model; it's available only via "Resource Manager" deployment model:
In my case the issue was due to selecting Zone Redundant Storage (ZRS).
Since ZRS accounts only support Block Blobs, you will not see the
table, queue or file endpoints listed on the portal for the new
account.
https://blogs.msdn.microsoft.com/windowsazurestorage/2014/08/01/introducing-zone-redundant-storage/
Recreating the storage account using Globaly Redundant Storage (GRS) worked.

Resources