Azure Data Explorer oneclick Ingest from blob container (UI) - azure

I'm trying to configure and use the Azure Data Explorer OneClick Ingest from blob container (continous ingest).
Whatever I try the URL is never accepted, I always end up with this error:
Invalid URL. Either the URL leads to a blob instead of a container, or the permissions are incorrect. If you just grant permission, please wait couple of minutes and try again.
The URL I'm using follow that pattern:
https://mystorageaccount.blob.core.windows.net/mycontainer?sp=rl&st=2022-04-26T22:01:42Z&se=2032-04-27T06:01:42Z&spr=https&sv=2020-08-04&sr=c&sig=Z4Mlh7s5%2Fm1890kdfzlkYLSIHHDdGJmTSyYXVYsHdn01o%3D
I'm probably missing something, either in the URL syntax ou SAS generation.
Has anyone successfully used it? Any idea what could be wrong?
Thanks

I finally found out what was the issue.
Probably due to the security in place on my Storage account I had to create in Azure Data Explorer Networking panel, a Managed private enpoint, pointing to my storage resource (and then approve that endpoint in the storage account Networking)
https://learn.microsoft.com/en-us/azure/data-explorer/security-network-managed-private-endpoint-create

Related

ADF Pipeline Errors - RequestContentTooLarge and InvalidContentLink

The ADF Pipeline release to the test Data Factory instance is failing with the following error as shown in the image below.
So, to overcome the above issue, I modified the pipeline by adding an additional step of Azure Blob File Copy to store the linked templates in a storage account and reference it in the pipeline to use it for the deployment. However when I made the above change I am getting another error which states InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?***Sanitized Azure Storage Account Shared Access Signature***'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
I have tried using the SAS token for both at the Container level and at the Storage Account level. I also have ensured that the agent and the storage account are under same VNets. I have also tried to remove the firewall restrictions but still it gives me the same InvalidContentLink error.
The modified pipeline with the Azure Storage Account step :
How do I resolve this issue?
InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?Sanitized Azure Storage Account Shared Access Signature'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
This error can cause because of you are trying to link which might not present in storage account.
Make sure you provide correct URl for the nested template that is accessible.
Also, if your storage account has firewall rule you can't link nested template from it.
Make sure your Storage Account, Container and Blob are publicly available. To achieve this:
Provide a Blob level Shared Access Signature URL. select the file click on"..." and then Click on Generate SAS.
refer for more understanding about nested template.

Azure Data Factory to Azure Blob Storage Permissions

I'm connecting ADF to blob storage v2 using a managed identity following this doc: Doc1
When it comes to test the connection with my first dataset, I am successful when I test the connection to the linkedservice. When I try by the filepath, and enter "testfolder" (which exists in the blob) it fails returning a generic forbidden error displayed at the end of this post.
However, when I opt to "browse" the folders in the dataset portal, the folder "testfolder" does show up. But when I select it, it will not show me anything within that folder.
The Data Factory managed instance is given the role of Contributor, granting full access to manage all resources. Is there some other hidden issue or possible way to narrow down the issue? My instinct is that this is something within the blob container since I can view the containers, but not their contents.
Error message:
It seems that you don't give the role of azure blob storage.
Please fellow this:
1.click IAM in azure blob storage,navigate to Role assignments and add role assignment.
2.choose role according your need and select your data factory.
3.A few minute later,you can retry to choose file path.
Hope this can help you.

Azure Storage Explorer : Unable to retrieve child resources

Getting error ONLY while accessing Blob storage.
No issues in Queues, File Share or table.
Any idea ?
Unable to retrieve child resources.
Details:
["FetchError:request to https://fssaicessunsetsbxv1sa.blob.core.windows.net/?include=metadata&comp=list failed, reason: unable to get local issuer certificate"]
Error : Self-Signed Certificate in Certificate Chain ,Unable to retrieve child resources.
Issue for me: I am attached with office proxy server. But Azure Storage Explorer is not using that proxy.
Solution:
Azure Storage Explorer -> Edit -> Configure Proxy,
Source = No proxy "Changed to" Use System proxy(preview)
After making these changes; I am able access the resources.
Moreover, Verify the permissions do you have on the connection string?
To generate your connection string either through the Azure Portal or some apps. When you generate the connection string, you need to give "Allowed permissions". Beside Read/Write you also need the List permission so Storage Explorer can list the blobs. Here is a screenshot in Azure portal to check/uncheck the permissions:
Have set any RBAC policies?
If you are connected to Azure through a proxy, verify that your proxy settings are correct. If you were granted access to a resource from the owner of the subscription or account, verify that you have read or list permissions for that resource.
If possible can you try to un-install and reinstall the latest version and check for the status of the issue.
Azure Storage Explorer Troubleshooting: "unable to retrieve child resources” or “The request action could not be completed”.
If the issue still persist after trying above mentioned steps, I would like to work closer on this issue. Let me know the status
Warning: For the noobs !
if you got luck you can also fix it by closing and re-opening the visual studio.
Reason: Authorization is tightly coupled with azure
Motivation: To err is Human ! Even Soft. DEV working at Microsoft are Human.

Azcopy error "This request is not authorized to perform this operation."

I copied a container to another storage account based on the document linked below.
(DataLake Storage Gen2).
When trying, I got the following error:
this request not authorized to perform this operations using this permission
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
If you are using AAD Token, this error is telling you that you need to add a role assignment to the user. Please go to Storage account -> Access Control -> Add -> Add role assignment, then add Storage Blob Data Owner to your login account.
If this problem persists, please provide more details.
I also faced the same problem. For me to work I just log out and log in again on the azcopy cli after doing the #BowmanZhu solution
azcopy logout
azcopy login --tenant-id xxxx-xxxx-xxxx
If you don't want to login that way there is always the option to add a SAS token at the end of the URL. If you don't want to attach the token always at the end you can try for permanent access by going through any one of these steps you find in the official documentation page.
After granting myself with role Storage Blob Data Owner on the container, then AzCopy will now behave itself and succeed in copying a file to the blob storage container.
go to storageaccount -> container -> Access control rules -> add role assignement -> Storage Blob Data Owner
In my case, my azure storage account vnet address was blocking the azcopy from copying the data over the storage account.
I added my client IP to allow a firewall address.
The SAS token has probably expired.
When I had this, I discovered it was because I'd used Azure Storage Explorer to generate a SAS that didn't have read permission, and I think it was trying to read the size/existence of a blob before writing it.
I got a clue from https://github.com/Azure/azure-storage-azcopy/issues/790 but ultimately I just regenerated a new SAS with read permission and it worked out..
I probably could ahve looked to modify the C# code using Azure Data Movement lib, to not perform a length check, but the spec was later changed to "don't overwrite" so the read permissions are probably needed anyway
Give appropriate permissions(read, write, create) while generating SAS tokens
as here
Had a similar issue. That's how was resolved
Command used was .\azcopy.exe copy "C:\Users\kriof\Pictures" "https://test645676535storageaccount.blob.core.windows.net/images?sp=rw&st=2022-02-23T11:03:50Z&se=2022-02-23T19:03:50Z&spr=https&sv=2020-08-04&sr=c&sig=QRN%2SMFtU3zaUdd4adRddNFjM2K4ik7tNPSi2WRL0%3D"
SAS token had default(Read) permission only. Adding Write permission
in Azure Portal, resolved the issue.

Azure sql database export to storage blob failed

I tried to export Sql Database in azure to storage blob but the operation failed. I do this task daily last month and this issue is new. It shows as follows :
Error encountered during the service operation.Blob https://blob link/dbname-2019-1-16-14-24.bacpac is not writeable. The remote server returned an error: (403) Forbidden.The remote server returned an error: (403) Forbidden.
I had the same problem. I have contacted Azure support and this is their response.
We recently identified a regression in the import/export service that is generating incorrect SAS tokens to the storage accounts.
The engineering team has rolled out the fix, but it might take some time for the fix to get applied worldwide.
Please try the following link to access Azure portal and then perform the export operations:
https://portal.azure.com/?feature.canmodifystamps=true&microsoft_azure_storage=stage1
The portal shows an orange title bar if you open it via the above link, this is expected.
So, it should be already fixed by now.
If you're using sql scripts for back up following this link, then I suspect that the expired date of SHARED ACCESS SIGNATURE is reached.
Please re-generate SHARED ACCESS SIGNATURE, and then use the new key for backup.
Please let me know if any more issues.
Make sure a firewall rule does not exist. Please go to the Azure portal "Storage Accounts → "YourStorageAccountName" → Firewalls and Virtual Networks (left vertical panel). Put it to "Allow access" from "All networks". You can also configure the storage account with exceptions for trusted Microsoft services as explained here.

Resources