I am upgrading Azure blob storage to DataLake Gen 2. I have already created a few pipelines in ADF in which there was a delete activity with logging enabled, and I have given a path to a blob container for the logged files.
Now when I tried to upgrade to DataLake Gen 2, validation failed saying that "Incompatible feature soft delete". I disabled "Enable Logging" in the ADF pipeline, removed the path of the blob folder and deleted that folder from blob storage. However I am still getting this validation failed message.
I am not able to get what changes are still needed. see the error:
This question is answered in Microsoft Q&A platform. Thank you #NandanHegde-7720 for the solution, posting it as an answer here to help the other community members.
Disable soft delete for containers in your storage account.
• In the Azure portal, navigate to your storage account.
• Locate the Data protection settings under Data management.
• Uncheck soft delete for containers.
Any soft-deleted containers will be permanently deleted at the expiration of the retention period that was in effect at the time that the container was deleted.
Refer to this link for more information on how container soft delete works.
Alternatively, a quick workaround is to copy the files creating issues to the new storage account and then delete those files from the original storage account. Then try to upgrade data lake gen2.
Related
I am developing a data factory that downloads a csv file from a source and writes it to an Azure Storage account that i have read/write rights on. Everything looks good. it gets validated, but when i (test) run, i keep getting the error:
This endpoint does not support BlobStorageEvents or SoftDelete. Please disable these account features if you would like to use this endpoint.\", 409, HEAD
I checked; the source and sink are DIFFERENT files on different locations, i do have successfull connection on both endpoints. What else can i check to fix this?
From the error it seems to be an incorrect storage type. Could you please double check if your storage type is Azure Blob storage or Azure data lake storage gen 2 account? You can go to Storage account Settings and under Configuration section you can find these details. Depending upon version for your storage account you can use the connector for your copy activity in the pipeline. Use Azure Blob Storage connector if your account type is that else use ADLS gen 2 connector if your storage account type is general purpose v2.
I am trying to add an Azure Blob Storage rule that files older than a day should be deleted.
I am following this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts?tabs=azure-portal however, under portal view steps to follow, Under Blob service, select Lifecycle Management to view or change your rules. However, there is no such menu under the Blob Service. I cannot find this menu anywhere to add a new rule. What am I doing wrong here?
Lifecycle management is only supported one of the following storage account types(see this section of the doc):
General Purpose v2 (GPv2) accounts, Blob storage accounts, and Premium Block Blob storage accounts
If not, please consider upgrading to one of them. If you don't want to upgrade, you should write a code to delete them by yourself.
Here is the step to check the account type:
Nav to azure portal -> your storage account -> overview page -> check the Account Kind. Screenshot as below:
I just started with Data Lakes in Azure and countered an issue with the ADLS Gen2 screens in the Azure Portal.
Using the Azure Portal, I created a new Storage account to setup a new Azure Data Lake Gen2 storage by following the online instructions. At the time of creating the storage account, the option Hierarchical namespace was enabled and the storage set to StorageV2(general purpose). This created the Data Lake. However the name of the container still appears as Container. In the videos i've seen the Services panel display the Container with the label 'Data Lake Gen2 File System'. However in the one i created it still shows the label as Container. Furthermore the collapsible panel to the left also shows Container as against File System. Please refer to the screen grab below.
Can anyone tell me on whether i am missing anything or is it just that Azure had changed the names in the immediate past that i am not aware of?
You're not missing anything. Some time ago, we also noticed that the portal UI of ADLS GEN2 made the change: change the label Data Lake Gen2 File System to Containers.
It does not break any feature, just a simple UI change.
The storage account for this workspace has been deleted. which I have been using for my Machine Learning Studio. What should I do as when I try to save my experiment it shows that no workspace is found.
See the Image for reference which showing storage account has been deleted
Deepak, this issue maybe related to cache on your browser or a deleted storage account for the workspace.
For scenario one (which you are getting):
After the new Machine Learning workspace is created, you can sign in to Machine Learning Studio by using the Microsoft account you used to create the workspace. If you encounter the error message, “Workspace Not Found” (similar to the following screenshot), please use the following steps to delete your browser cookies. More details can be found here: https://learn.microsoft.com/en-us/azure/machine-learning/studio/troubleshooting-creating-ml-workspace, you can also use in private/incognito mode to confirm.
Scenario 2:
In case if the storage account was deleted, double check in the Storage accounts on the left blade of the azure portal, then select the subscription you created the workspace with, it should list it there, if it is not, then it was indeed deleted. In that case:
"It's not possible to restore a deleted storage account or retrieve any of the content that it contained before deletion. Be sure to back up anything you want to save before you delete the account. This also holds true for any resources in the account—once you delete a blob, table, queue, or file, it is permanently deleted." More details can be found here: https://learn.microsoft.com/en-us/azure/storage/common/storage-create-storage-account
I mistakenly created a VM without linking it to an extant storage account. When I realized my mistake I deleted the VM and then sought to delete the storage account. I found the auto created storage account and attempted to remove the container. However even 24 hours later I get told that the container contains resources in use by the (now deleted) VM and so cannot be deleted.
Clearly there is some kind of dependency which is not apparent from the management portal which needs to be removed. I am looking for some advice on a powershell approach to investigating and resolving this issue.
You have to disassociate the disk with the VM image (VHD) before you can delete the blob or container.
You can do so from the Windows Azure management portal. Go to the VMs tab. Choose 'DISKS' from the top menu and remove it. If I remember well, it will also ask you whether you'd also like to delete the blob (VHD) from the storage account.