Unable to retrieve child resources - Azure Storage Explorer - azure

I am using Azure Storage Explorer to read data from my Azure storage emulator.
I can create Containers just fine and even use them, until I turn off my machine. When I turn mt machine on and start using the explorer again, I keep getting this error:
If I try and create a container with the same name as the ones I cant see, I get an error saying container already exists so the container still exists somewhere, just that my explorer can't retrieve it. And the container names are all in lowercase so not sure whats happening!

Please check and make sure to have container name and blob names are
between 3 and 63 characters .
If you are usingt proxy settings in storage explorer to connect to your account, please
verify that proxy settings are properly formatted by checking as they can be
case sensitive.
If set to Use environment variablesmake sure to set HTTPS_PROXY or
HTTP_PROXY environment variables are set as given. If these are not
correct , then it may not connect to the server properly.
After that restart the explorer .
Also check the similar issue: Unable to retrieve child resources - "The specifed resource name contains invalid characters" · GitHub

Related

"Azure Blob Source 400 Bad Request" when using Azure Blob Source in SSIS to pull a file from Azure Storage container

My package is very simple. It is loading data from a csv file that I have stored in an Azure storage container, and inserting that data into an Azure SQL database. The issue is stemming from the connection to my Azure storage container. here is an image of the output:
Making this even more odd, while the data flow task is failing:
The individual components within the data flow task all indicate success:
Setting up the package, it seems that the connection to the container is fine (after all, it was able to extract all the column names from the desired file and map them to their destination). Here is an image showing the connection is fine:
So the issue is only realized upon execution.
I will also note that I found this post that was experiencing the exact same issue that I am now. As the top response there instructed, I added the new registry keys, but no cigar.
Any thoughts would be helpful.
First, make sure your blob can be access by public:
And if you don't have requirement to set networking, please make sure:
Then set the container access level:
And make sure the container is correct.

Azure Storage Explorer : Unable to retrieve child resources

Getting error ONLY while accessing Blob storage.
No issues in Queues, File Share or table.
Any idea ?
Unable to retrieve child resources.
Details:
["FetchError:request to https://fssaicessunsetsbxv1sa.blob.core.windows.net/?include=metadata&comp=list failed, reason: unable to get local issuer certificate"]
Error : Self-Signed Certificate in Certificate Chain ,Unable to retrieve child resources.
Issue for me: I am attached with office proxy server. But Azure Storage Explorer is not using that proxy.
Solution:
Azure Storage Explorer -> Edit -> Configure Proxy,
Source = No proxy "Changed to" Use System proxy(preview)
After making these changes; I am able access the resources.
Moreover, Verify the permissions do you have on the connection string?
To generate your connection string either through the Azure Portal or some apps. When you generate the connection string, you need to give "Allowed permissions". Beside Read/Write you also need the List permission so Storage Explorer can list the blobs. Here is a screenshot in Azure portal to check/uncheck the permissions:
Have set any RBAC policies?
If you are connected to Azure through a proxy, verify that your proxy settings are correct. If you were granted access to a resource from the owner of the subscription or account, verify that you have read or list permissions for that resource.
If possible can you try to un-install and reinstall the latest version and check for the status of the issue.
Azure Storage Explorer Troubleshooting: "unable to retrieve child resources” or “The request action could not be completed”.
If the issue still persist after trying above mentioned steps, I would like to work closer on this issue. Let me know the status
Warning: For the noobs !
if you got luck you can also fix it by closing and re-opening the visual studio.
Reason: Authorization is tightly coupled with azure
Motivation: To err is Human ! Even Soft. DEV working at Microsoft are Human.

Azure form recognizer app invalid resource name

I'm traying to daploy an instance of the form recognizer app in Azure. For that I'm following the instructions in the documentation: https://learn.microsoft.com/en-us/azure/cognitive-services/form-recognizer/deploy-label-tool
I have created the docker instance and the connection, but the step to create the APP is failing.
This are the parameters I'm using:
Display Name: Test-form
Source Connection: <previuosly created connection>
Folder Path: None
Form Recognizer Service Uri: https://XXX-test.cognitiveservices.azure.com/
API Key: XXXXX
Description: None
And this is the error and getting:
I had the same error. It turned out to be due to incorrect SAS URI formatting because I generated and copied the SAS token via the Storage Accounts interface. It's much easier to get the correct format for the SAS URI if you generate it through the Storage Explorer (currently in Preview) as opposed to through the Storage Accounts.
If you read the documentation carefully it gives you a step by step guide
"To retrieve the SAS URL, open the Microsoft Azure Storage Explorer, right-click your container, and select Get shared access signature. Set the expiry time to some time after you'll have used the service. Make sure the Read, Write, Delete, and List permissions are checked, and click Create. Then copy the value in the URL section. It should have the form: https://.blob.core.windows.net/?"
Form Recognizer Documentation
The error messages point to a configuration issue with the AzureBlobStorageTemplate Thing. Most likely the containerName field for the Blob Storage Thing is empty or contains invalid characters
Ensure the containerName is a valid Azure storage container name.
Check https://learn.microsoft.com/en-us/rest/api/storageservices/Naming-and-Referencing-Containers--Blobs--and-Metadata for more information.
A container name must be a valid DNS name
The Connector loads and caches all configuration settings during startup. Any changes that you make to the configuration when troubleshooting are ignored until the Connector is restarted.
When creating the container connection, you must add the container into the SAS URI, such as
https://<storage-account>.blob.core.windows.net/<Enter-My-Container-Here>?<SAS Key>
You can also directly use the open source labeling tool, please see the section further down in the doc:
The OCR Form Labeling Tool is also available as an open-source project on GitHub. The tool is a web application built using React + Redux, and is written in TypeScript. To learn more or contribute, see OCR Form Labeling Tool.

Unable to find SYSTEM_KEY_NAME in SAP HANA Azure VM backup

I am trying to backup SAP HANA database which is in Azure VM by using Recovery Vault service. While running "msawb-plugin-config-com-sap-hana.sh" script file I am getting the error
Failed to determine SYSTEM_KEY_NAME: Please specify with the '--system-key' option.
Need a valid system key to create the backup key.
Please help me to resolve this error.
According to the prerequisites https://learn.microsoft.com/en-us/azure/backup/tutorial-backup-sap-hana-db#prerequisites, you have to create a key in the default hdbuserstore.
You can create it by login as ndbadm:
su - ndbadm
and add the key:
/hana/shared/NDB/hdbclient/hdbuserstore set BACKUP YOUR_HOSTNAME:30013 SYSTEM YOUR_PASSWORD
Then as a root, run the script.
After running the script, you can check again as the ndbadm user if the key AZUREWLBACKUPHANAUSER is there:
/hana/shared/NDB/hdbclient/hdbuserstore list
and delete your previously created key:
/hana/shared/NDB/hdbclient/hdbuserstore delete BACKUP
The script uses the command "runuser" (in my case ndbadm). When hdbuserstore is executed under the profile ndadm no keys is returned. You can copy the files SSFS_HDB.DAT and SSFS_HDB.KEY in the path returned by hdbuserstore LIST from a profile with valid files.
Refer to SAP Note 2853601 - Why is Nameserver Port Used in HDBUSERSTORE for SAP Application Installation.
In an MDC - nameserver port (e.g. 30013) is used in hdbuserstore instead of indexserver port (e.g. 30015) for a tenant DB.
Screenshot

Azure pipeline 'WinRMCustomScriptExtension' underlying connection was closed in non-public VM

In Azure pipeline when creating a VM through deployment template, we have the option to 'Configure with WinRM agent' as given below.
This acts as a custom extension behind the scenes. But the downloading of this custom extension can be blocked by an internal vnet in Azure. This is the error we are getting.
<datetime> Adding extension 'WinRMCustomScriptExtension' on virtual machine <vmname>
<datetime> Failed to add the extension to the vm: <vmname>. Error: "VM has reported a failure when processing extension 'WinRMCustomScriptExtension'. Error message: \"Failed to download all specified files. Exiting. Error Message: The underlying connection was closed: An unexpected error occurred on a send.\"\r\n\r\nMore information on troubleshooting is available at https://aka.ms/VMExtensionCSEWindowsTroubleshoot "
Since the files cannot be downloaded, I am thinking of a couple of solutions:
How can I know which powershell files azure is using to setup winrm?
Location to store files would be storage account (same vnet as VM)
Perhaps not use WinRM at all and use custom script extension to resolve
everything (with all files from storage account). I hope error from extension stops the pipeline if it happens.
Is there a better solution to resolve this? To me it looks like a bad design by azure as it is not covering non-public VMs.
EDIT:
Found answer to #1) https://aka.ms/vstsconfigurewinrm. This was shown in Raw logs of the pipeline when diagnostics were enabled
Even if you know - how does it help you? It won't be able to download them anyway and you cant really tell it to use local files
If you enable service endpoins and allow your subnet to talk to the storage account - it should work
there is a way to configure WinRM when you create the VM. Keyvault example
You could use script extension like you wanted to as well, but script extension has to download stuff to the Vm as well. Example

Resources