I have an HDInsight cluster in Azure Government and want to add an additional storage account that resides in Azure Government.
I’m attempting to do this via the portal’s Script Actions > + Submit New > Add an Azure Storage account and providing my Azure Government storage account’s name and key.
This fails with the error (from the cluster’s output file in /var/lib/ambari-agent/data/output-XXXX.txt):
Key encryption is enabled STORAGE ACCOUNT IS: testgovwebiaasdiag
Validate storage account creds: Invalid Credentials provided for
storage account ('Start downloading script locally: ',
u'https://hdiconfigactions.blob.core.windows.net/linuxaddstorageaccountv01/add-storage-account-v01.sh')
Fromdos line ending conversion successful ('Unexpected error:',
"('Execution of custom script failed with exit code', 139)")
Looking at the documentation for “Add additional storage accounts to HDInsight”, there is no indication that this script supports Azure Government.
What is the recommended path forward? Should I download the script and modify it? If so, what modifications are needed to support Azure Government?
At this time, the out of the box "Add additional storage accounts to HDInsight" script does not support Azure Government.
This is because it does not allow for the storage endpoints to be set for the different Azure Government endpoints.
I have created this script which supports Azure Goverment. It is a modified version of the out of the box one that overrides the endpoints with those for Azure Government.
You can use this script via Script Actions > + Submit New > - Custom and provide this URI in the Bash script URI. You can then provide the same parameters, storage account name and storage account key and run you script.
To get support for Azure Government in the out of the box script, please vote for this in the Azure Government feedback forum: Support for Azure Government storage accounts in HDInsight
Related
I am trying to redeploy an Azure Cloud Service (classic) to an "extended support" one since the former is being deprecated. Following this guide and the prerequisites I have created a virtual network and new storage account. I set up a bunch of permissions and the Connectivity Check for my storage account indicates no problems. However when I try to create and deploy a new Cloud Service (Extended Support) using my (updated) .cscfg, .csdef and .cspkg files I get this error:
Error:AuthorizationFailure, message:This request is not authorized to perform this operation. (Code: PackageUriForbidden)
I've tried setting the container and blob access to public for the deploy files, I have added Network Contributor and Storage Blob Data Contributor to both the subscription and the cloud storage resources for my user account. What am I missing?
I tried deploying cloud services extended support via Azure Portal and it got deployed successfully.
Refer below :-
I have uploaded all my cloud services packages in my storage account and used those packages from my storage blobs and created the Cloud service ES instance.
I enabled connection from all networks for my storage account, Thus I did not receive any authorization error :-
It looks like your Storage account has Firewall and V-Net enabled for the selected Networks. Or There’s an IP Address added to restrict storage account.
I created a Create a Service endpoint in my V-Net to allow Microsoft.storage like below :-
Added this V-Net in the selected Networks in Storage account’s Firewall :-
Checked this in order to allow azure services to access storage account like below :-
Now, when I try to deploy another cloud service with the same storage account having firewall and V-Net enabled, I get the same error as yours refer below :-
I allowed my client machine’s IP in the storage account and was able to add the packages without any error while deploying the Cloud Service:-
I'm nearly done migrating our cloud service (classic) deployments to cloud service (extended support). I'm working now on updating deployment pipelines. My package blob is located in a storage account. I create a SAS for the blob and use an API call to management.azure.com to create/update the deployment, passing ARM template as the body of the request.
This works correctly as long as the storage account with the package blob has its network set to "allow access from all networks". I want to restrict this access. I set the allow access from:
specific IP addresses of our devops servers
our own IP addresses
private vnet/subnets for the cloud service
I also tick the "Allow Azure services on the trusted services list to access this storage account" checkbox.
Yet, API call fails with error message indicating access is not allowed to the blob. When I change the storage account network configuration to "allow access from all networks", everything works correctly.
With lots of searches, I found only one hit explaining the same problem - https://github.com/Azure/azure-powershell/issues/20299 - yet no solution has been suggested other than allowing access from all networks.
I must be missing some trick - but what is it? How can I restrict access to the storage account?
I need an SFTP server hosted on Azure. The important thing is that it should support multiple user accounts and their management at runtime ( via an interface or API), i.e. password reset, account blocking/unblocking (preferably user groups also).
I have found some guides on how to set up an SFTP server on Azure:
https://learn.microsoft.com/en-us/samples/azure-samples/sftp-creation-template/sftp-on-azure/
https://charbelnemnom.com/how-to-deploy-sftp-service-on-microsoft-azure/
Their major drawback is that every change ( addition of a new user, password update etc.) requires a new deployment which is not acceptable.
Also, there is an SFTP functionality for the Azure Blob Storage:
https://learn.microsoft.com/en-us/azure/storage/blobs/secure-file-transfer-protocol-support
This functionality is still in preview and allows only to add local users in the Azure portal. Unfortunately, other - more advanced features like account blocking etc - are missing. It is also not possible to manage the user accounts from the code.
There exist some products on Azure marketplace provided by external companies, like Azure SFTP Gateway which supports all functionalities listed above. I am not sure about the further maintenance of these products and I couldn't find any information about SLA or similar things for these products.
I would like to ask if there is a reliable resource or a way to set up a resource that could serve as an SFTP server and meet the requirements listed above? If there is no explicit solution, maybe there is a way to integrate an SFTP server with a database or do something else?
Presently, we only support the SFTP for Azure Blob Storage.
SFTP for Azure Blob Storage supports multiple local user accounts and container-level permissions. You can use the Portal, ARM, PowerShell or CLI to manage users, permissions and passwords/keys. You can also disable individual users by disabling ssh key and password authentication.
Connect to Azure Blob Storage using SFTP (preview) | Microsoft Docs
az storage account local-user | Microsoft Docs
I am trying to integrate my odoo with azure blob storage, how can I do that?
you should checkout the capabilities of odoo on their documentation page: https://docs.bitnami.com/azure/faq/
There is a section of the supported capabilities in Azure as bellow:
The Bitnami Launchpad for Microsoft Azure creates the following resources in your Microsoft Azure account:
A storage account for each region in which the user launches an instance.
Application images are added to that storage account when the user launches instances.
Machine disks are created in that storage account when the user launches instances.
A resource group with the rest of the resources needed by the instance.
A resource group for the various storage accounts.
Source: https://docs.bitnami.com/azure/faq/
I'm using Azure Storage Explorer to connect to storage accounts that I've created by hand on Azure. However when I go to browse the storage account that was created by Azure when I created a Media Services account, I'm unable to connect to it.
I'm using blob.core.windows.net as the storage endpoint domain, and setting the storage account name and storage account key to be the same as Azure has defined it in the dashboard, but attempts to connect (with or without HTTPS) result in a 502-Bad Gateway HTTP error.
I'd like an easy way to browse all media files I've created without having to write special code. Has anyone been able to get this to work?
All storage accounts regardless the way they are being created are browsable with Storage Explorer!
For such storage accounts, created when you Create Media Services, you have to use the Storage Account Name and Storage Account Key, but not the Media Service Account Name and Media Service Account Key! You will not be able to access Storage service with Media Key and vice-versa.
When you create a Media Services account, one/multiple storage accounts could be attached to a particular media services account. Let's say your account name is "MediaStorage123". I believe you need to pass the following data to storage explorer:
Account name/key: this can be found in the bottom of your storage account page in Azure portal: press Manage Key button you will see the data.
storage endpoint domain: Not sure why you need this, but if so, you can see the information in Dashboard of your media services account: https://xxx.blob.core.windows.net/.
Hope this helps.
Just for the record, In my case (with the use of proxy) I had do install a previous version of the Azure Storage Explorer