Hashicorp Vault Server With Azure Storage Blob - azure

I am trying to setup the hashicorp Vault server in a Azure VM to connect to a storage blob. I tried uploading files from the VM and listings the blob using the CLI from the VM and was succesful. However, when I try to run my vault server I am getting the following
`
Error initializing storage of type azure: failed to get properties for container "CONTAINER NAME": -> github.com/hashicorp/vault/vendor/github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /gopath/src/github.com/hashicorp/vault/vendor/github.com/Azure/azure-storage-blob-go/azblob/zc_storage_error.go:42
===== RESPONSE ERROR (ServiceCode=AuthorizationPermissionMismatch) =====
Description=This request is not authorized to perform this operation using this permission.
RequestId: sdfsdfsdf-601e-00df-87897-f34329000000
Time:2022-11-08T19:57:49.5256170Z, Details:
Code: AuthorizationPermissionMismatch
GET https://MANAGEDIDENTITY.blob.core.windows.net/CONTAINERNAME?restype=container&timeout=5
Authorization: REDACTED
User-Agent: [Azure-Storage/0.11 (go1.15.11; linux)]
X-Ms-Client-Request-Id: [345345345-ee29-428c-7d92-bhjgjhuyssd]
X-Ms-Version: [2019-12-12]
--------------------------------------------------------------------------------
RESPONSE Status: 403 This request is not authorized to perform this operation using this permission.
Content-Length: [279]
Content-Type: [application/xml]
Date: [Tue, 08 Nov 2022 19:57:48 GMT]
Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
X-Ms-Client-Request-Id: [345345345-ee29-428c-7d92-bhjgjhuyssd]
X-Ms-Error-Code: [AuthorizationPermissionMismatch]
X-Ms-Request-Id: [345345345-ee29-428c-7d92-bhjgjhuyssd]
X-Ms-Version: [2019-12-12]
`
Here is my Vault config file
`
{
"listener": [{
"tcp": {
"address" : "127.0.0.1:8200",
"tls_disable" : 1
}
}],
"disable_mlock": "true",
"api_addr": "http://127.0.0.1:8200",
"storage": {
"azure": {
"accountName" : "AccountName",
"accountKey" : "",
"container" : "ContainerName",
"max_parallel": 512
}
},
"ui":true
}
I tried listing the blob from the VM and I was able to.
> --account-name "accountName" \
> --container-name containerName \
> --output table \
> --auth-mode login
Name Blob Type Blob Tier Length Content Type Last Modified Snapshot
---------- ----------- ----------- -------- -------------- ------------------------- ----------
helloworld BlockBlob Hot 13 text/plain 2022-11-08T21:14:44+00:00```
enter code here

403 This request is not authorized to perform this operation using this permission.
The above 403 errors occurs you may not given proper permission to your storage account and also you may not assign roles in storage account.
For service principal authentication purpose you need to assign roles in your storage account.
Storage Blob Data Contributor
Storage Blob Data Reader
Also check the firewall settings whether, In networking if you are access in public enable the select all network or if you enabled selected networks add the virtual networks.
If you add firewall , add your client iP address and also enable "Allow trusted Microsoft services to access this storage account" allows you to access storage account.
Reference:
Azure permission : not authorized to perform this operation - Stack Overflow

Related

Azure Data Storage Access from Databrikcs

I can not access Azure Data Lake Storage from Databrikcs.
I have no premium Azure Databricks service. I am trying to access ADLS Gen 2 Directly as per latest documentation: https://learn.microsoft.com/en-us/azure/databricks/data/data-sources/azure/adls-gen2/azure-datalake-gen2-sp-access#access-adls-gen2-directly
I have granted the service principle "Contributor permissions" on this account
This is the Error message from notebook:
Operation failed: "This request is not authorized to perform this operation using this permission.", 403, GET, https://geolocationinc.dfs.core.windows.net/instruments?upn=false&resource=filesystem&maxResults=500&timeout=90&recursive=false, AuthorizationPermissionMismatch, "This request is not authorized to perform this operation using this permission. ...;
this is my spark config setup:
spark.conf.set("fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net", "<application-id>")
spark.conf.set("fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net", dbutils.secrets.get(scope="<scope-name>",key="<service-credential-key-name>"))
spark.conf.set("fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net", "https://login.microsoftonline.com/<directory-id>/oauth2/token")```
The correct role is "Storage Blob Data Contributor" not "Contributor".

Limiting access to Storage Account from Azure Function Subnet

I have an azure function hosted on an (S1) App Service Plan. The Azure Function is integrated to a VNet subnet. This subnet has Microsoft.Storage and Microsoft.Web service endpoints enabled, and also it's delegated to Microsoft.Web/serverFarms
On the other hand, the storage account is configured to accept request only from the same subnet the azure function is part of.
Unfortunately, that doesn't work. When I try to communicate with the storage account from the Azure function, I get the below error
2020-02-18T02:03:03.505 [Error] Faliure Occured
Azure.RequestFailedException : This request is not authorized to perform this operation.
RequestId:0b034a99-701e-002c-09ff-e5bd0a000000
Time:2020-02-18T02:03:03.1177265Z
Status: 403 (This request is not authorized to perform this operation.)
ErrorCode: AuthorizationFailure
Headers:
Server: Microsoft-HTTPAPI/2.0
x-ms-request-id: 0b034a99-701e-002c-09ff-e5bd0a000000
x-ms-client-request-id: 0bbe8185-4657-47f3-8566-5bcbd16c4274
x-ms-error-code: AuthorizationFailure
Date: Tue, 18 Feb 2020 02:03:02 GMT
Content-Length: 246
Content-Type: application/xml
at Azure.Storage.Blobs.BlobRestClient.Container.GetPropertiesAsync_CreateResponse(ClientDiagnostics clientDiagnostics,Response response)
at async Azure.Storage.Blobs.BlobRestClient.Container.GetPropertiesAsync(ClientDiagnostics clientDiagnostics,HttpPipeline pipeline,Uri resourceUri,String version,Nullable`1 timeout,String leaseId,String requestId,Boolean async,String operationName,CancellationToken cancellationToken)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at async Azure.Storage.Blobs.BlobContainerClient.GetPropertiesInternal(BlobRequestConditions conditions,Boolean async,CancellationToken cancellationToken)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at Azure.Storage.TaskExtensions.EnsureCompleted[T](Task`1 task)
at Azure.Storage.Blobs.BlobContainerClient.GetProperties(BlobRequestConditions conditions,CancellationToken cancellationToken)
at SharedLib.Utils.TestStorageAccountAccess() at D:\poc-code\NetworkSecurityPoc\SharedLib\Utils.cs : 13
at async MessengerFunction.Trigger.Run(HttpRequest req,ILogger log) at D:\poc-code\NetworkSecurityPoc\MessengerFunction\Trigger.cs : 25
But when I disable the vnet restriction on the storage account, everything works.
What could I be doing wrong?
Thank you.
The below documentation might help why this is happening:
From MS documentation:
When you create a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. You can't currently use any virtual network restrictions on this account. If you configure a virtual network service endpoint on the storage account you're using for your function app, that configuration will break your app.
Reference: enter link description here
I would say it's networking problem as per here Function networking So set WEBSITE_VNET_ROUTE_ALL to 1 then it should work..

Azure DevOps Pipeline Azure Blob Storage upload file 403 Forbidden Exception

Summary
I'm creating a CI/CD provisioning pipeline for a new Azure Storage Account within an Azure DevOps Pipeline and attempting to upload some files into the Blob Storage using AzCopy running from an Azure Powershell task in the pipeline.
The Error
The script runs successfully from my local machine but when running in the Azure DevOps pipeline I get the following error (ErrorDateTime is just an obfuscated ISO 8601 formatted datetime):
System.Management.Automation.RemoteException: [ErrorDateTime][ERROR] Error parsing destination location
"https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate
destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[debug]Processed: ##vso[task.logissue type=error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
Error record:
This request is not authorized to perform this operation.
Assumptions
The storage account has been setup to only allow specific VNet and IP Addresses access.
It looks like the firewall or credentials are somehow configured wrongly but the ServicePrincipal running the script has been used successfully in other sibling pipeline tasks and to understand these problems i've temporarily given the ServicePrincipal Subscription Owner permissions and the Storage account Firewall Rules tab has "Allow trusted Microsoft Services to access this storage account"
What I've tried...
I've successfully run the script from my local machine with my IP Address being in the allowed list.
If I enable "Allow access from All networks" on the Storage Account Firewall rules then the script runs and the file is uploaded successfully.
It appears as if the Azure Pipeline Agents running in their own VNet don't have access to my Storage Account but I would have thought that requirement would be satisfied by setting "Allow trusted Microsoft Services to access this storage account" in the Firewall settings
I'm using the following line within the Azure Powershell Task. I'm happy with the values because everything works when "All networks" or my IP Address is enabled and I run locally.
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
Any thoughts or guidance would be appreciated.
Thanks,
SJB
People seem to be getting mixed results in this github issue, but the AzureFileCopy#4 task works (at least for us) after adding the "Storage Blob Data Contributor" role to the ARM connection's service principal (to the storage account itself). The below is the only necessary step in a pipeline that deploys a repo as a static website in a blob container:
- task: AzureFileCopy#4
displayName: 'Copy files to blob storage: $(storageName)'
inputs:
SourcePath: '$(build.sourcesDirectory)'
Destination: AzureBlob
storage: $(storageName)
ContainerName: $web
azureSubscription: 'ARM Connection goes here' # needs a role assignment before it'll work
(Of course, if you're using Azure CDN like we are, the next step is to clear the CDN endpoint's cache, but that has nothing to do with the blob storage error)
After doing further research I noticed the following raised issue - that Azure DevOps isn't considered a trusted Microsoft Service from a Storage Account perspective.
https://github.com/MicrosoftDocs/azure-docs/issues/19456
My temporary workaround is to:
Setting the DefaultAction to Allow, thereby allowing "All networks access".
Setting the DefaultAction to Deny after the copy action ensured my VNet rules were being enforced again.
Try
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Allow
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
}
Catch
{
#Handle errors...
}
Finally
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Deny
}
Thanks,
SJB
Have you considered using the Azure DevOps Task "Azure File Copy" instead of a powershell script?
see: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy?view=azure-devops

Getting NameOrService not known error when trying to load data to Data Lake Storage Gen1

I am trying to load data to Data Lake Storage Gen1 using python but getting NameOrService not known.
I have created an AD application and got the client key, tenant id as mentioned in docs
adlsAccountName = '*******'
adlCreds = lib.auth(tenant_id = '*****', client_secret = '*****', client_id ='******')
##Create a filesystem client object
adlsFileSystemClient = core.AzureDLFileSystem(adlCreds, store_name=adlsAccountName)
adlsFileSystemClient.ls('/')
The error i am getting is :
azure.datalake.store.exceptions.DatalakeRESTException: HTTP error: ConnectionError(MaxRetryError("HTTPSConnectionPool(host='junipertest.azuredatalakestore.net', port=443): Max retries exceeded with url: /webhdfs/v1/.?api-version=2018-09-01&listSize=4000&OP=LISTSTATUS (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -2] Name or service not known',))",),)
I have tried mkdir and ls both but getting the same error.
"Name or service not known" is a network error that indicates the host cannot be resolved junipertest.azuredatalakestore.net or there is no service on port 443.
Check the name again in the Azure portal.
Check the name resolution (DNS):
> nslookup junipertest.azuredatalakestore.net # Windows
$ dig junipertest.azuredatalakestore.net # Linux
Ensure you have a route to reach the data lake.
From the Azure Portal Select Data Lake Storage and the select Firewall and virtual networks
For Gen1 storage, follow this guide to secure data

The MAC signature found in the HTTP request is not same as computed from azure server

I am trying to hit Azure blob service from POST man. I have the Azure account name, sharedKey which we are using from java code to hit Azure media services.
I had read through the Azure documentation for the reference but it's not working when I hit from POST man though I passed all the required credentials like 'Azure accountName, shared key', please find the below details which I am using from Postman.
URL - https://xxxx.blob.core.windows.net/yyy/myblob
(Note:xxx- accountName, yyy- container)
header:
x-ms-date: value
x-ms-blob-type: BlockBlob
x-ms-version: 2017-11-09
Authorization: SharedKey xxx:hascodeKey (which we are using for media service which is working one)
I want to know, is the shared key same for Azure media service and Azure blob service?
Can somebody help me on this?

Resources