Can't Access Storage Queue receiveMessages due AuthorizationPermissionMismatch - node.js

I have accessed Azure Storage Queue methods using ClientSecretCredential but on accessing
queue receiveMessages, queue peekMessages and deleteMessages it is giving me error
RestError: This request is not authorized to perform this operation using this permission.
RequestId:c92577923e-a603-0004-61c0-f70a19000
here is my node js code
const { QueueServiceClient } = require("#azure/storage-queue");
const { ClientSecretCredential } = require("#azure/identity");
async function getQueueMessages() {
try {
let myStorageAccount = "hellostorage";
const credential = new ClientSecretCredential(tenantId, app_id, SecretKey);
const queueServiceClient = new QueueServiceClient(
`https://${myStorageAccount}.queue.core.windows.net`,
credential
);
const queueName = "hello-queue";
const queueClient = queueServiceClient.getQueueClient(queueName);
const response = await queueClient.receiveMessages(10);
console.log("response: ", response);
} catch (error) {
console.log("error: ", error);
}
}
getQueueMessages();
Here is my App permission

The screenshot you shared essentially allows your Service Principal to acquire token for your Storage Accounts. It does not give you permissions to perform operations on a Storage Account and this is why you are getting this error.
What you would need to do is give appropriate data related permissions to your Service Principal on a Storage Account. Please see this link for the appropriate RBAC roles that you must assign to your Service Principal to perform data related operations: https://learn.microsoft.com/en-us/rest/api/storageservices/authorize-with-azure-active-directory#manage-access-rights-with-rbac.
You can try with Storage Queue Data Message Processor or Storage Queue Data Contributor roles.
After you apply appropriate roles, you should be able to perform the operations.

Related

Azure blob read SAS token throws AuthorizationPermissionMismatch exception

I'm trying to generate a SAS token for a blob, so that any user with the token can read the blob. Below is the code I have. I get an exception when I try to read the blob. If I grant "Storage Blob Data Reader" access to the user, then it works. My understanding is that user with SAS token should be able to read the blob without granting specific permission. what am I missing here ?
BlobServiceClient blobServiceClient = new BlobServiceClient(new Uri("https://accountname.blob.core.windows.net/"), new DefaultAzureCredential());
UserDelegationKey key = await blobServiceClient.GetUserDelegationKeyAsync(DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow.AddDays(1));
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName = "containerName",
BlobName = "file.json",
Resource = "b",
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
};
sasBuilder.SetPermissions(BlobSasPermissions.Read);
string sasToken = sasBuilder.ToSasQueryParameters(key, "accountname").ToString();
UriBuilder fullUri = new UriBuilder()
{
Scheme = "https",
Host = string.Format("{0}.blob.core.windows.net", "accountname"),
Path = string.Format("{0}/{1}", "containerName", "file.json"),
Query = sasToken
};
var blobClient = new Azure.Storage.Blobs.BlobClient(fullUri.Uri);
using (var stream = await blobClient.OpenReadAsync()) // throws exception
{ }
Exception : Service request failed.
Status: 403 (This request is not authorized to perform this operation using this permission.)
ErrorCode: AuthorizationPermissionMismatch
I believe you are getting this error is because the user for which you are getting the user delegation key does not have permissions to access the data in the storage account.
Assigning Owner permission enables the user to manage the storage account itself, it does not give them permissions to manage the data.
Please try by assigning the user one of the data roles described here: https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-access-azure-active-directory#azure-built-in-roles-for-blobs.
To learn more about RBAC roles to manage data, please see this link: https://learn.microsoft.com/en-us/azure/storage/blobs/assign-azure-role-data-access?tabs=portal.

Azure Function - Managed IDs to write to storage table - failing with 403 AuthorizationPermissionMismatch

I have an Azure function application (HTTP trigger) that writes to the storage queue and table. Both fail when I try to change to managed Id. This post / question is about just the storage table part.
Here's the code that does the actual writing to the table:
GetStorageAccountConnectionData();
try
{
WorkspaceProvisioningRecord provisioningRecord = new PBIWorkspaceProvisioningRecord();
provisioningRecord.status = requestType;
provisioningRecord.requestId = requestId;
provisioningRecord.workspace = request;
#if DEBUG
Console.WriteLine(Environment.GetEnvironmentVariable("AZURE_TENANT_ID"));
Console.WriteLine(Environment.GetEnvironmentVariable("AZURE_CLIENT_ID"));
DefaultAzureCredentialOptions options = new DefaultAzureCredentialOptions()
{
Diagnostics =
{
LoggedHeaderNames = { "x-ms-request-id" },
LoggedQueryParameters = { "api-version" },
IsLoggingContentEnabled = true
},
ExcludeVisualStudioCodeCredential = true,
ExcludeAzureCliCredential = true,
ExcludeManagedIdentityCredential = true,
ExcludeAzurePowerShellCredential = true,
ExcludeSharedTokenCacheCredential = true,
ExcludeInteractiveBrowserCredential = true,
ExcludeVisualStudioCredential = true
};
#endif
DefaultAzureCredential credential = new DefaultAzureCredential();
Console.WriteLine(connection.storageTableUri);
Console.WriteLine(credential);
var serviceClient = new TableServiceClient(new Uri(connection.storageTableUri), credential);
var tableClient = serviceClient.GetTableClient(connection.tableName);
await tableClient.CreateIfNotExistsAsync();
var entity = new TableEntity();
entity.PartitionKey = provisioningRecord.status;
entity.RowKey = provisioningRecord.requestId;
entity["requestId"] = provisioningRecord.requestId.ToString();
entity["status"] = provisioningRecord.status.ToString();
entity["workspace"] = JsonConvert.SerializeObject(provisioningRecord.workspace);
//this is where I get the 403
await tableClient.UpsertEntityAsync(entity);
//other stuff...
catch(AuthenticationFailedException e)
{
Console.WriteLine($"Authentication Failed. {e.Message}");
WorkspaceResponse response = new PBIWorkspaceResponse();
response.requestId = null;
response.status = "failure";
return response;
}
catch (Exception ex)
{
Console.WriteLine($"whoops! Failed to create storage record:{ex.Message}");
WorkspaceResponse response = new WorkspaceResponse();
response.requestId = null;
response.status = "failure";
return response;
}
I have the client id/ client secret for this security principal defined in my local.settings.json as AZURE_TENANT_ID/AZURE_CLIENT_ID/AZURE_CLIENT_SECRET.
The code dies trying to do the upsert. And it never hits the AuthenticationFailedException - just the general exception.
The security principal defined in the AZURE* variables was used to created this entire application including the storage account.
To manage data inside a storage account (like creating table etc.), you will need to assign different sets of permissions. Owner role is a control-plane role that enables you to manage storage accounts themselves and not the data inside them.
From this link:
Only roles explicitly defined for data access permit a security
principal to access blob data. Built-in roles such as Owner,
Contributor, and Storage Account Contributor permit a security
principal to manage a storage account, but do not provide access to
the blob data within that account via Azure AD.
Even though the text above is for Blobs, same thing applies for Tables as well.
Please assign Storage Table Data Contributor to your Managed Identity and then you should not get this error.

Getting Error while Creating SAS Token for Azure Storage Blob with MSI

I'm trying to create a SAS token for a storage blob. I use a StorageCredentials which was created with MSI (Managed Service Identity) to create the CloudBlobClient. When creating the SAS I'm getting "Cannot create Shared Access Signature unless Account Key credentials are used". Is there support to SAS with MSI?
var container = blobClient.GetContainerReference(containerName);
var blockBlob = container.GetBlockBlobReference(snapname);
var sas = string.Concat(blockBlob.Uri.ToString(), blockBlob.GetSharedAccessSignature(sasConstraints));
This is how I create the StorageCredentials:
tokenCallback = CreateMsiCallback();
var initToken = await tokenCallback(audience);
return new StorageCredentials(
new TokenCredential(initToken, async (state, token) =>
{
var accessToken = await _tokenCallback(audience);
return new NewTokenAndFrequency(accessToken, TimeSpan.FromMinutes(1));
}, null, TimeSpan.FromMinutes(1))
);
To create the token callback I use HttpClient
public Func<string, Task<string>> CreateMsiCallback()
{
var handler = new HttpClientHandler
{
ServerCertificateCustomValidationCallback =
(httpRequestMessage, cert, certChain, policyErrors) =>
{
if (policyErrors == SslPolicyErrors.None)
{
return true;
}
return 0 == string.Compare(cert.GetCertHashString(), FabricThumbprint, StringComparison.OrdinalIgnoreCase);
}
};
var client = new HttpClient(handler)
{
DefaultRequestHeaders =
{
{"secret", FabricAuthenticationCode }
}
};
return async (resource) =>
{
var requestUri = $"{FabricMsiEndpoint}?api-version={FabricApiVersion}&resource={HttpUtility.UrlEncode(resource)}";
var requestMessage = new HttpRequestMessage(HttpMethod.Get, requestUri);
var response = await client.SendAsync(requestMessage);
response.EnsureSuccessStatusCode();
var tokenResponseString = await response.Content.ReadAsStringAsync();
var tokenResponseObject =
JsonConvert.DeserializeObject<ManagedIdentityTokenResponse>(tokenResponseString);
return tokenResponseObject.AccessToken;
};
}
}
Based on this Github issue, you will need to assign Storage data roles to your MSI in order to generate SAS token. From this thread:
The error is because your oauth account don't have permission to
generateUserDelegationKey. To get SAS with Oauth storage context
(New-AzStorageContext -UseConnectedAuth), we need first generate
UserDelegationKey from server , then use the key to generate the SAS
token.
Please check have you assigned correct roles to the Oauth login user
(with Connect-AzAccount). like at least one of the following 4 roles
on the specific storage account:
Storage Blob Data Owner
Storage Blob Data Contributor
Storage Blob Data Reader
Storage Blob Delegator

Is there a preset query for Azure Service Connections to retrieve certificate info with the azure-devops-api?

I am working on a front end application feature to get the details from all the key vault certificates and when they expire. The app uses the service connection to authenticate and the azure-devops-sdk and azure-devops-api to pull back data for api calls. I am currently getting data back on the contents within the key vault but it only lists the permissions for the certificates - https://learn.microsoft.com/en-us/rest/api/keyvault/getcertificate/getcertificates. I've tried appending /certificates on the end of the api call but it is not a valid endpoint.
I am aware of the azure rest api which returns this data - https://learn.microsoft.com/en-us/rest/api/keyvault/vaults/get but I am not sure how to implement this with the service connection.
Does anyone have any ideas on how I can gets this data back with the service connection?
export async function getCertInfoFromAzure(endpoint) {
requestInitialData = await getRequestInitialData();
const {project, settings, client} = requestInitialData;
const subscriptionId = endpoint.data.subscriptionId;
const apiVersion = providerMap['Microsoft.KeyVault'.toLowerCase()];
const resourceId = '/subscriptions/<subscription name> /resourceGroups/<resource group name>/providers/Microsoft.KeyVault/vaults/<key vault name>/';
const seRequest: ServiceEndpointRequest = createRequestObject(
`{{{endpoint.url}}}${resourceId}?api-version=${`2019-09-01`}`,
'jsonpath:$');
const response = await client.executeServiceEndpointRequest(seRequest, project.id, endpoint.id);
console.log('response', response);
console.log(JSON.parse(response.result));
console.log('seRequest', seRequest)
return {
response: JSON.parse(response.result[0]),
};
}

Authorize a user for azure blob access

I am authenticating the users to my web page using the microsoft login via adal-node library.
adal-node has an AuthenticationContext from which we can get a JWT token using acquireTokenWithAuthorizationCode.
So, the users of my active directory app can now successfully login with their Microsoft accounts.
Now, the question is how to get their RBAC roles for a specific storageaccount/container/blob using the above received JWT Token? Is that even possible?
Or should I be using a library like azure-arm-authorization for this purpose? I have set the RBAC roles for each storageaccount/container/blob but I am not finding any online documentation on how to get these roles for every logged in user of my app.
Any help would be invaluable.
TL;DR How do I authorize azure blobs?
According to my research, if we want to use Azure AD Authentication to acess Azure blob storage, we need to assign Azure RABC role for Azure storage account or container. For more details, please refer to here. Besides, we can use Azure CLI to assign role and get role assignment.
For example
# assign role
az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee <email> \
--scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/blobServices/default/containers/<container>"
# list role assignment of the resource
az role assignment list --scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/blobServices/default/containers/<container>"
For further information, please read the article
Update1
If you want to use nodejs sdk to get the role assignment, please refer to the following code
const authorizationManagement = require('azure-arm-authorization');
const msrestAzure = require('ms-rest-azure');
const scope = '/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/blobServices/default/containers/<container>';
const subscriptionId = 'e5b0fcfa-e859-43f3-8d84-5e5fe29f4c68';
msrestAzure.interactiveLogin().then(credentials => {
const client = new authorizationManagement(credentials, subscriptionId);
client.roleAssignments.listForScope(scope).then(result => {
result.forEach(element => {
client.roleDefinitions.getById(element.roleDefinitionId).then(result => {
console.log("principal ID: "+ element.principalId+"\nrole name: "+result.roleName)
});
});
});
})
For more details, please refer to Get access control list (IAM) of a resource group in Node.js
Update2
According to my test, if you want to use azure-arm-authorization with adal-node. Please refer to the following code
const authorizationManagement = require('azure-arm-authorization');
const TokenCredentials = require('ms-rest').TokenCredentials
const adal = require('adal-node').AuthenticationContext;
const scope = '/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/blobServices/default/containers/<container>';
const subscriptionId = 'e5b0fcfa-e859-43f3-8d84-5e5fe29f4c68';
// use service principal to get access token with adal-node
/*
If you do not have a service principal, you can use the following Azure CLI command(https://learn.microsoft.com/en-us/cli/azure/ad/sp?view=azure-cli-latest#az-ad-sp-create-for-rbac) to create it
az ad sp create-for-rbac -n "MyApp" --role contributor
*/
const tenant = 'your-tenant-id';
const authorityUrl = "https://login.microsoftonline.com/" + tenant;
const clientId = 'your-client-id';
const clientSecret = 'your-client-secret';
const resource = 'https://management.azure.com/';
const context = new adal(authorityUrl);
context.acquireTokenWithClientCredentials(
resource,
clientId,
clientSecret,
(err, tokenResponse) => {
if (err) {
console.log(`Token generation failed due to ${err}`);
} else {
const credentials = new TokenCredentials(tokenResponse.accessToken);
const client = new authorizationManagement(credentials, subscriptionId);
client.roleAssignments.listForScope(scope).then(result => {
result.forEach(element => {
client.roleDefinitions.getById(element.roleDefinitionId).then(result => {
console.log("principal ID: " + element.principalId + "\nrole name: " + result.roleName)
});
});
});
}
}
);
Besides if you want to know how to use passport-azure-ad to get role assignment, you can use passport-azure-ad to get AD access token then call the Azure rest API. Regarding how to implement it, you can refer to the sample.

Resources