(Azure Storage - nodeJS) Getting SAS policies that are applied on blob container and queue - node.js

I'm trying to get the expiration date of the SAS policies that are applied on blob container and queue.
I'm able to get the information via powershell with the Get-AzStorageQueueStoredAccessPolicy and Get-AzStorageContainerStoredAccessPolicy but I cannot find a way to look a way to do the same via nodeJS.
I've went trough the MS node sdk for storage, i was able to find a way to setup the SAS policy but not to retrieve an existing one.
Do I need to go trough the ms graph?
Thank you for your help.

To get the access policies for a blob container, the method you would want to use is getAccessPolicy() which is in ContainerClient class.
import {BlobServiceClient} from '#azure/storage-blob';
const connectionString = "your-storage-account-connection-string";
const containerName = "your-container-name";
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(containerName);
const accessPolicyResult = await containerClient.getAccessPolicy();
console.log(accessPolicyResult.signedIdentifiers);//access policies are defined in "signedIdentifiers"
Similarly to get the access policies for a queue, the method you would to use is getAccessPolicy() which is in QueueClient class.
import {QueueServiceClient} from '#azure/storage-queue';
const connectionString = "your-storage-account-connection-string";
const queueName = "your-queue-name";
const queueServiceClient = QueueServiceClient.fromConnectionString(connectionString);
const queueClient = queueServiceClient.getQueueClient(queueName);
const accessPolicyResult = await queueClient.getAccessPolicy();
console.log(accessPolicyResult.signedIdentifiers);//access policies are defined in "signedIdentifiers"

Related

How to check existence of soft deleted file in Azure blob container with node js?

I have file which was stored in some Azure blob directory "folder1/folder2/file.txt". This file was soft deleted - I can see it in Azure web console. I need to have function which checks this file existence.
I tried library "azure-storage". It perfectly works with NOT removed files:
const blobService = azure.createBlobService(connectingString);
blobService.doesBlobExist(container, blobPath, callback)
May be anyone knows how use same approach with soft removed files?
I tied with lib "#azure/storage-blob".
But I stuck with endless entities there (BlobServiceClient, ContainerItem, BlobClient, ContainerClient, etc) and couldn't find way to see particular file in particular blob directory.
Following this MSDOC, I got to restore the Soft deleted blobs and their names with the below code snippet.
const { BlobServiceClient } = require('#azure/storage-blob');
const connstring = "DefaultEndpointsProtocol=https;AccountName=kvpstorageaccount;AccountKey=<Storage_Account_Key>;EndpointSuffix=core.windows.net"
if (!connstring) throw Error('Azure Storage Connection string not found');
const blobServiceClient = BlobServiceClient.fromConnectionString(connstring);
async function main(){
const containerName = 'kpjohncontainer';
const blobName = 'TextFile05.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
undeleteBlob(containerClient, blobName)
}
main()
.then(() => console.log(`done`))
.catch((ex) => console.log(ex.message));
async function undeleteBlob(containerClient, blobName){
const blockBlobClient = await containerClient.getBlockBlobClient(blobName);
await blockBlobClient.undelete(); //to restore the deleted blob
console.log(`undeleted blob ${blobName}`);
}
Output:
To check if the blob exists and if exists but in Soft-deleted state, I found the relevant code but it’s in C# provided by #Gaurav Mantri. To achieve the same in NodeJS refer here.

Azure Data Storage: Unable to upload file (Not authorized to perform this operation using this permission)

I'm trying to follow the example to upload a file to Azure Data Storage as mentioned in the documentation : https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet?tabs=visual-studio%2Cmanaged-identity%2Croles-azure-portal%2Csign-in-azure-cli%2Cidentity-visual-studio
Following is my code:
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.IO;
using Azure.Identity;
// TODO: Replace <storage-account-name> with your actual storage account name
var blobServiceClient = new BlobServiceClient(
new Uri("https://[some azure storage]"),
new DefaultAzureCredential());
// Set container name
string containerName = "data";
// Get container
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "data";
Directory.CreateDirectory(localPath);
string fileName = "testupload" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Upload data from the local file
await blobClient.UploadAsync(localFilePath, true);
But I'm getting an error message that the request is not authorized.
Error message:
Azure.RequestFailedException: 'This request is not authorized to perform this operation using this permission.
I have Contributor role (which based on description is Grant full access to manage all resources ....), is this role still not enough to perform the operation?
I tried in my environment and got below results:
Initially I tried same code in my environment and got same error
Console:
Azure.RequestFailedException: 'This request is not authorized to perform this operation using this permission.
The above error occurs when your principal doesn't has access to azure blob storage.
For accessing blob storage through identity, Gaurav Mantri comment it is correct, you need a role to access blob storage,
The roles are
Storage-blob-contributor(or)
Storage-blob-owner
Go to portal -> storage accounts -> Access Control (IAM) ->Add -> Add role assignments -> storage-blob-contributor or storage-blob-owner role to the storage account.
Portal:
After assigning role to my storage account, I executed same code and it successfully uploaded file in azure blob storage.
Code:
using Azure.Storage.Blobs;
using System;
using System.IO;
using Azure.Identity;
// TODO: Replace <storage-account-name> with your actual storage account name
var blobServiceClient = new BlobServiceClient(
new Uri("https://[some azure storage]"),
new DefaultAzureCredential());
// Set container name
string containerName = "test";
// Get container
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "data";
Directory.CreateDirectory(localPath);
string fileName = "testupload" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
await File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine("Uploading to Blob storage as blob:\n\t {0}\n", blobClient.Uri);
// Upload data from the local file
await blobClient.UploadAsync(localFilePath, true);
Console:
Portal:
Make sure you have to change the Network Access to Enable Public to All, if you're not using VPN or dedicated Network to access Azure Environment.

How to Get Azure Event Hub Connection String in C#?

Given a Event Hub Name, how can I get connection string in C#?
I googled a bit, but nothing useful found so far.
Thanks
Using AAD authentication for an EventHub
var credential = new DefaultAzureCredential();
// or use
// var credential = new Azure.Identity.ClientSecretCredential("tenantId", "clientId", "clientSecret");
EventHubProducerClient producerClient = new EventHubProducerClient(txtNamespace.Text, txtEventHub.Text, credential
var consumerClient = new EventHubConsumerClient(EventHubConsumerClient.DefaultConsumerGroupName, txtNamespace.Text, txtEventHub.Text, credential)
Full example and docs
Acquiring the Connection Strings of configured Access Policies
You can use these two Nuget packages:
Azure.ResourceManager.EventHubs
Azure.Identity
Then you can use the resource group name and the eventhub name to retrieve the connection string. You will need to iterate the subscriptions and resource groups if you don't have this information.
using Azure.Identity;
using Azure.ResourceManager;
using Azure.ResourceManager.EventHubs;
ArmClient client = new ArmClient(new DefaultAzureCredential());
// Or use
// ArmClient client = new ArmClient(new Azure.Identity.ClientSecretCredential("tenantId", "clientId", "clientSecret"));
var subscription = await client.GetDefaultSubscriptionAsync();
var resourceGroup = await subscription.GetResourceGroupAsync("myresourcegroup");
var eventhubNamespace = await resourceGroup.Value.GetEventHubsNamespaceAsync("namespacename");
var rules = eventhubNamespace.Value.GetEventHubsNamespaceAuthorizationRules();
foreach (var rule in rules)
{
var keys = await rule.GetKeysAsync();
Console.WriteLine(keys.Value.PrimaryConnectionString);
Console.WriteLine(keys.Value.SecondaryConnectionString);
}
Not sure if this is what you mean, but if you want to access an Event Hub through C# you need to provide the EH connection string into your code. This can be retrieved by adding a Shared access policy for the Event hub that you are trying to access.
Edit: If you are trying to actually create the connection string yourself you could follow this sample where you create the SAS-token yourself. But you would still need to provide the Primary key that is set on the policy from Azure.

How to read Azure Blob Url?

I am creating list of blog posts in react and express/ Azure SQL db. I am able to use the Azure blob storage to store the image associated to the post. I am also able to get the blob url and I am storing that in my SQL db. However when I want to read the url directly it threw an error resource not found. After searching docs and other stackoverflow answers I could infer that it has something to do with SAS token. Can anyone explain what would be the better way to approach this?
https://yourdomain.blob.core.windows.net/imagecontainer/yourimage.png
Below is the nodejs code.
router.post('/image', async function (req, res) {
try {
console.log(req.files.files.data);
const blobName = 'test' + uuidv1() + '.png';
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(req.files.files.data, req.files.files.data.length)
res.send({tempUrl:blockBlobClient.url});
} catch (e) {
console.log(e);
}
})
However when I want to read the url directly it threw an error
resource not found.
Most likely you're getting this error because the blob container containing the blob has a Private ACL and because of that anonymous access is disabled. To enable anonymous access, please change the blob container's ACL to Blob or Public and that will solve this problem.
If you can't (or don't want to) change the blob container's ACL, other option would be to create a Shared Access Signature (SAS) on a blob. A SAS essentially gives time and permission bound access to a blob. For your needs, you would need to create a short-lived SAS token with just Read permission.
To generate a SAS token, you will need to use generateBlobSASQueryParameters method. Once you create a SAS token, you will need to append it to your blob's URL to get a SAS URL.
Here's the sample code to do so. It makes use of #azure/storage-blob node package.
const permissions = new BlobSASPermissions();
permissions.read = true;//Set read permission only.
const currentDateTime = new Date();
const expiryDateTime = new Date(currentDateTime.setMinutes(currentDateTime.getMinutes()+5));//Expire the SAS token in 5 minutes.
var blobSasModel = {
containerName: 'your-blob-container-name',
blobName: 'your-blob-name',
permissions: permissions,
expiresOn: expiryDateTime
};
const sharedKeyCredential = new StorageSharedKeyCredential('your-storage-account-name', 'your-storage-account-key');
const sasToken = generateBlobSASQueryParameters(blobSasModel, sharedKeyCredential);
const sasUrl = blockBlobClient + "?" + sasToken;//return this SAS URL to the client.

Copy blob from one storage account to another using #azure/storage-blob

What would be the best way to copy a blob from one storage account to another storage account using #azure/storage-blob?
I would imagine using streams would be best instead of downloading and then uploading, but would like to know if the code below is the correct/optimal implementation for using streams.
const srcCredential = new ClientSecretCredential(<src-ten-id>, <src-client-id>, <src-secret>);
const destCredential = new ClientSecretCredential(<dest-ten-id>, <dest-client-id>, <dest-secret>);
const srcBlobClient = new BlobServiceClient(<source-blob-url>, srcCredential);
const destBlobClient = new BlobServiceClient(<dest-blob-url>, destCredential);
const sourceContainer = srcBlobClient.getContainerClient("src-container");
const destContainer = destBlobClient.getContainerClient("dest-container");
const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name)
// copy blob
await destBlob.uploadStream((await sourceBlob.download()).readableStreamBody);
Your current approach downloads the source blob and then re-uploads it which is not really optimal.
A better approach would be to make use of async copy blob. The method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions). You would need to create a Shared Access Signature URL on the source blob with at least Read permission. You can use generateBlobSASQueryParameters SDK method to create that.
const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name);
const sourceBlobSasUrl = GenerateSasUrlWithReadPermissionOnSourceBlob(sourceBlob);
// copy blob
await destBlob.beginCopyFromURL(sourceBlobSasUrl);

Resources