How do I authenticate with azure/identity module in node.js? - node.js

I am using a node.js application (v12.18.2, not in the browser) to access an Azure blob store. My existing code using #azure/storage-blob v10.5.0 is working and the authentication code looks like this:
const Azure = require( '#azure/storage-blob' );
let containerUriWithSAS = `${credentials.container_uri}?${credentials.sas_token}`;
let pipeline = Azure.StorageURL.newPipeline( new Azure.AnonymousCredential() );
let ContainerURL = new Azure.ContainerURL( containerUriWithSAS, pipeline );
Using this code to authenticate and then using, for example, ContainerURL.listBlobFlatSegment() to list objects works perfectly. I can create, get, delete, and list objects.
When I upgraded to #azure/storage-blob v12.1.2, there were some breaking changes. Now my code looks like:
//const{ DefaultAzureCredential } = require( '#azure/identity' ); // tried this instead of AnonymousCredential
const{ BlobServiceClient, AnonymousCredential } = require( '#azure/storage-blob' );
let containerUriWithSAS = `${credentials.container_uri}?${credentials.sas_token}`;
//let defaultAzureCredential = new DefaultAzureCredential();
let anonymousCredential = new AnonymousCredential();
let blobServiceClient = new BlobServiceClient( containerUriWithSAS, anonymousCredential );
const containerName = 'MyContainer';
const containerClient = blobServiceClient.getContainerClient( containerName );
const createContainerResponse = await containerClient.create();
On one (Linux) machine, I cannot connect to the server at all (the create() call times out). On another (Windows), the create() call throws an error which tells me that "The requested URI does not represent any resource on the server".
I've verified that the URI is exactly the same as one used by the working code but obviously I'm missing something in my understanding of the authentication process. How do I make my new code do what my old code did?
Also, it seems that I have to create a container before I can create objects, which I didn't have to do before. Is that part of my confusion?

BlobServiceClient should be created like below (not like with container URI you are doing). Also, note you don't need AnonymousCredential.
const { BlobServiceClient } = require("#azure/storage-blob");
const account = "<account name>";
const sas = "<service Shared Access Signature Token>";
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net${sas}`
);
const containerName = 'MyContainer';
const containerClient = blobServiceClient.getContainerClient(containerName);
// and go on doing your stuffs

Related

How to check existence of soft deleted file in Azure blob container with node js?

I have file which was stored in some Azure blob directory "folder1/folder2/file.txt". This file was soft deleted - I can see it in Azure web console. I need to have function which checks this file existence.
I tried library "azure-storage". It perfectly works with NOT removed files:
const blobService = azure.createBlobService(connectingString);
blobService.doesBlobExist(container, blobPath, callback)
May be anyone knows how use same approach with soft removed files?
I tied with lib "#azure/storage-blob".
But I stuck with endless entities there (BlobServiceClient, ContainerItem, BlobClient, ContainerClient, etc) and couldn't find way to see particular file in particular blob directory.
Following this MSDOC, I got to restore the Soft deleted blobs and their names with the below code snippet.
const { BlobServiceClient } = require('#azure/storage-blob');
const connstring = "DefaultEndpointsProtocol=https;AccountName=kvpstorageaccount;AccountKey=<Storage_Account_Key>;EndpointSuffix=core.windows.net"
if (!connstring) throw Error('Azure Storage Connection string not found');
const blobServiceClient = BlobServiceClient.fromConnectionString(connstring);
async function main(){
const containerName = 'kpjohncontainer';
const blobName = 'TextFile05.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
undeleteBlob(containerClient, blobName)
}
main()
.then(() => console.log(`done`))
.catch((ex) => console.log(ex.message));
async function undeleteBlob(containerClient, blobName){
const blockBlobClient = await containerClient.getBlockBlobClient(blobName);
await blockBlobClient.undelete(); //to restore the deleted blob
console.log(`undeleted blob ${blobName}`);
}
Output:
To check if the blob exists and if exists but in Soft-deleted state, I found the relevant code but it’s in C# provided by #Gaurav Mantri. To achieve the same in NodeJS refer here.

Authorization Error from beginCopyFromURL API from Javascript library (#azure/storage-blob) when executed from minikube

I have application registered in Azure and it has Storage Account Contributor role. I am trying to copy content from one account to another in same subscription by using SAS token. Below is code snippet for testing purpose. This code works perfectly fine from standalone node js but it fails when deployed in minikube pod with Authorization Error code 403. Any suggestions/thoughts will be appreciated.
I have verified start and end date for signature.
Permissions are broader but they seem to correct.
For testing keeping expiry for 24 hrs.
If I copy sas url generated from failed code,I can download file from my host machine using azcopy command line. Looks like code fails only when executed from minikube pod.
const { ClientSecretCredential } = require("#azure/identity");
const { BlobServiceClient, UserDelegationKey, ContainerSASPermissions, generateBlobSASQueryParameters } = require("#azure/storage-blob");
module.exports = function () {
/*
This function will receive an input that conforms to the schema specified in
activity.json. The output is a callback function that follows node's error first
convention. The first parameter is either null or an Error object. The second parameter
of the output callback should be a JSON object that conforms to the schema specified
in activity.json
*/
this.execute = async function (input, output) {
try {
if (input.connection) {
const containerName = input.sourcecontainer.trim()
const credential = new ClientSecretCredential(input.connection.tenantId, input.connection.clientid, input.connection.clientsecret);
const { BlobServiceClient } = require("#azure/storage-blob");
// Enter your storage account name
const account = input.sourceaccount.trim();
const accounturl = 'https://'.concat(account).concat('.blob.core.windows.net')
const blobServiceClient = new BlobServiceClient(
accounturl,
credential);
const keyStart = new Date()
const keyExpiry = new Date(new Date().valueOf() + 86400 * 1000)
const userDelegationKey = await blobServiceClient.getUserDelegationKey(keyStart, keyExpiry);
console.log(userDelegationKey)
const containerSAS = generateBlobSASQueryParameters({
containerName,
permissions: ContainerSASPermissions.parse("racwdl"),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + 86400 * 1000),
},
userDelegationKey, account).toString();
const target = '/' + containerName + '/' + input.sourcefolder.trim() + '/' + input.sourcefilename.trim()
const sastoken = accounturl + target + '?' + containerSAS
console.log(sastoken)
let outputData = {
"sourcesas": sastoken
}
//Testing second action execution from same action for testing purpose.
const containerName2 = 'targettestcontainer'
const credential2 = new ClientSecretCredential(input.connection.tenantId, input.connection.clientid, input.connection.clientsecret);
// Enter your storage account name
const blobServiceClient2 = new BlobServiceClient(
accounturl,
credential2);
const destContainer = blobServiceClient2.getContainerClient(containerName2);
const destBlob = destContainer.getBlobClient('testfolder01' + '/' + 'test-code.pdf');
const copyPoller = await destBlob.beginCopyFromURL(outputData.sourcesas);
const result = await copyPoller.pollUntilDone();
return output(null, outputData)
}
} catch (e) {
console.log(e)
return output(e, null)
}
}
}
Thank you EmmaZhu-MSFT for providing the solution. Simmilar issue also raise in github Posting this as an answer to help other community member.
From service side log, seems there's time skew between Azure Storage
Service and the client, the start time used in source SAS token was
later than server time.
We'd suggest not using start time in SAS token to avoid this kind of
failure caused by time skew.
Reference : https://github.com/Azure/azure-sdk-for-js/issues/21977

(Azure Storage - nodeJS) Getting SAS policies that are applied on blob container and queue

I'm trying to get the expiration date of the SAS policies that are applied on blob container and queue.
I'm able to get the information via powershell with the Get-AzStorageQueueStoredAccessPolicy and Get-AzStorageContainerStoredAccessPolicy but I cannot find a way to look a way to do the same via nodeJS.
I've went trough the MS node sdk for storage, i was able to find a way to setup the SAS policy but not to retrieve an existing one.
Do I need to go trough the ms graph?
Thank you for your help.
To get the access policies for a blob container, the method you would want to use is getAccessPolicy() which is in ContainerClient class.
import {BlobServiceClient} from '#azure/storage-blob';
const connectionString = "your-storage-account-connection-string";
const containerName = "your-container-name";
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(containerName);
const accessPolicyResult = await containerClient.getAccessPolicy();
console.log(accessPolicyResult.signedIdentifiers);//access policies are defined in "signedIdentifiers"
Similarly to get the access policies for a queue, the method you would to use is getAccessPolicy() which is in QueueClient class.
import {QueueServiceClient} from '#azure/storage-queue';
const connectionString = "your-storage-account-connection-string";
const queueName = "your-queue-name";
const queueServiceClient = QueueServiceClient.fromConnectionString(connectionString);
const queueClient = queueServiceClient.getQueueClient(queueName);
const accessPolicyResult = await queueClient.getAccessPolicy();
console.log(accessPolicyResult.signedIdentifiers);//access policies are defined in "signedIdentifiers"

Azure - create deployment slot node.js - ResourceNotFound

I am trying to create a deployment slot on Azure using node.js SDK (npm package #azure/arm-appservice)
This is how I call the method:
const msRest = require('#azure/ms-rest-nodeauth');
const armWebsite = require('#azure/arm-appservice');
async function createDeploymentSlot(){
const credentials = await msRest.loginWithServicePrincipalSecret("clientId", "secret", "domain");
const webManagementClient = armWebsite.WebSiteManagementClient(credentials, "subscriptionId");
const deployment = {
location : "westus"
}
return webManagementClient.webApps.createDeploymentSlot(
"test-resource-group-name",
"test-app-name",
"", //ID of an existing deployment??
"test-new-slot-name",
deployment
)
}
And I am getting the following error:
Error: The Resource 'Microsoft.Web/sites/test-app-name/slots/test-new-slot-name' under resource group 'test-resource-group-name' was not found.
Which is a very weird, because it says it can't find the resource I am trying to create.
What am I doing wrong?
Thanks
If you want to create a deployment slot for your web app, you need to use the createOrUpdateSlot method, note the location need to be the same with the web app.
Sample:
const msRest = require('#azure/ms-rest-nodeauth');
const armWebsite = require('#azure/arm-appservice');
async function main(){
const credentials = await msRest.loginWithServicePrincipalSecret("clientId", "secret", "domain");
const webManagementClient = new armWebsite.WebSiteManagementClient(credentials, "subscriptionId");
const siteEnvelope = '{"location":"centralus","enabled":true}';
const obj = JSON.parse(siteEnvelope);
const a = await webManagementClient.webApps.createOrUpdateSlot(
"groupname",
"joywebapp",
obj,
"slot2"
);
console.log(a)
}
main();

StorageSharedKeyCredential - Cannot set property of 'accountName' of undefined

I've been following the tutorial for creating the azure blob service client for the npm package "#azure/storage-blob" inside the documentation. I'm using the alternative method of using the storage account's account name and key to generate a "StroageSharedKeyCredential" object but I'm getting an error where the package is trying to assign the account name to an undefined object.
I've taken my account name and key from the "Settings" -> "Access keys" tab in the azure portal but can't see where I'm going wrong. If anyone can point me in the right direction on what I should check or change it would be greatly appreciated.
Snippet of my code trying to create the StorageSharedKeyCredentialObject
const azureAccount = 'some_account_name';
const azureAccountKey = 'some_account_key';
let azureSharedKeyCredential = StorageSharedKeyCredential(azureAccount, azureAccountKey);
Snippet below taken from node_modules/#azure/storage-blob/dist/index.js
function StorageSharedKeyCredential(accountName, accountKey) {
var _this = _super.call(this) || this;
_this.accountName = accountName;//<---- line reporting the undefined object
_this.accountKey = Buffer.from(accountKey, "base64");
return _this;
}
I can repro your issue , as #juunas said, you should use new StorageSharedKeyCredential(account, accountKey);
Try code below to get all blobs in a container to have a test , it works perfectly for me:
const { BlobServiceClient, StorageSharedKeyCredential } = require("#azure/storage-blob");
const account = "<your storage name>";
const accountKey = "<storage key>";
const containerName = "<container name>";
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
let i = 1;
let iter = await containerClient.listBlobsFlat();
for await (const blob of iter) {
console.log(`Blob ${i++}: ${blob.name}`);
}}
main();
Result :

Resources