I've been following the tutorial for creating the azure blob service client for the npm package "#azure/storage-blob" inside the documentation. I'm using the alternative method of using the storage account's account name and key to generate a "StroageSharedKeyCredential" object but I'm getting an error where the package is trying to assign the account name to an undefined object.
I've taken my account name and key from the "Settings" -> "Access keys" tab in the azure portal but can't see where I'm going wrong. If anyone can point me in the right direction on what I should check or change it would be greatly appreciated.
Snippet of my code trying to create the StorageSharedKeyCredentialObject
const azureAccount = 'some_account_name';
const azureAccountKey = 'some_account_key';
let azureSharedKeyCredential = StorageSharedKeyCredential(azureAccount, azureAccountKey);
Snippet below taken from node_modules/#azure/storage-blob/dist/index.js
function StorageSharedKeyCredential(accountName, accountKey) {
var _this = _super.call(this) || this;
_this.accountName = accountName;//<---- line reporting the undefined object
_this.accountKey = Buffer.from(accountKey, "base64");
return _this;
}
I can repro your issue , as #juunas said, you should use new StorageSharedKeyCredential(account, accountKey);
Try code below to get all blobs in a container to have a test , it works perfectly for me:
const { BlobServiceClient, StorageSharedKeyCredential } = require("#azure/storage-blob");
const account = "<your storage name>";
const accountKey = "<storage key>";
const containerName = "<container name>";
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
let i = 1;
let iter = await containerClient.listBlobsFlat();
for await (const blob of iter) {
console.log(`Blob ${i++}: ${blob.name}`);
}}
main();
Result :
Related
I have application registered in Azure and it has Storage Account Contributor role. I am trying to copy content from one account to another in same subscription by using SAS token. Below is code snippet for testing purpose. This code works perfectly fine from standalone node js but it fails when deployed in minikube pod with Authorization Error code 403. Any suggestions/thoughts will be appreciated.
I have verified start and end date for signature.
Permissions are broader but they seem to correct.
For testing keeping expiry for 24 hrs.
If I copy sas url generated from failed code,I can download file from my host machine using azcopy command line. Looks like code fails only when executed from minikube pod.
const { ClientSecretCredential } = require("#azure/identity");
const { BlobServiceClient, UserDelegationKey, ContainerSASPermissions, generateBlobSASQueryParameters } = require("#azure/storage-blob");
module.exports = function () {
/*
This function will receive an input that conforms to the schema specified in
activity.json. The output is a callback function that follows node's error first
convention. The first parameter is either null or an Error object. The second parameter
of the output callback should be a JSON object that conforms to the schema specified
in activity.json
*/
this.execute = async function (input, output) {
try {
if (input.connection) {
const containerName = input.sourcecontainer.trim()
const credential = new ClientSecretCredential(input.connection.tenantId, input.connection.clientid, input.connection.clientsecret);
const { BlobServiceClient } = require("#azure/storage-blob");
// Enter your storage account name
const account = input.sourceaccount.trim();
const accounturl = 'https://'.concat(account).concat('.blob.core.windows.net')
const blobServiceClient = new BlobServiceClient(
accounturl,
credential);
const keyStart = new Date()
const keyExpiry = new Date(new Date().valueOf() + 86400 * 1000)
const userDelegationKey = await blobServiceClient.getUserDelegationKey(keyStart, keyExpiry);
console.log(userDelegationKey)
const containerSAS = generateBlobSASQueryParameters({
containerName,
permissions: ContainerSASPermissions.parse("racwdl"),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + 86400 * 1000),
},
userDelegationKey, account).toString();
const target = '/' + containerName + '/' + input.sourcefolder.trim() + '/' + input.sourcefilename.trim()
const sastoken = accounturl + target + '?' + containerSAS
console.log(sastoken)
let outputData = {
"sourcesas": sastoken
}
//Testing second action execution from same action for testing purpose.
const containerName2 = 'targettestcontainer'
const credential2 = new ClientSecretCredential(input.connection.tenantId, input.connection.clientid, input.connection.clientsecret);
// Enter your storage account name
const blobServiceClient2 = new BlobServiceClient(
accounturl,
credential2);
const destContainer = blobServiceClient2.getContainerClient(containerName2);
const destBlob = destContainer.getBlobClient('testfolder01' + '/' + 'test-code.pdf');
const copyPoller = await destBlob.beginCopyFromURL(outputData.sourcesas);
const result = await copyPoller.pollUntilDone();
return output(null, outputData)
}
} catch (e) {
console.log(e)
return output(e, null)
}
}
}
Thank you EmmaZhu-MSFT for providing the solution. Simmilar issue also raise in github Posting this as an answer to help other community member.
From service side log, seems there's time skew between Azure Storage
Service and the client, the start time used in source SAS token was
later than server time.
We'd suggest not using start time in SAS token to avoid this kind of
failure caused by time skew.
Reference : https://github.com/Azure/azure-sdk-for-js/issues/21977
i want to us Key Vault key to create JWT token and then validate it.
Im using this code:
public static async Task<string> SignJwt()
{
var tokenHandler = new JwtSecurityTokenHandler();
var signinKey = new SymmetricSecurityKey(Encoding.UTF8.GetBytes("this is my custom Secret key for authentication"));
var tokenDescriptor = new SecurityTokenDescriptor
{
Subject = new ClaimsIdentity(new[] { new Claim("id", "1") }),
Expires = DateTime.UtcNow.AddDays(7),
SigningCredentials = new SigningCredentials(signinKey, SecurityAlgorithms.HmacSha256Signature)
};
var token = tokenHandler.CreateToken(tokenDescriptor);
return tokenHandler.WriteToken(token);
}
and it works fine. I was googling a lot and found this snippet for SigningCredentials using Identity extension nuget:
new SigningCredentials(new KeyVaultSecurityKey("https://myvault.vault.azure.net/keys/mykey/keyid", new KeyVaultSecurityKey.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback)), "RS256")
{
CryptoProviderFactory = new CryptoProviderFactory() { CustomCryptoProvider = new KeyVaultCryptoProvider() }
});
But it is not clear for me, what really AuthenticationCallback is and how to implement that and if i will be able to use that in Azure in web app or azure function?
Firstly, a JWT token consists of 3 parts (Header, Payload and Signature) and all those 3 parts are Base64UriEncoded.
To get the Signature you need to generate header and payload, then combine them by dot.**
Below is the sample code to verify JWT using Azure kay Vault.
const key = await this.keyClient.getKey(this.KEY_NAME);
const cryptClient = new CryptographyClient(key, new DefaultAzureCredential());
const util =require('util')
const base64 = require('base64url');
const JWT=""
const jwtHeader = JWT.split('.')[0];
const jwtPayload = JWT.split('.')[1];
const jwtSignature = JWT.split('.')[2];
const signature = base64.toBuffer(jwtSignature)
const data = util.format('%s.%s', jwtHeader, jwtPayload);
const hash = crypto.createHash('sha256');
const digest = hash.update(data).digest()
const verified =await cryptClient.verify("RS256",digest,signature)
Here are few SO threads with related discussions. SO1, SO2 and SO3
I'm using managed identity to access azure database in this manner.The Azure App Registration is used for getting the token and the token is passed to the connection.In the same manner,how do i connect to a storage account and write to a container? What will be the scope in this case?
AuthenticationResult authenticationResult = null;
var _app = ConfidentialClientApplicationBuilder.Create(Environment.GetEnvironmentVariable("ClientId"))
.WithAuthority(string.Format(Environment.GetEnvironmentVariable("AADInstance"), Environment.GetEnvironmentVariable("Tenant")))
.WithClientSecret(Environment.GetEnvironmentVariable("ClientSecret")).Build();
authenticationResult = _app.AcquireTokenForClient(new string[] { "https://database.windows.net/.default" }).ExecuteAsync().Result;
using (SqlConnection conn = new SqlConnection(Environment.GetEnvironmentVariable("DBConnection")))
{
conn.AccessToken = authenticationResult.AccessToken;
conn.Open();
using (SqlCommand cmd = new SqlCommand("SELECT * FROM mytable", conn))
{
var result = cmd.ExecuteScalar();
Console.WriteLine(result);
}
}
Azure Storage uses this scope:
https://storage.azure.com/.default
That said, with the new Azure Storage SDK and Azure.Identity, you don't actually need to know this.
You can use them like this:
var credential = new ClientSecretCredential(tenantId: "", clientId: "", clientSecret: "");
var blobUrl = "https://accountname.blob.core.windows.net";
var service = new BlobServiceClient(new Uri(blobUrl), credential);
var container = service.GetBlobContainerClient("container");
var blob = container.GetBlobClient("file.txt");
// TODO: Write the file
For Azure Storage, the scope will be https://storage.azure.com/.default.
Please see this link for more details: https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-app?tabs=dotnet#azure-storage-resource-id.
I am using a node.js application (v12.18.2, not in the browser) to access an Azure blob store. My existing code using #azure/storage-blob v10.5.0 is working and the authentication code looks like this:
const Azure = require( '#azure/storage-blob' );
let containerUriWithSAS = `${credentials.container_uri}?${credentials.sas_token}`;
let pipeline = Azure.StorageURL.newPipeline( new Azure.AnonymousCredential() );
let ContainerURL = new Azure.ContainerURL( containerUriWithSAS, pipeline );
Using this code to authenticate and then using, for example, ContainerURL.listBlobFlatSegment() to list objects works perfectly. I can create, get, delete, and list objects.
When I upgraded to #azure/storage-blob v12.1.2, there were some breaking changes. Now my code looks like:
//const{ DefaultAzureCredential } = require( '#azure/identity' ); // tried this instead of AnonymousCredential
const{ BlobServiceClient, AnonymousCredential } = require( '#azure/storage-blob' );
let containerUriWithSAS = `${credentials.container_uri}?${credentials.sas_token}`;
//let defaultAzureCredential = new DefaultAzureCredential();
let anonymousCredential = new AnonymousCredential();
let blobServiceClient = new BlobServiceClient( containerUriWithSAS, anonymousCredential );
const containerName = 'MyContainer';
const containerClient = blobServiceClient.getContainerClient( containerName );
const createContainerResponse = await containerClient.create();
On one (Linux) machine, I cannot connect to the server at all (the create() call times out). On another (Windows), the create() call throws an error which tells me that "The requested URI does not represent any resource on the server".
I've verified that the URI is exactly the same as one used by the working code but obviously I'm missing something in my understanding of the authentication process. How do I make my new code do what my old code did?
Also, it seems that I have to create a container before I can create objects, which I didn't have to do before. Is that part of my confusion?
BlobServiceClient should be created like below (not like with container URI you are doing). Also, note you don't need AnonymousCredential.
const { BlobServiceClient } = require("#azure/storage-blob");
const account = "<account name>";
const sas = "<service Shared Access Signature Token>";
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net${sas}`
);
const containerName = 'MyContainer';
const containerClient = blobServiceClient.getContainerClient(containerName);
// and go on doing your stuffs
I am trying to create a deployment slot on Azure using node.js SDK (npm package #azure/arm-appservice)
This is how I call the method:
const msRest = require('#azure/ms-rest-nodeauth');
const armWebsite = require('#azure/arm-appservice');
async function createDeploymentSlot(){
const credentials = await msRest.loginWithServicePrincipalSecret("clientId", "secret", "domain");
const webManagementClient = armWebsite.WebSiteManagementClient(credentials, "subscriptionId");
const deployment = {
location : "westus"
}
return webManagementClient.webApps.createDeploymentSlot(
"test-resource-group-name",
"test-app-name",
"", //ID of an existing deployment??
"test-new-slot-name",
deployment
)
}
And I am getting the following error:
Error: The Resource 'Microsoft.Web/sites/test-app-name/slots/test-new-slot-name' under resource group 'test-resource-group-name' was not found.
Which is a very weird, because it says it can't find the resource I am trying to create.
What am I doing wrong?
Thanks
If you want to create a deployment slot for your web app, you need to use the createOrUpdateSlot method, note the location need to be the same with the web app.
Sample:
const msRest = require('#azure/ms-rest-nodeauth');
const armWebsite = require('#azure/arm-appservice');
async function main(){
const credentials = await msRest.loginWithServicePrincipalSecret("clientId", "secret", "domain");
const webManagementClient = new armWebsite.WebSiteManagementClient(credentials, "subscriptionId");
const siteEnvelope = '{"location":"centralus","enabled":true}';
const obj = JSON.parse(siteEnvelope);
const a = await webManagementClient.webApps.createOrUpdateSlot(
"groupname",
"joywebapp",
obj,
"slot2"
);
console.log(a)
}
main();