Environment variable not accepting for azure - node.js

I have defined environment variables for azure in terminal like as below,
SET AZURE_STORAGE_ACCOUNT=accountname
SET SAS_TOKEN="sr=c&sp=rwl&sig=signatureKey%3D&sv=2017-04-17&se=2018-03-10"
After defining the varibale, i have called these variables inside the azure blob storage function like,
AZURE_STORAGE_ACCOUNT= process.env.AZURE_STORAGE_ACCOUNT;
SAS_TOKEN = process.env.SAS_TOKEN;
var blobUri = "http://"+AZURE_STORAGE_ACCOUNT+".blob.core.windows.net";
var blobService = azureStorage.createBlobServiceWithSas(blobUri, SAS_TOKEN).withFilter(new azureStorage.ExponentialRetryPolicyFilter());
blobService.createBlockBlobFromLocalFile('mycontainer', 'sparks-events-data', fileToWrite, function(error, result, response) {
if (!error) {
console.log("upload successful..");
} else {
console.log(error);
}});
When i run above file i am getting error like
StorageError: Server failed to authenticate the request. Make sure
the value of Authorization header is formed correctly including the
signature.
But when i call the SAS token directly inside the code it works fine. I am using like this
var sasKey = "sr=c&sp=rwl&sig=signatureKey%3D&sv=2017-04-17&se=2018-03-10";
var blobService = azureStorage.createBlobServiceWithSas(blobUri, sasKey ).withFilter(new azureStorage.ExponentialRetryPolicyFilter());
it works fine for me. I need to set the SAS token as environment variable. Here what i am missing. Please someone suggest me a solution for this.
Thanks in Advance,

export those env vars instead. Exporting a variable causes the variable to be inherited by subsequent processes started in that shell.
export SAS_TOKEN="..."
You may want to take a look at this npm package before rolling your own env var secrets handling logic. We already cracked this, as a people —
https://github.com/motdotla/dotenv
In Windowsland, just use set, but make sure your Node process starts in the same terminal window:
C:\lab> set KEY="secret"
C:\lab> type app.js
let key = process.env.KEY;
console.log('KEY is ' + key);
C:\lab> node app.js
KEY is "secret"
Watch out for those quotes!

Related

Why does my Firebase/Postmark email despatch function fail to deploy?

Firebase deployment of my function fails with the following error:
"Function failed on loading user code. This is likely due to a bug in the user code."
Here's the function code:
const postmark = require("postmark");
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp()
exports.sendPostmarkEmailFunction = functions.firestore.
document('/postmarklogs/{documentId}').
onCreate((snapShot, context) => {
var serverToken = "_my_client_key_";
var client = new postmark.ServerClient(serverToken);
try {
client.sendEmail({
"From": "_my_depatch_address_",
"To": "_my_receipt_address_",
"Subject": snapShot.data().subject,
"HtmlBody": snapShot.data().message
});
return true;
} catch (error) {
console.log("Error : " + error.ErrorCode + " : " + error.Message);
return false;
}
});
This code works just fine in the Firebase emulator. As far as I can see, the deployment issue is triggered specifically by the const postmark = require("postmark"); line. If I comment this out, the function deploys - but then of course it doesn't work!
Advice would be greatly appreciated.
Postmark needs to be installed in the project's 'functions' folder. I'd installed it into the body of the project and so Postmark was missing from the 'functions/package.json' file that guides deployment's build stage. The 'functions' folder created by firebase init functions is like a project within a project and needs to be treated as such.
I got onto the problem from the deployment "build" logs
Once Postmark had been installed in the 'functions' folder my sendPostmarkEmailFunction function worked perfectly.
In passing, unless you already know this, the Postmark API token really needs to be squirreled securely away in the Firebase environment variable store. Also, while you might be tempted to use an https.onRequest trigger rather than the onCreate() used here, you might like to know that this is likely to land you in endless CORS issues when used with Postmark.

Is it safe to store public keys/policies in a node.js constant in Lambda

I am writing a AWS lambda Authorizer in node.js. We are required to call Azure AD API to fetch the public keys/security policies to validate the incoming the Access Token.
However, to optimize the performance, I decided to store the public keys/security policies in node.js as a constant (this will be active until the Lambda is running or TTL of the keys expire).
Question : Is it safe from a security perspective ? I want to avoid "caching" it in DynamoDB as calls to DynamoDB would also incur additional milliseconds. Ours is a very high traffic application and we would like to save any millisecond possible for optimal performance. Also, any best practice is also higly appreciated
Typically, you should not hard-code things like that in your code. Even though it is not a security problem, it is making maintenance harder.
For example: when the key is "rotated" or the policy changed and you had it hard-coded in your Lambda, you would need to update your code and do another deployment. This is often causing issues, because the developer forgot about this etc. causing issues because your authorizer does not work anymore. If the Lambda loads the information from an external service like S3, SSM or directly Azure AD, you don't need another deployment. In theory, it should sort itself out depending on which service you use and how you manage your keys etc.
I think the best way is to load the key from an external service during the initialisation phase of the Lambda. That means when it is "booted" for the first time and then cache that value for the duration of the Lambdas lifetime (a few minutes to a few hours).
You could for example load the public keys and policies either directly from Azure, from S3 or SSM Parameter Store.
The following code uses the AWS SDK NodeJS v3, which is not bundled with the Lambda Runtime. You can use v2 of the SDK as well.
const { SSMClient, GetParameterCommand } = require("#aws-sdk/client-ssm");
// This only happens once, when the Lambda is started for the first time:
const init = async () => {
const config = {}
try {
// use whatever 'paramName' you defined, when you created the SSM parameter
const paramName = "/azure/publickey"
const command = new GetParameterCommand({Name: paramName});
const ssm = new SSMClient();
const data = await ssm.send(command);
config["publickey"] = data.Parameter.Value;
} catch (error) {
return Promise.reject(new Error("unable to read SSM parameter '"+ paramName + "'."));
}
return new Promise((resolve, reject) => {
resolve(config);
reject(new Error("unable to create configuration. Unknown error."));
});
};
const initPromise = init();
exports.handler = async (event) => {
const config = await initPromise;
console.log("My public key '%s'", config.key);
return "Hello World";
};
The most important point of this code is the init "function", which is only run on once, creating a "config" which should contain your AWS SDK clients and all the configuration you need in your code. This way, you don't have to get the policy for every request that the Lambda is processing etc.

Load credential from environment varaible

I want to call some Fortify APIs using nodejs
But for that i need to give administrator credentials
How can i load the credentials from my environment variable?
Example:-
SSCURL = xyz
SSCUSERNAME=changeme
SSCPASSWORD=changeme
There is a system variable process.env, first example in node.js:
const SSCURL = process.env.SSCURL;
const SSCURSERNAME = porcess.env.SSCURSERNAME;
const SSCPASSWORD = process.env.SCCPASSWORD;
You can use this as normal constant afterwards...

Azure keyvault, request for multiple secrets

Im making use of the following node library azure-keyvault to get retrieve stored secrets from azure keyvault. Ive only found the client.getSecret api exposed to retrieve a secret value. Im searching for a way to retrieve multiple secret values in one call. I hav'nt found one yet. Is there a way to do this that i'm missing or its simply not supported.
const { SecretClient } = require('#azure/keyvault-secrets')
const client = new SecretClient(
`https://${KEYVAULT_NAME}.vault.azure.net`,
new DefaultAzureCredential()
)
const [secret1, secret2] = await Promise.all([
client.getSecret(`secret1`),
client.getSecret(`secret2`)
])
Here is the complete code for getting the multiple client secret at once:
var credentials = new KeyVault.KeyVaultCredentials(authenticator);
var client = new KeyVault.KeyVaultClient(credentials);
client.setSecret(vaultUri, 'mysecret', 'my password', options, function (err, secretBundle) {
// List all secrets
var parsedId = KeyVault.parseSecretIdentifier(secretBundle.id);
client.getSecrets(parsedId.vault, parsedId.name, function (err, result) {
if (err) throw err;
var loop = function (nextLink) {
if (nextLink !== null && nextLink !== undefined) {
client.getSecretsNext(nextLink, function (err, res) {
console.log(res);
loop(res.nextLink);
});
}
};
console.log(result);
loop(result.nextLink);
});
});
You can find the complete reference for azure key vault using node js below:
http://azure.github.io/azure-sdk-for-node/azure-keyvault/latest/KeyVaultClient.html#getSecrets
http://azure.github.io/azure-sdk-for-node/azure-keyvault/latest/
Hope it helps.
You can use read-azure-secrets npm package which will return all secrets to you.
E.g.
const secretClient = require('read-azure-secrets');
async function loadKeyVaultValues() {
let applicationID = '';
let applicationSecret = '';
let vaultURL = 'https://<your-key-vault-name>.vault.azure.net/';
let secrets = await secretClient.getSecrets(applicationID, applicationSecret, vaultURL);
secrets.forEach(secret => {
console.log(secret);
});
}
loadKeyVaultValues();
You can try using client.getSecrets(..) method exposed by the REST Api.
Kindly go through the following useful blog, in which all methods have been implemented.
LINK: https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/using-azure-keyvault-with-node-js/
You haven't specified what information about the secret you want to fetch so I am going to assume that you are looking for the secret's value. I am also going to assume you are looking to minimize network traffic for fetching multiple secrets (either for costs or for performance).
Looking at the Azure REST API documentation while there is a route to list multiple secrets it only provides the secret identifier and metadata about the secret (attributes, tags, etc). So if you want to get the secret's value (the actual secret) you will need to make individual calls although get-secrets route can be used to find all the secrets stored in the Key Vault.
As far as the client library, #azure/keyvault-secrets maps pretty closely to the REST API it supports so it will not provide a method that fetches multiple secrets. Even if it did, it would just be a facade over multiple network calls so it would not help reduce the number of network trips.
So to answer your question - it does not look possible today unless all you want is metadata about the secret and not the secret value itself.

DocumentDB Stored Procs: what does the EnableScriptLogging option do?

The DocumentDB APIs for working with stored procedures take an optional RequestOptions argument that has, among others, the property EnableScriptLogging.
The help page for it is useless. The description for it is:
EnableScriptLogging is used to enable/disable logging in JavaScript stored procedures.
Mkay... so how do I log something? (assuming that's console.log(...))
And more importantly, how do I read the logs generated by stored procedures?
I was expecting the response of requests to stored procedures would somehow contain the logs, but can't find anything.
Yes, this is for console.log from script. Must be enabled by client (turned off by default, so that console.log in script is ignored), essentially this set http-header on request. In script you can do something like this:
function myScript() {
  console.log("This is trace log");
}
The log will be in response header (x-ms-documentdb-script-log-results), as well as can be accessible in SDK.
If you use C# SDK, you can use it like this:
var options = new RequestOptions { EnableScriptLogging = true };
var response = await client.ExecuteStoredProcedureAsync<string>(sprocUri, options);
Console.WriteLine(response.ScriptLog);
If you use node.js SDK:
var lib = require("documentdb");
var Client = lib.DocumentClient;
var client = new Client("https://xxxxxxx.documents.azure.com:443/", { masterKey: "xxxxxxxxxxxx" });
var sprocLink = ...;
client.executeStoredProcedure(sprocLink, "input params", { partitionKey: {}, enableScriptLogging: true }, function(err, res, responseHeaders) {
console.log(responseHeaders[lib.Constants.HttpHeaders.ScriptLogResults]);
}
Current limitations:
only enabled for stored procedures
\n is not supported (should be fixed soon)
not supported when script throws unhandled exception (should be fixed soon)

Resources