AWS Lambda: How to store secret to external API? - node.js

I'm building a monitoring tool based on AWS Lambda. Given a set of metrics, the Lambdas should be able to send SMS using Twilio API. To be able to use the API, Twilio provide an account SID and an auth token.
How and where should I store these secrets?
I'm currently thinking to use AWS KMS but there might be other better solutions.

Here is what I've come up with. I'm using AWS KMS to encrypt my secrets into a file that I upload with the code to AWS Lambda. I then decrypt it when I need to use them.
Here are the steps to follow.
First create a KMS key. You can find documentation here: http://docs.aws.amazon.com/kms/latest/developerguide/create-keys.html
Then encrypt your secret and put the result into a file. This can be achieved from the CLI with:
aws kms encrypt --key-id some_key_id --plaintext "This is the scret you want to encrypt" --query CiphertextBlob --output text | base64 -D > ./encrypted-secret
You then need to upload this file as part of the Lambda. You can decrypt and use the secret in the Lambda as follow.
var fs = require('fs');
var AWS = require('aws-sdk');
var kms = new AWS.KMS({region:'eu-west-1'});
var secretPath = './encrypted-secret';
var encryptedSecret = fs.readFileSync(secretPath);
var params = {
CiphertextBlob: encryptedSecret
};
kms.decrypt(params, function(err, data) {
if (err) console.log(err, err.stack);
else {
var decryptedSecret = data['Plaintext'].toString();
console.log(decryptedSecret);
}
});
I hope you'll find this useful.

As of AWS Lambda support for NodeJS 4.3, the correct answer is to use Environment Variables to store sensitive information. This feature integrates with AWS KMS, so you can use your own master keys to encrypt the secrets if the default is not enough.

Well...that's what KMS was made for :) And certainly more secure than storing your tokens in plaintext in the Lambda function or delegating to a third-party service.
If you go down this route, check out this blog post for an existing usage example to get up and running faster. In particular, you will need to add the following to your Lambda execution role policy:
"kms:Decrypt",
"kms:DescribeKey",
"kms:GetKeyPolicy",
The rest of the code for the above example is a bit convoluted; you should really only need describeKey() in this case.

There is a blueprint for a Nodejs Lambda function that starts off with decrypting an api key from kms. It provides an easy way to decrypt using a promise interface. It also gives you the role permissions that you need to give the lambda function in order to access kms. The blue print can be found by searching for "algorithmia-blueprint"

Whatever you choose to do, you should use a tool like GitMonkey to monitor your code repositories and make sure your keys aren't committed or pushed to them.

Related

How to set a profile on an aws client

I'm trying to create an AWS client for IOT following this article: How can I publish to a MQTT topic in a Amazon AWS Lambda function?
client = boto3.client('iot-data', region_name='us-east-1')
However I need to set a profile so that boto3 picks the correct credentials from my ~/.aws/credentials file.
The articles that describe how to do this (How to choose an AWS profile when using boto3 to connect to CloudFront) use Session instead of creating a client. However iot-data is not a "resource" that you can get from Session.
boto_session = boto3.Session(profile_name='my-profile')
boto_client = boto_session.resource('iot-data', region_name='us-west-1')
When I try the above I get the error:
Consider using a boto3.client('iot-data') instead of a resource for 'iot-data'
And we've achieved full catch-22 status. How can I get an appropriate IOT client using an AWS profile?
IoTDataPlane does not have resource. You can only use client with the IoTDataPlane:
boto_session.client('iot-data', region_name='us-west-1')

Authenticating a Google Cloud Function as a service account on other Google APIs

I have an HTTP-triggered function running on Google Cloud Functions, which uses require('googleapis').sheets('v4') to write data into a docs spreadsheet.
For local development I added an account via the Service Accounts section of their developer console. I downloaded the token file (dev-key.json below) and used it to authenticate my requests to the Sheets API as follows:
var API_ACCT = require("./dev-key.json");
let apiClient = new google.auth.JWT(
API_ACCT.client_email, null, API_ACCT.private_key,
['https://www.googleapis.com/auth/spreadsheets']
);
exports.myFunc = function (req, res) {
var newRows = extract_rows_from_my_client_app_request(req);
sheets.spreadsheets.values.append({
auth: apiClient,
// ...
resource: { values:newRows }
}, function (e) {
if (e) res.status(500).json({err:"Sheets API is unhappy"});
else res.status(201).json({ok:true})
});
};
After I shared my spreadsheet with my service account's "email address" e.g. local-devserver#foobar-bazbuzz-123456.iam.gserviceaccount.com — it worked!
However, as I go to deploy this to the Google Cloud Functions service, I'm wondering if there's a better way to handle credentials? Can my code authenticate itself automatically without needing to bundle a JWT key file with the deployment?
I noticed that there is a FUNCTION_IDENTITY=foobar-bazbuzz-123456#appspot.gserviceaccount.com environment variable set when my function runs, but I do not know how to use this in the auth value to my googleapis call. The code for google.auth.getApplicationDefault does not use that.
Is it considered okay practice to upload a private JWT token along with my GCF code? Or should I somehow be using the metadata server for that? Or is there a built-in way that Cloud Functions already can authenticate themselves to other Google APIs?
It's common to bundle credentials with a function deployment. Just don't check them into your source control. Cloud Functions for Firebase samples do this where needed. For example, creating a signed URL from Cloud Storage requires admin credentials, and this sample illustrates saving that credential to a file to be deployed with the functions.
I'm wondering if there's a better way to handle credentials? Can my
code authenticate itself automatically without needing to bundle a JWT
key file with the deployment?
Yes. You can use 'Application Default Credentials', instead of how you've done it, but you don't use the function getApplicationDefault() as it has been deprecated since this Q was posted.
The link above shows how to make a simple call using the google.auth.getClient API, providing the desired scope, and have it decide the credential type needed automatically. On cloud functions this will be a 'Compute' object, as defined in the google-auth-library.
These docs say it well here...
After you set up a service account, ADC can implicitly find your
credentials without any need to change your code, as described in the
section above.
Where ADC is Application Default Credentials.
Note that, for Cloud Functions, you use the App Engine service account:
YOUR_PROJECT_ID#appspot.gserviceaccount.com, as documented here. That is the one you found via the FUNCTION_IDENTITY env var - this rather tripped me up.
The final step is to make sure that the service account has the required access as you did with your spreadsheet.

Google Cloud Function : support for Google Cloud KMS

I am using a Google Cloud Function (GCF) with a Pubsub trigger which sends a HTTP request to a third party API.
The GCF receives notifications from a Pubsub topic used by a service which should not be aware of the third party API.
The third party API requires an authentication using Basic HTTP Authentication.
In order to not to have to hardcode the password in my source code I am using Google KMS to generate a new encrypted key each time I deploy my function. I am using Google Cloud KMS to decrypt the secret each time the function is instantiated.
For decrypting using KMS I have to provide a private key for a service account to the NodeJS Google API.
My main problem today is that I have to push my private key to the GCloud Bucket if I want my GCF to work properly.
Is it possible by using either the Runtime Configurator or the Deployment Manager to configure secrets for a Google Cloud Function?
Thanks you.
As of December 2019, the preferred way to store and manage secrets on Google Cloud is Secret Manager:
$ echo -n "user:pass" | gcloud beta secrets create "my-basic-auth" \
--data-file=- \
--replication-policy "automatic"
You can also create and manage secrets from API:
// Import the library
const {SecretManagerServiceClient} = require('#google-cloud/secret-manager');
// Create the client
const client = new SecretManagerServiceClient();
// Create the secret
const [secret] = await client.createSecret({
parent: "projects/<YOUR-PROJECT-ID>",
secretId:"my-basic-auth",
secret: {
replication: {
automatic: {},
},
},
});
// Add the version with your data
const [version] = await client.addSecretVersion({
parent: secret.name,
payload: {
data: Buffer.from("user:pass", "utf8"),
},
});
Then, in your Cloud Function:
const [version] = await client.accessSecretVersion({
name:"projects/<YOUR-PROJECT-ID>/secrets/<MY-SECRET>/versions/1",
});
const auth = version.payload.data.toString('utf-8');
// auth is user:pass
The service account with which you deploy your Cloud Function will need roles/secretmanager.secretAccessor permissions.
The other solution to this which came out only in the last few months, is to use Google Cloud Runtime Configuration with Firebase for Functions:
https://firebase.google.com/docs/functions/config-env
Firebase for Functions seems to provide access to several features that are not yet available via other means.
Runtime Configurator does not charge for use, but enforces the following API limits and quotas:
1200 Queries Per Minute (QPM) for delete, create, and update requests
600 QPM for watch requests.
6000 QPM for get and list requests.
4MB of data per user, which consists of all data written to the Runtime Configurator service and accompanying metadata.
https://cloud.google.com/deployment-manager/pricing-and-quotas#runtime_configurator
As an aside, I find this conflict in the Firebase for Functions comical:
The Firebase SDK for Cloud Functions offers built-in environment configuration to make it easy to store and retrieve this type of data for your project without having to redeploy your functions.
Then a moment later:
After running functions:config:set, you must redeploy functions to make the new configuration available.
The KMS solution is a viable alternative, however it seems costly for functions. KMS is billed at $0.06 per month per active key, as well as $0.03 per 10,000 operations.
This would then change the cost of your Cloud Function from $0.40 per million invocations, to $3.40 per million invocations. That is quite the jump.
https://cloud.google.com/kms/
https://cloud.google.com/functions/
Is it possible by using either the Runtime Configurator or the Deployment Manager to configure secrets for a Google Cloud Function?
There is no built-in service that will let you configure secrets to be directly accessed by Google Cloud Functions at this time, so the method you are currently using is the proper way to handle secrets on Cloud functions for the time being. This could change as the product is still in beta.
If you want you can make a feature request to the Cloud Function team by using the appropriate issue tracker.
There's also a Google Cloud Key Management Service: Node.js Client.
cd functions
npm install #google-cloud/kms
For example:
// Imports the Cloud KMS library
const {KeyManagementServiceClient} = require('#google-cloud/kms');
// Instantiates a client
const client = new KeyManagementServiceClient();
// Build the location name
const locationName = client.locationPath(functions.config().firebase.projectId, functions.config().firebase.locationId);
async function listKeyRings() {
const [keyRings] = await client.listKeyRings({
parent: locationName,
});
for (const keyRing of keyRings) {
console.log(keyRing.name);
}
return keyRings;
}
return listKeyRings();

Google cloud storage object deletion with google cloud API, Node.js, and request.del()

As the title is saying, I'm currently trying to delete a object using request, and google_API.
But, Even though I did what Google Cloud Platform said,
It doesn't work.
Please, Help me. what should I do?
It doesn't appear you're providing any sort of authentication token. You are providing an API key, which is important when making anonymous requests, but an API key does not authenticate your identity or grant any permissions. I am guessing that you are getting 403 Forbidden responses.
Since you're using Node.JS, I might suggest trying the google-cloud library. It's easy to use, and it deals with OAuth 2 authorization logic for you. A delete might look like this:
var gcloud = require('google-cloud')({
projectId: 'grape-spaceship-123'
keyFilename: '/path/to/keyfile.json'
});
var gcs = gcloud.storage();
var myBucket = gcs.bucket('backups');
var myFile = myBucket.file('someFile.png');
myFile.delete(function(err, apiResponse) {});

Lambda function needs to use key for dependency in the code, where to store it?

I have some code such as this which will run inside of Lambda:
var Parse = require('parse').Parse;
Parse.initialize("Your App Id", "Your JavaScript Key");
var query = new Parse.Query(Parse.User);
query.find({
success: function(users) {
for (var i = 0; i < users.length; ++i) {
console.log(users[i].get('username'));
}
}
});
The code needs an API key to work. Is it safe to just put the key directly into the code or should I store it somewhere else, and if so where? I am concerned if it needs to be stored externally, this will cause overhead as I need to make a network call every time to retrieve it.
I wouldn't put it in the code. One cheap and elegant solution is to use the Key Management Services offered by AWS. Just few lines of code to retrieve your key from AWS KMS. It costs $0.03 for every 10000 requests and each key storage costs $1/month.
AWS Key Management Service
It is integrated with AWS Lambda too.
Edit: See this SO link on how to use it: AWS Lambda: How to store secret to external API?

Resources