How to use Async and Await with AWS SDK Javascript - node.js

I am working with the AWS SDK using the KMS libary. I would like to use async and await instead of callbacks.
import AWS, { KMS } from "aws-sdk";
this.kms = new AWS.KMS();
const key = await this.kms.generateDataKey();
However this does not work, when wrapped in an async function.
How can i use async and await here?

If you are using aws-sdk with version > 2.x, you can tranform a aws.Request to a promise with chain .promise() function.
For your case:
try {
let key = await kms.generateDataKey().promise();
} catch (e) {
console.log(e);
}
the key is a KMS.Types.GenerateDataKeyResponse - the second param of callback(in callback style).
The e is a AWSError - The first param of callback func
note: await expression only allowed within an async function

await requires a Promise. generateDataKey() returns a AWS.Request, not a Promise. AWS.Request are EventEmitters (more or less) but have a promise method that you can use.
import AWS, {
KMS
} from "aws-sdk";
(async function() {
const kms = new AWS.KMS();
const keyReq = kms.generateDataKey()
const key = await keyReq.promise();
// Or just:
// const key = await kms.generateDataKey().promise()
}());

As of 2021 I'd suggest to use AWS SDK for JavaScript v3. It's a rewrite of v2 with some great new features
sample code:
const { KMSClient, GenerateDataKeyCommand } = require('#aws-sdk/client-kms');
const generateDataKey = async () => {
const client = new KMSClient({ region: 'REGION' });
const command = new GenerateDataKeyCommand({ KeyId: 'KeyId' });
const response = await client.send(command);
return response;
};
AWS SDK for JavaScript v3 new features
modular architecture with a separate package for each service
First-class TypeScript support
New middleware stack

Related

Why is my sinon stub not working and trying to make a external call?

Hi I am working on making a unit test for AWS lambda written in JS.
To be honest, it is my first time writing a test.
I decided to use chai, mocha, and sinon libraries.
Here is my actual code
// index.js
const AWS = require("aws-sdk");
const getParam = async (path) => {
const ssm = new AWS.SSM();
const params = {
Name: path,
WithDecryption: false
}
const result = await ssm.getParameter(param).promise();
const value = result['Parameter']['Value']
console.log("##", value);
return value;
}
And here is what I got so far by reading other posts and documentation codes.
// index.test.js
const AWS = require("aws-sdk");
AWS.config.update({ region: 'us-east-1' });
const chai = require('chai');
const expect = chai.expect;
const sinon = require("sinon");
const { getParam } = require('./index.js');
describe("test1", () => {
if('testing', async () => {
const ssm = AWS.SSM();
sinon.stub(ssm, "getParameter").withArgs("testing")
.resolve({ Parameter: { Value: "TESTING VALUE FROM PARAM STORE" } });
const res = await getParam("Hello");
console.log(res);
expect("TESTING VALUE FROM PARAM STORE").to.equal("TESTING VALUE FROM PARAM STORE");
})
})
When I ran the test, it asked for AWS secrets, which made me realize that it was not how I expected this to behave.
If it works correctly, it will not bother connecting to AWS at all, I believe.
And it should call getParam function and return the value from resolve function.
May I know what I am missing?
Or am I misusing stub function?
I read somewhere that stub function is used when we need to see what happened during the test like how many times a certain function was called and etc...
However, I saw some of the posts using the stub function to do a similar thing that I am doing.
Thank you in advance.
You're creating an instance of SSM inside your test and stubbing that, not the instance of SSM that lives inside your getParam method.
Instead, what you can do is stub against the prototype so that all instances from thereon will use stub when invoked.
sinon.stub(AWS.SSM.prototype, "getParameter")
.withArgs("testing")
.callsFake(() => ({
promise: Promise.resolve({
Parameter: {
Value: "TESTING VALUE FROM PARAM STORE"
}
})
}));
You can't use .resolves() on get Parameter too as it doesn't return a promise, that's what you're .promise() chained method is responsible for, so we have to replicate this :).

Handling consecutive DynamoDB calls in a node.js Lambda for AWS

My example is quite simple. I am using AWS Lambda in proxy mode where the index.js looks like this.
const awsServerlessExpress = require('aws-serverless-express');
const app = require('./app');
const server = awsServerlessExpress.createServer(app);
exports.handler = (event, context) => {
console.log(`EVENT: ${JSON.stringify(event)}`);
return awsServerlessExpress.proxy(server, event, context, 'PROMISE').promise;
};
I have a separate app.js file which has a POST endpoint. Inside this endpoint I need to make two queries on a DynamoDB table.
I need to first query the table to determine if something exists.
My code looks like this.
// DynamoDB Document Client
const docClient = new AWS.DynamoDB.DocumentClient();
app.post('/profile/:userSub', (req, res) => {
const userSub = req.params.userSub;
const accountName = req.body.accountName;
// First query
const params = {
TableName: profileTableName,
IndexName: profileIndexAccountName,
KeyConditionExpression: 'accountName = :accountName',
ExpressionAttributeValues: {
':accountName' : accountName
}
};
docClient.query(params, (err, data) => {
// Process the results
});
}
The problem is that I want to do a follow up query docClient.put based on the results from the first query.
I don't understand how to chain the queries together so that the first one is completed before the second one is executed.
Can someone please point me to an example? Or alternatively, if there's a better using async/await then I'm happy to follow it.
TL;DR prefer the SDK's async-await patterns.
Here is an example of consecutive async-await calls using the AWS SDK for JavaScript v3.*. Note the async keyword in the function signature and the await keyword before the promise-returning method calls. The docs have the complete client initialization code.
// myLambda.js
import { DynamoDBDocument } from '#aws-sdk/lib-dynamodb';
// ... configure client
const ddbDocClient = DynamoDBDocument.from(client);
export async function handler(event) {
// ... other stuff
const getResult = await ddbDocClient.get({ TableName, Key });
const putResult = await ddbDocClient.put({
TableName,
Item: { id: '2', content: getResult.Item.Name },
});
}
* The #aws-sdk/lib-dynamodb library's DynamoDBDocument is the v3 equivalent of the v2 client in your code. It exposes convenience methods like .get and .put. It also helps convert between native js types and DynamoDB attribute values.

Sinon.restore not working for stubbing and testing AWS functions

So I'm trying to write a few tests for testing an AWS wrapper library that I have been writing.
The tests are running individually without any issues, but won't all run as one 'describe' block.
const AWS_REGION = 'eu-west-2';
const aws = require('aws-sdk');
const chai = require('chai');
const expect = chai.expect;
const sinon = require('sinon');
const sinonChai = require('sinon-chai');
chai.use(sinonChai);
// These help:
// https://stackoverflow.com/questions/26243647/sinon-stub-in-node-with-aws-sdk
// https://stackoverflow.com/questions/61516053/sinon-stub-for-lambda-using-promises
describe('SQS Utilities Test', () => {
afterEach(() => {
sinon.restore();
});
it('should add to SQS', async () => {
sinon.stub(aws.config, 'update');
const sqs = {
sendMessage: sinon.stub().returnsThis(),
promise: sinon.stub()
};
sinon.stub(aws, 'SQS').callsFake(() => sqs);
// these use the above stubbed version of aws
const AWSUtilities = require('../index').AWSUtilities;
const awsUtilities = new AWSUtilities(AWS_REGION);
const response = await awsUtilities.postToSQS('https://example.com', { id: 1}, 'chicken');
expect(sqs.sendMessage).to.have.been.calledOnce;
});
it('should get from SQS', async () => {
sinon.stub(aws.config, 'update');
const sqs = {
receiveMessage: sinon.stub().returnsThis(),
promise: sinon.stub()
};
sinon.stub(aws, 'SQS').callsFake(() => sqs);
// these use the above stubbed version of aws
const AWSUtilities = require('../index').AWSUtilities;
const awsUtilities = new AWSUtilities(AWS_REGION);
const response = await awsUtilities.getFromSQS('https://example.com');
expect(sqs.receiveMessage).to.have.been.calledOnce;
});
...
What I noticed, is that in the second test, the error I am getting is sqs.receiveMessage is not a function, which means that the second test is using the sqs object from the first test (I can further verify this as the error changes if I add receiveMessage to the first test sqs object).
Is this a bug in sinon restore, or have I written something incorrectly? Here is the whole library: https://github.com/unegma/aws-utilities/blob/main/test/SQSTests.spec.js
This is not an issue with Sinon. This an issue of how you are stubbing AWS SDK. Let's break down what's happening within the code you have shared.
const sqs = {
sendMessage: sinon.stub().returnsThis(),
promise: sinon.stub()
};
sinon.stub(aws, 'SQS').callsFake(() => sqs);
// these use the above stubbed version of aws
const AWSUtilities = require('../index').AWSUtilities;
This code does the following
Stub SQS of aws.
Load AWSUtilities.js (based on the source code in github)
AWSUtilities.js does the following as soon as its loaded
const aws = require('aws-sdk');
const sqs = new aws.SQS();
// code removed to demo the concept
The above code creates an internal sqs object, which in this case is made using the stubbed aws module. In node once a module is loaded using require it's cached in memory i.e the above code executes only once.
So when the first it() executes it in turn loads AWSUtilities.js for the first time and is cached. Any subsequent calls gets the cached version. When you call sinon.restore it only restores the SQS function of aws module it doesn't restore the sqs object that was created within AWSUtilities.js.
I hope that explains the reason for the behavior that you are seeing.
There are multiple ways to fix this issue. Dependency injection, using modules like proxyquire, rewire, stubbing aws from a central location before all test cases etc.
The following is an option to fix it in just the test cases shown here.
describe('SQS Utilities Test', () => {
let AWSUtilities, sqsStub;
before(() => {
sinon.stub(aws.config, 'update');
sqsStub = {
sendMessage: sinon.stub().returnsThis(),
receiveMessage: sinon.stub().returnsThis(),
promise: sinon.stub()
};
sinon.stub(aws, 'SQS').callsFake(() => sqs);
AWSUtilities = require('../index').AWSUtilities;
});
after(() => {
sinon.restore();
});
it('should add to SQS', async () => {
const awsUtilities = new AWSUtilities(AWS_REGION);
const response = await awsUtilities.postToSQS('https://example.com', { id: 1}, 'chicken');
expect(sqsStub.sendMessage).to.have.been.calledOnce;
});
it('should get from SQS', async () => {
const awsUtilities = new AWSUtilities(AWS_REGION);
const response = await awsUtilities.getFromSQS('https://example.com');
expect(sqsStub.receiveMessage).to.have.been.calledOnce;
});
});

Mocking using aws-sdk-mock's promise support with DocumentClient

I'm trying to write a unit test using aws-sdk-mock's promise support. I'm using DocumentClient.
My code looks like this:
const docClient = new AWS.DynamoDB.DocumentClient();
const getItemPromise = docClient.get(params).promise();
return getItemPromise.then((data) => {
console.log('Success');
return data;
}).catch((err) => {
console.log(err);
});
My mock and unit test looks like this:
const AWS = require('aws-sdk-mock');
AWS.Promise = Promise.Promise;
AWS.mock('DynamoDB.DocumentClient', 'get', function (params, callback)
{
callback(null, { Item: { Key: 'test value } });
});
dynamoStore.getItems('tableName', 'idName', 'id').then((actualResponse) => {
// assertions
done();
});
Runnning my unit test, does not return my test value, it actually bypasses my mock, and calls calls dynamoDb directly. What am I doing wrong? How can I get my mock set up properly?
It's unclear from your code but aws-sdk-mock has this note
NB: The AWS Service needs to be initialised inside the function being tested in order for the SDK method to be mocked
so the following will not mock correctly
var AWS = require('aws-sdk');
var sns = AWS.SNS();
var dynamoDb = AWS.DynamoDB();
exports.handler = function(event, context) {
// do something with the services e.g. sns.publish
}
but this will
var AWS = require('aws-sdk');
exports.handler = function(event, context) {
var sns = AWS.SNS();
var dynamoDb = AWS.DynamoDB();
// do something with the services e.g. sns.publish
}
see more here https://github.com/dwyl/aws-sdk-mock#how-usage
It might be too late for an answer, but I had the same problem and I stumbled upon this question. After a few tries I found a solution that doesn't involve aws-sdk-mock but only plain Sinon, and I hope that sharing it would help someone else. Note that the DynamoDB client is create outside the lambda.
The lambda itself looks like this:
const dynamoDB = new DynamoDB.DocumentClient();
exports.get = async event => {
const params = {
TableName: 'Tasks',
Key: {
id: event.pathParameters.id
}
};
const result = await dynamoDB.get(params).promise();
if (result.Item) {
return success(result.Item);
} else {
return failure({ error: 'Task not found.' });
}
};
And the test for this lambda is:
const sandbox = sinon.createSandbox();
describe('Task', () => {
beforeAll(() => {
const result = { Item: { id: '1', name: 'Go to gym'}};
sandbox.stub(DynamoDB.DocumentClient.prototype, 'get').returns({promise: () => result});
});
afterAll(() => {
sandbox.restore();
});
it('gets a task from the DB', async () => {
// Act
const response = await task.get(getStub);
// Assert
expect(response.statusCode).toEqual(200);
expect(response.body).toMatchSnapshot();
});
});
I like to use Sinon's sandbox to be able to stub a whole lot of different DynamoDB methods and clean up everything in a single restore().
sinon and proxyquire can be used to mock the dynamodb client.
It supports both callback based and async/await based calls.
Refer this link for full details
https://yottabrain.org/nodejs/nodejs-unit-test-dynamodb/
Somewhat related to the question, expanding wyu's solution - i too faced similar issue - for me, below didn't work with aws-sdk-mock
const AWS = require('aws-sdk');
AWS.config.update({region: 'us-east-1'});
let call = function (action, params) {
const dynamoDb = new AWS.DynamoDB.DocumentClient();
return dynamoDb[action](params).promise();
};
where as this worked
let call = function (action, params) {
const AWS = require('aws-sdk');
AWS.config.update({region: 'us-east-1'});
const dynamoDb = new AWS.DynamoDB.DocumentClient();
return dynamoDb[action](params).promise();
};
I had exactly the same problem of mock failing but resolved the issue after following the suggestion by a user who above by moving the following line within the function rather than defining outside:
let sns = new AWS.SNS(.....)

How do I promisify the AWS JavaScript SDK?

I want to use the aws-sdk in JavaScript using promises.
Instead of the default callback style:
dynamodb.getItem(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
I instead want to use a promise style:
dynamoDb.putItemAsync(params).then(function(data) {
console.log(data); // successful response
}).catch(function(error) {
console.log(err, err.stack); // an error occurred
});
The 2.3.0 release of the AWS JavaScript SDK added support for promises: http://aws.amazon.com/releasenotes/8589740860839559
I believe calls can now be appended with .promise() to promisify the given method.
You can see it start being introduced in 2.6.12 https://github.com/aws/aws-sdk-js/blob/master/CHANGELOG.md#2612
You can see an example of it's use in AWS' blog https://aws.amazon.com/blogs/compute/node-js-8-10-runtime-now-available-in-aws-lambda/
let AWS = require('aws-sdk');
let lambda = new AWS.Lambda();
exports.handler = async (event) => {
return await lambda.getAccountSettings().promise() ;
};
You can use a promise library that does promisification, e.g. Bluebird.
Here is an example of how to promisify DynamoDB.
var Promise = require("bluebird");
var AWS = require('aws-sdk');
var dynamoDbConfig = {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_REGION
};
var dynamoDb = new AWS.DynamoDB(dynamoDbConfig);
Promise.promisifyAll(Object.getPrototypeOf(dynamoDb));
Not you can add Async to any method to get the promisified version.
Way overdue, but there is a aws-sdk-promise npm module that simplifies this.
This just adds a promise() function which can be used like this:
ddb.getItem(params).promise().then(function(req) {
var x = req.data.Item.someField;
});
EDIT: It's been a few years since I wrote this answer, but since it seems to be getting up-votes lately, I thought I'd update it: aws-sdk-promise is deprecated, and newer (as in, the last couple of years) versions of aws-sdk includes built-in promise support. The promise implementation to use can be configured through config.setPromisesDependency().
For example, to have aws-sdk return Q promises, the following configuration can be used:
const AWS = require('aws-sdk')
const Q = require('q')
AWS.config.setPromisesDependency(Q.Promise)
The promise() function will then return Q promises directly (when using aws-sdk-promise, you had to wrap each returned promise manually, e.g. with Q(...) to get Q promises).
With async/await I found the following approach to be pretty clean and fixed that same issue for me for DynamoDB. This works with ElastiCache Redis as well. Doesn't require anything that doesn't come with the default lambda image.
const {promisify} = require('util');
const AWS = require("aws-sdk");
const dynamoDB = new AWS.DynamoDB.DocumentClient();
const dynamoDBGetAsync = promisify(dynamoDB.get).bind(dynamoDB);
exports.handler = async (event) => {
let userId="123";
let params = {
TableName: "mytable",
Key:{
"PK": "user-"+userId,
"SK": "user-perms-"+userId
}
};
console.log("Getting user permissions from DynamoDB for " + userId + " with parms=" + JSON.stringify(params));
let result= await dynamoDBGetAsync(params);
console.log("Got value: " + JSON.stringify(result));
}
Folks,
I've not been able to use the Promise.promisifyAll(Object.getPrototypeOf(dynamoDb));
However, the following worked for me:
this.DYNAMO = Promise.promisifyAll(new AWS.DynamoDB());
...
return this.DYNAMO.listTablesAsync().then(function (tables) {
return tables;
});
or
var AWS = require('aws-sdk');
var S3 = Promise.promisifyAll(new AWS.S3());
return S3.putObjectAsync(params);
CascadeEnergy/aws-promised
We have an always in progress npm module aws-promised which does the bluebird promisify of each client of the aws-sdk. I'm not sure it's preferable to using the aws-sdk-promise module mentioned above, but here it is.
We need contributions, we've only taken the time to promisify the clients we actually use, but there are many more to do, so please do it!
This solution works best for me:
// Create a promise object
var putObjectPromise = s3.putObject({Bucket: 'bucket', Key: 'key'}).promise();
// If successful, do this:
putObjectPromise.then(function(data) {
console.log('PutObject succeeded'); })
// If the promise failed, catch the error:
.catch(function(err) {
console.log(err); });

Resources