My problem is the following: I want to test a method that uploads a buch of data into an AWS S3 bucket. The problem is: I don't want to really upload data every time I am testing and I don't want to care about credentials sitting in the env. So I want to setup Sinon's fake-server module to simulate the upload and return the same results then S3 would. Sadly, it seems to be difficult to find a working example with code using async/await.
My test looks like this:
import {skip, test, suite} from "mocha-typescript";
import Chai from "chai";
import {S3Uploader} from "./s3-uploader.class";
import Sinon from "sinon";
#suite
class S3UploaderTest {
public server : Sinon.SinonFakeServer | undefined;
before() {
this.server = Sinon.fakeServer.create();
}
after() {
if (this.server != null) this.server.restore();
}
#test
async "should upload a file to s3 correctly"(){
let spy = Sinon.spy();
const uploader : S3Uploader = new S3Uploader();
const upload = await uploader.send("HalloWelt").toBucket("onetimeupload.test").toFolder("test/hw.txt").upload();
Chai.expect(upload).to.be.a("object");
}
}
Inside of the uploader.upload() method, I resolved a promise out of a callback. So how can I simulate the uploading-process?
Edit: Here is the code of the s3-uploader:
import AWS from "aws-sdk";
export class S3Uploader {
private s3 = new AWS.S3({ accessKeyId : process.env.ACCESS_KEY_ID, secretAccessKey : process.env.SECRET_ACCESS_KEY });
private params = {
Body: null || Object,
Bucket: "",
Key: ""
};
public send(stream : any) {
this.params.Body = stream;
return this;
}
public toBucket(bucket : string) {
this.params.Bucket = bucket;
return this;
}
public toFolder(path : string) {
this.params.Key = path;
return this;
}
public upload() {
return new Promise((resolve, reject) => {
if (process.env.ACCESS_KEY_ID == null || process.env.SECRET_ACCESS_KEY == null) {
return reject("ERR_NO_AWS_CREDENTIALS");
}
this.s3.upload(this.params, (error : any, data : any) => {
return error ? reject(error) : resolve(data);
});
});
}
}
Sinon fake servers are something you might use to develop a client that itself makes requests, instead of a wrapper around an existing client like AWS.S3, like you're doing. In this case, you're better off just stubbing the behavior of AWS.S3 instead of testing the actual requests it makes. That way you can avoid testing the implementation details of AWS.S3.
Since you're using TypeScript and you've made your s3 client private, you're going to need to make some changes to expose it to your tests. Otherwise, you won't be able to stub its methods without the TS compiler complaining about it. You also won't be able to write assertions using the params object, for similar reasons.
Since I don't use TS regularly, I'm not too familiar with it's common dependency injection techniques, but one thing you could do is add optional constructor arguments to your S3Uploader class that can overwrite the default s3 and arguments properties, like so:
constructor(s3, params) {
if (s3) this.s3 = s3;
if (params) this.params = params;
}
After which, you can create a stub instance and pass it to your test instance like this:
const s3 = sinon.createStubInstance(AWS.S3);
const params = { foo: 'bar' };
const uploader = new S3Uploader(s3, params);
Once you have the stub instance in place, you can write assertions to make sure the upload method was called the way you want it to be:
sinon.assert.calledOnce(s3.upload);
sinon.assert.calledWith(s3.upload, sinon.match.same(params), sinon.match.func);
You can also affect the behavior the upload method using the sinon stub api. For example, to make it fail like so:
s3.upload.callsArgWith(1, null);
Or make it succeed like so:
const data = { whatever: 'data', you: 'want' };
s3.upload.callsArgWith(1, null, data);
You'll probably want a completely separate test for each of these cases, using an instance before hook to avoid duplicating the common setup stuff. Testing for success will involve simply awaiting the promise and checking that its result is the data. Testing for failure will involve a try/catch that ensures the promise was rejected with the proper error.
Also, since you seem to be doing actual unit tests here, I'll recommend testing each S3Uploader method separately instead of calling them all in once big test. This drastically reduces the number of possible cases you need to cover, making your tests a lot more straightforward. Something like this:
#suite
class S3UploaderTest {
params: any; // Not sure the best way to type this.
s3: any; // Same. Sorry, not too experienced with TS.
uploader: S3Uploader | undefined;
before() {
this.params = {};
this.s3 = sinon.createStubInstance(AWS.S3);
this.uploader = new S3Uploader(this.s3, this.params);
}
#test
"send should set Body param and return instance"() {
const stream = "HalloWelt";
const result = this.uploader.send(stream);
Chai.expect(this.params.Body).to.equal(stream);
Chai.expect(result).to.equal(this.uploader);
}
#test
"toBucket should set Bucket param and return instance"() {
const bucket = "onetimeupload.test"
const result = this.uploader.toBucket(bucket);
Chai.expect(this.params.Bucket).to.equal(bucket);
Chai.expect(result).to.equal(this.uploader);
}
#test
"toFolder should set Key param and return instance"() {
const path = "onetimeupload.test"
const result = this.uploader.toFolder(path);
Chai.expect(this.params.Key).to.equal(path);
Chai.expect(result).to.equal(this.uploader);
}
#test
"upload should attempt upload to s3"() {
this.uploader.upload();
sinon.assert.calledOnce(this.s3.upload);
sinon.assert.calledWith(
this.s3.upload,
sinon.match.same(this.params),
sinon.match.func
);
}
#test
async "upload should resolve with response if successful"() {
const data = { foo: 'bar' };
s3.upload.callsArgWith(1, null, data);
const result = await this.uploader.upload();
Chai.expect(result).to.equal(data);
}
#test
async "upload should reject with error if not"() {
const error = new Error('Test Error');
s3.upload.callsArgWith(1, error, null);
try {
await this.uploader.upload();
throw new Error('Promise should have rejected.');
} catch(err) {
Chai.expect(err).to.equal(err);
}
}
}
If I were doing this with mocha proper, I'd group each method's tests into a nested describe block. I'm not sure if that's encouraged or even possible with mocha-typescript, but if so you might consider it.
Related
my setup looks like this
const sandbox = sinon.createSandbox();
test.afterEach.always(() => {
sandbox.restore();
});
test.serial( 'some test', t => {
new awvr(context)
})
test.serial( 'other test', t=> {
new awvr(other_context)
}
export class awvr {
private static isStubConfigured: boolean = false;
private readonly serviceContext: ServiceCtx;
constructor(ctx: ServiceCtx) {
this.serviceContext = ctx;
awvr.setupMockedIntegrationRequestHandler();
}
private static setupMockedIntegrationRequestHandler() {
if (!awvr.isStubConfigured) {
const requestStub = { post: sandbox.stub() };
requestStub.post.resolves({ body: { output: 'test-response-field' } });
const integrationRequestStub = sandbox.stub(requestUtils, 'getIntegrationRequest');
integrationRequestStub.returns((requestStub as unknown));
const mockLock = createMockRedlock();
sandbox.stub(redis, 'lockNode').resolves(mockLock);
awvr.isIntegrationStubConfigured = true;
}
}
when i run these tests i get an error saying the below along with stack trace
attempted to wrap x which is already wrapped
checkWrappedMethod
wrapMethod
stub
Sandbox.stub
Function.setupMockedIntegrationRequestHandler
new awvr
processTicksAndRejections
test.ts:601:3
wrapMethod
stub
Sandbox.stub
Function.setupMockedIntegrationRequestHandler
new awvr
processTicksAndRejections
This seems to imply there's some parallelization going on -- although i tried creating two separate sandboxes for each test and passing them through to be used by awvr and that didn't work either. I have tried calling sandbox.restore() in the awvr call as well, but issue persists -- Is there something sinon does under the hood that doesn't allow for this setup?
Typescript newbie here. I am working on an AWS Lambda function by using typescript with classes. I am exporting an async handler at the end. When I invoke my function from AWS SAM CLI then I am getting error of;
{"errorType":"TypeError","errorMessage":"Cannot read property 'test' of undefined","stack":["TypeError: Cannot read property 'test' of undefined"," at Runtime.handler (/var/task/src/lambda/create-cost-lambda.js:12:56)"," at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"]}
create-cost-lambda.ts
class CreateCostLambda {
private readonly foobarRepository: FoobarRepository;
constructor() {
this.foobarRepository = new FoobarRepository();
}
async handler(event: APIGatewayProxyEventV2) : Promise<APIGatewayProxyResultV2> {
const result = await this.foobarRepository.test();
console.log(result);
return {
body: JSON.stringify(result),
statusCode: 200,
};
}
}
export const { handler } = new CreateCostLambda();
Here is a very basic class represents a repository.
foobar-repository.ts
export class FoobarRepository {
private readonly awesomeValue: string;
constructor() {
this.awesomeValue = 'John Doe';
}
async test(): Promise<string> {
return this.awesomeValue;
}
}
I am almost sure it is because of the way I am exporting the handler and how aws-sam internally runs the handler. But I might be wrong and it can be typescript thing that I am missing. Please let me know if you need more information and thanks a lot for the help!
The short version is if you pass a function from an class, it loses it's reference to this.
I would solve this as follows:
const createCostLambda = new CreateCostLambda();
export const handler = createCostLambda.handler.bind(createCostLambda);
You can also ask yourself, does this need to be a class? The answer is: probably not. There's nothing gained from this in your sample.
const foobarRepository = new FoobarRepository();
export async function handler(event: APIGatewayProxyEventV2) : Promise<APIGatewayProxyResultV2> {
const result = await foobarRepository.test();
console.log(result);
return {
body: JSON.stringify(result),
statusCode: 200,
};
}
Fewer lines, no unneeded state. Javascript is not Java =)
I am using Dynamodb streams to do some work on records as they are added or modified to my table. I am also using Dynamoose models in my application.
The Dynamodb stream event passes an event object to my node.js lambda handler that includes the objects record.dynamoDb.NewImage and record.dynamoDb.OldImage. However, these objects are in DynamoDB's AttributeValue format including all of the data types ('S' for string), rather than a normal javascript object. So record.id becomes record.id.S.
Dynamoose models allow you to instantiate a model from an object, like so: new Model(object). However, it expects that argument to be a normal object.
I know that Dynamoose has a dynamodb parser, I think its Schema.prototype.dynamodbparse(). However, that doesn't work as expected.
import { DYNAMODB_EVENT } from '../../constant';
import _get from 'lodash/get';
import { Entry } from '../../model/entry';
import { applyEntry } from './applyEntry';
async function entryStream(event) {
await Promise.all(
event.Records.map(async record => {
// If this record is being deleted, do nothing
if (DYNAMODB_EVENT.Remove === record.eventName) {
return;
}
// What I've tried:
let entry = new Entry(record.dynamodb.NewImage);
// What I wish I could do
entry = new Entry(Entry.schema.dynamodbparse(record.dynamodb.newImage));
await applyEntry(entry);
})
);
}
export const handler = entryStream;
So is there a way to instantiate a Dynamoose model from DynamoDB's AttributeValue format? Has anyone else done this?
The alternative, is that I simply extract the key from the record, and then make a trip to the database using Model.get({ id: idFromEvent }); But I think that would be inefficient, since the record was just handed to me from the stream.
I solved it by using AWS.DynamoDB.Converter.unmarshall to parse the object before passing to Dynamoose.
import { DYNAMODB_EVENT } from '../../constant';
import _get from 'lodash/get';
import { Entry } from '../../model/entry';
import { applyEntry } from './applyEntry';
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/Converter.html#unmarshall-property
var AWS = require('aws-sdk');
var parseDynamo = AWS.DynamoDB.Converter.unmarshall;
async function entryStream(event) {
await Promise.all(
event.Records.map(async record => {
// If this record is being deleted, do nothing
if (DYNAMODB_EVENT.Remove === record.eventName) {
return;
}
entry = new Entry(parseDynamo(record.dynamodb.newImage));
await applyEntry(entry);
})
);
}
export const handler = entryStream;
While unittesting my NodeJS application I'm trying to create a simple helper class that will translate the Kafka pub-sub semantics into a simpler API suited for unittesting.
My idea is to be able to write mocha unittest like this:
const testSubscriber = kafkaTestHelper.getTestSubscriber({topic:'test'});
return someKafkaProducer.sendAsync({topic: 'test', message: randomWord})
.then(() =>
testSubscriber.next()
).then(msg => {
msg.should.equal(randomWord);
});
Of course I would also add helper methods such as
testSubscriber.nextUntil(someFilter)
This is inspired by the AKKA.NET TestKit which has a similar approach.
I have two questions:
Is this a reasonable approach or is there some cleaner way to unittest application logic based on Kafka stream processing in NodeJS?
Can anybody post coding examples showing how to make testSubscriber work as I intend?
This might not be the most elegant solution but it seems to work, at least for my initial testing. The trick is to create an ever growing list of Promises for which the resolver function is kept by reference in an array called 'resolvers'. Then when a message comes in, the resolver is invoked with the message. In this way I can return promises to any unittest invoking next() and it will work transparently if either the message was already delivered or it will be delivered in the future.
I still feel I'm reinventing the wheel here, so any comments would still be greatly appreciated.
function TestSubscriber(consumer, initialMessageFilter) {
this.consumer = consumer;
let promiseBuffer = [];
let resolvers = [];
let resolveCounter = 0;
let isStarted = false;
const ensurePromiseBuffer = function() {
if (promiseBuffer.length === 0 || resolveCounter >= resolvers.length) {
const newPromise = new Promise(function(resolve, reject) {
resolvers.push(resolve);
});
promiseBuffer.push(newPromise);
}
}
const that = this;
this.consumer.on('message', function(message) {
if (!isStarted) {
//Determine if we should start now.
isStarted = initialMessageFilter === undefined || initialMessageFilter(message);
}
if (isStarted) {
ensurePromiseBuffer();
const resolver = resolvers[resolveCounter];
resolver(message);
resolveCounter++;
that.consumer.commit(function(err, data) {
if (err) {
//Just log any errors here as we are running inside a unittest
log.warn(err)
}
})
}
});
this.next = function() {
ensurePromiseBuffer();
return promiseBuffer.shift();
};
}
const cache = {};
module.exports = {
getTestSubscriber: function({topic}, initialMessageFilter) {
if (!cache[topic]) {
const consumer = kafka.getConsumer({topic, groupId: GROUP_ID});
cache[topic] = new TestSubscriber(consumer, initialMessageFilter);
}
return cache[topic];
}
}
I'm trying to develop a NodeJS app connecting to Firebase. I can connect successfully, but I'm unable to figure how to manage the scope in the then call.
I'm using NodeJS 6.9.2
My test implementation looks like this:
const EventEmitter = require('events');
const fb = require('firebase')
class FireGateway extends EventEmitter {
constructor() {
super();
if ( this.instance ) {
return this.instance;
}
// INIT
var fbConfig = {
apiKey: "xxxxx",
authDomain: "xxxxx.firebaseapp.com",
databaseURL: "https://xxxxx.firebaseio.com/"
};
fb.initializeApp(fbConfig)
this.instance = this;
this.testvar = "aaa";
}
login() {
fb.auth().signInWithEmailAndPassword ("email", "pwd")
.catch(function(error) {
// Handle Errors here.
}).then( function(onresolve, onreject) {
if (onresolve) {
console.log(this.testvar);
// "Cannot read property 'testvar' of undefined"
this.emit('loggedin');
// error as well
}
})
}
}
module.exports = FireGateway;
------
...
var FireGateway = require('./app/fireGateway');
this.fireGW = new FireGateway();
this.fireGW.login();
....
Any idea how can I manage it?
The callback passed to then is being called asynchronously from another context, so the this doesn't correspond to the instantiated object.
Using ES6 arrow functions you can keep your object context, since an arrow function does not create its own this context.
By the way, the syntax you are using in the then method is not correct, then accepts two callbacks with one argument each one. Check the syntax here.
The catch before the then is not necessary as well I think, it would make more sense to put it at the end.
It would be something like this:
login() {
fb.auth().signInWithEmailAndPassword("email", "pwd")
.then(
(onResolve) => {
console.log(this.testvar);
this.emit('loggedin');
},
(onReject) = > {
// error handling goes here
});
}
On the other hand, it seems login method is doing an asynchronous operation, so you might want to wait for it to finish in your code. I would make the login method return a Promise, so you can wait for it outside:
login() {
return fb.auth().signInWithEmailAndPassword("email", "pwd")
...
}