Does aws-sdk-mock support mocking of AWS SSM (Parameter Store)? - jestjs

I am trying to mock AWS SSM using aws-sdk-mock with the code below but not working. Does not throw error, fetch the values from Actual store when getParametersByPath is called.
I had a look at the aws-sdk-mock documentation but does not seem to have an example for mocking ssm, is it supported or not.
AWSMock.mock('SSM', 'getParametersByPath', (params, callback) => {
callback(null, mockResponse);
});

I ran across this when trying to do a similar operation: When trying to mock SSM functionality the resources were still attempting to make requests to AWS and were not using the mock functionality.
Example:
import { mock } from 'aws-sdk-mock';
import { SSM } from 'aws-sdk';
import { GetParameterRequest, GetParameterResult } from 'aws-sdk/clients/ssm';
import 'mocha'
...
const ssm: SSM = new SSM();
mock('SSM', 'getParameter', async (request: GetParameterRequest) => {
return { Parameter: { Value: 'value' } } as GetParameterResult;
})
const request: GetParameterRequest = { Name: 'parameter', WithDecryption: true};
const result: GetParameterResult = await ssm.getParameter(request).promise();
expect(result.Parameter.Value).to.equal('value');
...
The error occurred when making the call to getParameter.
Turns out that the reason for our error was that we were instantiating the integration prior to declaring our mock. So the fix was to switch the order of execution and declare the mock before instantiating the integration.
Example:
import { mock } from 'aws-sdk-mock';
import { SSM } from 'aws-sdk';
import { GetParameterRequest, GetParameterResult } from 'aws-sdk/clients/ssm';
import 'mocha'
...
mock('SSM', 'getParameter', async (request: GetParameterRequest) => {
return { Parameter: { Value: 'value' } } as GetParameterResult;
});
// -> Note the following line was moved below the mock declaration.
const ssm: SSM = new SSM();
const request: GetParameterRequest = { Name: 'parameter', WithDecryption: true};
const result: GetParameterResult = await ssm.getParameter(request).promise();
expect(result.Parameter.Value).to.equal('value');
...

Related

Change region of firebase cloud functions of callable in v2

Hey so I am trying to upgrade to v2 of firebase cloud functions, but when trying to change the code I noticed that my functions do not have .region anymore like in v1.
Here the v1 version where I could call .region and change it
import * as functions from "firebase-functions";
exports.helloWorld = functions.region("europe-west1").https.onCall(() => {
functions.logger.info("Hello logs!", { structuredData: true });
return { text: "Hello from Firebase!" };
});
now I upgraded to v2, but I get:
Property 'region' does not exist on type
'typeof import("/.../node_modules/firebase-functions/lib/v2/index")
Trying to achieve something like this for v2 of firebase cloud functions any ideas ?
import { https, logger } from "firebase-functions/v2";
import * as functions from "firebase-functions/v2";
// // Start writing Firebase Functions
// // https://firebase.google.com/docs/functions/typescript
//
const regionalFunctions = functions.region("europe-west1");
exports.helloWorld = regionalFunctions.https.onCall(() => {
logger.info("Hello logs!", { structuredData: true });
return { text: "Hello ${process.env.PLANET} and ${process.env.AUDIENCE}" };
});
You can specify the region in the function's options as shown below:
import { onCall } from "firebase-functions/v2/https";
export const testFunction = onCall({ region: "..." }, (event) => {
// ...
})

How to mock #google-cloud/kms using jest

I'm trying to write unit test cases for decrypt. I've my own implementation of decrypting an encrypted file. While trying to import the decrypt.mjs facing the following error.
Must use import to load ES Module: /node_modules/bignumber.js/bignumber.mjs
My application is a react frontend and NodeJS backend. I've used ES6 modules for NodeJS. Here is my decrypt.mjs file
import { readFile } from 'fs/promises';
import path from 'path';
import { KeyManagementServiceClient } from '#google-cloud/kms';
const decrypt = async (APP_MODE, __dirname) => {
if (APP_MODE === 'LOCALHOST') {
const keys = await readFile(
new URL(`./stagingfile.json`, import.meta.url)
).then((data) => JSON.parse(data));
return keys;
}
const { projectId, locationId, keyRingId, cryptoKeyId, fileName } =
getKMSDefaults(APP_MODE);
const ciphertext = await readFile(
path.join(__dirname, `/${fileName}`)
);
const formattedName = client.cryptoKeyPath(
projectId,
locationId,
keyRingId,
cryptoKeyId
);
const request = {
name: formattedName,
ciphertext,
};
const client = new KeyManagementServiceClient();
const [result] = await client.decrypt(request);
return JSON.parse(result.plaintext.toString('utf8'));
};
const getKMSDefaults = (APP_MODE) => {
//Based on APP_MODE the following object contains different values
return {
projectId: PROJECT_ID,
locationId: LOCATION_ID,
keyRingId: KEY_RING_ID,
cryptoKeyId: CRYPTO_KEY_ID,
fileName: FILE_NAME,
};
};
export default decrypt;
I tried to mock the #google-cloud/kms using manual mock (jest) but it didn't work. I tried multiple solutions to mock but nothing worked and it ended with the Must use import to load ES Module error.
I've had successfully used jest to mock #google-cloud/kms with TypeScript, so hopefully this will be the same process for ES modules that you can use.
Example working code:
// jest will "hoist" jest.mock to top of the file on its own anyway
jest.mock("#google-cloud/kms", () => {
return {
KeyManagementServiceClient: jest.fn().mockImplementation(() => {
return {
encrypt: kmsEncryptMock,
decrypt: kmsDecryptMock,
cryptoKeyPath: () => kmsKeyPath,
};
}),
};
});
// give names to mocked functions for easier access in tests
const kmsEncryptMock = jest.fn();
const kmsDecryptMock = jest.fn();
const kmsKeyPath = `project/location/keyring/keyname`;
// import of SUT must be after the variables used in jest.mock() are defined, not before.
import { encrypt } from "../../src/crypto/google-kms";
describe("Google KMS encryption service wrapper", () => {
const plaintext = "some text to encrypt";
const plaintextCrc32 = 1897295827;
it("sends the correct request to kms service and raise error on empty response", async () => {
// encrypt function is async that throws a "new Error(...)"
await expect(encrypt(plaintext)).rejects.toMatchObject({
message: "Encrypt: no response from KMS",
});
expect(kmsEncryptMock).toHaveBeenNthCalledWith(1, {
name: kmsKeyPath,
plaintext: Buffer.from(plaintext),
plaintextCrc32c: { value: plaintextCrc32 },
});
});
});

TypeError: metadata_1.Public is not a function (NestJS SetMetaData)

My e2e test is returning TypeError: metadata_1.Public is not a function for a controller that is using the custom decorator #Public()
Some code is omitted for clarity
it(`/GET forks`, async () => {
const fork: ForksModel = {
type: 'Full Copy',
};
await request(app.getHttpServer())
.get('/forks')
.expect(200)
.expect({ fork: expectedForks});
});
#Public()
public async getAccountForks(#Req() req: Request) {
const { account } = req;
const fork = await this.service.getAccountForks(account);
return { fork, account };
}
public.decorator.ts
import { SetMetadata } from "#nestjs/common";
export const Public = () => SetMetadata( "isPublic", true );
I don't know what is happening here, it doesn't complain this when running nest
This is imported
import { Public } from '#app/utils/metadata';
So i just forgot to export my metadata files from the root utils index.ts!
But Nest didn't complain and the decorator was functional on my Guard when testing!

Unit Testing NodeJs Controller with Axios

I have a controller and a request file that look like this, making the requests with axios(to an external API), and sending the controller response to somewhere else, my question is, how to apply Unit Testing to my controller function (getInfoById), how do I mock the axiosRequest since it's inside the controller?. I am using Jest and only Jest for testing(might need something else, but I'm not changing)
file: axiosFile.js
import axios from "axios"
export const axiosRequest = async (name) => {
const { data } = await axios.get("url")
return data
}
file: controllerFile.js
import { axiosRequest } from "./axiosFile"
export const getInfoById = async (name) => {
try {
const response = await axiosRequest(name)
return { status: 200, ...response }
} catch {
return { status: 500, { err: "Internal ServerError" } }
}
}
Thanks in advance.
PS: It's a Backend in NodeJs
You can mock the http calls using nock
This way you will be directly able to test your method by mocking the underlying http call. So in your case something like
const nock = require('nock')
const scope = nock(url)
.get('/somepath')
.reply(200, {
data: {
key: 'value'
},
})

Need to find the error with connecting subscription with schema stitching

I am using apollo-server-express for graphql back-end. I am going to process only mutations there, but I want to redirect query and subscription on hasura by means of schema stitching with introspection. Queries through apollo-server to hasura are working fine and returning the expected data.
But subscriptions are not working and I am getting this error: " Expected Iterable, but did not find one for field subscription_root.users".
And besides, server hasura is receiving events:
But apollo-server resents the answer from hasura. It is not the first day I suffer with this and I can not understand what the problem is.
In the editor hasura subscriptions work.
Link to full code
If you need any additional info, I will gladly provide it to you.
import {
introspectSchema,
makeExecutableSchema,
makeRemoteExecutableSchema,
mergeSchemas,
transformSchema,
FilterRootFields
} from 'graphql-tools';
import { HttpLink } from 'apollo-link-http';
import nodeFetch from 'node-fetch';
import { resolvers } from './resolvers';
import { hasRoleResolver } from './directives';
import { typeDefs } from './types';
import { WebSocketLink } from 'apollo-link-ws';
import { split } from 'apollo-link';
import { getMainDefinition } from 'apollo-utilities';
import { SubscriptionClient } from 'subscriptions-transport-ws';
import * as ws from 'ws';
import { OperationTypeNode } from 'graphql';
interface IDefinitionsParams {
operation?: OperationTypeNode,
kind: 'OperationDefinition' | 'FragmentDefinition'
}
const wsurl = 'ws://graphql-engine:8080/v1alpha1/graphql';
const getWsClient = function (wsurl: string) {
const client = new SubscriptionClient(wsurl, {
reconnect: true,
lazy: true
}, ws);
return client;
};
const wsLink = new WebSocketLink(getWsClient(wsurl));
const createRemoteSchema = async () => {
const httpLink = new HttpLink({
uri: 'http://graphql-engine:8080/v1alpha1/graphql',
fetch: (nodeFetch as any)
});
const link = split(
({ query }) => {
const { kind, operation }: IDefinitionsParams = getMainDefinition(query);
console.log('kind = ', kind, 'operation = ', operation);
return kind === 'OperationDefinition' && operation === 'subscription';
},
wsLink,
httpLink,
);
const remoteSchema = await introspectSchema(link);
const remoteExecutableSchema = makeRemoteExecutableSchema({
link,
schema: remoteSchema
});
const renamedSchema = transformSchema(
remoteExecutableSchema,
[
new FilterRootFields((operation, fieldName) => {
return (operation === 'Mutation') ? false : true; // && fieldName === 'password'
})
]
);
return renamedSchema;
};
export const createNewSchema = async () => {
const hasuraExecutableSchema = await createRemoteSchema();
const apolloSchema = makeExecutableSchema({
typeDefs,
resolvers,
directiveResolvers: {
hasRole: hasRoleResolver
}
});
return mergeSchemas({
schemas: [
hasuraExecutableSchema,
apolloSchema
]
});
};
Fixed by installing graphql-tools 4th version. It tutns out the editor did not even notice that I do not have this dependency and simply took the version of node_modules, which was installed by some other package. Problem was with version 3.x. Pull request is where the bug was fixed.
I had the same problem, different cause and solution.
My subscription was working well, until I introduced the 'resolve' key in
my subscription resolver:
Here is the 'Subscription' part of My resolver:
Subscription: {
mySubName: {
resolve: (payload) => {
console.log('In mySubName resolver, payload:',payload)
return payload;
},
subscribe:() => pubSub.asyncIterator(['requestsIncomplete']),
// )
},
The console.log proved the resolve() function was being called with a well structured payload (shaped the same as my Schema definiton - specifically the an object with a key named after the graphQL Subscriber, pointing to an array (array is an iterable):
In mySubName resolver, payload: { mySubName:
[ { id: 41,
...,
},
{...},
{...}
...
...
]
Even though I was returning that same unadulterated object, it caused the error expected Iterable, but did not find one for field "Subscription.mySubName"
When I commented out that resolve function all together, the subscription worked, which is further evidence that my payload was well structured, with the right key pointing to an iterable.
I must be mis-using the resolve field. From https://www.apollographql.com/docs/graphql-subscriptions/subscriptions-to-schema/
When using subscribe field, it's also possible to manipulate the event
payload before running it through the GraphQL execution engine.
Add resolve method near your subscribe and change the payload as you wish
so I am not sure how to properly use that function, specifically don't know what shape object to return from it, but using it as above breaks the subscription in the same manner you describe in your question.
I was already using graphql-tools 4.0.0, I upgraded to 4.0.8 but it made no difference.

Resources