How to call Firebase onRequest Cloud Function? - node.js

I have the following addEventToCalendar cloud function :
const { google } = require("googleapis");
const calendar = google.calendar("v3");
//const functions = require('firebase-functions');
const googleCredentials = require("./credentials.json");
const OAuth2 = google.auth.OAuth2;
const ERROR_RESPONSE = {
status: "500",
message: "There was an error adding an event to your Google calendar",
};
const TIME_ZONE = "EST";
const addEvent = (event, auth) => {
return new Promise(function (resolve, reject) {
calendar.events.insert(
{
auth: auth,
calendarId: "primary",
resource: {
summary: event.eventName,
description: event.description,
start: {
dateTime: event.startTime,
timeZone: TIME_ZONE,
},
end: {
dateTime: event.endTime,
timeZone: TIME_ZONE,
},
},
},
(err, res) => {
if (err) {
console.log("Rejecting because of error");
reject(err);
}
console.log("Request successful");
resolve(res.data);
}
);
});
}
exports.addEventToCalendar = functions.https.onRequest((request, response) => {
const eventData = {
eventName: request.body.eventName,
description: request.body.description,
startTime: request.body.startTime,
endTime: request.body.endTime,
};
console.log("Event Data: ", eventData);
const oAuth2Client = new OAuth2(
googleCredentials.web.client_id,
googleCredentials.web.client_secret,
googleCredentials.web.redirect_uris[0]
);
oAuth2Client.setCredentials({
refresh_token: googleCredentials.refresh_token,
});
addEvent(eventData, oAuth2Client)
.then((data) => {
response.status(200).send(data);
return;
})
.catch((err) => {
console.error("Error adding event: " + err.message);
response.status(500).send(ERROR_RESPONSE);
return;
});
});
URL looks like this : http://localhost:5001/prototype-dev-fadd5/us-central1/addEventToCalendar
How do I call this by passing the following data?
{
"eventName": "Firebase Event",
"description": "This is a sample description",
"startTime": "2018-12-01T10:00:00",
"endTime": "2018-12-01T13:00:00"
)
When I say call, it implies 2 things:
Call from a browser just to test it.
Call it from another firebase cloud function which runs on a trigger.

To call it from a browser, you can call it using XHR:
const data = {
eventName: 'Firebase Event',
description: 'This is a sample description',
startTime: '2018-12-01T10:00:00',
endTime: '2018-12-01T13:00:00'
}
fetch('http://localhost:5001/prototype-dev-fadd5/us-central1/addEventToCalendar', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(data)
}).then(response => {
// ...
}).catch(error => {
// ...
});
To call it from another Cloud Function, I'd suggest exporting the addEvent function so you can import and use it elsewhere.
I'd also move the creation of the oAuth2Client object over to the addEvent() function so that you don't have to create an oAuth client before calling the addEvent() function.
If you can't re-use the addEvent() function because the two Cloud Functions live in separate projects and you can't share code between them, I'd call the function through XHR using a tool like Axios or node-fetch.

I know the tutorial you are using.
There are two good ways to test it:
Google Cloud Function Console, Testing tab
Postman
And for using it, I can think of 3 good ways to actually use it.
fetch Javascript from the frontend
Callable Firebase Functions from the frontend
export import to use from the backend.
Testing - Google Cloud Functions Console.
https://console.cloud.google.com/functions/
Testing - Postman
To test you can also use Postman.
Create a POST request with the URL of your function.
Then set the body to raw, with the type set to JSON. Once that is all setup, hit send and the response will show up if you did everything correctly.
See this image on where to click in Postman.
Using - fetch
Fetch provides a
global fetch() method that provides an easy, logical way to fetch
resources asynchronously across the network.
https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch
And luckily, Postman will write the fetch code for you.
Click on the Code tab, and select Javascript - Fetch from the dropdown.
Using - Callable Functions
Callable Functions are functions you can call directly from your Javascript on the front end. They are explained here. https://firebase.google.com/docs/functions/callable
Biggest difference is instead of using
functions.https.onRequest
in your function declaration, you use:
functions.https.onCall
Using - export to use in Node.js on your server
If you want to be able to call a function directly on your Node.js server, aka your Firebase functions, if the function is in the same file, you call it like any other javascript file.
function normalFunction() {
//do work
return data;
}
exports.addEventToCalendar = functions.https.onRequest((request, response) => {
let data = normalFunction();
response.send(data);
}
You can put whatever you want in the normalFunction and call it where ever you want. Just make sure to use a promise, or await if it is doing something on a server etc.
If you want to use normalFunction from a different file, you would need to use export. https://developer.mozilla.org/en-US/docs/web/javascript/reference/statements/export
Firebase has a good doc on how you can organize multiple files here https://firebase.google.com/docs/functions/organize-functions

Related

Can not invoke the firebase cloud function where I have 3rd party api request

I have function to get vehicle images from a 3rd party api called evox.
The following is the function I use in my cloud function: EDITED VERSION:
const axios = require('axios').default;
/**
*
* #param {Number} vifNum vif / evox number of the requested vehicle
* #returns {Promise} { image: '' }
*/
module.exports.makeImagesRequest = (vifNum) => {
return new Promise(async (resolve, reject) => {
try{
var returnData = { image: "" }
var productId = 0
var productType = 0
// Set headers for request
var imagesReqUrl = 'http://api.evoximages.com/api/v1/vehicles/' + vifNum + '/products/' + productId + '/' + productType
const imagesRequestOptions = {
url: imagesReqUrl,
method: 'GET',
headers: {
'x-api-key': functions.config().evox.key,
'Accept': 'application/json',
'Content-Type': 'application/json;charset=UTF-8'
}
};
// Send request and set the returned data
const response = await axios(imagesRequestOptions);
returnData.image =response.data.urls[0]
resolve(returnData)
}
catch(error){
reject({ error, helperMsg: "Vif number is not valid or connection problem" })
}
});
}
I use postman to send requests to api in the following format:
{
"data" : {
"vifNum": 99999
}
}
Where I export the cloud function: EDITED VERSION:
module.exports.getVehicleImages = functions.https.onCall((data, context) => {
return new Promise(async (resolve, reject) => {
try{
const val = await makeImagesRequest(data.vifNum)
resolve(val)
}
catch(error){
reject({ error, msg: "Error while getting vehicle image by evox" });
}
})
})
The weird thing is that, when I use emulators to test my functions, it works quite good, no error and the information is correct and consistent. But when I deployed my functions, and called that function with the api url from postman with the exact same data, now it gives an error inside this makeImagesRequest function
IMPORTANT EDIT: I tried using another 3rd party api VIN, and that api worked in cloud functions. How I built the evox and VIN functions is almost exactly same. The only notable difference is that, evox require that 'x-api-key' in the header, while VIN does not need any api-key. Could the error be related to this? PS: functions.config() is properly set, checked that out via pulling a .runtimeconfig.json file from firebase
Postman error:
Firebase error:
It seems that the problem was with my firebase.config(), even though I retrieved evox.key when I ran the command firebase functions:config:get > .runtimeconfig.json, in my cloud function it was not valid.

Error with Lambda Function running on NodeJS netlify-lambda — "TypeError: Expected signal to be an instanceof AbortSignal"

I’ve recently been trying to set up a Lambda Function on a Netlify site. The rest of the site is running on Gatsby. I’m able to get some basic functions working in development and production (e.g. returning a “hello world”), but I’m running into a problem whenever I try something more complex.
My function always seems to return an error in development which says Function invocation failed: TypeError: Expected signal to be an instanceof AbortSignal. I haven’t tried getting it to work in production (although it obviously won’t work as it stands, relying on .env). I'm using netlify-lambda for this.
For instance, here is the code I’m trying to get to work, lifted from this gist. I'm trying to fetch records from a simple Airtable database using their API.
require("dotenv").config({ debug: process.env.DEBUG })
const Airtable = require("airtable")
Airtable.configure({
endpointUrl: "https://api.airtable.com",
apiKey: process.env.AIRTABLE_PASS,
})
var base = Airtable.base(process.env.AIRTABLE_ID)
exports.handler = function(event, context, callback) {
const allRecords = []
base('Main')
.select({
maxRecords: 100,
view: 'all'
})
.eachPage(
function page(records, fetchNextPage) {
records.forEach(function(record) {
allRecords.push(record)
})
fetchNextPage()
},
function done(err) {
if (err) {
callback(err)
} else {
const body = JSON.stringify({ records: allRecords })
const response = {
statusCode: 200,
body: body,
headers: {
'content-type': 'application/json',
'cache-control': 'Cache-Control: max-age=300, public'
}
}
callback(null, response)
}
}
)
}
The error occurs whether I hit the url from my browser, or send a GET or POST request through Postman or a form.
The Netlify site name is affectionate-engelbart-b6885d, and the repo is here.
Happy to post error logs if that’s helpful. Thank you!
Downgrade airtable to version 0.8.1.
I had the same issue. I found this helpful.

How to connect to Azure SQL database from multiple Azure Functions TypeScript API endpoints

I'm creating an API in Azure Functions using TypeScript, with multiple endpoints connecting to the same Azure SQL Server. Each endpoint was set up using the Azure Functions extension for VS Code, with the HttpTrigger TypeScript template. Each endpoint will eventually make different calls to the database, collecting from, processing and storing data to different tables.
There don't seem to be any default bindings for Azure SQL (only Storage or Cosmos), and while tedious is used in some Microsoft documentation, it tends not to cover Azure Functions, which appears to be running asynchronously. What's more, other similar StackOverflow questions tend to be for standard JavaScript, and use module.exports = async function (context) syntax, rather than the const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> syntax used by the TypeScript HttpTrigger templates.
Here's what I've got so far in one of these endpoints, with sample code from the tedious documentation in the default Azure Functions HttpTrigger:
var Connection = require('tedious').Connection;
var config = {
server: process.env.AZURE_DB_SERVER,
options: {},
authentication: {
type: "default",
options: {
userName: process.env.AZURE_DB_USER,
password: process.env.AZURE_DB_PASSWORD,
}
}
};
import { AzureFunction, Context, HttpRequest } from "#azure/functions"
const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
context.log('HTTP trigger function processed a request.');
const name = (req.query.name || (req.body && req.body.name));
if (name) {
var connection = new Connection(config);
connection.on('connect', function(err) {
if(err) {
console.log('Error: ', err)
}
context.log('Connected to database');
});
context.res = {
// status: 200, /* Defaults to 200 */
body: "Hello " + (req.query.name || req.body.name)
};
}
else {
context.res = {
status: 400,
body: "Please pass a name on the query string or in the request body"
};
}
};
export default httpTrigger;
This ends up with the following message:
Warning: Unexpected call to 'log' on the context object after function
execution has completed. Please check for asynchronous calls that are
not awaited or calls to 'done' made before function execution
completes. Function name: HttpTrigger1. Invocation Id:
xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx. Learn more:
https://go.microsoft.com/fwlink/?linkid=2097909
Again, the async documentation linked to covers just the standard JavaScript module.exports = async function (context) syntax, rather than the syntax used by these TypeScript httpTriggers.
I've also been reading that best practice might be to have a single connection, rather than connecting anew each time these endpoints are called - but again unsure if this should be done in a separate function that all of the endpoints call. Any help would be much appreciated!
I'm glad that using the 'mssql' node package works for you:
const sql = require('mssql');
require('dotenv').config();
module.exports = async function (context, req) {
try {
await sql.connect(process.env.AZURE_SQL_CONNECTIONSTRING);
const result = await sql.query`select Id, Username from Users`;
context.res.status(200).send(result);
} catch (error) {
context.log('error occurred ', error);
context.res.status(500).send(error);
}
};
Ref:https://stackoverflow.com/questions/62233383/understanding-js-callbacks-w-azure-functions-tedious-sql
Hope this helps.

How to assert that app sends correct data to API server with POST request

I am writing React.js application talking to API server. I have read tons of articles on how to mock these calls and send some fake response from API. I can do testing using #testing-library/react, I can easily mock axios with axios-mock-adapter and test fetch requests using HTTP GET method. But I cannot find anywhere how to make sure that my app, when it sends some POST request, sends correct data to API, i.e. that my app sends json payload with e.g. "id" field, or "name" field set to "abc", or something like this.
I am new to React.js. Please advise how to make tests asserting what the app sends to API. Is it possible?
Let's say that I have a function named doSomething, like below, called with onClick of some button.
const doSomething = async (userId, something) => {
try {
await REST_API.post('doSomething', {
user_id: userId,
something: something
});
return true;
} catch (error) {
window.alert(error);
return false;
}
};
REST_API above is axios instance.
How can I ensure that the I (or some other developer) didn't make a typo and didn't put "userId" instead of "user_id" in the payload of the request?
If you have to be sure you call correctly the api, I'd use jest as follow:
jest.mock('axios', () => ({
post: jest.fn(),
}));
describe('test', () => {
it('doSomething', () => {
const userId = 123;
const something = 'abc';
doSomething(userId, something);
expect(axios.post).toBeCalledWith(
'doSomething', {
user_id: userId,
something,
},
);
});
});
or if you use instance, define it in another file (axios_instance.js) and using the follow test:
jest.mock('./axios_instance', () => ({
instance: {
post: jest.fn(),
},
}));
describe('test', () => {
it('doSomething', () => {
const userId = 123;
const something = 'abc';
doSomethingInstance(userId, something);
expect(instance.post).toBeCalledWith(
'doSomething', {
user_id: userId,
something,
},
);
});
});
For your need I would use Swagger and its tooling. You would kill three birds with one stone :
Have a proper API documentation : https://swagger.io/tools/swagger-ui/
Protect Backend : Ensure inputs/outputs are valid, and throw detailed exception if a client sends bad data : https://github.com/cdimascio/express-openapi-validator-example
Protect Frontend : Use client api generation to genrate js classes used by your clients .. That way they won't arbitrarily create objects manually and send them to server (crossing fingers) but use a dedicated API with setters : https://github.com/swagger-api/swagger-codegen
That way you have a rock solid Frontend + Backend + Documentation combo ..

How to write unit test for the function which is accessing aws resources?

I have a function which is accessing multiple aws resources and now need to test this function, but I don't know how to mock these resources.
I have tried following github of aws-sdk-mock, but didn't get much there.
function someData(event, configuration, callback) {
// sts set-up
var sts = new AWS.STS(configuration.STS_CONFIG);
sts.assumeRole({
DurationSeconds: 3600,
RoleArn: process.env.CROSS_ACCOUNT_ROLE,
RoleSessionName: configuration.ROLE_NAME
}, function(err, data) {
if (err) {
// an error occurred
console.log(err, err.stack);
} else {
// successful response
// resolving static credential
var creds = new AWS.Credentials({
accessKeyId: data.Credentials.AccessKeyId,
secretAccessKey: data.Credentials.SecretAccessKey,
sessionToken: data.Credentials.SessionToken
});
// Query function
var dynamodb = new AWS.DynamoDB({apiVersion: configuration.API_VERSION, credentials: creds, region: configuration.REGION});
var docClient = new AWS.DynamoDB.DocumentClient({apiVersion: configuration.API_VERSION, region: configuration.REGION, endpoint: configuration.DDB_ENDPOINT, service: dynamodb });
// extract params
var ID = event.queryStringParameters.Id;
console.log('metrics of id ' + ID);
var params = {
TableName: configuration.TABLE_NAME,
ProjectionExpression: configuration.PROJECTION_ATTR,
KeyConditionExpression: '#ID = :ID',
ExpressionAttributeNames: {
'#ID': configuration.ID
},
ExpressionAttributeValues: {
':ID': ID
}
};
queryDynamoDB(params, docClient).then((response) => {
console.log('Params: ' + JSON.stringify(params));
// if the query is Successful
if( typeof(response[0]) !== 'undefined'){
response[0]['Steps'] = process.env.STEPS;
response[0]['PageName'] = process.env.STEPS_NAME;
}
console.log('The response you get', response);
var success = {
statusCode: HTTP_RESPONSE_CONSTANTS.SUCCESS.statusCode,
body: JSON.stringify(response),
headers: {
'Content-Type': 'application/json'
},
isBase64Encoded: false
};
return callback(null, success);
}, (err) => {
// return internal server error
return callback(null, HTTP_RESPONSE_CONSTANTS.BAD_REQUEST);
});
}
});
}
This is lambda function which I need to test, there are some env variable also which is being used here.
Now I tried writing Unit test for above function using aws-sdk-mock but still I am not able to figure out how to actually do it. Any help will be appreciated. Below is my test code
describe('test getMetrics', function() {
var expectedOnInvalid = HTTP_RESPONSE_CONSTANTS.BAD_REQUEST;
it('should assume role ', function(done){
var event = {
queryStringParameters : {
Id: '123456'
}
};
AWS.mock('STS', 'assumeRole', 'roleAssumed');
AWS.restore('STS');
AWS.mock('Credentials', 'credentials');
AWS.restore('Credentials');
AWS.mock('DynamoDB.DocumentClient', 'get', 'message');
AWS.mock('DynamoDB', 'describeTable', 'message');
AWS.restore('DynamoDB');
AWS.restore('DynamoDB.DocumentClient');
someData(event, configuration, (err, response) => {
expect(response).to.deep.equal(expectedOnInvalid);
done();
});
});
});
I am getting the following error :
{ MultipleValidationErrors: There were 2 validation errors:
* MissingRequiredParameter: Missing required key 'RoleArn' in params
* MissingRequiredParameter: Missing required key 'RoleSessionName' in params
Try setting aws-sdk module explicitly.
Project structures that don't include the aws-sdk at the top level node_modules project folder will not be properly mocked. An example of this would be installing the aws-sdk in a nested project directory. You can get around this by explicitly setting the path to a nested aws-sdk module using setSDK().
const AWSMock = require('aws-sdk-mock');
import AWS = require('aws-sdk');
AWSMock.setSDKInstance(AWS);
For more details on this : Read aws-sdk-mock documentation, they have explained it even better.
I strongly disagree with #ttulka's answer, so I have decided to add my own as well.
Given you received an event in your Lambda function, it's very likely you'll process the event and then invoke some other service. It could be a call to S3, DynamoDB, SQS, SNS, Kinesis...you name it. What is there to be asserted at this point?
Correct arguments!
Consider the following event:
{
"data": "some-data",
"user": "some-user",
"additionalInfo": "additionalInfo"
}
Now imagine you want to invoke documentClient.put and you want to make sure that the arguments you're passing are correct. Let's also say that you DON'T want the additionalInfo attribute to be persisted, so, somewhere in your code, you'd have this to get rid of this attribute
delete event.additionalInfo
right?
You can now create a unit test to assert that the correct arguments were passed into documentClient.put, meaning the final object should look like this:
{
"data": "some-data",
"user": "some-user"
}
Your test must assert that documentClient.put was invoked with a JSON which deep equals the JSON above.
If you or any other developer now, for some reason, removes the delete event.additionalInfo line, tests will start failing.
And this is very powerful! If you make sure that your code works the way you expect, you basically don't have to worry about creating integration tests at all.
Now, if a SQS consumer Lambda expects the body of the message to contain some field, the producer Lambda should always take care of it to make sure the right arguments are being persisted in the Queue. I think by now you get the idea, right?
I always tell my colleagues that if we can create proper unit tests, we should be good to go in 95% of the cases, leaving integration tests out. Of course it's better to have both, but given the amount of time spent on creating integration tests like setting up environments, credentials, sometimes even different accounts, is not worth it. But that's just MY opinion. Both you and #ttulka are more than welcome to disagree.
Now, back to your question:
You can use Sinon to mock and assert arguments in your Lambda functions. If you need to mock a 3rd-party service (like DynamoDB, SQS, etc), you can create a mock object and replace it in your file under test using Rewire. This usually is the road I ride and it has been great so far.
I see unit testing as a way to check if your domain (business) rules are met.
As far as your Lambda contains exclusively only integration of AWS services, it doesn't make much sense to write a unit test for it.
To mock all the resources means, your test will be testing only communication among those mocks - such a test has no value.
External resources mean input/output, this is what integration testing focuses on.
Write integration tests and run them as a part of your integration pipeline against real deployed resources.
This is how we can mock STS in nodeJs.
import { STS } from 'aws-sdk';
export default class GetCredential {
constructor(public sts: STS) { }
public async getCredentials(role: string) {
this.log.info('Retrieving credential...', { role });
const apiRole = await this.sts
.assumeRole({
RoleArn: role,
RoleSessionName: 'test-api',
})
.promise();
if (!apiRole?.Credentials) {
throw new Error(`Credentials for ${role} could not be retrieved`);
}
return apiRole.Credentials;
}
}
Mock for the above function
import { STS } from 'aws-sdk';
import CredentialRepository from './GetCredential';
const sts = new STS();
let testService: GetCredential;
beforeEach(() => {
testService = new GetCredential(sts);
});
describe('Given getCredentials has been called', () => {
it('The method returns a credential', async () => {
const credential = {
AccessKeyId: 'AccessKeyId',
SecretAccessKey: 'SecretAccessKey',
SessionToken: 'SessionToken'
};
const mockGetCredentials = jest.fn().mockReturnValue({
promise: () => Promise.resolve({ Credentials: credential }),
});
testService.sts.assumeRole = mockGetCredentials;
const result = await testService.getCredentials('fakeRole');
expect(result).toEqual(credential);
});
});

Resources