I have my firebase cloud function in which I am calling my external api end point like this.
const functions = require('firebase-functions');
var admin = require("firebase-admin");
admin.initializeApp(functions.config().firebase);
var request = require('request');
var moment = require('moment');
var rp = require('request-promise');
var db = admin.database();
exports.onCheckIn = functions.database.ref('/news/{newsId}/{entryId}')
.onCreate(event => {
console.log("Event Triggered");
var eventSnapshot = event.data.val();
request.post({
url: 'http://MyCustomURL/endpoint/',
form: {
data: eventSnapshot.data
}
}, function(error, response, body){
console.log(response);
});
})
I am using Blaze plan and this is working completely fine. But the problem is that when I am creating bulk data (around 50 to 100 entries) the HTTP request at to my custom url is not working properly.One or two HTTP request are being skipped.
I have debugged my custom server and found out that it is not receiving missing requests from firebase. But I have also checked the cloud function logs and I can find that every event is correctly being triggered.
What could be the problem? Am I doing anything wrong ?
You're not returning any value from your function. This means that Cloud Functions assumes that the function is done with its work after the last statement has run. But since you're making an asynchronous HTTP call in the function, that calls may not have completed yet. Unfortunately you're not telling Cloud Functions about the fact that you have an outstanding call, so it may kill your function at any time after you return.
The solution is to return a promise that resolves after the HTTP request has completed. Since you're already including request-promise this is simple:
exports.onCheckIn = functions.database.ref('/news/{newsId}/{entryId}')
.onCreate(event => {
console.log("Event Triggered");
var eventSnapshot = event.data.val();
return rp.post({
url: 'http://MyCustomURL/endpoint/',
form: {
data: eventSnapshot.data
}
});
})
This is a common problem for developers new to JavaScript, or with JavaScript and Firebase, and is covered in:
the Firebase documentation
this blog post
this video
Related
What are the consequences of making request global (or singleton), so that it is accessible all over the server and does not have to be passed in each function call? For example:
index.js:
const http = require('http');
const { saveReq } = require('./shared');
const {
doSomethingWithReqPassingItAsParameter,
doSomethingWithReqPassingItAsGlobal
} = require('./lib');
const requestListener = function (req, res) {
// approach 1
doSomethingWithReqPassingItAsParameter(req);
// approach 2
saveReq(req);
doSomethingWithReqPassingItAsGlobal();
res.writeHead(200);
res.end('Hello, World!');
}
const server = http.createServer(requestListener);
server.listen(8080);
lib.js:
const { loadReq } = require('./shared');
const doSomethingWithReqPassingItAsParameter = (req) => {
console.log('req as parameter', req.url);
};
const doSomethingWithReqPassingItAsGlobal = () => {
console.log('req as global', loadReq().url);
};
module.exports = {
doSomethingWithReqPassingItAsParameter,
doSomethingWithReqPassingItAsGlobal,
};
shared.js
var request;
const saveReq = (r) => request = r;
const loadReq = () => request;
module.exports = {
saveReq,
loadReq,
}
This is very convenient for large projects with many levels of function calls, but how parallel requests will be handled? I know that nodejs is single-threaded, does it mean than each http request will be run from end to finish separately or they can overlap, thus using a global request object would make a mess?
The consequences are that your server will only work for one request at a time and as soon as you have more than one request in flight at the same time, data will be mixed up between requests dealing to bugs, crashes, security issues and incorrect results.
Simply put, you cannot program a server that way. Pass the req object or data from it to any functions that need it. That keeps the appropriate req object associated with the right execution to avoid all the problems of trying to store a req object in some sort of global location where multiple requests in flight at the same time will/can conflict.
There is a relatively new thing in nodejs called "async local storage" that could perhaps be used for this. You can read a little about it here, though it's my personal opinion that it's still better to pass your request data to the functions that want to use it rather than the async local storage for this.
I have a firebase function called getRequest which contains simple http calls using request npm package and a variable named result which will should contain the body response from the http call after request completed.
However, the request output is a "this should be replaced" string, because the http call runs asynchronously.
How do you get the result variable to contain the body response from the http call?
const functions = require('firebase-functions');
const request = require('request');
exports.getRequest = functions.https.onRequest(() => {
let result = "this should be replaced";
request('http://worldclockapi.com/api/json/est/now', function(error,response,body){
if (!error && response.statusCode == 200)
result = body;
});
console.log(result);
});
I've tried to use callback but I'm confused to put the parameter, because actually this is also inside the callback.
request supports callback interfaces natively but does not return a promise. You must wait the asynchronous call to the external API is finished before sending back the response, and for this you should use a Promise which will resolves when the call to the API returns.
You can use the request-promise library and the rp() method which "returns a regular Promises/A+ compliant promise" and then, adapt your code as follows:
const functions = require('firebase-functions');
const rp = require('request-promise');
exports.getRequest = functions.https.onRequest((req, res) => {
let result = "this should be replaced";
var options = {
uri: 'http://worldclockapi.com/api/json/est/now',
json: true // Automatically stringifies the body to JSON
};
rp(options)
.then(parsedBody => {
result = parsedBody.currentDateTime;
console.log(result);
res.send( {result} );
})
.catch(err => {
// API call failed...
res.status(500).send({'Error': err});
});
});
I would suggest you watch the official Video Series (https://firebase.google.com/docs/functions/video-series/) which explains the key point about returning Promises and also how to handle errors in an HTTP Cloud Function.
Two extra points to note:
onRequest() arguments
You need to pass two arguments to the onRequest() function: the Request and theResponse` objects, see https://firebase.google.com/docs/functions/http-events?authuser=0#trigger_a_function_with_an_http_request
exports.date = functions.https.onRequest((req, res) => {
// ...
});
Pricing plan
You need to be on the "Flame" or "Blaze" pricing plan.
As a matter of fact, the free "Spark" plan "allows outbound network requests only to Google-owned services". See https://firebase.google.com/pricing/ (hover your mouse on the question mark situated after the "Cloud Functions" title)
Since the worldclock API is not a Google-owned service, you need to switch to the "Flame" or "Blaze" plan.
the callback calls asynchronous, so console.log(result) call before the callback run.
if you want to print the result variable with content from the requet you need to print it from the callback
const functions = require('firebase-functions');
const request = require('request');
exports.getRequest = functions.https.onRequest(() => {
let result = "this should be replaced";
request('http://worldclockapi.com/api/json/est/now', function(error,response,body){
if (!error && response.statusCode == 200)
result = body;
console.log(result);
});
});
however, I recommend request-promise and use async/await syntax
const functions = require('firebase-functions');
const request = require('request-promise');
exports.getRequest = functions.https.onRequest(async () => {
let result = "this should be replaced";
result = await request('http://worldclockapi.com/api/json/est/now');
console.log(result);
});
I am testing a call to a SOAP service using node-soap library.
This works fine as a standalone node.js app and the SOAP service responds, however when I package the same code up as a serverless AWS lambda function (using serverless framework, but also executed directly in AWS lambda), it doesn’t appear to create the soap client.
Any thoughts on why this might be happening?
export async function main(event, context) {
var soap = require('soap');
var url = 'https://service.blah.co.uk/Service/Function.asmx?wsdl';
var soapOptions = {
forceSoap12Headers: true
};
var soapHeader = {
'SOAPAction': 'http://www.blah.co.uk/Services/GetToken'
};
var params = {
xmlMessage: message
};
console.log(JSON.stringify(params));
soap.createClient(url, soapOptions, function (err, client) {
//the serverless AWS lambda function never reaches this point (deployed and invoked locally)
console.log("In create client")
if (err) console.log(err);
client.addSoapHeader(soapHeader);
client.GetToken(params, function (err, data) {
console.log(data);
});
});
}
I have created a very minimal working example using async/await. Rather using the old style callback approach, just invoke soap.createClientAsync(url).
Here's the code:
'use strict';
const soap = require('soap');
module.exports.hello = async (event, context) => {
const url = 'http://www.thomas-bayer.com/axis2/services/BLZService?wsdl';
const client = await soap.createClientAsync(url)
console.log(client)
return {
statusCode: 200,
message: 'It works!'
}
}
And here are the logs (partially):
EDIT: Function's output (via AWS's Lambda Console)
EDIT 2 - The REAL problem
Check the link above around async/await. The async keyword will execute asynchronously and will return a Promise. Since it's an asynchronous function, your Lambda is being terminated before it could actually execute your callback.
Using async/await will make your code simpler and will have you stop beating your brains out for such a small thing like this.
I am trying to deploy a GraphQL server on node.js platform using Azure functions. I have been able to deploy a basic hello world app.
However, I need to get data from a backend API in the resolver. I am not able to get either fetch or request package to work in Azure functions.
Below is my code:
var { graphql, buildSchema } = require('graphql');
var fetch = require('node-fetch');
var request = require('request');
var schema = buildSchema(`
type Query {
myObject: MyObject
}
type MyObject {
someId (data: String) : String
}
`);
var root = {
myObject: () => {
return {
someId: (args) => {
// Code enters till this point.
// I can see context.info messages from here.
// return "hello"; <--- This works perfectly fine.
return request('http://example.com', function (error, response, body) {
// -----> Code never enters here.
return body;
});
}
}
}
};
module.exports = function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
graphql(schema, req.body, root)
.then(response => {
context.res = {
body: JSON.strigify(response)
};
context.done();
});
};
I have tried using fetch and request modules. But with both of them, I see the same behavior - the response never returns. The request eventually times out after 5 minutes. If instead of fetch or request, I choose to return some dummy value, I see the response getting returned correctly to the query. With fetch, I don't see the then block or the catch block ever executing.
Note: I have tried both http and https URLs in the request URIs but none of them seem to return any data.
Is it an issue with the way I have implemented the fetch/request or is it an issue with Azure functions in general?
Answering my own question:
It seems that node-fetch and request don't actually return promises. Wrapping the request around Promise seems to solve the problem. Something similar to this answer.
In my app.js
var employees = require('../models/employees');
employees.read(req.params.id, function(body) {
console.log(body.firstName);
});
in my models/employees
var request = require('request');
var employees = {
read: function(id, callback) {
request
.get('http://api.mysite.com/employees/' + id, function(error, response, body) {
body = JSON.parse(body);
return callback(body);
})
},
};
module.exports = employees;
this works. (returns the employee name correctly) but I´m not sure if this is the correct (async) way of getting data from an api and displaying it.
thank you!
Node.js by default is asynchronous so you don't have to 'make' it work in an async manner.
For future use though, once you have more requests, there may be times where you have to wait for certain request to finish before you can fire the next one off, i.e. run tasks synchronously. In that case you'll have to use something like http://caolan.github.io/async/ and queue function calls in a waterfall/series model.