Removing Firebase Realtime Database node using Cloud Function - node.js

I can successfully get a query param and post it into my realtime database at a path defined by the query param itself, using this code:
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.emptyHat = functions.https.onRequest(async (req, res) => {
// Grab the group parameter.
const group = req.query.group;
// Push the new message into the Realtime Database using the Firebase Admin SDK.
const snapshot = await admin.database().ref('/group').push({list: group});
// Redirect with 303 SEE OTHER to the URL of the pushed object in the Firebase console.
res.redirect(303, snapshot.ref.toString());
});
If the query param was 'test', the result would be a new entry at /test/{firebaseID}/{'list':'test'}
When I tried to modify it to remove the node named in the query parameter I get errors.
(I'm trying to remove that top level /test node
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.emptyHat = functions.https.onRequest(async (req, res) => {
// Grab the group parameter.
const group = req.query.group;
// Remove the node at the location 'group'.
functions.database().ref('/group').remove();
// Redirect with 303 SEE OTHER to the URL of the pushed object in the Firebase console.
//res.redirect(303, snapshot.ref.toString());
});
Error message in the logs:
at exports.emptyHat.functions.https.onRequest (/srv/index.js:95:13)
at cloudFunction (/srv/node_modules/firebase-functions/lib/providers/https.js:49:9)
at /worker/worker.js:783:7
at /worker/worker.js:766:11
at _combinedTickCallback (internal/process/next_tick.js:132:7)
at process._tickDomainCallback (internal/process/next_tick.js:219:9)

Your second function is ignoring the promise returned by the Firebase API. This makes it unlikely to work in Cloud Functions. Cloud Functions HTTP triggers require that you only send a response only after all the asynchronous work is complete. After your function generates a response, it will terminate and clean up any async work still in progress. That work might not complete.
Your second function should be more like your first one, and use the promise returned by remove() and wait for it to complete before sending the response:
exports.emptyHat = functions.https.onRequest(async (req, res) => {
const group = req.query.group;
await admin.database().ref('/group').remove();
// send the response here.
});

Related

KoaJs cant handle POST requests on CloudFunctions

I have a NodeJS Application written in KoaJS,
app.ts
const app = new Koa();
app.use(healthCheck());
app.use(bodyParser());
app.use(errorHandler());
app.use(endpoints);
export default app;
main.ts
const port = process.env.PORT || 3000;
if (!isCloudFunctions()) {
app
.listen(port, () => {
console.info(`Listening at http://localhost:${port}`);
})
.on('error', console.error);
}
export const api = (req, res) => {
app.callback()(req, res);
}
The app works well on Cloud Runs,
I can deploy the app on Cloud Functions, but on Functions the app can only handle GET requests.
If I try a POST request, I get this error
InternalServerError: stream is not readable
at getRawBody (/workspace/node_modules/raw-body/index.js:112:10)
at readStream (/workspace/node_modules/raw-body/index.js:178:17)
at AsyncFunction.module.exports [as json] (/workspace/node_modules/co-body/lib/json.js:39:21)
at executor (/workspace/node_modules/raw-body/index.js:113:5)
at parseBody (/workspace/node_modules/koa-bodyparser/index.js:100:26)
at new Promise (<anonymous>)
at bodyParser (/workspace/node_modules/koa-bodyparser/index.js:85:25)
at next (/workspace/node_modules/koa-compose/index.js:42:32)
at /workspace/webpack:/sample-explore/apps/sample-api/src/middlewares/health-check.ts:10:12
at Generator.next (<anonymous>)
I re-created the application in ExpressJS, and it works fine with both Runs and Functions
However I am really like the native async/await , compose routing of KoaJS
Does anyone know the reason why KoaJS can not handle POST requests on Cloud Functions?
The json body is automatically parsed in google cloud functions (documentation) and the koa-bodyparser middleware can't handle already parsed bodies.
More info on this issue: https://github.com/koajs/bodyparser/issues/127
Suggested fixes, from the issue thread, are to either use ctx.req.body instead of ctx.request.body, you'll need to parse it when running locally of course.
Alternatively add a middleware that will support already parsed bodies.
function hybridBodyParser (opts) {
const bp = bodyParser(opts)
return async (ctx, next) => {
ctx.request.body = ctx.request.body || ctx.req.body
return bp(ctx, next)
}
}
app.use(hybridBodyParser())

How to fetch data from an api handler from another api handler in Nextjs?

Using Nextjs, I am struggling with fetching data from an API handler in another API handler.
/api/batch.js handler retrieves some data from a MongoDB:
import middleware from '../../libs/mongodb';
import nextConnect from 'next-connect';
const handler = nextConnect();
handler.use(middleware);
handler.get(async (req, res) => {
const {batch_uuid} = req.query
const batch_stats = await req.db.collection('stats').find({
'batch.uuid': uuid
}).toArray()
res.json(batch_stats)
})
export default (req, res) => handler.apply(req, res)
/api/stats.js handler retrieves some data from MySQL and, based on that and on data from Mongo, it builds a payload.
A dirty fix works as below in stats.js:
const response = await fetch('http://localhost:3000/api/batch?uuid=1234')
const data = await response.json()
// use data and build another payload
However, this requires doing an http call and it does not make sense to do it for an internal endpoint.
I tried calling the batch handler dirrectly, but getting undefined
const getBatch = require('./batch');
module.exports = async (req, res) => {
const accounts = await getBatch()
res.status(200).json({accounts})
}
What am I missing?
You could extract fetching data from the database to a separate module and import this module in both API handlers. Instead of one API endpoint making request to another API endpoint, API handlers would just use the same module that fetches data from MongoDB.

Aws lambda with mongoDb connection

Hello guyz i need an answer to an simple question.I am using Aws lambda with serverless framework and i am using mongoDb connection in aws lambda.I have used connection code inside my handler function and i have used connection pooling.
Now when i deploy my app on AWS using sls deploy and after deploying when i call my lambda for first time then connection is established only once and after that on other lambda API calls it is reusing my connection instead of creating new connection.so this thing is fine.
Now after this process i am running a script which is not related with my AWS app to test my concurrent lambda requests.I have called my same lambda API using request npm module in for loop in script and in that case all time my new connnection is created till loop terminates instead of using my existing one generated from first call.Can someone tell me why it is happening and what is the reason behind this? Why my connection is creating again when this script runs when already i have created my connection on first lambda call.
And same api when i call from postman then it is resuing my connection after first lambda call but when i run this script and from inside script i call this API(using request NPM module) using command "node app.js" then all time till loop terminates it creates new connection.
Please help me out in this.
'use strict'
const bodyParser = require('body-parser')
const express = require('express')
const serverless = require('serverless-http')
const cors = require('cors');
const mongoConnection = require('./connection/mongoDb');
const app = express()
app.use(cors())
app.use(bodyParser.json())
const handler = serverless(app);
let cachedDb = null;
module.exports.handler = async (event, context) => {
context.callbackWaitsForEmptyEventLoop = false;
if (cachedDb == null) {
let Database = await mongoConnection();
console.log("DB", Database);
cachedDb = Database
}
const baseRouter = require('./routes/index');
app.use('/api', baseRouter);
const result = await handler(event, context);
return result;
};
Here is a node.js example that shows the connection parameters. Perhaps this will help?
const express = require("express");
const bodyParser= require("body-parser")
const app = express();
const MongoClient = require("mongodb").MongoClient
MongoClient.connect("mongodb://myusername:mypassword#localhost:27017", (err, client) => {
if (err) return console.log(err)
var db = client.db("mydatabase")
db.collection("mycollection").countDocuments(getCountCallback);
app.listen(3000, () => {
console.log("listening on 3000")
})
})
function getCountCallback(err, data) {
console.log(data);
}
app.use(bodyParser.urlencoded({extended: true}))
app.get("/", (req, res) => {
res.sendFile(__dirname + "/index.html")
})
app.post("/quotes", (req, res) => {
console.log(req.body)
})
Your example code does not show any hostname for your database server, nor does it specify which port to use. Please compare your code and contrast to my example.
I see you defined the cachedDb variable outside the handler scope, so that makes it available when the container is reused. However, there is no guarantee that the container will be reused (see my previous link on that) because that's not how Lambda works. If you invoke the same functions many times very quickly after eachother, Lambda needs to scale out horizontally to be able to handle the requests quickly. They each get their own container and connection.
When the invocation is finished, AWS will keep the container for a bit (how long depends on many factors like function size & RAM limit). If you invoke it again the containers can reuse their connection. You can try to invoke the function 20 times with 1 second interval and counting the number of connections that have been openend. It will be lower than 20, but higher than 1.

Dialogflow Node.js Client - save data on data side, but don't return response

I'm using DialogFlow with the official Google Node.js library. I want to use a webhook to save input data to a database, but not return a response.
However, currently I'm just waiting for the function to timeout which is slow, and writes an error to the logs Error: No responses defined for platform: FACEBOOK
I've checked the documentation hoping for a way to send a 200 status or similar, but haven't found anything. https://dialogflow.com/docs/reference/fulfillment-library/webhook-client
Is it possible to do what I'd like to do? It seems like a fairly standard requirement.
UPDATE: My code is simple, but below
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements
const firebase = require('firebase-admin');
firebase.initializeApp();
const session_split = agent.session.split('/');
const session = session_split[blf_session_split.length - 1];
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((req, res) => {
const agent = new WebhookClient({
request: req,
response: res
});
agent.handleRequest(savedata);
function input(agent) {
return firebase.database().ref('/tests/' + session).update({
"input": agent.action
});
}
});

Firebase Cloud Functions Simply Refuse to Read From Realtime Database

Here is my code:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
exports.createNewGame = functions.https.onRequest((request, response) => {
var database = admin.database();
database.ref().on('value', function(data) {
console.log('SOMETHING');
});
response.end();
});
The problem is, it is not logging "SOMETHING". It is as if it is refusing to call the "on" function. It logs that the function was called and that it was executed successfully, however, it does not actually run anything inside the callback function in "on".
Any help is appreciated, I have no idea what else to try to get this to run, I have literally copy-pasted code from Firebase's documentation to no avail.
Cheers!
You're sending the response before the database read from the database. This likely terminates the function before the log statement can execute.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.createNewGame = functions.https.onRequest((request, response) => {
var database = admin.database();
database.ref().once('value', function(data) {
console.log('SOMETHING');
response.end();
});
});
Note that I also changed on() to once(), since you're only interested in receiving the current value.

Resources