Firebase node.js push async - node.js

Trying to use Firebase as a nodejs module. And i have a problem:
Firebase docs say to code for nodejs the same as for javascript Firebase library.
var newChildRef = myRootRef.push{{name: "Child1"}, function(error){
if(!error){
myModel.addChildto(newChildRef); // Here is calling global var inside a callback!
}
});
In nodejs we need to use async calls to databases. So the standard node way is:
myRootRef.push{{name: "Child1"}, function(newChildRef,error){
if(!error){
myModel.addChildto(newChildRef);
}
});
Please explain how should I code in node js using Firebase, not to spoil an async style.

This was discussed in: https://groups.google.com/forum/#!topic/firebase-talk/eEsFpjE2mmI
The short answer is that it's better to use an inline function for each operation you do. We'll look into improving the API so it's easier to use the same callback for multiple set and push calls.

Related

How to create a Flutter Stream using MongoDB (watch collection?) with Firebase Cloud Function

I've been trying out MongoDB as database for my Flutter project lately, since I want to migrate from pure Firebase database (some limitations in Firebase are an issue for my project, like the "in-array" limit of 10 for queries).
I already made some CRUD operations methods in some Firebase Cloud Functions, using MongoDB. I'm now able to save data and display it as a Future in a Flutter App (a simple ListView of Users in a FutureBuilder).
My question is : how would it be possible to create a StreamBuilder thanks to MongoDB and Firebase Cloud Functions ? I saw some stuff about watch collection and Stream change but nothing clear enough for me (usually I read a lot of examples or tutorial to understand).
Maybe some of you would have some clues or maybe tutorial that I can read/watch to learn a little bit more about that subject ?
For now, I have this as an example (NodeJS Cloud Function stored in Firebase), which obviously produces a Future in my Future app (not realtime) :
exports.getUsers = functions.https.onCall(async (data, context) => {
const uri = "mongodb+srv://....";
const client = new MongoClient(uri);
await client.connect();
var results = await client.db("myDB").collection("user").find({}).toArray();
await client.close();
return results;
});
What would you advice me to obtain a Stream instead of a Future, using maybe watch collection and Stream change from MongoDB, providing example if possible !
Thank you very much !
Cloud Functions are meant for short-lived operations, not for long-term listeners. It is not possible to create long-lived connections from Cloud Functions, neither to other services (such as you're trying to do to MongoDB here) nor from Cloud Functions back to the calling client.
Also see:
If I implement onSnapshot real-time listener to Firestore in Cloud Function will it cost more?
Can a Firestore query listener "listen" to a cloud function?
the documentation on EventArc, which is the platform that allows you build custom triggers. It'll be (a lot* more. involved though.

How to push console.log from Nodejs application to Elasticsearch

I am using the package elastic-apm-node for sending the APM data to Elastic via Nodejs app. I need to do something like apm.captureError(error) to send the error to Elastic.
Is there some where wherein i can send simple console.log() to Elastic ?
I basically need functionality wherein i can track every function getting called in my App.
Function A (){
console.log("Function A called , send to Elastic");
doSomething()
}
Function doSomething(){
console.log("Function doSomething called, send to Elastic");
}
I got some example using Elastic FileBeat , but i think this should be achievable using APM as the connection is already made between the app and Elastic.
APM libraries like that (from any provider, it's the same with NewRelic or Datadog, etc.) work by instrumenting (basically, wrapping) functions in common libraries like Express and Koa, so they automatically pick up on function calls like executing routes, but won't pick up on something else unrelated. And APM is for performance (and error) monitoring; logs are really a separate concern, which is why there's a separate solution for logs.
But you could do it with custom spans, and overriding console.log or writing a wrapper. Example:
const elasticLog = (data) => {
const span = apm.startSpan(data)
span.end()
}
const logger = process.env.NODE_ENV === 'production' ? elasticLog : console.log
function foo () {
logger('In foo')
bar()
}
function bar () {
logger('In bar')
}
As for automatically detecting names of called functions, that's tricky. And running the logger on every function that's called without having to actually call it is pretty much impossible in this sort of context.

when i use firebase.database().goOnline(); I get an error

this is my code
admin.initializeApp({...});
var db = admin.database();
let ref = db.ref(...);
await ref.once("value",function () {...},);
firebase.database().goOffline();
firebase.database().goOnline();
ref.on("value",function () {...},);
);
when i use firebase.database().goOnline(); I get an error
Firebase: No Firebase App '[DEFAULT]' has been created - call Firebase App.initializeApp() (app/no-app).
at app
You're mixing two ways of addressing Firebase.
You can access Firebase through:
Its client-side JavaScript SDK, in which case the entry point is usually called firebase.
Its server-side Node.js SDK, in which case the entry point is usually called admin.
Now you initialize admin, but then try to access firebase.database(). And since you did call admin.initializeApp that's where the error comes from.
My guess is that you're looking for admin.database().goOnline() or db.goOnline().
Also note that there is no reason to toggle goOffline()/goOnline() in the way you do now, and you're typically better off letting Firebase manage that on its own.

NodeJS stream out of AWS Lambda function

We are trying to migrate our zip microservice from regular application in nodejs Express to AWS API Gateway integrated with AWS Lambda.
Our current application sends request to our API, gets list of attachments and then visits those attachments and pipes their content back to user in form of zip archive. It looks something like this:
module.exports = function requestHandler(req, res) {
//...
//irrelevant code
//...
return getFileList(params, token).then(function(fileList) {
const filename = `attachments_${params.id}`;
res.set('Content-Disposition', `attachment; filename=${filename}.zip`);
streamFiles(fileList, filename).pipe(res); <-- here magic happens
}, function(error) {
errors[error](req, res);
});
};
I have managed to do everything except the part where I have to stream content out of Lambda function.
I think one of possible solutions is to use aws-serverless-express, but I'd like a more elegant solution.
Anyone has any ideas? Is it even possible to stream out of Lambda?
Unfortunately lambda does not support streams as events or return values. (It's hard to find it mentioned explicitly in the documentation, except by noting how invocation and contexts/callbacks are described in the working documentation).
In the case of your example, you will have to await streamFiles and then return the completed result.
(aws-serverless-express would not help here, if you check the code they wait for your pipe to finish before returning: https://github.com/awslabs/aws-serverless-express/blob/master/src/index.js#L68)
n.b. There's a nuance here that a lot of the language SDK's support streaming for requests/responses, however this means connecting to the stream transport, e.g. the stream downloading the complete response from the lambda, not listening to a stream emitted from the lambda.
Had the same issue, now sure how you can do stream/pipe via the native lambda + API Gateway directly... but it's technically possible.
We used Serverless Framework and were able to use XX.pipe(res) using this starter kit (https://github.com/serverless/examples/tree/v3/aws-node-express-dynamodb-api)
What's interesting is that this just wraps over native lambda + API Gateway so, technically it is possible as they have done it.
Good luck

Testing web API using jasmine and node.js

We've written a RESTful web API which responds to GET and PUT requests using node.js.
We're having some difficulties testing the API.
First, we used Zombie.js, but it's not well documented so we couldn't get it to make PUT requests:
var zombie = require("zombie");
describe("description", function() {
it("description", function() {
zombie.visit("http://localhost:3000/", function (err, browser, status) {
expect(browser.text).toEqual("A")
});
});
});
After that we tried using a REST-client called restler, which would OK, since we don't need any advanced browser simulation. This fails due to the fact that the request seems to be asynchronous - i.e. the test is useless since it finishes before the 'on success' callback is called:
var rest = require('restler');
describe("description", function() {
it("description", function() {
rest.get("http://www.google.com").on('complete', function(data, response) {
// Should fail
expect(data).toMatch(/apa/i);
});
});
});
We'd grateful for any tips about alternative testing frameworks or synchronous request clients.
For node, jasmine-node from Misko Hevery has asynchronous support and wraps jasmine.
https://github.com/mhevery/jasmine-node
You add a 'done' parameter to the test signature, and call that when the asynchronous call completes. You can also customize the timeout (the default is 500ms).
e.g. from the Github README
it("should respond with hello world", function(done) {
request("http://localhost:3000/hello", function(error, response, body){
done();
}, 250); // timeout after 250 ms
});
jasmine regular also has support for asynchronous testing with runs and waitsFor, or can use 'done' with Jasmine.Async.
I was curious about this so I did a little more research. Other than zombie, you have a couple of options...
You could use vows with the http library like this guy.
However, I think a better approach might be to use APIeasy, which is apparently built on vows. There is an awesome article over at nodejitsu that explains how to use it.
Another interesting idea is to use expresso if you are using express.

Resources