How to create a Flutter Stream using MongoDB (watch collection?) with Firebase Cloud Function - node.js

I've been trying out MongoDB as database for my Flutter project lately, since I want to migrate from pure Firebase database (some limitations in Firebase are an issue for my project, like the "in-array" limit of 10 for queries).
I already made some CRUD operations methods in some Firebase Cloud Functions, using MongoDB. I'm now able to save data and display it as a Future in a Flutter App (a simple ListView of Users in a FutureBuilder).
My question is : how would it be possible to create a StreamBuilder thanks to MongoDB and Firebase Cloud Functions ? I saw some stuff about watch collection and Stream change but nothing clear enough for me (usually I read a lot of examples or tutorial to understand).
Maybe some of you would have some clues or maybe tutorial that I can read/watch to learn a little bit more about that subject ?
For now, I have this as an example (NodeJS Cloud Function stored in Firebase), which obviously produces a Future in my Future app (not realtime) :
exports.getUsers = functions.https.onCall(async (data, context) => {
const uri = "mongodb+srv://....";
const client = new MongoClient(uri);
await client.connect();
var results = await client.db("myDB").collection("user").find({}).toArray();
await client.close();
return results;
});
What would you advice me to obtain a Stream instead of a Future, using maybe watch collection and Stream change from MongoDB, providing example if possible !
Thank you very much !

Cloud Functions are meant for short-lived operations, not for long-term listeners. It is not possible to create long-lived connections from Cloud Functions, neither to other services (such as you're trying to do to MongoDB here) nor from Cloud Functions back to the calling client.
Also see:
If I implement onSnapshot real-time listener to Firestore in Cloud Function will it cost more?
Can a Firestore query listener "listen" to a cloud function?
the documentation on EventArc, which is the platform that allows you build custom triggers. It'll be (a lot* more. involved though.

Related

when i use firebase.database().goOnline(); I get an error

this is my code
admin.initializeApp({...});
var db = admin.database();
let ref = db.ref(...);
await ref.once("value",function () {...},);
firebase.database().goOffline();
firebase.database().goOnline();
ref.on("value",function () {...},);
);
when i use firebase.database().goOnline(); I get an error
Firebase: No Firebase App '[DEFAULT]' has been created - call Firebase App.initializeApp() (app/no-app).
at app
You're mixing two ways of addressing Firebase.
You can access Firebase through:
Its client-side JavaScript SDK, in which case the entry point is usually called firebase.
Its server-side Node.js SDK, in which case the entry point is usually called admin.
Now you initialize admin, but then try to access firebase.database(). And since you did call admin.initializeApp that's where the error comes from.
My guess is that you're looking for admin.database().goOnline() or db.goOnline().
Also note that there is no reason to toggle goOffline()/goOnline() in the way you do now, and you're typically better off letting Firebase manage that on its own.

What is the best way to stream data in real time into Big Query (using Node)?

I want to stream HTTP requests into BigQuery, in real time (or near real time).
Ideally, I would like to use a tool that provides an endpoint to stream HTTP requests to and allows me to write simple Node such that:
1. I can add the appropriate insertId so BigQuery can dedupe requests if necessary and
2. I can batch the data so I don't send a single row at a time (which would result in unnecessary GCP costs)
I have tried using AWS Lambdas or Google Cloud Functions but the necessary setup for this problem on those platforms far exceeds the needs of the use case here. I assume many developers have this same problem and there must be a better solution.
Since you are looking for a way to stream HTTP requests to BigQuery and also send them in batch to minimize Google Cloud Platform costs, you might want to take a look at the public documentation where this issue is explained.
You can also find a Node.js template on how to perform the stream insert into BigQuery:
// Imports the Google Cloud client library
const {BigQuery} = require('#google-cloud/bigquery');
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const projectId = "your-project-id";
// const datasetId = "my_dataset";
// const tableId = "my_table";
// const rows = [{name: "Tom", age: 30}, {name: "Jane", age: 32}];
// Creates a client
const bigquery = new BigQuery({
projectId: projectId,
});
// Inserts data into a table
await bigquery
.dataset(datasetId)
.table(tableId)
.insert(rows);
console.log(`Inserted ${rows.length} rows`);
As for the batch part, the recommended ratio is to use 500 rows per request even though it can be up to 10,000. More information about that Quotas & Limits for streaming inserts can be found in the public documentation.
You can make use of Cloud functions. With the help of cloud functions, you can create your own API in Node JS and then it can be used for Streaming data in BQ.
Target Architecture for STREAM will be like this:
Pubsub Subscriber (PUSH TYPE) -> Google Cloud Function -> Google Big Query
You can make use of this API in batch mode as well with the help of Cloud Composer (i.e. Apache Airflow) or Cloud Scheduler to schedule your API as per your requirements.
Target Architecture for BATCH will be like this:
Cloud Scheduler/Cloud Composer -> Google Cloud Function -> Google Big Query

API that will continuously return data

Beginner here, I'm using Firebase real time database and I need my API to constantly return that value when something has been added see my code below.
apiCalls.get('/api/getallusers',function(req,res){
userFunc.getAllUsers(function(err,result){
if (err) return res.status(500).send('internal server error!');
res.status(200).write(JSON.stringify(result));
res.end();
return res;
})
})
this will return the error
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
but if i remove res.end it will show 1 record and constantly load until the page times out..
is what I'm doing possible or are there different ways to do it.
also I'm using firebase cloud functions for this api.
UPDATE:
Uploaded the API but it does not return anything...
here is the link https://us-central1-testproject-e6819.cloudfunctions.net/api1/api/getUser
tried axios and Event Source
Firebase functions logs the values but it does not return it..
If you're viewing the API response like a web page, your browser is buffering the data it's received until there's enough of it to form a more full page. Your browser is expecting content that ends, not some endless stream of data.
You should remove .end() if you expect to be able to continue to write to the output stream.
Also, I recommend using the Server-Sent Events (SSE) protocol for this. https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events It provides a nice standards-based abstraction that makes it very easy to handle event streams client-side.
const eventSource = new EventSource('https://api.example.com/someApi');
eventSource.addEventListener('userupdate', (e) => {
console.log(e.data);
});
Server-side, there are a couple Express-based middlewares to make this even easier than it already is.
Operations in Cloud Functions must be relatively short-lived and end deterministically. There is no way to keep a connection open from Cloud Functions to the client.
Typically consider what triggers the need to send new data. For example, if it is triggered by the fact that a new user is registered, you can use trigger your Cloud Functions from Firebase Authentication. Then the function could for example write to the Realtime Database (or Cloud Firestore), and your client/app listens to the database for realtime updates. That way you're using all the pieces of Firebase in the way they're designed: Cloud Functions for short-lived updates triggered from events in the system, and the Realtime Database or Cloud Firestore for sending realtime updates.
If that doesn't work for your use-case, you'll need a runtime environment that allows you to keep processes alive. Something like App Engine flex, Kubernetes, or many other options come to mind for that.

How to get Google cloud function execution event in node.js using firestore

Below is google cloud function , deployed properly and is working fine
path to function - functions/index.js
const functions = require('firebase-functions');
const admin = require("firebase-admin");
admin.initializeApp();
exports.createUser = functions.firestore
.document('users/{userId}')
.onCreate((snap, context) => {
const newValue = snap.data();
console.log(newValue);
});
how can i access this function's event on successful invocation in node.js app
something like
const myFunctions = require("./functions/index");
myFunctions.createUser().then((data) => {
console.log(data)
})
.catch((err) => {
console.log(err);
})
As of now getting below error
Your createUser Cloud Function is triggered by a Firestore onCreate() event type and therefore will be "triggered when a document is written for the first time", as per the documentation.
The doc also adds the following:
In a typical lifecycle, a Cloud Firestore function does the following:
Waits for changes to a particular document. (In this case when the document is written for the first time)
Triggers when an event occurs and performs its tasks
Receives a data object that contains a snapshot of the data stored in the specified document.
Therefore, if you want to trigger this Cloud Function from "the outside world", e.g. from a node.js app, you need to create a new Firestore document at the corresponding location, i.e. under the users collection. To this end you would use the Node.js Server SDK, see https://cloud.google.com/nodejs/docs/reference/firestore/0.14.x/
Note that you could also trigger it from a client application (web, android, iOS) by creating a new user doc with the corresponding client SDK.
Update following your comments:
You cannot directly "port" and run your code written for Cloud Functions to a Node.js app. You will have to re-develop your solution for Node.js.
In your case you should use the Node.js Server SDK (as mentionned in my comment) and you could use the onSnapshot method of a CollectionReference. See https://cloud.google.com/nodejs/docs/reference/firestore/0.14.x/CollectionReference#onSnapshot
I will try to answer your question, but it's a bit unclear. You asked:
How to get Google cloud function execution event
Well, the event has started when the funcion triggers and your code is running, i.e your line const newValue = snap.data()
Maybe you are looking for a way to do certain tasks, when the trigger has run? You simply just do that from inside the function, and return a promise. If you for example had multiple async tasks to run, you could use a Promise.all([]).

How to use Firebase Database in Google Actions?

I'm new to programming Actions for Google Home/Assistant.
I have been using the Inline-Editor under Fulfilment lately and it works fine. Now I want to start using the Firebase DB.
As it says const functions = require('firebase-functions'); in the first lines of the Inline Editor I am assuming, that the Database is ready to use?
If so, how do I access it?
Although Dialogflow uses Firebase Functions to let you do inline code-editing, it doesn't sound like it is a full-fledged Firebase environment. There may be APIs on the back-end that are not setup.
The Dialogflow In-line Fulfillment is meant for simple logic testing and simple operations.
Fortunately - it isn't difficult to take that code and expand it into code that you write... and still host on Firebase Functions! See https://firebase.google.com/docs/functions/get-started for the tools you'll need to install to get started.
For a more extensive tutorial about writing Firebase Functions that work with Dialogflow and getting started with Firebase Functions, you can take a look at the codelab from Google at https://codelabs.developers.google.com/codelabs/assistant-dialogflow-nodejs/index.html
You can use the Google Realtime Database package firebase-admin
const admin = require("firebase-admin");
admin.initializeApp(functions.config().firebase);
const db = admin.database();
const ref = db.ref("/");
And to set a value in the database
ref.set({yourKey: 'value'});

Resources