Firebase and backend logic - node.js

I am parse.com user, and now I look for another service.
How can I write back end logic to firebase?
let say I want to validate all the values on server side, or trigger things. I thought about one solution, but I want to know the recommended way.
I think to
create nodejs server, that uses express.
create middlewares to handle the logic.
send rest request from the app, that triggers the middlewares
use the nodejs sdk of firebase to update the values according to the params of the http request.
And implement on the app firebase handler that listen to changes
their something simpler? In parse I used cloud code, I want that the logic will not be on the client side but on a server side.

Update (March 10, 2017): While the architecture I outline below is still valid and can be used to combine Firebase with any existing infrastructure, Firebase just released Cloud Functions for Firebase, which allows you to run JavaScript functions on Google's servers in response to Firebase events (such as database changes, users signing in and much more).
The common architectures of Firebase applications are pretty well-defined in this blog post Where does Firebase fit in your app?.
The architecture you propose is closest to architecture 3, where your client-side code talks both directly to Firebase and to your node.js server directly.
I also highly recommend that you consider option 2, where all interaction between clients and server runs through Firebase. A great example of this type of architecture is the Flashlight search integration. Clients write their search queries into the Firebase database. The server listens for such requests, executes the query and writes the response back to the database. The client waits for that response.
A simple outline for this server could be:
var ref = new Firebase('https://yours.firebaseio.com/searches');
ref.child('requests').on('child_added', function(requestSnapshot) {
// TODO: execute your operation for the request
var responseRef = ref.child('responses').child(requestSnapshot.key());
responseRef.set(result, function(error) {
if (!error) {
// remove the request, since we've handled it
requestSnapshot.ref().remove();
}
});
})
With this last approach the client never directly talks to your server, which removes all kind of potential problems that you have to worry about. For this reason I sometimes refer to them as "bots", instead of servers.

2017
Today Google announced Cloud Functions for Firebase
https://firebase.google.com/features/functions/
This is a great solution for the architectures and back end logic in Firebase.

Here's what I would do:
Validade all the inputs with the ".validate" rules. No server needed for that.
If you have tasks to run, use Firebase Queue, a bot to run the tasks and you are done.
If you don't do the last one, you may have two problems:
If you try use the diagram you posted it will be a little tricky to get the auth object at the server (but not impossible). Go ahead if you don't need to validate the user to allow the request.
If you use just the regular firebase app to listen to changes and respond (editing the object for instance, like Frank van Puffelen's example code), you might have scalability problems. Once your back end scales to two (or more) instances, a firebase edit will trigger the task on all of them. Each instance will notice there was a change, then run the same task once each, add/replace the response object once each and try to remove the request object once each..
Using Firebase Queue avoids both of these problems.

You can combine these two behaviors simultaneously:
Client side communicates directly with the Database
One excelent thing about the Firebase Realtime & Firestore is that you are able to listen in realtime to database changes. But is important to configure the Security Rules so the client can't modify or read data that he is not suppose to.
Client communicates with a Node.js server (or other server)
The node.js server will have adminstrative privilegies by using the Firebase Admin SDK, it can perform any change in the database regardless how the Firebase Security Rules are configured.
The Client Side should use the Firebase Authentication library to obtain the
ID Token, it will inform to the server on each request (e.g. on headers). For each received request, the node.js server verifies if the ID Token is valid by using the Firebase Admin SDK.
I created a documented GitHub project of a Node.js server that uses Firestore Database and Firebase Authentication, check the example here.

Related

Node.js, Socket.IO, Express: Should app logic be in socket handlers or REST api?

I'm planning a non-trivial realtime chat platform. The app has several types of resources: Users, Groups, Channels, Messages. There are roughly 20 types of realtime events having to do with these resources: for instance, submitting a message, a user connecting or disconnecting, a user joining a group, a moderator kicking a user from a group, etc...
Overall, I see two paths to organizing all this complexity.
The first is to build a REST API to manage the resources. For instance, to send a message, POST to /api/v1/messages. Or, to kick a user from a group, POST to /api/v1/group/:group_id/kick/. Then, from within the Express route handler, call io.emit (made accessible through res.locals) with the updated data to notify all related clients. In this case, clients talk to the server through HTTP and the server notifies clients through socket.io.
The other option is to not have a rest API at all, and handle all events through socket.IO. For instance, to send a message, emit a SEND_MESSAGE event. Or, to kick a user, emit a KICK_USER event. Then, from within the socket.io event handler, call io.emit with the updated data to notify all clients.
Yet another option is to have certain actions handled by a REST API, others by socket.IO. For instance, to get all messages, GET api/v1/channel/:id/messages. But to post a message, emit SEND_MESSAGE to the socket.
Which is the most suitable option? How do I determine which actions need to be sent thorough an API, and which need to be sent through socket.io? Is it better not to have a REST API for this type of application?
Some of my thoughts so far, nothing conclusive:
Advantages of REST API over the socket.io-only approach:
Easier to organize hierarchically, more modular
Easier to test
More robust and elegant
Simpler auth implementation with middleware
Disadvantages of REST API over the socket.io-only approach:
Slightly less performant (source)
Since a socket connection needs to be open anyways, why not use it for everything?
Slightly harder to manage on the client side.
Thanks for reading !
This could be achieve this using sockets.
Why because a chat application will be having dozens of actions, like ..
'STARTS_TYPING', 'STOPS_TYPING', 'SEND_MESSAGE', 'RECIVE_MESSAGE',...
Accommodating all these features using rest api's will generate a complex system which lacks performance.
Also concept of rooms in socket.io simplifies lot of headache regarding group chat implementation.
So its better to build everything based on sockets[socket.io or web cluster].
Here is the solution I found to solve this problem.
The key mistake in my question was that I assumed a rest API and websockets were mutually exclusive, because I intended on integrating the business and database logic directly in express routes and socket.io handlers. Thus, choosing between socket.io and http was important, because it would influence the core business logic of my app.
Instead, it shouldn't matter which transport to use. The business logic has to be independent from the transport logic, in its own module.
To do this, I developed a service layer that handles CRUD tasks, but also more specific tasks such as authentication. Then, this service layer can be easily consumed from either or both express routes and socket.io handlers.
In the end, this architecture allowed me not to easily switch between transport technologies.

How would one run a Node.js script on Firebase?

I have a Node app/script that needs to constantly be running (it's a discord bot, done with discord.js, but I think that's mostly irrelevant), and I'd like to do it on Firebase.
It has its own client.on('event', ()=>{}) events system, so I don't believe that I could use Firebase's cloud functions. There's also what seems to be a website-hosting based way to have a node.js server, but that seems triggered by HTTP requests.
Is there any other way I could do it?
There is no way to run arbitrary node.js code on Firebase. Unless your script can run within Cloud Functions "triggered execution" mode, you'll need your own app server to run it.
You can of course create a service that maps Discord.js events to Firebase events, such as writes to the Realtime Database, Cloud Firestore, even just direct HTTPS calls to a Cloud Functions endpoint. You could even bypass Firebase there and have your mapping service write to Cloud PubSub and use that to trigger Cloud Functions.
One thing that looks promising in the Discord.js documentation is their mention of web hooks, which is just another way of describing HTTP endpoints. But from my quick scan I couldn't figure out if those would allow you to call your HTTP triggered Cloud Function.

Correct way to update frontend when backend changes

I'm currently setting up the following application:
Node backend with Express
Postgres DB with Knex as an interface
React frontend
Everything is working as intended and I am making good progress, my question is more architectural:
What is the preferred/recommended/best way to notify the frontend when database changes occur?
I saw that Postgres has a LISTEN/NOTIFY feature but that is not currently (ever) supported by Knex (https://github.com/tgriesser/knex/issues/285).
My thoughts:
Polling (every x seconds query the DB). This seems wasteful and antiquated but it would be easy to set up.
Sockets. Rewrite all my Express endpoints to use sockets?
?
I'm interested to see how others handle this.
Thanks!
I've had a similar situation before. I have a front end which connects via web sockets to the API. The API emits a message on successful database commit with the API endpoint matching the update. The front end components listen for these update socket messages and if the updated type is relevant to that component the component will query the API endpoint over https for the new data. Using a web socket only to advertise that an update is available won't necessitate rewriting the entire API.

How are Node.js+Socket.io+MongoDB webapps truly asynchronous?

I have a good old-style LAMP webapp. A week ago I needed to add a push notification mechanism to it.
Therefore, what I did was to add node.js+socket.io on the server and poll the MySQL database every 10 seconds using node.js to check whether there were new items: if so, I would have sent them to the client(s) with socket.io.
I was pretty happy with the result, even if that is not a proper realtime notification (as there is a lag of up to 10 secs).
Now, I am about to build a new webapp which will need push notifications, too. I am wondering whether to go with the same approach as the first one (that I believe is more stable and mature) or to go totally Node.js, without PHP and Apache. As for the database, I have already decided to go for MongoDB.
Finally, my question is: if I go for Node.js+Socket.io+MongoDB will I get a truly near-real-time webapp? I mean, as soon as a new record is inserted into MongoDB, will there be some sort of event triggered that I can catch via node.js, do some checking on it and, if relevant, send the notification to the client? Or will there be anyway some sort of polling on the db server-side and lag, as with my first LAMP webapp?
A related question: can you build a realtime webapp on MySQL without doing any polling as I did with my first app. Or do you need MongoDB (or Redis)?
I hope this question is not too silly - sorry, I am just starting with Node.js and co.
Thanks.
I understand your problem because I switched to node.js from php/apache/mysql too.
Generally node.js is stable, modules and your scripts are the main reasons for errors
Real-time has nothing to do with database, it's all about client and server, you can query as many data as you want in your requests and push it to the other client.
Choosing node.js is very wise but it's harder to implement.
When you insert a new record to your db, the event is the request itself, you will make a push event along with the database query something like:
// Please note this is not real code, just an example of the idea
app.get('/query', function(request, response){
// Query your database
db.query('SELECT * FROM users', function(rows){
// Push notification to dan
socket.emit('database_query_executed', 'to_dan', rows);
// End request
response.end('success');
})
})
Of course you can use MySQL! And any database you want, as I said real-time has nothing to do with databases because the database is in the middle of the process and it's totally optional.
If you want to use node.js for push notifications and php/apache for mysql then you will need to create 2 requests for each server something like:
// this is javascript
ajax('http://node.yoursite.com/push', node_options)
ajax('http://php.yoursite.com/mysql_query', php_options)
or if you want just one request, or you want to use a form, you can call your php and inside php you can create an http or net request to node.js from php, something like:
// this is php
new HttpRequest('http://node.youtsite.com/push', HttpRequest::METH_GET);
Using:
A regular MongoDB Collection as the Store,
A MongoDB Capped Collection with Tailable Cursors as the Queue,
A Node worker with Socket.IO watching the Queue as the Worker,
A Node server to serve the page with the Socket.IO client, and to receive POSTed data (or however else the data gets added) as the Server
It goes like:
The new data gets sent to the Server,
The Server puts the data in the Store,
The Server adds the data's ObjectID to the Queue,
The Queue will send the newly arrived ObjectID to the open Tailable Cursor on the Worker,
The Worker goes and gets the actual data in the ObjectID from the Store,
The Worker emits the data through the socket,
The client receives the data from the socket.
This is 'push' from the initial addition of the data all the way to receipt at the client - no polling, so as real-time as you can get given the processing time at each step.
Re: triggers in MongoDB - please see this answer: https://stackoverflow.com/a/12405093/1651408
There are much more convenient triggers in MySQL, but to call Node.js from them would require a bit of work with MySQL UDFs (user-defined functions), for instance pushing data through a Unix socket. Please note that this is necessary only when other applications (besides your Node.js process) are updating the database, and be sure to choose InnoDB as storage in this case (row- vs. table-level locking).
Can see no big problem with your technology choice of sockets.io, even if client-side web sockets aren't supported, you'll fall back (gracefully, I hope) to polling.
Finally, your question is not silly at all, since push technology is definitely superior to the flood of polling requests - it scales better. EDIT: However, would not describe either technology as real-time.
Another EDIT: for a quite well-known and successful setup of this kind please read this: http://blog.fogcreek.com/the-trello-tech-stack/
Have you discovered Chole? It works separately from your web sever and interfaces with it by using HTTP POSTs. That way you can code your web app any which way you want.
Actually Using Push Technology like Socket.IO helps you to use
the server's resource efficiently and also helps you to leverage old browsers to modern browsers making websocket or websocket-like connection.
10 sec polling is a HTTP request which is expensive especially when a lot of users present.
Unlike polling technology, push technology is relatively cheap. Users' client is opening a dedicated socket(ie. websocket) to listen to the server's push notification.
And usually your client-side JavaScript do some actions when the push notification is received.
Using your LAMP stack and Socket.IO with different port (other than 80) will be good enough to implement what you need.
But using Node.js + MongoDB + Socket.IO actually helps you to manage your server's resource much efficiently.
Because those three have non-blocking nature.
If you understand non-blocking concept correctly and implement your app appropriately,
your identical app, an app with same feature but with different language and different database, would be able to handle a lot more requests than general LAMP stack.
Above picture is a famous chart of comparing Non-blocking vs Thread way to handle concurrency
Apache(Thread) vs Nginx(Non-blocking)
MySQL is a great database. I believe you won't need join and transactions for realtime notification.
MongoDB does not have those two features unless you implement similar features by yourself.
Because of not having those two and some characteristics of its own, MongoDB can store and fetch data much faster than traditional SQL databases.
Switching from MySQL to MongoDB will decrease the time taking to insert and fetch data.
with JS you can open a socket to your server (not old browser), the server will have a ah-hoc program (on an ad-hoc port, so you need the permission to open door and run program on your server) that will send data (almost) realtime from and to the client, and without the HTTP's protocol overhead.old browser will just fall-back to polling mechanism.
I can't see other way to do this (probably there are already "coocked" framework that do this)

RESTful backend and socket.io to sync

Today, i had the idea of the following setup. Create a nodejs server along with express and socket.io. With express, i would create a RESTful API, which is connected to a mongo. BackboneJS or similar would connect the client to that REST API.
Now every time the mongodb(ie the data in it iam interested in) changes, socket.io would fire an event to the client, which would carry a courser to the data which has changed. The client then would trigger the appropriate AJAX requests to the REST to get the new data, where it needs it.
So, the socket.io connection would behave like a synchronize trigger. It would be there for the entire visit and could also manage sessions that way. All the payload would be send over http.
Pros:
REST API for use with other clients than web
Auth could be done entirely over socket.io. Only sending token along with REST requests.
Use the benefits of REST.
Would also play nicely with pub/sub service like Redis'
Cons:
Greater overhead, than using pure socket.io.
What do you think, are there any great disadvantages i did not think of?
I agree with #CharlieKey, you should send the updated data rather than re-requesting.
This is exactly what Tower is doing:
save some data: https://github.com/viatropos/tower/blob/development/src/tower/model/persistence.coffee#L77
insert into mongodb (cursor is a query/persistence abstraction): https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L29
notify sockets: https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L68
emit updated records to client: https://github.com/viatropos/tower/blob/development/src/tower/server/net/connection.coffee#L62
The disadvantage of using sockets as a trigger to re-request with Ajax is that every connected client will have to fetch the data, so if 100 people are on your site there's going to be 100 HTTP requests every time data changes - where you could just reuse the socket connections.
I think that pushing the updated data with the socket.io event would be better than re-requesting the lastest. Even better you could only push the modified pieces of data decreasing the amount of data sent over the line. Overall though a interesting idea.
I'd look into Now.js since it does pretty much exactly what you need.
It creates a namespace which is shared among the client and server. The server can call functions on the client directly and vice versa.
That is if you insist on your current infrastructure decision to use MongoDB and Node.js, otherwise there would be CouchDB which is a full web server and document database with sophisticated replication mechanisms built-in.

Resources