derbyjs running code on server from x-bind - node.js

I'm trying to run some server only code from an event on the client in derby.js
I'm using x-bind to bind the event on the view like so:
click me
and on the app:
exports.func=function(e,el,next){
// i want to run some server code here, but it runs on the client only
}
So:
Can this be done in any way?
if not, is there any way to use sockets in a 'native' way on derby.js
I simply don't want to fall back to ajax with server routes when all the rest is real time.

You can route the request onto the server via the model ( model.fetch() & model.subscribe() ). If it's just retrieving some data from the server, you are basically all set. Keep a reference to the model for when you need it (in app.ready callback as pointed out by switz).
To use sockets directly or to extend the model (which uses sockets in the background) see
https://groups.google.com/forum/?pli=1#!topic/derbyjs/60gouek7334

Related

How to use socket.io properly with express app

I wonder how do I use socket.io properly with my express app.
I have a REST API written in express/node.js and I want to use socket.io to add real-time feature for my app. Consider that I want to do something I can do just by sending a request to my REST API. What should I do with socket.io? Should I send request to the REST API and send socket.io client the result of the process or handle the whole process within socket.io emitter and then send the result to socket.io client?
Thanks in advance.
Question is not that clear but from what I'm getting from it, is that you want to know what you would use it for that you cant already do with your current API?
The short answer is, well nothing really.. Websockets are just the natural progression of API's and the need for a more 'real-time' interface between systems.
Old methods (and still used and relevant for the right use case) is long polling where you keep checking back to the server for updated items and if so grab them.. This works but it can be expensive in terms of establishing a connection, performing a lookup, then closing a connection.
websockets keep that connection open, allowing both the client and server to communicate real time. So for example, lets say you make an update to your backend data and want users to get that update, using long polling you would rely on each client to ping back to the server, check if there is an update and if so grab it. This can cause lags between updates, some users have updated data while other do not etc.
Now, take the same scenario with websockets, you make an update to the backend data, hit submit, this then emits to your socket server. Socket server takes the call, performs the task ( grabs updated data ) and emits it to the users, each connected user instantly gets that update.
Socket servers are typically used for things like real time chats or polling where packets are smaller but they are also used for web games etc. Depending on the size of your payloads will determine how best to send data back and forth because the larger the payload the more resources / bandwidth it will take on the socket server so its something to consider.

Changing NodeJS variables at runtime

I've got a simple FeathersJS server. I want to change certain things in its config without restarting it, such as disallow creating new users.
So, basically, I want to change a couple of variables at runtime.
It definitely sounds like something that's got complete solutions written somewhere.
My thoughts are to expose a callback that, according to its args, changes the variables of interest, then call the callback with something like inquirer or expose WebSockets that execute the callback accordingly.
Please tell me how to best solve this problem, and - if possible - point me to some existing solutions as well.
To change something inside your nodejs process from outside the process, you need to expose some external connectivity that can make the change for you. There are an infinite set of choices, but a webSocket/socket.io server or an http server are probably the most common.
If this is typically just a transaction, not a whole sequence of changes, then a simple http server on a port that is not accessible from the outside world is probably the most appropriate way to do this. You then just create routes on the http server that accept the desired changes and the code in the routes themselves changes the internal variables accordingly. If you chose some other interface such as a webSocket or socket.io server, the concept would be the same.

Is there a way to run a node task in a child process?

I have a node server, which needs to:
Serve the web pages
Keep querying an external REST API and save data to database and send data to clients for certain updates from REST API.
Task 1 is just a normal node tasks. But I don't know how to implement the task 2. This task won't expose any interface to outside. It's more like a background task.
Can anybody suggest? Thanks.
To make a second node.js app that runs at the same time as your first one, you can just create another node.js app and then run it from your first one using child_process.spawn(). It can regularly query the external REST API and update the database as needed.
The part about "Send data to clients for certain updates from REST API" is not so clear what you're trying to do.
If you're using socket.io to send data to connected browsers, then the browsers have to be connected to your web server which I presume is your first node.js process. To have the second node.js process cause data to be sent through the socket.io connections in the first node.js process, you need some interprocess way to communicate. You can use stdout and stdin via child_process.spawn(), you can use some feature in your database or any of several other IPC methods.
Because querying a REST API and updating a database are both asynchronous operations, they don't take much of the CPU of a node.js process. As such, you don't really have to do these in another node.js process. You could just have a setInterval() in your main node.js process, query the API every once in a while, update the database when results are received and then you can directly access the socket.io connections to send data to clients without having to use a separate process and some sort of IPC mechanism.
Task 1:
Express is good way to accomplish this task.
You can explore:
http://expressjs.com/
Task 2:
If you are done with Expressjs. Then you can write your logic with in Express Framework.
This task then can be done with node module forever. Its a simple tool that runs your background scripts forever. You can use forever to run scripts continuously (whether it is written in node.js or not)
Have a look:
https://github.com/foreverjs/forever

How are Node.js+Socket.io+MongoDB webapps truly asynchronous?

I have a good old-style LAMP webapp. A week ago I needed to add a push notification mechanism to it.
Therefore, what I did was to add node.js+socket.io on the server and poll the MySQL database every 10 seconds using node.js to check whether there were new items: if so, I would have sent them to the client(s) with socket.io.
I was pretty happy with the result, even if that is not a proper realtime notification (as there is a lag of up to 10 secs).
Now, I am about to build a new webapp which will need push notifications, too. I am wondering whether to go with the same approach as the first one (that I believe is more stable and mature) or to go totally Node.js, without PHP and Apache. As for the database, I have already decided to go for MongoDB.
Finally, my question is: if I go for Node.js+Socket.io+MongoDB will I get a truly near-real-time webapp? I mean, as soon as a new record is inserted into MongoDB, will there be some sort of event triggered that I can catch via node.js, do some checking on it and, if relevant, send the notification to the client? Or will there be anyway some sort of polling on the db server-side and lag, as with my first LAMP webapp?
A related question: can you build a realtime webapp on MySQL without doing any polling as I did with my first app. Or do you need MongoDB (or Redis)?
I hope this question is not too silly - sorry, I am just starting with Node.js and co.
Thanks.
I understand your problem because I switched to node.js from php/apache/mysql too.
Generally node.js is stable, modules and your scripts are the main reasons for errors
Real-time has nothing to do with database, it's all about client and server, you can query as many data as you want in your requests and push it to the other client.
Choosing node.js is very wise but it's harder to implement.
When you insert a new record to your db, the event is the request itself, you will make a push event along with the database query something like:
// Please note this is not real code, just an example of the idea
app.get('/query', function(request, response){
// Query your database
db.query('SELECT * FROM users', function(rows){
// Push notification to dan
socket.emit('database_query_executed', 'to_dan', rows);
// End request
response.end('success');
})
})
Of course you can use MySQL! And any database you want, as I said real-time has nothing to do with databases because the database is in the middle of the process and it's totally optional.
If you want to use node.js for push notifications and php/apache for mysql then you will need to create 2 requests for each server something like:
// this is javascript
ajax('http://node.yoursite.com/push', node_options)
ajax('http://php.yoursite.com/mysql_query', php_options)
or if you want just one request, or you want to use a form, you can call your php and inside php you can create an http or net request to node.js from php, something like:
// this is php
new HttpRequest('http://node.youtsite.com/push', HttpRequest::METH_GET);
Using:
A regular MongoDB Collection as the Store,
A MongoDB Capped Collection with Tailable Cursors as the Queue,
A Node worker with Socket.IO watching the Queue as the Worker,
A Node server to serve the page with the Socket.IO client, and to receive POSTed data (or however else the data gets added) as the Server
It goes like:
The new data gets sent to the Server,
The Server puts the data in the Store,
The Server adds the data's ObjectID to the Queue,
The Queue will send the newly arrived ObjectID to the open Tailable Cursor on the Worker,
The Worker goes and gets the actual data in the ObjectID from the Store,
The Worker emits the data through the socket,
The client receives the data from the socket.
This is 'push' from the initial addition of the data all the way to receipt at the client - no polling, so as real-time as you can get given the processing time at each step.
Re: triggers in MongoDB - please see this answer: https://stackoverflow.com/a/12405093/1651408
There are much more convenient triggers in MySQL, but to call Node.js from them would require a bit of work with MySQL UDFs (user-defined functions), for instance pushing data through a Unix socket. Please note that this is necessary only when other applications (besides your Node.js process) are updating the database, and be sure to choose InnoDB as storage in this case (row- vs. table-level locking).
Can see no big problem with your technology choice of sockets.io, even if client-side web sockets aren't supported, you'll fall back (gracefully, I hope) to polling.
Finally, your question is not silly at all, since push technology is definitely superior to the flood of polling requests - it scales better. EDIT: However, would not describe either technology as real-time.
Another EDIT: for a quite well-known and successful setup of this kind please read this: http://blog.fogcreek.com/the-trello-tech-stack/
Have you discovered Chole? It works separately from your web sever and interfaces with it by using HTTP POSTs. That way you can code your web app any which way you want.
Actually Using Push Technology like Socket.IO helps you to use
the server's resource efficiently and also helps you to leverage old browsers to modern browsers making websocket or websocket-like connection.
10 sec polling is a HTTP request which is expensive especially when a lot of users present.
Unlike polling technology, push technology is relatively cheap. Users' client is opening a dedicated socket(ie. websocket) to listen to the server's push notification.
And usually your client-side JavaScript do some actions when the push notification is received.
Using your LAMP stack and Socket.IO with different port (other than 80) will be good enough to implement what you need.
But using Node.js + MongoDB + Socket.IO actually helps you to manage your server's resource much efficiently.
Because those three have non-blocking nature.
If you understand non-blocking concept correctly and implement your app appropriately,
your identical app, an app with same feature but with different language and different database, would be able to handle a lot more requests than general LAMP stack.
Above picture is a famous chart of comparing Non-blocking vs Thread way to handle concurrency
Apache(Thread) vs Nginx(Non-blocking)
MySQL is a great database. I believe you won't need join and transactions for realtime notification.
MongoDB does not have those two features unless you implement similar features by yourself.
Because of not having those two and some characteristics of its own, MongoDB can store and fetch data much faster than traditional SQL databases.
Switching from MySQL to MongoDB will decrease the time taking to insert and fetch data.
with JS you can open a socket to your server (not old browser), the server will have a ah-hoc program (on an ad-hoc port, so you need the permission to open door and run program on your server) that will send data (almost) realtime from and to the client, and without the HTTP's protocol overhead.old browser will just fall-back to polling mechanism.
I can't see other way to do this (probably there are already "coocked" framework that do this)

Using node.js and socket.io with PHP application

I have working PHP application. It allows user create private projects and invite others in it. Now, using node.js and socket.io, I want to make real-time comments, posts etc.
What is the best architecture?
I see two solutions now.
The first is:
User sends AJAX query to PHP backend:
http://example.com/comment_add.php?text=...
comment_add.php adds
comment to database and via AMPQ (or something better?) notifies
node.js server which broadcasts comment to channel's subscribers.
The second is:
User sends AJAX query to node.js server: http://example.com:3000/comment_add
Node.js sends request to PHP backend (But how? And what about authorization?), receives response, and then broadcasts to channel's subscribers.
What is the best way? Is there another methods? How to implement this properly?
When you decided to use node.js + socket.io to make a real-time web-app, you don't need to think about PHP anymore and forget Ajax also... Socket.io will be the communication between client and server.
But yes, you can use Ajax and PHP for building websites fast, and some other functions that don't need real-time
The second way is the best method. You can use http to communicate with PHP from node.js. Authorization can be done in node.js but passing auth credentials everytime to PHP
Finally my working solution is #1.
When user establishing connection to node.js/socket.io he just send 'subscribe' message to node.js with his PHP session id. Node.js checks authorization using POST request to PHP backend and if all is OK allows user to establish connection.
Frontend sends all requests to PHP as it was before node.js.
PHP modifies some object, checks who can access modified object and sends message (via AMQP or redis pub/sub etc.) to node.js:
{
id_object: 125342,
users: [5, 23, 9882]
}
node.js then check who from listed users have active sockets and for each user sends GET request to PHP:
{
userId: 5,
id_object: 125342
}
Special PHP controller receiving this request runs query to get object with rights of given user id and then sends message to node.js with resulting answer. Node.js then via socket sends answer to user's frontend.
I faced this same question a year ago when starting my final year project at University. I realized that my project was much better suited to using Node as a standalone. Node is very good at dealing with I/O, this can be anything from a HTTP requests to a database query. Adding in a PHP based Web Server behind Node is going to add un-needed complexity. If your application needs to perform CPU intensive tasks you can quite easilly spawn 'child' node processed which perform the needed operation, and return the result to your parent node.
However out of the two you methods you have mentioned I would choose #2. Node.js can communicate with your PHP server in a number of ways, you could look at creating a unix socket connection between your PHP server and Node. If that is unavailable you could simply communicate between Node and your PHP back end using HTTP. :)
Take a look here, here is a solution to a question very similar to your own:
http://forum.kohanaframework.org/discussion/comment/57607

Resources