Overwriting Backbone.sync for socket.io - node.js

Im working on a socket.io based server/client connection instead of ajax.
Client uses Backbone and I overwritten the Backbone.sync function with one
half assed of my own:
Backbone.sync = function (method, collection, options) {
// use the window.io variable that was attached on init
var socket = window.io.connect('http://localhost:3000');
// emit the collection/model data with standard ajax method names and options
socket.emit(method,{collection:collection.name,url:collection.url});
// create a model in the collection for each frame coming in through that connection
socket.on(collection.url,function(socket_frame){
collection.create(socket_frame['model']);
})
};
Instead of ajax calls I simply emit through socket attached to window.io
global var. Server listens to those emits and based on the model url, I don't want to change that behaviour and I use the default crud method names (read,patch...) inside each emited frame. The logic behind it (its a bit far thought, but who knows) that in case the client doesn't support Websockets I can easily fallback to default jQuery ajax. I attached the orginal Backbone.sync to a var so I can pass the same arguments to it when no websocket is available.
All it that greatness behalves properly and the server answers to the client events. The server emits then each model data as a seperate websocket frames in one connection.
I see the frames in the Network/Websocket filter as one (concurrent/established) connection
and things seems to be working
Currently the function assumes I pass a collection and not a model.
Questions:
Is that approach ok with you?
How can I use the socket.io callbacks on 'success' and 'failure' etc in Backbone the right way so I don't have to call the collection.create function 'by-hand'?
Is it better to establish different concurrent connections for models/collections or use the one already established instead?

Related

Storing custom objects in WebSocket in Node.js

I'm using a WebSocketServer. When a new client connects I need to store data about that client as a custom object for each client and I want to be able to store it in the WebSocket so when the server receives a new message from that client I can quickly access my custom object from the WebSocket itself. Is there any way to do this or do I just have to create a map for all my objects and lookup the object from the WebSocket each time based on it's ID? Thanks
The Map idea will work to lookup each connection's data, but there's also no reason why you can't just add your own custom property to the webSocket object.
wss.on('connection', socket => {
socket.userInfo = {someProperty: 'someValue'};
socket.on('message', data => {
console.log(socket.userInfo);
});
});
Then, the .userInfo property will be available any time you get a message on that socket object.
Keep in mind that this info is temporal and only good for the lifetime of the socket object. So, if the client disconnects and reconnects (such as when changing web pages), that will generate a new socket object that will not have the previous .userInfo property on it. So, if that information needs to persist longer than the lifetime of one connection, you will have to store it elsewhere and index it by some persistent user property, not just the socket object.

What is the proper way to emit an event with socket.io?

I want to emit an event to the client when a long fucntion comes to an end.
This will show a hidden div with a link - on the client side.
This is the approach i tested:
//server.js
var express = require('express');
var app = express();
var http = require('http').Server(app);
var io = require('socket.io')(http);
require('./app/routes.js')(app, io);
//routes.js
app.post('/pst', function(req, res) {
var url = req.body.convo;
res.render('processing.ejs');
myAsyncFunction(url).then(result => {
console.log('Async Function completed');
socket.emit('dlReady', { description: 'Your file is ready!'});
//do some other stuff here
}).catch(err => {
console.log(err);
res.render('error.ejs');
})
});
I get this
ERROR: ReferenceError: socket is not defined
If i change the socket.emit() line to this:
io.on('connection', function(socket) {
socket.emit('dlReady', { description: 'Your file is ready!'});
});
Then i don't receive an error, but nothing happens at the client.
This is the client code:
<script>
document.querySelector('.container2').style.display = "none";
var socket = io();
socket.on('dlReady', function(data) { //When you receive dlReady event from socket.io, show the link part
document.querySelector('.container1').style.display = "none";
document.querySelector('.container2').style.display = "block";
});
</script>
This whole concept is likely a bit flawed. Let me state some facts about this environment that you must fully understand before you can follow what needs to happen:
When the browser does a POST, there's an existing page in the browser that issues the post.
If that POST is issued from a form post (not a post from Javascript in the page), then when you send back the response with res.render(), the browser will close down the previous page and render the new page.
Any socket.io connection from the previous page will be closed. If the new page from the res.render() has Javascript in it, when that Javascript runs, it may or may not create a new socket.io connection to your server. In any case, that won't happen until some time AFTER the res.render() is called as the browser has to receive the new page, parse it, then run the Javascript in it which has to then connect socket.io to your server again.
Remember that servers handle lots of clients. They are a one-to-many environment. So, you could easily have hundreds or thousands of clients that all have a socket.io connection to your server. So, your server can never assume there is ONE socket.io connection and sending to that one connection will go to a particular page. The server must keep track of N socket.io connections.
If the server ever wants to emit to a particular page, it has to create a means of figuring out which exact socket.io connect belongs to the page that it is trying to emit to, get that particular socket and call socket.emit() only on that particular socket. The server can never do this by creating some server-wide variable named socket and using that. A multi-user server can never do that.
The usual way to "track" a given client as it returns time after time to a server is by setting a unique cookie when the client first connects to your server. From then on, every connection from that client to your server (until the cookie expires or is somehow deleted by the browser) whether the client is connection for an http request or is making a socket.io connection (which also starts with an http request) will present the cookie and you can then tell which client it is from that cookie.
So, my understanding of your problem is that you'd like to get a form POST from the client, return back to the client a rendered processing.ejs and then sometime later, you'd like to communicate with that rendered page in the client via socket.io. To do that, the following steps must occur.
Whenever the client makes the POST to your server, you must make sure there is a unique cookie sent back to that client. If the cookie already exists, you can leave it. If it does not exist, you must create a new one. This can be done manually, or you can use express-session to do it for you. I'd suggest using express-session because it will make the following steps easier and I will outline steps assuming you are using express-session.
Your processing.ejs page must have Javascript in it that makes a socket.io connection to your server and registers a message listener for your "dlready" message that your server will emit.
You will need a top-level io.on('connection', ...) on your server that puts the socket into the session object. Because the client can connect from multiple tabs, if you don't want that to cause trouble, you probably have to maintain an array of sockets in the session object.
You will need a socket.on('disconnect', ...) handler on your server that can remove a socket from the session object it's been stored in when it disconnects.
In your app.post() handler, when you are ready to send the dlready message, you will have to find the appropriate socket for that browser in the session object for that page and emit to that socket(s). If there are none because the page you rendered has not yet connected, you will have to wait for it to connect (this is tricky to do efficiently).
If the POST request comes in from Javascript in the page rather than from a form post, then things are slightly simpler because the browser won't close the current page and start a new page and thus the current socket.io connection will stay connected. You could still completely change the page visuals using client-side Javascript if you wanted. I would recommend this option.

Raising socket events from from nodejs code

I am using net library in node. I want to raise a close event from my code. Is there any way i can do that?
If socket object is stored, I can perform sock.destroy() on it but the client will not be informed about the closing connection, which results in half dropped connection.
Is there any other way to handle this case ?
Emit to that specific socket (client) a custom 'kill_connection' event in order to inform the client about the connection being terminated for whatever reason you decided.
For example using socket.io :
var csid = socket.id; //The socket you are going to destroy
io.to(csid).emit('kill_connection');
Do this before performing the destruction of the socket.
Your client (if a webapp) could look something like:
socket.on("kill_connection", killSession);
function killSession(){
socket.disconnect();
console.log("socket.disconnect");
location.reload();
}
The reload is in case you want a single-page webapp to show the login screen.

Adding data to a socket.io socket object

I am trying to add some custom information to my socket object on connect, so that when I disconnect the socket, I can read that custom information.
IE:
// (Client)
socket.on('connect', function(data){
socket.customInfo = 'customdata';
});
// (server)
socket.on('disconnect', function () {
console.log(socket.customInfo);
});
Since it is JavaScript you can freely add attributes to any object (just as you did). However socket.io does give you a built-in way to do that (so you won't have to worry about naming conflicts):
socket.set('nickname', name, function () {
socket.emit('ready');
});
socket.get('nickname', function (err, name) {
console.log('Chat message by ', name);
});
Note that this is only on one side (either client or server). Obviously you can't share data between client and server without communication (that's what your example suggests).
The socket in your browser and the socket in the server won't share the same properties if you set them.
Basically you have set the data only at the client side (which is in your browsers memory NOT on the server).
For anyone still looking for an answer, there are a couple of things you can do to share data from the client to the server without actually sending a message.
Add a custom property to the auth property in the client socketIo options. This will be available to the server event handlers in socket.handshake.auth.xxxx.
Add a custom header to the transportOptions.polling.extraHeaders property in the client socketIo options. This will ONLY be presented when the socket.io client is connected via polling and not when "upgraded" to websockets (as you can't have custom headers then).
Add a custom query property to the client socketIo options. I don't recommend this since it potentially exposes the data to intermediate proxies.

Architecting a node.js application around socket.io

I'm writing an application in Node.js/Express based around websockets. I'm using Node's EventEmitter in conjunction with socket.io for a nearly completely event-driven app.
I wonder if this this is a good architecture though. My main socket is managed in app.js right now, and has code like this:
socket.on(Events.InitialFetch, function(battle_id){
dispatcher.emit(Events.InitialFetch, battle_id);
});
dispatcher.on(Events.InitialFetched, function(data){
socket.emit(Events.InitialFetched, data);
});
... while in my controller, I have code like this:
dispatcher.on('initial-fetch', function(data){
Battle.findOne({_id: data})
.populate('players')
.populate('owner')
.exec(function(err, battle){
if (err) {
}
else {
dispatcher.emit(Events.InitialFetched, battle);
}
});
});
Instead of the normal RESTful routing. My concern is that it's a little confusing (ie 'fetch' and 'fetched' for describing data flow) and the fact that I'm basically passing methods from one type of event emitter (socket.io) to another (Event.EventEmitter).
How can this be architected better? Would it be better to have the controllers directly access the socket instead of using EventEmitter as a bus? How can I make the names of my events more clear?
I wouldn't worry about using multiple event emitters. They are kind good primitive to build upon in Node.js. As for design, I find a good question to ask is how deeply I have coupled my components.
By using an non-socket.io event emitter for your controller, Socket.io is an independent transport from the controller. This is good.
As a final stage, you should wire the two together using dependency injection. In your server.js file create your dispatcher, then initialize your socket.io module passing the dispatcher as a dependency.
var dispatcher = require('./dispatcher')
var socket_transport = require('./socket_transport')
socket_transport.init_with_dispatcher(dispatcher);
This will let you test your dispatcher independently of the transport. Debugging socket.io can be difficult.

Resources