Broadly, I have the following workflow:
User asks for an article with a certain title
Client side, Socket.io emits an event and passes the title as data
Server side, node fires an http request to an API and gathers relevant information about that article
When finished, the server emits that information to the client.
Since 4 depends on 3, my understanding is that it needs to be captured in a callback to effect synchronous behavior. That gave me this:
io.on('connection', function(socket){
socket.on('need data', function(msg) {
getLinkBacks(msg, socket);
});
});
var getLinkBacks = function(title, socket) {
request.get(/* relevant url */, function(err, res, body) {
socket.emit("data", body);
});
};
None of the socket.io documentation talks about async methods and it feels pretty weird to be passing the socket, rather than a callback function, which would be more Node-y. Am I using poor technique or thinking about the problem wrong or is this the standard way to emit the response of an asynchronous method?
Note: I would have put this on Code Review, but they don't have a tag for Socket.IO, which made me think it would fit better here.
I agree with you, Node's style is passing a callback function, so I would rewrite this code as follows:
io.on('connection', function(socket){
socket.on('need data', function(msg) {
getLinkBacks(msg, function(content) {
socket.emit("data", content);
});
});
});
var getLinkBacks = function(title, fn) {
request.get(/* relevant url */, function(err, res, body) {
fn(body);
});
};
This will keep each part of your app isolated, and getLinkBacks function will not have to know about what socket is. This is of course a good programming practice. Besides you could reuse getLinkBacks function in other parts of your app, which are not connected with socket object.
P.S. I would recommend Robert Martin's "Clean Code: A Handbook of Agile Software Craftsmanship". He gives very valuable advises about how to structure your code to make it "clean".
Good luck!
Related
I am building a web app that uses socket.io to trigger remote procedure calls that passes sessions specific data back go the client. As my app is getting bigger and getting more users, I wanted to check to see if my design is reasonable.
The server that receives websocket communications and triggers RPCs looks something like this:
s.on('run', function(input) {
client.invoke(input.method, input.params, s.id, function(error, res, more) {
s.emit('output', {
method: input.method,
error: error,
response: res,
more: more,
id: s.id
});
});
});
However, this means that the client has to first emit the method invocation, and then listen to all method returns and pluck out its correct return value:
socket.on('output', function(res) {
if (res.id === socket.sessionid) {
if (!res.error) {
if(res.method === 'myMethod') {
var myResponse = res.response;
// Do more stuff with my response
});
}
}
});
It is starting to seem like a messy design as I add more and more functions... is there a better way to do this?
The traditional AJAX way of attaching a callback to each function would be a lot nicer, but I want to take advantage of the benefits of websockets (e.g. less overhead for rapid communications).
I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)
I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;
I want to do something like:
//client -> notifies server that client is connected.
//server -> begins fetching information from DB (series of both async and synchronous requests).
//as sets of data become available on server -> server pushes updates to client via res.render()
Basically I have a menu item on the client, and I want to update that menu as the data that the server fetches gets ready. is there any way to do this? I notice I can't do
res.render('something');
// again
res.render('somethingElse');
Because once render is called, then the response is sent, and render cannot be called again
"Error: Can't set headers after they are sent."
Any suggestions?
You might benefit from using WebSockets:
http://en.wikipedia.org/wiki/WebSocket
This post has a little bit of info:
Which websocket library to use with Node.js?
HTTP works via request/response. Typically once the response is sent, the connection is terminated.
To stream data from the server to client, you can use websockets. There is a very popular node.js module called socket.io, which simplifies using websockets.
Using socket.io, the client code would look like this:
var socket = io.connect('http://yourserver.com');
socket.on('data', function (data) {
updateMenu(data);
});
And the server code:
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('data', data);
getMoreDataFromDb(function(data){
socket.emit('data', data);
});
// etc..
});
Alternatively, if you want a simpler solution, you can just make multiple small ajax requests to the server, until you get all your data:
(function getData(dataId){
$.ajax({
url:"yourserver.com/getdata",
data: dataId || {},
success:function(data){
updateMenu(data);
if(data) getData({ lastDataReceived: data.lastId }); // server is still returning data, request more
}
});
})();
So I understand that Node.js Connect works like a stack through which it runs, starting from the top and going to the bottom. From the Connect introduction by its author at http://howtonode.org/connect-it it shows an example like
var Connect = require('connect');
module.exports = Connect.createServer(
require('./log-it')(),
require('./serve-js')()
);
The article reads
Every request enters the onion at the outside and traverses layer by
layer till it hits something that handles it and generates a response.
In Connect terms, these are called filters and providers. Once a layer
provides a response, the path happens in reverse.
I'm particulary curious about "Once a layer provides a response, the path happens in reverse". How does that happen? Every middleware gets called again, but in reverse order?
No, they don't get called again in reverse, but each middleware has a chance to monkey-patch the request methods and hijack them. It's not ideal.
// basic logger example
module.exports = function () {
return function logger(req, res, next) {
var writeHead = res.writeHead;
res.writeHead = function (code, headers) {
console.log(req.method, req.url, code);
res.writeHead = writeHead;
return res.writeHead(code, headers);
};
next();
};
};
Now, this code has issues because writeHead isn't the only way to set the status code, so it won't catch all requests. But that is the basic way that middleware can catch events on the way out.
I'm starting a school project in node.js. It's a system that consists of couple of node.js servers connected via TCP sockets.
I tried some TCP libraries (Net module from node itself, nssocket from nodejitsu, now i'm experimenting with zeromq) but they don't seem to have one important feature: message delivery confirmation.
What I would like to have is some kind of library that allows to do something like this:
client.connect(server, port);
client.send(data, function callback(err, res) { ... });
The callback function would be called after the message is delivered or when something bad (timeout, network failure) happens during sending.
I was thinking I would write my own protocol over TCP but I would prefer somethink more robust and tested.
Thanks for any constructive answers :)
It's not really convenient, because what happens when you A->B goes fine, but the receive confirmation fails (B->A)? Anyhow, it's pretty easy to roll your own, something like (pseudo):
Sender
var callbacks = {};
function send (msg, callback) {
// assign unique request id
msg.msgId = uuid.v4();
msg.timestamp = new Date();
callbacks[msg.msgId] = callback;
// send over TCP
}
socket.on("confirmation", function (msg) {
callbacks[msg.msgId](null);
delete callbacks[msg.msgId];
});
// now you'll need some polling mechanism that checks for messages that take
// more than the timeout and clean them
Receiver
socket.on("message", function (msg) {
// send confirmation to sender
socket.emit("confirmation", { msgId: msg.msgId });
});