Click events with Socket IO - node.js

I'm using node and Socket IO to set up click events in one browser and trigger an animation in another. I'm having difficulty getting this to work. For example, clicking a button in one browser will hide a box in the other. The code I have so far is:
Client side:
function hideBox(data) {
$('.box').hide();
};
$('.btn').on('click', function(event) {
socket.emit('hideBtn', {id: event.target});
});
socket.on('hideBtn', function(data) {
$(data.id).hideBox;
});
Server side
socket.on('hideBtn', function(data) {
socket.broadcast.emit('hideBtn', data);
});
Any help would be greatly appreciated. Thanks.

socket.on('hideBtn', function(data) {
$(data.id).hideBox;
});
hideBox is a function right ? Maybe it's due to the "()" forgotten :-)

OK, so after a lot of going back and forth, the answer resulted in actually just having to simplify the code. No need to specify event.target etc, just simply emit and listen for the event.
Code for client side, index.html:
$('.btn').on('click', function() {
socket.emit('hideBoxRequest');
});
Code for server side:
socket.on('hideBoxRequest', function () {
socket.broadcast.emit('hideBox');
});
Final code client side, for my other html file:
socket.on('hideBox', function () {
$('.testBox').fadeOut();
});
So basically clicking the btw in index.html emits 'hideBoxRequest' to the server. The server then picks this up and sends out 'hideBox' to all clients which are connected to the server.

Related

Socket.io emit not called

I am trying to emit two things from disconnection, and one is a redirect.
Server side:
socket.on('disconnect', () => {
socket.broadcast.emit('user-disconnected', users[socket.id]);
socket.emit('redirect', destination);
console.log('disconnect: '+ users[socket.id]);
delete users[socket.id];
});
Client side:
This gets called greatly:
socket.on('user-disconnected', username => {
appendMessage(username+ ' est parti.');
removeUser(username);
});
This does not:
socket.on('redirect', function(destination) { // never gets called…
window.location.href = destination;
});
Do you know why my server-side gets called, user-disconnected does, but not my redirect ?
Thank you so much if you help!
You are sending:
socket.emit('redirect', destination);
to a socket that has just disconnected. You're literally trying to send the message from the disconnect event. That socket isn't able to send more messages. You can do socket.broadcast.emit(...) to send to all the other sockets, but not to this one.

SocketIO - processing data before broadcasting

I am currently having a problem with a small Node.JS application. It is extremely similar to, for example, the Socket.IO sample chat application: whenever an user sends data to the server, I need to process the data and then broadcast it to all users. However, the way data is processed depends on the user that receives it. I tried altering the Socket#emit method (in order to process the data before sending it) for each socket, but the broadcasting doesn't use it. Is there any way I can get this solved?
Thanks in advance.
EDIT
Here is what I'd tried ():
var io = require('socket.io');
io.use(function(socket, next) {
var old_emit = socket.emit;
socket.emit = function() {
// do something with the data (arguments[1] in this case)
// the processing of the data depends on a value that is
// unique to each connection, and can't be sent over to the client
old_emit.apply(socket, arguments);
}
});
I tried the code above because I thought, initially, that broadcasting would call emit somewhere else for each connection.
I am not sure what you want to do,
There is no any "io( )" like function.You are doing it wrong.
BTW, to process data.
From sample chat application,
// when the client emits 'new message', this listens and executes
socket.on('new message', function (data) {
//call a function which you want to process data
var newData = messageProcessingFunc(data);
//and then broadcast it.
// we tell the client to execute 'new message'
socket.broadcast.emit('new message', {
username: socket.username,
message: newData
});
});

Passing socket as a Socket.io Param?

Broadly, I have the following workflow:
User asks for an article with a certain title
Client side, Socket.io emits an event and passes the title as data
Server side, node fires an http request to an API and gathers relevant information about that article
When finished, the server emits that information to the client.
Since 4 depends on 3, my understanding is that it needs to be captured in a callback to effect synchronous behavior. That gave me this:
io.on('connection', function(socket){
socket.on('need data', function(msg) {
getLinkBacks(msg, socket);
});
});
var getLinkBacks = function(title, socket) {
request.get(/* relevant url */, function(err, res, body) {
socket.emit("data", body);
});
};
None of the socket.io documentation talks about async methods and it feels pretty weird to be passing the socket, rather than a callback function, which would be more Node-y. Am I using poor technique or thinking about the problem wrong or is this the standard way to emit the response of an asynchronous method?
Note: I would have put this on Code Review, but they don't have a tag for Socket.IO, which made me think it would fit better here.
I agree with you, Node's style is passing a callback function, so I would rewrite this code as follows:
io.on('connection', function(socket){
socket.on('need data', function(msg) {
getLinkBacks(msg, function(content) {
socket.emit("data", content);
});
});
});
var getLinkBacks = function(title, fn) {
request.get(/* relevant url */, function(err, res, body) {
fn(body);
});
};
This will keep each part of your app isolated, and getLinkBacks function will not have to know about what socket is. This is of course a good programming practice. Besides you could reuse getLinkBacks function in other parts of your app, which are not connected with socket object.
P.S. I would recommend Robert Martin's "Clean Code: A Handbook of Agile Software Craftsmanship". He gives very valuable advises about how to structure your code to make it "clean".
Good luck!

Node.js server side connection to Socket.io

I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)
I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;

Node.js Express rendering multiple subsequent views

I want to do something like:
//client -> notifies server that client is connected.
//server -> begins fetching information from DB (series of both async and synchronous requests).
//as sets of data become available on server -> server pushes updates to client via res.render()
Basically I have a menu item on the client, and I want to update that menu as the data that the server fetches gets ready. is there any way to do this? I notice I can't do
res.render('something');
// again
res.render('somethingElse');
Because once render is called, then the response is sent, and render cannot be called again
"Error: Can't set headers after they are sent."
Any suggestions?
You might benefit from using WebSockets:
http://en.wikipedia.org/wiki/WebSocket
This post has a little bit of info:
Which websocket library to use with Node.js?
HTTP works via request/response. Typically once the response is sent, the connection is terminated.
To stream data from the server to client, you can use websockets. There is a very popular node.js module called socket.io, which simplifies using websockets.
Using socket.io, the client code would look like this:
var socket = io.connect('http://yourserver.com');
socket.on('data', function (data) {
updateMenu(data);
});
And the server code:
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('data', data);
getMoreDataFromDb(function(data){
socket.emit('data', data);
});
// etc..
});
Alternatively, if you want a simpler solution, you can just make multiple small ajax requests to the server, until you get all your data:
(function getData(dataId){
$.ajax({
url:"yourserver.com/getdata",
data: dataId || {},
success:function(data){
updateMenu(data);
if(data) getData({ lastDataReceived: data.lastId }); // server is still returning data, request more
}
});
})();

Resources