I am new to Node.js and Socket.IO. According to documentation the client side code is something like:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io('http://localhost:3000');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
It's simple and easy but although it's perfectly fine for local testing I am not sure whether is really secure on a live page since it's providing direct url and event information for the server.
Should I change client to something else (or somehow hide it?) for the live page or it's good to go the way it is ?
Yes, indeed it is dangerous and not secured!
But, there are few things that you could do to enhance the security of your Socket.io
Use HTTPS connection, this will prevent anyone from eavesdropping your socket connection
Always authenticate your user before establishing your socket connection with your client
The link below is a good example how you can authenticate your Socket.io connection using JSON Web Token
https://auth0.com/blog/2014/01/15/auth-with-socket-io/
If you are using Session based authentication, you can implement the similiar architecture
You could also do obfuscation, you could use those minify to make your code ugly and hards for attacker to trace it, but if you would have implemented the authentication before establishing connection, it should be safe enough to expose your url!
Related
I want to emit an event to the client when a long fucntion comes to an end.
This will show a hidden div with a link - on the client side.
This is the approach i tested:
//server.js
var express = require('express');
var app = express();
var http = require('http').Server(app);
var io = require('socket.io')(http);
require('./app/routes.js')(app, io);
//routes.js
app.post('/pst', function(req, res) {
var url = req.body.convo;
res.render('processing.ejs');
myAsyncFunction(url).then(result => {
console.log('Async Function completed');
socket.emit('dlReady', { description: 'Your file is ready!'});
//do some other stuff here
}).catch(err => {
console.log(err);
res.render('error.ejs');
})
});
I get this
ERROR: ReferenceError: socket is not defined
If i change the socket.emit() line to this:
io.on('connection', function(socket) {
socket.emit('dlReady', { description: 'Your file is ready!'});
});
Then i don't receive an error, but nothing happens at the client.
This is the client code:
<script>
document.querySelector('.container2').style.display = "none";
var socket = io();
socket.on('dlReady', function(data) { //When you receive dlReady event from socket.io, show the link part
document.querySelector('.container1').style.display = "none";
document.querySelector('.container2').style.display = "block";
});
</script>
This whole concept is likely a bit flawed. Let me state some facts about this environment that you must fully understand before you can follow what needs to happen:
When the browser does a POST, there's an existing page in the browser that issues the post.
If that POST is issued from a form post (not a post from Javascript in the page), then when you send back the response with res.render(), the browser will close down the previous page and render the new page.
Any socket.io connection from the previous page will be closed. If the new page from the res.render() has Javascript in it, when that Javascript runs, it may or may not create a new socket.io connection to your server. In any case, that won't happen until some time AFTER the res.render() is called as the browser has to receive the new page, parse it, then run the Javascript in it which has to then connect socket.io to your server again.
Remember that servers handle lots of clients. They are a one-to-many environment. So, you could easily have hundreds or thousands of clients that all have a socket.io connection to your server. So, your server can never assume there is ONE socket.io connection and sending to that one connection will go to a particular page. The server must keep track of N socket.io connections.
If the server ever wants to emit to a particular page, it has to create a means of figuring out which exact socket.io connect belongs to the page that it is trying to emit to, get that particular socket and call socket.emit() only on that particular socket. The server can never do this by creating some server-wide variable named socket and using that. A multi-user server can never do that.
The usual way to "track" a given client as it returns time after time to a server is by setting a unique cookie when the client first connects to your server. From then on, every connection from that client to your server (until the cookie expires or is somehow deleted by the browser) whether the client is connection for an http request or is making a socket.io connection (which also starts with an http request) will present the cookie and you can then tell which client it is from that cookie.
So, my understanding of your problem is that you'd like to get a form POST from the client, return back to the client a rendered processing.ejs and then sometime later, you'd like to communicate with that rendered page in the client via socket.io. To do that, the following steps must occur.
Whenever the client makes the POST to your server, you must make sure there is a unique cookie sent back to that client. If the cookie already exists, you can leave it. If it does not exist, you must create a new one. This can be done manually, or you can use express-session to do it for you. I'd suggest using express-session because it will make the following steps easier and I will outline steps assuming you are using express-session.
Your processing.ejs page must have Javascript in it that makes a socket.io connection to your server and registers a message listener for your "dlready" message that your server will emit.
You will need a top-level io.on('connection', ...) on your server that puts the socket into the session object. Because the client can connect from multiple tabs, if you don't want that to cause trouble, you probably have to maintain an array of sockets in the session object.
You will need a socket.on('disconnect', ...) handler on your server that can remove a socket from the session object it's been stored in when it disconnects.
In your app.post() handler, when you are ready to send the dlready message, you will have to find the appropriate socket for that browser in the session object for that page and emit to that socket(s). If there are none because the page you rendered has not yet connected, you will have to wait for it to connect (this is tricky to do efficiently).
If the POST request comes in from Javascript in the page rather than from a form post, then things are slightly simpler because the browser won't close the current page and start a new page and thus the current socket.io connection will stay connected. You could still completely change the page visuals using client-side Javascript if you wanted. I would recommend this option.
My goal is to build a chat application - similar to whatsapp
To my understanding, socket.io is a real-time communication library written in javascript and it is very simple to use
For example
// Serverside
io.on('connection', function(socket) {
socket.on('chat', function(msg) {
io.emit('chat', msg);
});
});
// ClientSide (Using jquery)
var socket = io();
$('form').submit(function(){
socket.emit('chat', $('#m').val());
$('#m').val('');
return false;
});
socket.on('chat', function(msg){
$('#messages').append($('<li>').text(msg));
});
1) do I always need to start an io.on('connection') to use the real-time feature or i could just start using socket.on object instead? for example i have a route
app.post('/postSomething', function(req, res) {
// Do i need to start an io.on or socket.on here?
});
because i want the real-time feature to be listen only on specific route.
2) Redis is a data structure library which handles the pub/sub, why do we need to use pub/sub mechanism?
I read alot of articles but couldn't grasp the concept. Article example http://ejosh.co/de/2015/01/node-js-socket-io-and-redis-intermediate-tutorial-server-side/
for example the code below
// Do i need redis for this, if so why? is it for caching purposes?
// Where does redis fit in this code?
var redis = require("redis");
var client = redis.createClient();
io.on('connection', function(socket) {
socket.on('chat', function(msg) {
io.emit('chat', msg);
});
});
3) Just wondering why I need nginx to scale node.js application? i found this stackoverflow answer:
Strategy to implement a scalable chat server
It says something about load balancing, read that online and couldn't grasp the concept as well.
So far I have only been dealing with node.js , mongoose simple CRUD application, but I'm willing work really hard if you guys could share some of your knowledge and share some useful resources so that I could deepen my knowledge about all of these technologies.
Cheers!
Q. Socket.on without IO.on
io.on("connection" ... )
Is called when you receive a new connection. Socket.on listens to all the emits at the client side. If you want your client to act as a server for some reason then (in short) yes io.on is required
Q. Redis pub/sub vs Socket.IO
Take a look at this SO question/anwer, quoting;
Redis pub/sub is great in case all clients have direct access to redis. If you have multiple node servers, one can push a message to the others.
But if you also have clients in the browser, you need something else to push data from a server to a client, and in this case, socket.io is great.
Now, if you use socket.io with the Redis store, socket.io will use Redis pub/sub under the hood to propagate messages between servers, and servers will propagate messages to clients.
So using socket.io rooms with socket.io configured with the Redis store is probably the simplest for you.
Redis can act like a message queue if it is a requirement. Redis is a datastore support many datatypes.
Q. Why Nginx with Node.js
Node.js can work standalone but nginx is faster to server static content.
Since nginx is a reverse proxy therefore servers are configured with nginx to handle all the static data (serving static files, doing redirects, handling SSL certificates and serving error pages.
) and every other request is sent to node.js
Check this Quora post as well: Should I host a node.js project without nginx?
Quoting:
Nginx can be used to remove some load from the Node.js processes, for example, serving static files, doing redirects, handling SSL certificates and serving error pages.
You can do everything without Nginx but it means You have to code it yourself, so why not use a fast and proven solution for this.
I am currently working on a project with socket.io, and i'm not sure to fully understand the mechanism of reconnection.
Since a disconnection could happen client side, i would like to know how to maintain the state of the socket on the server. I already know that socket.io-client will try to reconnect automatically, but i would like to know if it is possible to ensure the state of the socket on the server side.
I was thinking of a cookie based session, with express for example, but again i am not sure if i'm taking the good way about this. Is there another solution i should consider?
For the record, i successfully configured HAProxy with a cookie based sticky-sessions mechanism. Could it be possible to mix this mechanism with a cookie session on the socket.io server ?
Thanks
William
I think cookie based sessions are your best option. Look into the session.socket.io module. Looks like it was built specifically for this.
var SessionSockets = require('session.socket.io');
var sessionSockets = new SessionSockets(io, sessionStore, cookieParser);
sessionSockets.on('connection', function (err, socket, session) {
//your regular socket.io code goes here
//and you can still use your io object
session.foo = 'bar';
//at this point the value is not yet saved into the session
session.save();
//now you can read session.foo from your express routes or connect middlewares
});
Alternatively you could implement sessions yourself using express as you mentioned. I don't know of any easy way to integrate with HAProxy.
I am developing a simple application, with a node.js server, and an HTML5 client in browser. At the moment, I am using socket.io for the communication, because it seems to me that it should work in most cases: proxies, firewalls, etc. On the other hand, I find hard to now precisely what is going on, as a lot of things are automated, and as I did not find a comprehensive documentation. One other important point is that I am new to the Javascript/Node.js world.
In this particular question, I am trying to achieve a tight synchronisation between clients and a server, following an SNTP-like scheme. Therefore, I would like to drop any delayed packet. The volatile flag should allow me to do this, and I use it on to emit messages from the server, but it does not seem valid from the client side. Is it by design? Because I am using the stand-alone version on the client (no require or browserify here)?
index.html
<html>
<body onload="init()">
<script src="/socket.io/socket.io.js"></script>
<script src="calibration.js"></script>
</body>
</html>
calibration.js
var socket = io.connect();
function init() {
socket.emit('test', 'ok');
socket.volatile.emit('test-volatile', 'bad');
}
console log on page load
socket.volatile is undefined
Is volatile pointless from the client side anyway? If not, is there a way to use it? Any pointer to documentation would be appreciated. At the moment, I am considering engine.io or ws node.js packages...
I think sending volatile messages from the client to the server is not supported yet.
https://github.com/socketio/socket.io-client/issues/283
https://github.com/socketio/socket.io-client/pull/821
I am using socket.io-client to send a volatile emit from client to server. But, as I am sending the message right after the connection I had to use a setTimeout()
const socket = io( { autoConnect: false });
function init(){
socket.open();
setTimeout(() => {
socket.volatile.emit('hello', 'world');
}, 1000);
}
I am trying to add some custom information to my socket object on connect, so that when I disconnect the socket, I can read that custom information.
IE:
// (Client)
socket.on('connect', function(data){
socket.customInfo = 'customdata';
});
// (server)
socket.on('disconnect', function () {
console.log(socket.customInfo);
});
Since it is JavaScript you can freely add attributes to any object (just as you did). However socket.io does give you a built-in way to do that (so you won't have to worry about naming conflicts):
socket.set('nickname', name, function () {
socket.emit('ready');
});
socket.get('nickname', function (err, name) {
console.log('Chat message by ', name);
});
Note that this is only on one side (either client or server). Obviously you can't share data between client and server without communication (that's what your example suggests).
The socket in your browser and the socket in the server won't share the same properties if you set them.
Basically you have set the data only at the client side (which is in your browsers memory NOT on the server).
For anyone still looking for an answer, there are a couple of things you can do to share data from the client to the server without actually sending a message.
Add a custom property to the auth property in the client socketIo options. This will be available to the server event handlers in socket.handshake.auth.xxxx.
Add a custom header to the transportOptions.polling.extraHeaders property in the client socketIo options. This will ONLY be presented when the socket.io client is connected via polling and not when "upgraded" to websockets (as you can't have custom headers then).
Add a custom query property to the client socketIo options. I don't recommend this since it potentially exposes the data to intermediate proxies.