In the code below, I'm assuming there is a chance that my route handler will fire and try to emit to the socket before the io connection has been established:
server.js:
import { Server } from 'socket.io'
....
....
const app = express()
const io = new Server(....)
app.io = io
app.post('/something', (req, res) => {
req.app.io.emit('something', doSomethingWith(req.body))
res.status(200)
})
io.on('connection', function(socket) {
console.log('socket connected')
socket.on('disconnect', (reason) => {
console.log('disconnected due to = ', reason)
})
})
client.js:
socket = io(`http://localhost:${port}`, { transports: ['websocket'] })
socket.on('something', (data) => {
doSomethingMoreWith(data)
})
fetch('/something', ....)
In that case, is it safer to instead do:
io.on('connection', function(socket) {
app.post('/something', ....)
app.get('/something', ....)
.....
...
socket.on('disconnect', (reason) => {
console.log('disconnected due to = ', reason)
})
})
Or is this is not recommended and there is a better option ?
Putting app.post() and app.get() inside of io.on('connection', ...) is never the proper design or implementation. This is because io.on('connection', ...) is triggered multiple times and there's no point in adding the same express route handler over and over again as that will just waste memory and do nothing useful. The very first client to connect on socket.io would cause the routes to be registered and they'd be there from then on for all other clients (whether they connected via socket.io or not).
It is unclear why you are trying to do this. You don't install routes for one particular circumstance. Routes are installed once for all clients in all states. So, if you're trying to conditionally install routes, that type of design does not work. If you further explain what you're trying to accomplish, then perhaps we could make some different suggestions for a design.
In the code below, I'm assuming there is a chance that my route handler will fire and try to emit to the socket before the io connection has been established:
app.post('/something', (req, res) => {
req.app.io.emit('something', doSomethingWith(req.body))
res.status(200)
});
How exactly this code works depends upon what is doing the POST. If it's Javascript in an existing page, then that page will already by up and initialized and you control (with your Javascript client code in the page) whether you wait to issue the POST to /something until after the socket.io connection is established.
If this POST is a regular browser-based form submission (no Javascript involved), then you have other problems because a form submission from a browser reloads the current browser page with the response from the POST and, in the process of reloading the page, kills any existing socket.io connection that page had (since it loads a new page). Since you're not sending any content back from the POST, this would result in an empty page being displayed in the browser and no socket.io connection.
In looking at your client code, it appears that perhaps the POST is coming from a fetch() in the client code (and thus entirely Javascript-based). If that's the case, I would suggest restructuring your client code so that it waits until the socket.io connection has finished connecting before doing the fetch(). That way, you know you will be able to receive the io.emit() that the server does.
socket = io(`http://localhost:${port}`, { transports: ['websocket'] })
socket.on('something', (data) => {
doSomethingMoreWith(data)
});
socket.on('connect', () => {
// only issue fetch after socket.io connection is operational
fetch('/something', ....)
});
Related
Okay so i have no prior knowledge about Socket IO so please bear with me if my question i really stupid ;-; Any help at all is appreciated.
Basically what I'm trying to do is have my React front end get a 'signal' from the server. So I have an socket.emit('task-data-changed') server side and an 'io.on('task-data-changed)' on the front end but this does not seem to do anything and i have no idea why.
Here's my server side code:
const app = express();
const http = require('http').createServer(app);
const io = require('socket.io')(http);
io.origins('*:*');
io.on('connection', socket => {
console.log('a user connected');
socket.on("task-data", () => {
console.log('got task-data signal on the backend');
socket.emit("task-data-changed", 'everyone');
});
socket.on('disconnect', () => {
console.log('user disconnected');
});
});
Here's my React code:
useEffect(()=> {
Socket.on("task-data-changed",() => {
console.log('task data was changed on the backend');
});
return () => Socket.off("task-data-changed");
},[]);
const submitNewTaskHandler = () => {
console.log('trying to emit task-data');
Socket.emit('task-data');
}
Things i know are working i guess:
The server logs 'a user connected' and 'user disconnected' appropriately when we load to localhost and when the page closes.
Inside 'submitNewTaskHander', the console.log() runs properly as well as the Socket.emit('task-data')
I know the above thing because the server is able to get the 'task-data' signal and then logs 'got task-data on the backend' properly.
the problem is after that the socket.emit('task-data-changed') on the server-side and the Socket.on('task-data-changed') don't seem to be working.
Thank you for reading so far, apologies if I've used the wrong terminology.
Your code to set the event listener on your front-end socket seems fine:
useEffect(()=> {
Socket.on("task-data-changed",() => {
console.log('task data was changed on the backend');
});
return () => Socket.off("task-data-changed");
},[]);
But, my question to you is, why is that code inside of useEffect()? You should put it wherever your Socket is originally made. Or, you could put it on the line just before Socket.emit('task-data');
To give some explanation, Socket.on() is a listener. In your case, it listens for 'task-data-changed'. Once you tell it to start listening, it will keep listening forever. So, you should just set the listener up once by using Socket.on(), and you usually do this right when you initialize the Socket. Just do it once and it will stay listening and running your callback function.
I am trying to make a game server with node.js, socket.io.
The basic idea likes below.
Initialize socket.io instance when the server starts
Store instance in global scope, so controllers can access it
When API calls, we trigger some socket.io event in the controller or some other points
Here is the implementation I made ...
First, in server.js - entry point
let GlobalVars = require('./state/GlobalVars');
const apiRouters = require('./router');
...
app.use('/api', apiRouters);
app.get('/', (req, res) => {
res.sendFile(`${__dirname}/test/simpleClient.html`)
});
const httpServer = http.createServer(app);
let socketIOInstance = socketIO(httpServer);
socketIOInstance.on('connection', (socket) => {
console.log('SOCKET.IO A USER CONNECTED');
socket.on('create', (data) => {
console.log('SOCKET.IO create called', socket);
socket.join(data.room);
socketIOInstance.emit('message', 'New people joined');
});
socket.on('join', (data) => {
console.log('SOCKET.IO join called', data);
})
socket.emit('message', 'Hi');
});
GlobalVars.socketIO = socketIOInstance;
// Add to global, so the controllers can manage own actions like create, join ...
httpServer.listen(port, () => {
console.log(`Server Listening on the port ${port}`);
})
...
When I access from a client, I am able to see SOCKET.IO A USER CONNECTED and Hi in the browser console.
Second, In api controller.
let GlobalVars = require('../state/GlobalVars');
...
router.post('/create', (req, res) => {
console.log('GenerateGameSokect');
let game = new Game();
let gameId = game.gameId;
// console.log('Global vars ', GlobalVars.socketIO);
GlobalVars.socketIO.emit('create', {
room: gameId
});
res.json({
result : 'SUCCESS',
game : game
})
});
I imported GlobalVars which contains socketIO instance. So what I expected was, socket create event triggered from the statement GlobalVars.socketIO.emit('create', Object) but could not find message in the server logs.
I got no clue what I was missing.
The final form I pursue is something like...
When user call create API, I creates socket connection and room
API will called in HTTP protocol, but in the API, the server publishes some events. - pubsub like.
Thanks for reading my questions b. Here is full source code till now(bitbucket public)
================== EDIT ====================
I got understood (maybe...)
The user-flow I wanted was ...
The client call API
(In the server) Checking validation in API and if valid emit to socket.io
If event accepted send new status to all clients
However, creating socket.io connection in the server looks strange for me, the solution is up to the client.
New user-flow I will change
The client call a validation API
If return is valid, the client emit socket.io event. This time server only do validation, not emit socket.io
In socket event, send new status to all other users
================== EDIT #2 ====================
This is a kind of conclusion. It looks I just misunderstanding the concept of socket communication. Like answer and replies say, Socket and HTTP are totally different channels, there is no way to connect both. (At least, without open new connection from http server to socket)
If this is wrong, you could add reply, Thanks
Now I understand you. Or at least I think!
Let's put it this way: there are two (asymetric) sides on a socket, server and client. What I called, respectively, "global manager" and "socket" in my comment to your post.
const server = require('socket.io')(yourHttpServer);
// client is installed as well when `npm i socket.io`
const client = require('socket.io-client')('http://localhost:' + yourServerPort);
// `socket` is the server side of the socket
server.on('connection', (socket) => {
// this will be triggered by client sides emitting 'create'
socket.on('create', (data) => {
console.log('a client socket just fired a "create" event!');
});
});
// this will be triggered by server side emitting 'create'
client.on('create', (data) => {
server.emit('create', {content: 'this will result in an infinite loop of "create" events!'});
});
In your /create route, when you GlobalVars.socketIO.emit('create', ...), the server-side socket handler isn't triggered, however if you have clients connected through a browser (or, like I showed above, if you connect a client socket directly from the server) then these will trigger their 'create' listener, if any.
Hope this helps you get on the right tracks!
I'm making a REST API that works with routes and actions like /api/route/action. But I want to add WebSocket functionalities. So I want WebSockets to also be addressable by url.
I have this code:
const socketio = require('socket.io');
//server is a http.createServer()
module.exports = server => {
const io = socketio(server, { route: '/socketapi/test' );
io.on('connection', s => {
s.on('a', () => s.emit('b'));
s.emit('message', 'You connected to /test.');
});
const io2 = socketio(server, { route: '/socketapi/something_else' });
io2.on('connection', s => {
s.on('z', () => s.emit('y'));
s.emit('message', 'Hi');
});
};
The reason why I want to split them is so I don't have to keep track of event names I've already used, and so I can separate the logic in the connection event.
But it seems this is not possible. If I have two socket.io instances running I can't connect to either.
Is this possible or will I have to use some tricks and perhaps an event that the client can send to let me know what it wants to subscribe to?
You can use a built in feature of socket.io called namespaces to achieve this behaviour.
Here is a basic example:
Server side:
const nsp = io.of('/my-namespace');
nsp.on('connection', function(socket){
console.log('someone connected');
});
nsp.emit('hi', 'everyone!');
Client side:
const socket = io('/my-namespace');
Now the client can emit and receive messages which are specific to a namespace. With the use of namespaces your problem of name conflicts of the events, will be solved.
I have made a React application which relies fully on WebSockets after the initial HTTP Upgrade. For security reasons i use a cookie AND a JWT token in my WebSockets connection.
It all works fine, but when opening a new tab, socket.io cookies get reissued and I want users to stay logged in over multiple tabs. So i want to set a cookie if the client doesn't already have one. If it already has one, then use that cookie.
So I want to handle the first HTTP polling requests and created middleware for that in Node's http server:
// HTTP SERVER
const server = require('http').createServer(function (request, response) {
console.log('test');
console.log(request);
if(!request.headers.cookie) { // cookie pseudo-logic
response.writeHead(200, {
'Set-Cookie': 'mycookie=test',
'Content-Type': 'text/plain'
});
}
// Socket.IO server instance
const io = require('socket.io')(server, {
origins: config.allowedOrigins,
cookie: false, // disable default io cookie
});
server.listen(port, () => console.log(`Listening on port ${port}`));
I use Socket.io as WebSockets framework. The problem however is that this middleware get's ignored, when registering the Socket.io server. When i comment out the Socket.io server, the middleware is active and the request get's logged.
It looks like Socket.io's server is overriding the handler for node http's server. In the Socket.io docs however they provide this example:
var app = require('http').createServer(handler)
var io = require('socket.io')(app);
var fs = require('fs');
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
Thus indicating that it should be possible to handle thw first http polling requests and also the socket requests. I managed to get it work with Express, but I don't understand why node's http server can't.
Anybody who knows what's happening?
Thanks in advance,
Mike
Because normal usage of socket.io does not want regular http middleware to see socket.io connection requests (they would normally trigger 404 responses), socket.io places its own request handler first in line before any others, even ones that existed before it was installed.
You can see how it does that here: https://github.com/socketio/engine.io/blob/master/lib/server.js#L437 in the engine.io source.
I can think of the following ways for you to pre-process a request before socket.io sees it:
Use a proxy and do your cookie stuff in a proxy before socket.io even sees the request.
Patch socket.io/engine.io code to add a callback hook for what you want to do.
Copy the technique used by socket.io/engine.io to put your own request handler first in line after socket.io is configured.
Find a way to override the socket.io server object's handleRequest() method which is what gets called when there's an incoming connection request. You can see its code here.
I'm trying to build an application that has two components. There's a public-facing component and an administrative component. Each component will be hosted on a different server, but the two will access the same database. I need to set up the administrative component to be able to send a message to the public-facing component to query the database and send the information to all the public clients.
What I can't figure out is how to set up a connection between the two components. I'm using the standard HTTP server setup provided by Socket.io.
In each server:
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, fs = require('fs')
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.sockets.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
And on each client:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
I've looked at this question but couldn't really follow the answers provided, and I think the situation is somewhat different. I just need one of the servers to be able to send a message to the other server, and still send/receive messages to/from its own set of clients.
I'm brand new to Node (and thus, Socket), so some explanation would be incredibly helpful.
The easiest thing I could find to do is simply create a client connection between the servers using socket.io-client. In my situation, the admin server connects to the client server:
var client = require("socket.io-client");
var socket = client.connect("other_server_hostname");
Actions on the admin side can then send messages to the admin server, and the admin server can use this client connection to forward information to the client server.
On the client server, I created an on 'adminMessage' function and check for some other information to verify where the message came from like so:
io.sockets.on('connection', function (socket) {
socket.on('adminMessage', function (data) {
if(data.someIdentifyingData == "data") {
// DO STUFF
}
});
});
I had the same problem, but instead to use socket.io-client I decided to use a more simple approach (at least for me) using redis pub/sub, the result is pretty simple. My main problem with socket.io-client is that you'll need to know server hosts around you and connect to each one to send messages.
You can take a look at my solution here: https://github.com/alissonperez/scalable-socket-io-server
With this solution you can have how much process/servers you want (using auto-scaling solution), you just use redis as a way to forward your messages between your servers.