I have made a React application which relies fully on WebSockets after the initial HTTP Upgrade. For security reasons i use a cookie AND a JWT token in my WebSockets connection.
It all works fine, but when opening a new tab, socket.io cookies get reissued and I want users to stay logged in over multiple tabs. So i want to set a cookie if the client doesn't already have one. If it already has one, then use that cookie.
So I want to handle the first HTTP polling requests and created middleware for that in Node's http server:
// HTTP SERVER
const server = require('http').createServer(function (request, response) {
console.log('test');
console.log(request);
if(!request.headers.cookie) { // cookie pseudo-logic
response.writeHead(200, {
'Set-Cookie': 'mycookie=test',
'Content-Type': 'text/plain'
});
}
// Socket.IO server instance
const io = require('socket.io')(server, {
origins: config.allowedOrigins,
cookie: false, // disable default io cookie
});
server.listen(port, () => console.log(`Listening on port ${port}`));
I use Socket.io as WebSockets framework. The problem however is that this middleware get's ignored, when registering the Socket.io server. When i comment out the Socket.io server, the middleware is active and the request get's logged.
It looks like Socket.io's server is overriding the handler for node http's server. In the Socket.io docs however they provide this example:
var app = require('http').createServer(handler)
var io = require('socket.io')(app);
var fs = require('fs');
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
Thus indicating that it should be possible to handle thw first http polling requests and also the socket requests. I managed to get it work with Express, but I don't understand why node's http server can't.
Anybody who knows what's happening?
Thanks in advance,
Mike
Because normal usage of socket.io does not want regular http middleware to see socket.io connection requests (they would normally trigger 404 responses), socket.io places its own request handler first in line before any others, even ones that existed before it was installed.
You can see how it does that here: https://github.com/socketio/engine.io/blob/master/lib/server.js#L437 in the engine.io source.
I can think of the following ways for you to pre-process a request before socket.io sees it:
Use a proxy and do your cookie stuff in a proxy before socket.io even sees the request.
Patch socket.io/engine.io code to add a callback hook for what you want to do.
Copy the technique used by socket.io/engine.io to put your own request handler first in line after socket.io is configured.
Find a way to override the socket.io server object's handleRequest() method which is what gets called when there's an incoming connection request. You can see its code here.
Related
In the code below, I'm assuming there is a chance that my route handler will fire and try to emit to the socket before the io connection has been established:
server.js:
import { Server } from 'socket.io'
....
....
const app = express()
const io = new Server(....)
app.io = io
app.post('/something', (req, res) => {
req.app.io.emit('something', doSomethingWith(req.body))
res.status(200)
})
io.on('connection', function(socket) {
console.log('socket connected')
socket.on('disconnect', (reason) => {
console.log('disconnected due to = ', reason)
})
})
client.js:
socket = io(`http://localhost:${port}`, { transports: ['websocket'] })
socket.on('something', (data) => {
doSomethingMoreWith(data)
})
fetch('/something', ....)
In that case, is it safer to instead do:
io.on('connection', function(socket) {
app.post('/something', ....)
app.get('/something', ....)
.....
...
socket.on('disconnect', (reason) => {
console.log('disconnected due to = ', reason)
})
})
Or is this is not recommended and there is a better option ?
Putting app.post() and app.get() inside of io.on('connection', ...) is never the proper design or implementation. This is because io.on('connection', ...) is triggered multiple times and there's no point in adding the same express route handler over and over again as that will just waste memory and do nothing useful. The very first client to connect on socket.io would cause the routes to be registered and they'd be there from then on for all other clients (whether they connected via socket.io or not).
It is unclear why you are trying to do this. You don't install routes for one particular circumstance. Routes are installed once for all clients in all states. So, if you're trying to conditionally install routes, that type of design does not work. If you further explain what you're trying to accomplish, then perhaps we could make some different suggestions for a design.
In the code below, I'm assuming there is a chance that my route handler will fire and try to emit to the socket before the io connection has been established:
app.post('/something', (req, res) => {
req.app.io.emit('something', doSomethingWith(req.body))
res.status(200)
});
How exactly this code works depends upon what is doing the POST. If it's Javascript in an existing page, then that page will already by up and initialized and you control (with your Javascript client code in the page) whether you wait to issue the POST to /something until after the socket.io connection is established.
If this POST is a regular browser-based form submission (no Javascript involved), then you have other problems because a form submission from a browser reloads the current browser page with the response from the POST and, in the process of reloading the page, kills any existing socket.io connection that page had (since it loads a new page). Since you're not sending any content back from the POST, this would result in an empty page being displayed in the browser and no socket.io connection.
In looking at your client code, it appears that perhaps the POST is coming from a fetch() in the client code (and thus entirely Javascript-based). If that's the case, I would suggest restructuring your client code so that it waits until the socket.io connection has finished connecting before doing the fetch(). That way, you know you will be able to receive the io.emit() that the server does.
socket = io(`http://localhost:${port}`, { transports: ['websocket'] })
socket.on('something', (data) => {
doSomethingMoreWith(data)
});
socket.on('connect', () => {
// only issue fetch after socket.io connection is operational
fetch('/something', ....)
});
I have setup a Primus websocket service as below.
http = require('http');
server = http.createServer();
Primus = require('primus');
primus = new Primus(server, {
transformer: 'websockets',
pathname: 'ws'
});
primus.on('connection', function connection(spark) {
console.log("client has connected");
spark.write("Herro Client, I am Server");
spark.on('data', function(data) {
console.log('PRINTED FROM SERVER:', data);
spark.write('receive '+data)
});
spark.on('error', function(data) {
console.log('PRINTED FROM SERVER:', data);
spark.write('receive '+data)
});
});
server.listen(5431);
console.log("Server has started listening");
It works fine. In above code, I use spark.write to send response message to users. Now I want to convert it to be used in a middleware.
The code becomes as below:
primus.use('name', function (req, res, next) {
doStuff();
});
in the doStuff() method, how I can get the spark instance to send message back to clients?
The readme is slightly vague about this, but middleware only deals with the HTTP request.
Primus has two ways of extending the functionality. We have plugins but also support middleware. And there is an important difference between these. The middleware layers allows you to modify the incoming requests before they are passed in to the transformers. Plugins allow you to modify and interact with the sparks. The middleware layer is only run for the requests that are handled by Primus.
To achieve what you want, you'll have to create a plugin. It's not much more complicated than middleware.
primus.plugin('herro', {
server: function(primus, options) {
primus.on('connection', function(spark) {
spark.write('Herro Client, I am Server')
})
},
client: function(primus, options) {}
})
For more info, see the Plugins section of the readme.
Overview of app
I have a node.js server application implemented with the express.js 4 module and the node.js core http module. At a high level, the app takes incoming client http messages, makes various http calls (using http module) to other external APIs, and lastly sends back a response to the client based on the responses from the aforementioned various external http API responses.
The Issue
My issue is that when the incoming client http request is terminated by the client (e.g. when the client wants to cancel their request), my node.js app continues to proceed in making the aforementioned various external http API calls. I cannot seem to find a way to signal to the rest of my node.js app to terminate its various outgoing http requests to external APIs in such cases.
When the client terminates their request, the express app (i.e. the express http server) receives a "close" event, which I'm listening for. The "close" event listener in my code catches this event; however, I cannot seem to figure out how to then signal to the "downstream" or "subsequent" http requests made by my code to terminate.
My Goal
How can I signal to all the outgoing http requests to external APIs which are associated with a single client incoming request to terminate when the client terminates their incoming request to my service?
I've provided a simplified version of my node.js app below with some inline code comments to help illustrate my issue more clearly. Any help or insight would be very much appreciated. Thanks!
Additional Info
I'm using the Apigee swagger-tools middleware to do my api routing.
I've found a few answered questions out there which are similar but not quite directly applicable to my question:
Handling cancelled request with Express/Node.js and Angular
How to detect user cancels request
Best,
Chris
test-app.js
// test-app.js
"use strict";
var swaggerTools = require("swagger-tools");
var app = require("express")();
// swaggerRouter configuration
// sends incoming http messages to test-controller.js
var options = {
controllers: './controllers'
};
// The Swagger document (require it, build it programmatically, fetch it from a URL, ...)
// describes the API specification
var apiSpec = require('./test-swagger.json');
// Initialize the Swagger middleware
swaggerTools.initializeMiddleware(apiSpec, function (middleware) {
"use strict"
// Interpret Swagger resources and attach metadata to request - must be first in swagger-tools middleware chain
app.use(middleware.swaggerMetadata());
// Validate Swagger requests/responses based on test-swagger.json API specification
app.use(middleware.swaggerValidator());
// Route validated requests to appropriate controller, test-controller.js
app.use(middleware.swaggerRouter(options));
});
// Run http server on port 8080
app.listen(8080, function () {
"use strict";
console.log("Server running on port %d", this.address().port);
})
.on("connection", function (socket) {
console.log("a new connection was made by an incoming client request.");
socket.on("close", function () {
console.log("socket connection was closed by client");
// somehow signal to the rest of my node.js app to terminate any
// http requests being made to external APIs, e.g. twitter api
socket.destroy();
});
})
test-controller.js
//test-controller.js
"use strict";
var http = require("https");
// only one function currently, consequently, all incoming http requests are
// routed to this function, i.e. "compile"
module.exports = {
compile: compile
};
function compile(req, res, next) {
var options = {
"method": "GET",
"hostname": "api.twitter.com",
"path": "/1.1/statuses/mentions_timeline.json?count=2&since_id=14927799",
"headers": {"accept": "application/json"}
};
// how can i terminate this request when the http.server in test-app.js receives the "close" event?
http.request(options)
.on("response", function(response) {
var apiResponse = [];
response.on("data", function (chunk) {
apiResponse.push(chunk);
});
response.on("end", function () {
apiResponse = Buffer.concat(apiResponse).toString();
res.status(response.statusCode).set(response.headers).send(apiResponse);
});
})
}
In your test controller's compile method you should just be able to do something like this:
var request = http.request(options, function (response) {
res.writeHead(response.statusCode, response.headers)
response.pipe(res, { end: true })
})
req.on('close', function(){
request.abort()
})
I've got a node.js + express + socket.io app.
i want to save the request headers in the socket, to use later.
io.sockets.on('connection', function (socket) {
socket.headers = {};
app.get('*', function (req, res) {
//fetch headers
socket.headers.ua = req.headers['user-agent'];
res.sendfile(__dirname + '/index.html');
});
....etc
but because i am in the app scope, socket isnt defined. I always get confused with the in and out scope.
i cannot app.get() it, because if another browser connects, the app will be changed, right?
You're doing it wrong. Every socket has a handshake object with it which contains request headers, domain, host, date etc. If you still want to fetch headers information then do this:
io.sockets.on('connection', function(socket) {
console.log(socket.handshake); //This would print handshake object in JSON format
// to get req-header, do this
socket.head = socket.handshake.headers['user-agent'];
});
And you can use this property later in some event like:
socket.on('EventName',function(data){
console.log(socket.head);
});
I'm trying to build an application that has two components. There's a public-facing component and an administrative component. Each component will be hosted on a different server, but the two will access the same database. I need to set up the administrative component to be able to send a message to the public-facing component to query the database and send the information to all the public clients.
What I can't figure out is how to set up a connection between the two components. I'm using the standard HTTP server setup provided by Socket.io.
In each server:
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, fs = require('fs')
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.sockets.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
And on each client:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
I've looked at this question but couldn't really follow the answers provided, and I think the situation is somewhat different. I just need one of the servers to be able to send a message to the other server, and still send/receive messages to/from its own set of clients.
I'm brand new to Node (and thus, Socket), so some explanation would be incredibly helpful.
The easiest thing I could find to do is simply create a client connection between the servers using socket.io-client. In my situation, the admin server connects to the client server:
var client = require("socket.io-client");
var socket = client.connect("other_server_hostname");
Actions on the admin side can then send messages to the admin server, and the admin server can use this client connection to forward information to the client server.
On the client server, I created an on 'adminMessage' function and check for some other information to verify where the message came from like so:
io.sockets.on('connection', function (socket) {
socket.on('adminMessage', function (data) {
if(data.someIdentifyingData == "data") {
// DO STUFF
}
});
});
I had the same problem, but instead to use socket.io-client I decided to use a more simple approach (at least for me) using redis pub/sub, the result is pretty simple. My main problem with socket.io-client is that you'll need to know server hosts around you and connect to each one to send messages.
You can take a look at my solution here: https://github.com/alissonperez/scalable-socket-io-server
With this solution you can have how much process/servers you want (using auto-scaling solution), you just use redis as a way to forward your messages between your servers.