I have setup a Primus websocket service as below.
http = require('http');
server = http.createServer();
Primus = require('primus');
primus = new Primus(server, {
transformer: 'websockets',
pathname: 'ws'
});
primus.on('connection', function connection(spark) {
console.log("client has connected");
spark.write("Herro Client, I am Server");
spark.on('data', function(data) {
console.log('PRINTED FROM SERVER:', data);
spark.write('receive '+data)
});
spark.on('error', function(data) {
console.log('PRINTED FROM SERVER:', data);
spark.write('receive '+data)
});
});
server.listen(5431);
console.log("Server has started listening");
It works fine. In above code, I use spark.write to send response message to users. Now I want to convert it to be used in a middleware.
The code becomes as below:
primus.use('name', function (req, res, next) {
doStuff();
});
in the doStuff() method, how I can get the spark instance to send message back to clients?
The readme is slightly vague about this, but middleware only deals with the HTTP request.
Primus has two ways of extending the functionality. We have plugins but also support middleware. And there is an important difference between these. The middleware layers allows you to modify the incoming requests before they are passed in to the transformers. Plugins allow you to modify and interact with the sparks. The middleware layer is only run for the requests that are handled by Primus.
To achieve what you want, you'll have to create a plugin. It's not much more complicated than middleware.
primus.plugin('herro', {
server: function(primus, options) {
primus.on('connection', function(spark) {
spark.write('Herro Client, I am Server')
})
},
client: function(primus, options) {}
})
For more info, see the Plugins section of the readme.
Related
In the code below, I'm assuming there is a chance that my route handler will fire and try to emit to the socket before the io connection has been established:
server.js:
import { Server } from 'socket.io'
....
....
const app = express()
const io = new Server(....)
app.io = io
app.post('/something', (req, res) => {
req.app.io.emit('something', doSomethingWith(req.body))
res.status(200)
})
io.on('connection', function(socket) {
console.log('socket connected')
socket.on('disconnect', (reason) => {
console.log('disconnected due to = ', reason)
})
})
client.js:
socket = io(`http://localhost:${port}`, { transports: ['websocket'] })
socket.on('something', (data) => {
doSomethingMoreWith(data)
})
fetch('/something', ....)
In that case, is it safer to instead do:
io.on('connection', function(socket) {
app.post('/something', ....)
app.get('/something', ....)
.....
...
socket.on('disconnect', (reason) => {
console.log('disconnected due to = ', reason)
})
})
Or is this is not recommended and there is a better option ?
Putting app.post() and app.get() inside of io.on('connection', ...) is never the proper design or implementation. This is because io.on('connection', ...) is triggered multiple times and there's no point in adding the same express route handler over and over again as that will just waste memory and do nothing useful. The very first client to connect on socket.io would cause the routes to be registered and they'd be there from then on for all other clients (whether they connected via socket.io or not).
It is unclear why you are trying to do this. You don't install routes for one particular circumstance. Routes are installed once for all clients in all states. So, if you're trying to conditionally install routes, that type of design does not work. If you further explain what you're trying to accomplish, then perhaps we could make some different suggestions for a design.
In the code below, I'm assuming there is a chance that my route handler will fire and try to emit to the socket before the io connection has been established:
app.post('/something', (req, res) => {
req.app.io.emit('something', doSomethingWith(req.body))
res.status(200)
});
How exactly this code works depends upon what is doing the POST. If it's Javascript in an existing page, then that page will already by up and initialized and you control (with your Javascript client code in the page) whether you wait to issue the POST to /something until after the socket.io connection is established.
If this POST is a regular browser-based form submission (no Javascript involved), then you have other problems because a form submission from a browser reloads the current browser page with the response from the POST and, in the process of reloading the page, kills any existing socket.io connection that page had (since it loads a new page). Since you're not sending any content back from the POST, this would result in an empty page being displayed in the browser and no socket.io connection.
In looking at your client code, it appears that perhaps the POST is coming from a fetch() in the client code (and thus entirely Javascript-based). If that's the case, I would suggest restructuring your client code so that it waits until the socket.io connection has finished connecting before doing the fetch(). That way, you know you will be able to receive the io.emit() that the server does.
socket = io(`http://localhost:${port}`, { transports: ['websocket'] })
socket.on('something', (data) => {
doSomethingMoreWith(data)
});
socket.on('connect', () => {
// only issue fetch after socket.io connection is operational
fetch('/something', ....)
});
I have a fairly simple express server that is designed to take external client data and publish it via mqtt to a gateway. It works perfectly with a hardcoded variable but I can't figure out how to extract the actual data from the POST request, which is as follows (it prints to the console just fine):
const postData = app.post('/send-data', function (req, res) {
console.log('connected', req.body);
res.status(200).json(req.body)
});
I need to get the req.body data out of that and into the following code that publishes the data to the topic:
client.on('connect', function () {
console.log('connected!');
client.publish('iot-2/type/wtlType/id/channel100/evt/event/fmt/json', publishData);
client.end();
});
publishData will just be the stringified json response.
This is the create server code if that helps:
https.createServer(options, app).listen(30002, () => {
console.log('Listening')
});
If I understand correctly your question is about the logic of getting the req.body published by the client. If so, then something like this should work:
let connected = false;
client.on('connect', function () {
console.log('connected!');
connected = true;
});
const postData = app.post('/send-data', function (req, res) {
console.log('connected', req.body);
res.status(200).json(req.body)
client.publish('iot-2/type/wtlType/id/channel100/evt/event/fmt/json', JSON.stringify(req.body));
client.end(); // are you sure you want this? can there not be more messages to broadcast?
});
I have made a React application which relies fully on WebSockets after the initial HTTP Upgrade. For security reasons i use a cookie AND a JWT token in my WebSockets connection.
It all works fine, but when opening a new tab, socket.io cookies get reissued and I want users to stay logged in over multiple tabs. So i want to set a cookie if the client doesn't already have one. If it already has one, then use that cookie.
So I want to handle the first HTTP polling requests and created middleware for that in Node's http server:
// HTTP SERVER
const server = require('http').createServer(function (request, response) {
console.log('test');
console.log(request);
if(!request.headers.cookie) { // cookie pseudo-logic
response.writeHead(200, {
'Set-Cookie': 'mycookie=test',
'Content-Type': 'text/plain'
});
}
// Socket.IO server instance
const io = require('socket.io')(server, {
origins: config.allowedOrigins,
cookie: false, // disable default io cookie
});
server.listen(port, () => console.log(`Listening on port ${port}`));
I use Socket.io as WebSockets framework. The problem however is that this middleware get's ignored, when registering the Socket.io server. When i comment out the Socket.io server, the middleware is active and the request get's logged.
It looks like Socket.io's server is overriding the handler for node http's server. In the Socket.io docs however they provide this example:
var app = require('http').createServer(handler)
var io = require('socket.io')(app);
var fs = require('fs');
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
Thus indicating that it should be possible to handle thw first http polling requests and also the socket requests. I managed to get it work with Express, but I don't understand why node's http server can't.
Anybody who knows what's happening?
Thanks in advance,
Mike
Because normal usage of socket.io does not want regular http middleware to see socket.io connection requests (they would normally trigger 404 responses), socket.io places its own request handler first in line before any others, even ones that existed before it was installed.
You can see how it does that here: https://github.com/socketio/engine.io/blob/master/lib/server.js#L437 in the engine.io source.
I can think of the following ways for you to pre-process a request before socket.io sees it:
Use a proxy and do your cookie stuff in a proxy before socket.io even sees the request.
Patch socket.io/engine.io code to add a callback hook for what you want to do.
Copy the technique used by socket.io/engine.io to put your own request handler first in line after socket.io is configured.
Find a way to override the socket.io server object's handleRequest() method which is what gets called when there's an incoming connection request. You can see its code here.
I'm trying to build an application that has two components. There's a public-facing component and an administrative component. Each component will be hosted on a different server, but the two will access the same database. I need to set up the administrative component to be able to send a message to the public-facing component to query the database and send the information to all the public clients.
What I can't figure out is how to set up a connection between the two components. I'm using the standard HTTP server setup provided by Socket.io.
In each server:
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, fs = require('fs')
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.sockets.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
And on each client:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
I've looked at this question but couldn't really follow the answers provided, and I think the situation is somewhat different. I just need one of the servers to be able to send a message to the other server, and still send/receive messages to/from its own set of clients.
I'm brand new to Node (and thus, Socket), so some explanation would be incredibly helpful.
The easiest thing I could find to do is simply create a client connection between the servers using socket.io-client. In my situation, the admin server connects to the client server:
var client = require("socket.io-client");
var socket = client.connect("other_server_hostname");
Actions on the admin side can then send messages to the admin server, and the admin server can use this client connection to forward information to the client server.
On the client server, I created an on 'adminMessage' function and check for some other information to verify where the message came from like so:
io.sockets.on('connection', function (socket) {
socket.on('adminMessage', function (data) {
if(data.someIdentifyingData == "data") {
// DO STUFF
}
});
});
I had the same problem, but instead to use socket.io-client I decided to use a more simple approach (at least for me) using redis pub/sub, the result is pretty simple. My main problem with socket.io-client is that you'll need to know server hosts around you and connect to each one to send messages.
You can take a look at my solution here: https://github.com/alissonperez/scalable-socket-io-server
With this solution you can have how much process/servers you want (using auto-scaling solution), you just use redis as a way to forward your messages between your servers.
Right now I have the following code: foo
sIo.sockets.on('connection', function(socket){
socket.emit('hello', 'world');
});
I would like to be able to emit this when somebody opens a page from my routes, like this:
//app.js
app.get('/send', routes.index);
//routes.js
exports.index = function(req, res){
socket.emit('hello', 'world');
};
How can I achieve this? Thanks in advance
To send a socket message to all connected sockets, you can just call io.sockets.emit instead of socket.emit. There are a few ways to send messages using socket.io which I'll outline below.
// Send the message to all connected clients
io.sockets.emit('message', data);
// Send the message to only this client
socket.emit('message', data);
// Send the messages to all clients, except this one.
socket.broadcast.emit('message', data);
There is also a concept of rooms which you can use to segment your clients.
// Have a client join a room.
socket.join('room')
// Send a message to all users in a particular room
io.sockets.in('room').emit('message', data);
All of that covers how to send messages, but it's clear now you're asking about how to access the socket and / or io objects from inside a separate file. One options just has you pass those dependencies to the specified file. Your require line will end up looking something like this.
var routes = require('./routes')(io);
Where io is the socket.io object created from .listen. To handle that in your route file you'll have to change how you're defining your exports.
module.exports = function(io) {
return {
index: function(req, res) {
io.sockets.emit('hello', 'world');
res.send('hello world');
}
};
}
A cleaner implementation would have your routes expose events that your socket code can bind to. The following is untested, but should be very close.
var util = require("util"),
events = require("events");
function MyRoute() {
events.EventEmitter.call(this);
}
util.inherits(MyRoute, events.EventEmitter);
MyRoute.prototype.index = function(req, res) {
this.emit('index');
res.send('hello world');
}
module.exports = new MyRoute();
And then in your app.js file when you're binding express routes and socket.io.
app.get('/send', routes.index);
routes.on('index', function() {
io.sockets.emit('hello', 'world');
});
There are many other ways to accomplish this, but the best one depends on what you're trying to do. As I alluded to before, calling broadcasting to everyone is going to be far more simple than broadcasting to a particular user.