Exposing websocket data through an HTTP endpoint - node.js

I'm currenty suscribing to a websocket channel in order to get data that I needs to be consumed by clients using a GET endpoint.
Clients don't need to consume the full data stream, just the last message received.
I thought about storing the data in memory or into a database, and using that data to serve said GET requests, but I suspect that's not the right implementation. Any ideas on how it should be done will be appreciated.
EDIT: I'm not asking for code, just for an idea on what patter should I follow, BTW i'm using express.
Websocket code:
const WebSocket = require('ws')
const connection = new WebSocket('wss://ws.bitso.com')
connection.onopen = function () {
connection.send(JSON.stringify({ action: 'subscribe', book: 'btc_usd', type: 'orders' }))
}
connection.onmessage = function (message) {
const data = JSON.parse(message.data)
if (data.type === 'orders' && data.payload) {
console.log(data)
}
}

I just stored the data in memory and used that as the return value for my endpoint.

Related

How should I go about using Redis for creating notifications with express/nodejs?

Okay so I have a Nodejs/Express app that has an endpoint which allows users to receive notifications by opening up a connection to said endpoint:
var practitionerStreams = [] // this is a list of all the streams opened by pract users to the
backend
async function notificationEventsHandler(req, res){
const headers ={
'Content-Type': 'text/event-stream',
'Connection': 'keep-alive',
'Cache-Control': 'no-cache'
}
const practEmail = req.headers.practemail
console.log("PRACT EMAIL", practEmail)
const data = await ApptNotificationData.findAll({
where: {
practEmail: practEmail
}
})
//console.log("DATA", data)
res.writeHead(200, headers)
await res.write(`data:${JSON.stringify(data)}\n\n`)
// create a new stream
const newPractStream = {
practEmail: practEmail,
res
}
// add the new stream to list of streams
practitionerStreams.push(newPractStream)
req.on('close', () => {
console.log(`${practEmail} Connection closed`);
practitionerStreams = practitionerStreams.filter(pract => pract.practEmail !== pract.practEmail);
});
return res
}
async function sendApptNotification(newNotification, practEmail){
var updatedPractitionerStream = practitionerStreams.map((stream) =>
// iterate through the array and find the stream that contains the pract email we want
// then write the new notification to that stream
{
if (stream["practEmail"]==practEmail){
console.log("IF")
stream.res.write(`data:${JSON.stringify(newNotification)}\n\n`)
return stream
}
else {
// if it doesnt contain the stream we want leave it unchanged
console.log("ELSE")
return stream
}
}
)
practitionerStreams = updatedPractitionerStream
}
Basically when the user connects it takes the response object (that will stay open), will put that in an Object along with a unique email, and write to it in the future in sendApptNotification
But obviously this is slow for a full app, how exactly do I replace this with Redis? Would I still have a Response object that I write to? Or would that be replaced with a redis stream that I can subscribe to on the frontend? I also assume I would store all my streams on redis as well
edit: from what examples I've seen people are writing events from redis to the response object
Thank you in advance
If you want to use Redis Stream as notification system, you can follow this official guide:
https://redis.com/blog/how-to-create-notification-services-with-redis-websockets-and-vue-js/ .
To get this data as real time you need to create a websocket connection. I prefer to send to you an official guide instead of create it for you it's because the quality of this guide. It's perfect to anyone understand how to create it, but you need to adapt for your reality, of course.
However like I've said to you in the comments, I just believe that it's more simple to do requests in your api endpoint like /api/v1/notifications with setInterval in your frontend code and do requests each 5 seconds for example. If you prefer to use a notification system as real time I think you need to understand why do you need it, so in the future you can change your system if you think you need it. Basically it's a trade-off you must to do!
For my example imagine two tables in a relational database, one as Users and the second as Notifications.
The tables of this example:
UsersTable
id name
1 andrew
2 mark
NotificationTable
id message userId isRead
1 message1 1 true
2 message2 1 false
3 message3 2 false
The endpoint of this example will return all cached notifications that isn't read by the user. If the cache doesn't exists, it will return the data from the database, put it on the cache and return to the user. In the next call from API, you'll get the result from cache. There some points to complete in this example, for example the query on the database to get the notifications, the configuration of time expiration from cache and the another important thing is: if you want to update all the time the notifications in the cache, you need to create a middleware and trigger it in the parts of your code that needs to notify the notifications user. In this case you'll only update the database and cache. But I think you can complete these points.
const redis = require('redis');
const redisClient = redis.createClient();
app.get('/notifications', async (request, response) => {
const userId = request.user.id;
const cacheResult = await redisClient.get(`user:${userId}:notifications`)
if (cacheResult) return response.send(cacheResult);
const notifications = getUserNotificationsFromDatabase(userId);
redisClient.set(`user:${userId}:notifications`, notifications);
response.send(notifications);
})
Besides that there's another way, you can simple use only the redis or only the database to manage this notification. Your relational database with the correct index will send to your the results as faster as you expect. You'll only think about how much notifications you'll have been.

How to safely get the current user id in socket.io/Node.JS?

I am developing a simple API for a chat application on Node.js Express, and by assignment, it is required to make it possible to communicate between two users using socket.іо. I am faced with the problem that I cannot "safely" transfer information about the current user to the socket in any way. Information about the user with whom the correspondence is conducted can be specified in the socket parameters when connecting, which I do, but what about the current user (me)?
For example, I can do this:
const {receiverId, myId} = socket.handshake.query;
That is, specify both ids when connecting. But this is very unsafe because anyone who will join the socket can specify any id and write anything on behalf of someone else (for example, through Postman WebSockets).
Another option I was considering is making a post request in which a connection to the socket will be created using request.user.id and the request parameter. Then the post request will look like this:
router.post('/chat/:receiver', function (req,res){
const {receiver} = req.params
const socket = io.connect('/')
socket.emit('initMyUserId', {
myId: req.user,
});
})
But this option also did not work, because the file where this function is located and the file where the io variable is initialized are different, and I am not sure that it is generally advisable to transfer and use it in this way. Moreover, this approach will not allow me to check the operation of sockets via Postman, because the connection will occur in a post request, and not manually.
Are there working options to safely transfer the current user id with the ability to test it normally in postman? Or at least just safely pass the current user id if it doesn't work with Postman.
Here is the full code snippet for the socket events handlers:
module.exports = function(io) {
io.on('connection', (socket)=>{
const {id} = socket;
console.log(Socket connected: ${id});
const {idUser} = socket.handshake.query;
console.log(Socket idUser: ${idUser});
socket.on('message-to-user', (msg) => {
msg.type = user: ${idUser};
socket.to(idUser).emit('message-to-user', msg);
socket.emit('message-to-user', msg);
});
socket.on('disconnect', () => {
console.log(Socket disconnected: ${id});
});
});
}

websocket send to specific user nodejs

I am currently creating a websocket server for a mobile front. and in some cases I need to send a json only to a specific user and after several attempts I still haven't managed to get Websocket to work so I can send my json to one client at a time.
i'm using this library : github.com/websockets/ws
To explain my problem i have several products that contain several variables that need to be refreshed in real time. when a user connects to a product he will receive only the json of that product and the other users will receive the json of the other products they are currently on. that's why i want to send a specific json to a user to enable this
I would like to know if any of you know how to fix the problem as I'm starting to block on it.
Thank you very much.
const opp = new WebSocket.Server({port: 10001});
let user = 0;
let lookup = [];
opp.on('connection', function connection(op) {
lookup.push(op.id);
let id = "";
op.on('message', function incoming(message) {
console.log('received: %s', message);
id = message;
query = {
text: "SELECT id,state,users_list_name,user_win,timer_stamp FROM products WHERE id = " + parseInt(id) + " AND circle = 1 ORDER BY CASE WHEN state = \'available\' THEN \'1\' WHEN state = \'soon\' THEN \'2\' WHEN state = \'expired\' THEN \'3\' END",
};
});
client.connect();
const interval = setInterval(function ping() {
client.query(query, (err, res) => {
if (err) {
console.log(err.toString());
console.log(query);
} else {
console.log(lookup);
for (let i = 0; i < lookup.length; i++){
console.log("########################");
lookup[i].send(JSON.stringify(res.rows));
}
}
});
}, 300);
});```
OK. Still trying to understand the actual spec you're shooting for. But, assuming the following (based on your answers to my prior questions):
A client connects using a webSocket.
When they send a message over that webSocket, that message is an id of something that can be looked up in your database and that they want regular updates for.
Those updates for that particular id should be sent only to that specific client that requested it.
If a different client connects and specifies some id, they should get updates for that id only.
When a client sends a new message that specifies a different id, their updates should now be only for that new id.
Updates for the id that one client requested are sent only to that one client (not to the other clients).
If that's what you really want, here's a way to structure that.
const wss = new WebSocket.Server({port: 10001});
// make database connection that all users share
client.connect();
wss.on('connection', function connection(ws) {
// these variables are unique to each ws connection
let interval, query;
// when webSocket closes, stop any current interval timer associated with this webSocket
ws.on('close', function() {
if (interval) {
clearInterval(interval);
}
});
// when we get an id, start querying for updates on that id
ws.on('message', function incoming(id) {
console.log(`received: ${id}`);
query = {
text: "SELECT id,state,users_list_name,user_win,timer_stamp FROM products WHERE id = " + parseInt(id) + " AND circle = 1 ORDER BY CASE WHEN state = \'available\' THEN \'1\' WHEN state = \'soon\' THEN \'2\' WHEN state = \'expired\' THEN \'3\' END",
};
// if interval is not already going, start it
if (!interval) {
interval = setInterval(function() {
client.query(query, (err, res) => {
if (err) {
console.log(err);
console.log(query);
} else {
// send data to just the one client that this timer is for
ws.send(JSON.stringify(res.rows));
}
});
}, 300);
}
});
});
Now, some comments:
Polling the database on a short time interval with a separate polling loop for every single client simply will not scale at all. You will have serious database scale issues. You really need a better design here, but since we don't know the overall requirements and architecture of your application, we don't have enough info to know what to suggest. Probably you want to leverage notifications in a database that tell you when data has changed rather than you polling it on a short interval on behalf of every single client.
I could find no reason for the lookup data structure. Your comments say that you want to send updates to ONE specific client, the one that requested that id. That can be done with ws.send().
This code assumes that the client variable represents a connection to your database that each of the setIntervals for each connected client can all share. That's why that code was moved out of the wss.on('connection', ...) event handler.
I switched to the more common terminology of wss to refer to the server instance and ws to refer to the webSocket for a particular connected client.
ws.send() is how you send to a connected client. I still don't know what you were doing with op.id. Looking at the doc for the ws library, that doesn't appear to be something you can use to send to.
Your code (and this code) creates a separate setInterval() timer for every webSocket client that connects and it uses a very short interval time. This will not scale. At worst, the interval time needs to be lengthened into multiple seconds (depending upon desired target scale). At best, you need to stop polling the database entirely and use some other mechanism in the database for getting notifications when data has been changed.

How to send File through Websocket along with additional info?

I'm developing a Web application to send images, videos, etc. to two monitors from an admin interface. I'm using ws in Node.js for the server side. I've implemented selecting images available on the server and external URLs and sending them to the clients, but I also wanted to be able to directly send images selected from the device with a file input. I managed to do it using base64 but I think it's pretty inefficient.
Currently I send a stringified JSON object containing the client to which the resource has to be sent, the kind of resource and the resource itself, parse it in the server and send it to the appropriate client. I know I can set the Websocket binaryType to blob and just send the File object, but then I'd have no way to tell the server which client it has to send it to. I tried using typeson and BSON to accomplish this, but it didn't work.
Are there any other ways to do it?
You can send raw binary data through the WebSocket.
It's quite easy to manage.
One option is to prepend a "magic byte" (an identifier that marks the message as non-JSON). For example, prepend binary messages with the B character.
All the server has to do is test the first character before collecting the binary data (if the magic byte isn't there, it's probably the normal JSON message).
A more serious implementation will attach a header after the magic byte (i.e., file name, total length, position of data being sent etc').
This allows the upload to be resumed on disconnections (send just the parts that weren't acknowledged as received.
Your server will need to split the data into magic byte, header and binary_data before processing. but it's easy enough to accomplish.
Hope this help someone.
According to socket.io document you can send either string, Buffer or mix both of them
On Client side:
function uploadFile(e, socket, to) {
let file = e.target.files[0];
if (!file) {
return
}
if (file.size > 10000000) {
alert('File should be smaller than 1MB')
return
}
var reader = new FileReader();
var rawData = new ArrayBuffer();
reader.onload = function (e) {
rawData = e.target.result;
socket.emit("send_message", {
type: 'attachment',
data: rawData
} , (result) => {
alert("Server has received file!")
});
alert("the File has been transferred.")
}
reader.readAsArrayBuffer(file);
}
on server side:
socket.on('send_message', async (data, cb) => {
if (data.type == 'attachment') {
console.log('Found binary data')
cb("Received file successfully.")
return
}
// Process other business...
});
I am using pure WebSocket without io, where you cannot mix content - either String or Binary. Then my working solution is like this:
CLIENT:
import { serialize } from 'bson';
import { Buffer } from 'buffer';
const reader = new FileReader();
let rawData = new ArrayBuffer();
ws = new WebSocket(...)
reader.onload = (e) => {
rawData = e.target.result;
const bufferData = Buffer.from(rawData);
const bsonData = serialize({ // whatever js Object you need
file: bufferData,
route: 'TRANSFER',
action: 'FILE_UPLOAD',
});
ws.send(bsonData);
}
Then on Node server side, the message is catched and parsed like this:
const dataFromClient = deserialize(wsMessage, {promoteBuffers: true}) // edited
fs.writeFile(
path.join('../server', 'yourfiles', 'yourfile.txt'),
dataFromClient.file, // edited
'binary',
(err) => {
console.log('ERROR!!!!', err);
}
);
The killer is promoteBuffer option in deserialize function.

NodeJS - Response stream

I built a simple API endpoint with NodeJS using Sails.js.
When someone access my API endpoint, the server starts to wait for data and whenever a new data appears, he broadcasts it using sockets. Each client should receive his own stream of data based on his user input.
var Cap = require('cap').Cap;
collect: function (req, res) {
var iface = req.param("ip");
var c = new Cap(),
device = Cap.findDevice(ip);
c.on('data', function(myData) {
sails.sockets.blast('message', {"host": myData});
});
});
The response do not complete (I never send a res.json() - what actually happens is that the browser keep loading - but the above functionality works).
2 Problems:
I'm trying to subscribe and unsubscribe to to this API endpoint from my client (using RxJS). When I subscribe, I start to receive data via sockets - but I can't unsubscribe to the API endpoint (the browser expect the request to be completed).
Each client should subscribe to his own socket room based on the request IP parameter ( see updated code ). Currently it blasts the message to everyone.
How I can create a stream/service-like API endpoint with Sails.js that will emit new data to each user based on his input?
My goal is to be able to subscribe / unsubscribe to this API endpoint from each client.
Revised Answer
Let's assume your API endpoint is defined in config/routes.js like this:
...
'get /collect': 'SomeController.collectSubscribe',
'delete /collect': 'SomeController.collectUnsubscribe',
Since each Cap instance is tied to one device, we need one instance for each subscription. Instead of using the sails join/leave methods, we keep track of Cap instances in memory and just broadcast to the request socket's id. This works because Sails sockets are subscribed to their own ids by default.
In api/controllers/SomeController.js:
// In order for the `Cap` instances to persist after `collectSubscribe` finishes, we store them all in an Object, associated with which socket the were created for.
var caps = {/* req.socket.id: <instance of Cap>, */};
module.exports = {
...
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!!caps[req.socket.id]) return res.badRequest("Dude, you are already subscribed.");
caps[req.socket.id] = new Cap();
var c = caps[req.socket.id]; // remember that `c` is a reference to our new `Cap`, not a copy.
var device = c.findDevice(req.param('ip'));
c.open(device, ...);
c.on('data', function(myData) {
sails.sockets.broadcast(req.socket.id, 'message', {host: myData});
});
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!caps[req.socket.id]) return res.badRequest("I can't unsubscribe you unless you actually subscribe first.");
caps[req.socket.id].removeAllListeners('data');
delete caps[req.socket.id];
return res.ok();
}
}
Basically, it goes like this: when a browser request triggers collectSubscribe, a new Cap instance listens to the provided IP. When the browser triggers collectUnsubscribe, the server retreives that Cap instance, tells it to stop listening, and then deletes it.
Production Considerations: please be aware that the list of Caps is NOT PERSISTENT (since it is stored in memory and not a DB)! So if your server is turned off and rebooted (due to lightning storm, etc), the list will be cleared, but considering that all websocket connections will be dropped anyway, I don't see any need to worry about this.
Old Answer, Kept for Reference
You can use sails.sockets.join(req, room) and sails.sockets.leave(req, room) to manage socket rooms. Essentially you have a room called "collect", and only sockets joined in that room will receive a sails.sockets.broadcast(room, eventName, data).
More info on how to user sails.sockets here.
In api/controllers/SomeController.js:
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.join(req, 'collect');
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.leave(req, 'collect');
return res.ok();
}
Finally, we need to tell the server to broadcast messages to our 'collect' room.
Note that this only need to happen once, so you can do this in a file under the config/ directory.
For this example, I'll put this in config/sockets.js
module.exports = {
// ...
};
c.on('data', function(myData) {
var eventName = 'message';
var data = {host: myData};
sails.sockets.broadcast('collect', eventName, data);
});
I am assuming that c is accessible here; If not, you could define it as sails.c = ... to make it globally accessible.

Resources