Keeping track of socket.io sessions for logging - node.js

I am working on making a generic logging module for my application and am trying to add session information to each log (requestId/socketId, userId, etc.) But I am running into some issues with logging websockets.
Basically my application has 2 parts: a restAPI (express) and websockets (socket.io)
Both the restAPI and websockets use some of the same functions (database edits etc.), now these functions should log errors or other useful data.
But passing the session information to the logger module will create a lot of overhead and makes the code quite unreadable, so I am looking for a way to save the session information so that the logger can get the information from there.
For the restAPI this was fairly simple using asyncLocalStorage and I was hoping to utilize the same principle for the websockets but I guess its not that simple.
My (partially) working code setup is as follows:
Global context creator (logAsyncContext.ts):
import { AsyncLocalStorage } from "async_hooks";
export const context = new AsyncLocalStorage();
export const createContext = (data: any, callBack: () => any) => {
const store = data;
return context.run(store, () => callBack());
};
This is then used by the middleware of the restAPI and websockets
RestAPI middleware (apiLogContext.ts):
// Import the required modules
import { v4 } from "uuid";
import { Request, Response, NextFunction } from "express";
// Import custom utilities
import { createContext } from "../../utils/logAsyncContext";
import { logger } from "../../utils/logger";
// Generate a unique ID for incoming requests and store in context so logger can access it
export const apiLogContext = (
req: Request,
_res: Response,
next: NextFunction
) => {
const logData = {
api: {
requestId: v4(),
originalUrl: req.originalUrl,
},
};
return createContext(logData, () => debugLog(next));
};
const debugLog = (next: NextFunction) => {
logger. Debug("API log context created");
return next();
};
websocket middleware (wsLogContext.ts):
// Import the required modules
import { v4 } from "uuid";
import { Socket } from "socket.io";
// Import custom utilities
import { createContext } from "../../utils/logAsyncContext";
import { logger } from "../../utils/logger";
// Generate a unique ID for incoming requests and store in context so logger can access it
export const wsLogContext = (socket: Socket, next: () => void) => {
const logData = {
ws: {
socketId: v4(),
nameSpace: socket.nsp.name,
},
};
return createContext(logData, () => debugLog(next));
};
const debugLog = (next: () => void) => {
logger.debug(`WS log context created`);
return next();
};
Now the logger can get the context from logAsyncContext.ts:
import { context } from "./logAsyncContext";
const getStore = () => {
// Get the store from the AsyncLocalStorage
const store = context.getStore();
// If the store is not defined, log an error
if (!store) {
console.log("Store is not defined");
return undefined;
}
return store;
};
export function debug(message: string) {
// Get the context
const store = getStore();
if (!store) {
return;
}
if (isAPILog(store)) {
console.debug(
`DEBUG LOG: ${store.api.requestId} | ${store.api.originalUrl} - ${message}`
);
} else {
console.debug(
`DEBUG LOG: ${store.ws.socketId} | ${store.ws.nameSpace} - ${message}`
);
}
};
This works perfectly for the restAPI but for the websockets its a different story, it does log the initial debug message ("WS log context created") but everything logged after cannot access the store ("Store is not defined")
Now I am sure this is very logical but I don't fully understand the structure of data for websocket connections, so I am asking, am I just making a simple mistake or is this whole setup of logging for websockets incorrect? If so what would be the better way (without needing to pass the session info with every log)?

I faced with same issue.
After shallow investigation, I can suppose following moments:
socket.io middlewares are not the same as in express.(not 100% sure)
There known issue https://github.com/nodejs/node/issues/32330 (closed but with tricky code)
To go forward with AsyncLocalStorage in sockets.io I do next steps:
// context.js
const uuid = require('uuid').v4;
const { AsyncLocalStorage } = require('async_hooks');
const context = new AsyncLocalStorage();
const enterWith = (data) => context.enterWith({traceId: uuid(), ...data });
module.exports = { context, enterWith };
// sockets.js
// I have legacy socket.io v2, your code may be different
io.use(contextMiddleware);
io.use(authSocket);
io.on('connection', (socket) => {
socket.on('USER_CONNECT', async () => {
socket.emit('Exo', `USER_CONNECT`);
try {
// The main solution is here, enter a valid context before actual controller execution
await enterWith({ userId: socket.chatuser });
await userService.createOrUpdateChatUser({ userId: socket.chatuser, customerId });
socket.emit('Exo', `User created`);
} catch (e) {
logger.error(`Create user failure ${e.message}`, { error: e });
socket.emit('Error', e.message);
}
});

Thanks for #bohdan for reminding me that this issue was still unanswered. While his solution works, I will also explain what I did for anyone wondering how to do this using middleware.
What I learned is that WebSockets can be very confusing but quite logical, for me the most important thing to realize was that you cannot use the "same" asyncLocalStorage for a single socket as long as that socket is connected. So I use a different asyncLocalStorage for each event (I will call them stores)
For me there are 4 different "stores" for a websocket connection Which cannot share the same store.
When a connection is made
When an event is received (frontend --> backend)
When an event is sent (backend --> frontend)
When a connection is closed
For all of these types I (mostly) use the same middleware:
import { AsyncLocalStorage } from "async_hooks";
import { Socket } from "socket.io"
import { v4 } from "uuid";
const context = mew AsyncLocalStorage();
const wsLogStore = (socket: Socket, next: () => void) => {
const newData: any = {
// Any data you want to save in the store
// For example socket Id
socketId: socket.id
// I also add an eventId which I can later use in my logging to combine all logs belonging to a single event
eventId: v4()
}
return context.run(newData, () => callBack())
}
#1 For the first type (when a connection is made)
You can use the middleware like this:
// Import the middleware we just created
import wsLogStore from "./wsLogStore"
// io = socketIO server instance (io = new Server)
io.use(wsLogStore)
Now a store will be available everywhere as long as it happens directly after the connection
#2 When a event is received (frontend --> backend)
io.use((socket, next) => {
socket.use((event, next) => {
wsLogStore(socket, () => {
next()
});
});
})
Now everywhere you use socket.on("<any event>") A store will have been created and usable
#3 When an event is sent (backend --> frontend)
Now this one is a little bit different since depending on your implementation this will not be easy, for example when you sent something to a specific room, is it enough to create a single store for the whole room? Or do you want to have a separate one for each socket that is receiving a event? And how do we create a store since we don't have a specific socket available?
For my use case it was absolutely necessary to have a separate store for each socket that is receiving an event.
const sentEventsToSockets = () => {
// Get the sockets you want to send a event to,
// For example, you could get the sockets from a room
const sockets = (Array.from(io.sockets.values()) as Socket[]).filter((socket) => socket.rooms.has("your room"))
for (const socket of sockets) {
wsLogStore(socket, () => {
//Here a separate store for each socket will be available
socket.emit("your event")
})
}
}
#4 When a connection is closed
Sadly, the store we created in step 1 is not available in this case so we would need to create a new one.
io.use((socket, next) => {
socket.on("disconnect", () => {
wsLogStore(socket, () => {
// A separate store will be available here if a connection is closed
});
});
})
Conclusion
While it would be easier if we could create a single store for each socket and use it the whole time, it seems like that is simply not possible.
By saving the socketId in our store we can however combine all data that we need afterwards. For example, in logging.
Note: If you use namespaces the socketId will be different for each namespace, you could use the connection id socket.conn.id which is a unique ID for each socket (no matter which namespace). Why this value is marked as private (if using TS) I have no clue
All of this will of course be slightly different depending on your use case and implementation. For example, if you use namespaces then you need to make sure the middleware is applied in each namespace.
I hope someone finds this helpful and if there are any question about how I do things or how to improve my setup, I would love to hear from you!

Related

Node.js/Vuetify- Is there a way to get data from server based on time?

I have a node.js server set up for a vuetify project. In my server, I am parsing a csv file that has information in it about scheduling and time. In my vuetify project, is there a way to get data from the csv based on the time that the client is being used?
OK, let's go with an example.
From what I understand, you have the following information in your CSV file:
Time,Activity
07:00,Breakfast
08:00,Go to work
12:00,Lunch break
Since you didn't specify, I will use an example parser, which will push all rows, as objects, into an array:
[
{ Time: '07:00', Activity: 'Breakfast' },
{ Time: '08:00', Activity: 'Go to work' },
{ Time: '12:00', Activity: 'Lunch break' }
]
You need to send that information to your clients, so assuming you are using Express, you could go with something in the lines of:
const csv = require('csv-parser');
const fs = require('fs');
const express = require('express');
const app = express();
const timeSchedule = [];
function parseCsv() {
return new Promise((resolve, reject) => {
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (data) => timeSchedule.push(data))
.on('error', (err) => reject(err))
.on('end', () => {
csvParsed = true;
resolve();
});
}
}
app.get('/scheduled-for/:hour', function (req, res) {
// You need to come up with the logic for your case
// As an example, I will assume that if I make a request at any time
// between 7 and 8, I will get "Breakfast"
// between 8 and 9 - "Go to work"
// between 12 and 13 - "Lunch break"
parseCsv().then(() => {
res.json(timeSchedule.find(row => row.Time.startsWith(req.params.hour)))
})
})
Please note, all of the above is happening on the Nodejs server.
From the client, you will have to call the scheduled-for GET handle with the hour param. Another option is to allow the back-end to determine the hour of the request, by using the Date object, but the above is more flexible for the client. It will also avoid issues with different timezones, given that your client requests are coming from different timezones than the one your server is on.
Assuming you are using axios in your Vue application, the simplest way to get the schedule is to call your API in your component:
new Vue({
el: '#app',
data () {
return {
activity: null
}
},
mounted () {
axios
.get('https://urlToYourApi/v1/scheduled-for/' + new Date().getHours())
.then(schedule => {
if (schedule) {
this.activity = schedule.Activity;
}
else {
console.log("No activity found for this hour!");
}
}
}
})
This code is NOT for production! You need to handle many cases, as with new Date().getHours() returning single-digit hours, parsing of CSV, not to mention the domain logic itself, which depends on your specific case. This is just a simple example. I hope it helps to guide you in the right direction!

How do conditionally thread in a nodejs application

My website's api works fine and runs fast, except for one route. Since nodejs is single threaded, we wanted to make the api call DATA be separately threaded so that it doesnt block the rest of the incoming calls, and because it took too long to fork. Basically the code I want is like this:
router.get(req, res){
if(MasterThread){
create a thread to run the DATA functions
}
else{
res.json(getDATA())
}
}
is this possible at all? All the tutorials I found implied that I either had to use cluster, or my threading had to occur in my main.js, neither of which I want to do.
When I tried to set up the threading across 2 files, the imports for nodejs threading were always null, or my imports never worked.
So again, if this a possible thing?
You can use worker_threads for this case I use it for one of my sites and it works perfectly here is minimal example
const { Worker } = require('worker_threads')
const worker_script = path.join(__dirname, "./worker.js")
router.get('/', function(req, res) {
const worker = new Worker(worker_script, {
workerData: JSON.stringify(obj)
})
worker.on("error", (err) => console.log(err))
worker.on("exit", () => console.log("exit"))
worker.on("message", (data) => {
console.log(data)
res.send(data)
})
})
and here is the worker
const { parentPort, workerData, isMainThread } = require('worker_threads')
if (!isMainThread) {
console.log("workerData: ", workerData)
//do some heavy work with the data
var parsed = JSON.parse(workerData)
parentPort.postMessage(parsed)
}

Using Mocha for integration testing of a callback that should trigger when on an AMQP message received event

I have a Feathers application that is using RabbitMQ and a custom amqplib wrapper to communicate with some other code running elsewhere and I'm struggling to write a good integration test to show that the callback that runs when a message is received runs correctly. The actual callback just takes the body of the received message and calls an internal service to put the data in the database.
I have a RabbitMQ server running in the test environment, and the idea was to write a test that publishes some dummy data to the correct exchange, and then check that the data ends up in the database. The problem is that I can't work out how to tell that the callback has finished before I check the database.
Right now, I just publish the message and then use a timeout to wait a few seconds before checking the database, but I don't like this since there is no guarantee that the callback will have completed.
The code I'm testing looks something like this (not the actual code just an example):
const app = require('./app');
// handleAMQP is passed as a callback to the consumer
// it creates a new record in the myService database
const handleAMQP = async(message) => {
await app.service('users').create(message.content);
};
// Subscribe takes an amqp connection, opens a channel, and connects a callback
const subscribe = (conn) => {
let queue = 'myQueue';
let exchange = 'myExchange';
return conn.createChannel().then(function (ch) {
var ok = ch.assertExchange(exchange, 'topic', { durable: true });
ok = ok.then(function () {
return ch.assertQueue(queue, { exclusive: true });
});
ok = ok.then(function (qok) {
var queue = qok.queue;
ch.bindQueue(queue, exchange, topic);
});
ok = ok.then(function (queue) {
return ch.consume(queue, handleAMQP);
});
});
};
module.exports = {subscribe};
And my test looks something like this:
const assert = require('assert');
const amqp = require('amqplib');
describe('AMQP Pub/Sub Tests', async () => {
let exchange = 'myExchange';
let topic = 'myTopic';
let dummyData = {
email: 'example#example.com',
name: 'Example User'
}
it('creates a new db enry when amqp message recieved', async () => {
// Publish some dummy data
await amqp.connect('amqp://localhost').then((conn) => {
conn.createChannel().then((ch) => {
ch.assertExchange(exchange, 'topic', {durable: true}).then(() => {
ch.publish(exchange, topic, dummyData).then(() => {
ch.close();
})
});
});
});
await setTimeout(() => { // Wait three seconds
let result = app.service('users').find({email : 'example#example.com'}); // Attempt to find the newly created user
assert.deepEqual(result.email, dummyData.email);
assert.deepEqual(result.name, dummyData.name);
}, 3000);
});
});
Instead of just waiting an arbitrary time limit before I check if the record exists, is there a better way to structure this test?
Or is waiting a certain time a totally valid for event-driven functionality?

How to send data to local callable function? (Firebase CLI Shell) [duplicate]

I can't seem to find the solution for this in the Firebase Documentation.
I want to test my functions.https.onCall functions locally. Is it possible using the shell or somehow connect my client (firebase SDK enabled) to the local server?
I want to avoid having to deploy every time just to test a change to my onCall functions.
My code
Function :
exports.myFunction = functions.https.onCall((data, context) => {
// Do something
});
Client:
const message = { message: 'Hello.' };
firebase.functions().httpsCallable('myFunction')(message)
.then(result => {
// Do something //
})
.catch(error => {
// Error handler //
});
For locally you must call (after firebase.initializeApp)
firebase.functions().useFunctionsEmulator('http://localhost:5000')
Although the official Firebase Cloud Function docs have not yet been updated, you can now use firebase-functions-test with onCall functions.
You can see an example in their repository.
I have managed to test my TypeScript functions using jest, here is a brief example. There are some peculiarities here, like import order, so make sure to read the docs :-)
/* functions/src/test/index.test.js */
/* dependencies: Jest and jest-ts */
const admin = require("firebase-admin");
jest.mock("firebase-admin");
admin.initializeApp = jest.fn(); // stub the init (see docs)
const fft = require("firebase-functions-test")();
import * as funcs from "../index";
// myFunc is an https.onCall function
describe("test myFunc", () => {
// helper function so I can easily test different context/auth scenarios
const getContext = (uid = "test-uid", email_verified = true) => ({
auth: {
uid,
token: {
firebase: {
email_verified
}
}
}
});
const wrapped = fft.wrap(funcs.myFunc);
test("returns data on success", async () => {
const result = await wrapped(null, getContext());
expect(result).toBeTruthy();
});
test("throws when no Auth context", async () => {
await expect(wrapped(null, { auth: null })).rejects.toThrow(
"No authentication context."
);
});
});
There is a simple trick, how you can simplify onCall -function testing. Just declare the onCall function callback as a local function and test that instead:
export const _myFunction = (data, context) => { // <= call this on your unit tests
// Do something
}
exports.myFunction = functions.https.onCall(_myFunction);
Now you can variate all cases with a normal function with the input you define on your function call.
Callables are just HTTPS functions with a specific format. You can test just like a HTTPS function, except you have to write code to deliver it the protocol as defined in the documentation.
you should first check for dev environment and then point your functions to local emulator.
For JS:
//after firebase init
if (window.location.host.includes("localhost") ||
window.location.host.includes("127.0.0.1")
) {
firebase
.app()
.functions() //add location here also if you're mentioning location while invoking function()
.useFunctionsEmulator("http://localhost:5001");
}
or if you don't create instance of firebase then
//after firebase init
if (window.location.host.includes("localhost") ||
window.location.host.includes("127.0.0.1")
) {
firebase
.functions()
.useFunctionsEmulator("http://localhost:5001");
}
or when serving pages from backend (node.js):
//after firebase init
if (process.env.NODE_ENV === 'development') {
firebase.functions().useFunctionsEmulator('http://localhost:5001');
}
if you are using angularfire, add this to you app.module
{
provide: FirestoreSettingsToken,
useValue: environment.production
? undefined
: {
host: "localhost:5002",
ssl: false
}
}

observer create multiple message with socket.io in angular 2

i am using of this code for receive MSG to socket in angular 2 but i have used in global application but it's created multiple time msgs.. on routeing another pages i created one chat-box component which opened global after open like Facebook chat-box.
`get-messages() {
let observable = new Observable(observer => {
this.socket = io(this.url);
this.socket.on('message', (data) => {
observer.next(data);
});
return () => {
this.socket.disconnect();
};
})
return observable;
} `
I am not sure if this will help or not. In angular 1.x I use
$scope.$on('$destroy', function(event) {
// Code to un observe the socket here...
});
I am sure there is an equivalent in angular 2

Resources