Pass scope in node between different files/scripts - node.js

I am using Mocha to unit test some of the Kue (redis) functions of my node server. Of course Mocha starts its own process. This process has its own instance which is, of course, scope protected from the node server I have running on the same physical server (my laptop in this case as I am in dev env).
My question is: how can the mocha server check any variables or events from the main node server? I realize I could get tricky and have node push variables to redis and then poll redis from mocha but that seems arduous.

You can't. Since the mocha is launching in a separate node process, any javascript variables global or local are relatively scoped to that individual node process. You are correct with the redis suggestion to send a message from process to process or literally launch the node server inside your mocha framework.

Related

In memory redis server for test

I am writing integration tests for my NodeJS application that connects to a Redis cluster. The test framework I use is Mocha. Is it possible to setup Redis as an in-memory database which I can use to only test which then wipes away all my keys when the test is done?
Check this out
https://github.com/mhassan1/redis-memory-server : Redis Server for testing. The server will allow you to connect your favorite client library to the Redis Server and run parallel integration tests isolated from each other.

NodeJS start mongoDB server

I need to start the mongoDB server from my NodeJS application. I managed to do this before but for some reason I forgot how. I though I used a cild process but not sure anymore as I can't get anything to work at the moment.
How would I start the mongoDB server (command mongod) from withing my NodeJS app and execute some other code when the server had been started (guessing using a promise...)?
You can use child_process to run mongod from your application, but this may cause the MongoDB server to exit when your app exits. It's generally better to have the DB server running all the time.
https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback

Websockets inside a PM2 cluster, ok in production?

Before going to production, we want to make sure that this is an "as expected behavior".
I have conducted an experiment by laucnhing 4 child processes using a PM2 cluster (I have 4 cores on my machine). Which means there were 4 websocket processes running...
Then on the client I created multiple sockets, and sent many messages to the server. One thing I didn't expect was that Node was able to figure out what child process the socket belonged to, meaning that every message sent by the client was console logged by the correct child process.
It seems like the main worker in the cluster keeps track of what sockets belong where.
So is this managed by Nodejs internally by the "cluster" module?
Also is this ok to use in production?
P.S. for websockets we use "ws" module for Nodejs
I aksed the same question on github. And got an answer...
Also please look into using ClusterWs - it's awesome!
https://github.com/ClusterWS/ClusterWS/issues/143

Is there a way to run a node task in a child process?

I have a node server, which needs to:
Serve the web pages
Keep querying an external REST API and save data to database and send data to clients for certain updates from REST API.
Task 1 is just a normal node tasks. But I don't know how to implement the task 2. This task won't expose any interface to outside. It's more like a background task.
Can anybody suggest? Thanks.
To make a second node.js app that runs at the same time as your first one, you can just create another node.js app and then run it from your first one using child_process.spawn(). It can regularly query the external REST API and update the database as needed.
The part about "Send data to clients for certain updates from REST API" is not so clear what you're trying to do.
If you're using socket.io to send data to connected browsers, then the browsers have to be connected to your web server which I presume is your first node.js process. To have the second node.js process cause data to be sent through the socket.io connections in the first node.js process, you need some interprocess way to communicate. You can use stdout and stdin via child_process.spawn(), you can use some feature in your database or any of several other IPC methods.
Because querying a REST API and updating a database are both asynchronous operations, they don't take much of the CPU of a node.js process. As such, you don't really have to do these in another node.js process. You could just have a setInterval() in your main node.js process, query the API every once in a while, update the database when results are received and then you can directly access the socket.io connections to send data to clients without having to use a separate process and some sort of IPC mechanism.
Task 1:
Express is good way to accomplish this task.
You can explore:
http://expressjs.com/
Task 2:
If you are done with Expressjs. Then you can write your logic with in Express Framework.
This task then can be done with node module forever. Its a simple tool that runs your background scripts forever. You can use forever to run scripts continuously (whether it is written in node.js or not)
Have a look:
https://github.com/foreverjs/forever

Does Socket.IO forks or spawns a new process when run?

I have a node application that uses Socket.IO for the messaging.
And I run it using
node --expose_gc /path/to/app.js
Now, when I check on the htop utility, I noticed that instead of 1, I am getting multiple processes of the same command.
Can someone, in noob terms, explain to me why and what is going on here? I'm also worried that it may consume unexpected memory/cpu usage too.
socket.io does not fork or spawn any child processes.
usually sub processes that run node.js are spawned via cluster module but socket.io does no such thing.
it just adds a handler on top of a http server.
socket.io is just a library that hooks into a web server and listens for certain incoming requests (those requests that initiate a webSocket/socket.io connection). Once a socket.io connection is initiated, it just uses normal socket programming to send/receive messages.
It does not start up any additional processes by itself.
Your multiple processes are either because you accidentally started your own app multiple times without shutting it down or there is something else in your app that is starting up multiple processes. socket.io does not do that.

Resources