Modules initialization in node.js and persisting their state between requests - node.js

My scenario: I am going to upload some small amount of configuration data and also rarely changing data from the database to say exports.config, that I want to use instead of config file so that app admin (not a sysadmin :) could configure the application via web interface, and I wanted to make sure that this data will not be reloaded every time this module is require'd.
Am I right to assume that whatever [initialization] code I have in node.js module (outside of functions definitions) it will be executed only once per process lifetime, regardless how many times I require this module?
Probably a stupid question, but I am struggling to understand some aspects of how node.js functions.

Yes.
The file/module can be required many times per process lifetime, but will be executed only once. At least by default.
This works out nicely for you because you can simply query your config table once at app initialization and the exported values will be constant until the app is restarted.
From the nodejs module caching docs
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Multiple calls to require('foo') may not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.
If you want to have a module execute code multiple times, then export a function, and call that function.

Related

In an isomorphic Redux app, is it better practice to keep API calls small, or to send over all information in one go?

I am building a sports data visualization application with server-side rendering in React (ES6)/Redux/React-Router-Redux. At the top, there is a class-based App component, and there are two different class-based component routes. (everything under those is a stateless functional component), structured as follows:
App
|__ Index (/)
|__ Match (/match/:id)
When a request is made for a given route, one API call is dispatched, containing all information for the given route. This is hosted on a different server, where we're using Restify and Sequelize ORM. The JSON object returned is roughly 12,000 to 30,000 lines long and takes anywhere from 500ms to 8500ms to return.
Our application, therefore, takes a long time to load, and I'm thinking that this is the main bottleneck. I have a couple options in mind.
Separate this huge API call into many smaller API calls. Although, since JS is single-threaded, I'd have to measure the speed of the render to find out if this is viable.
Attempt lazy loading by dispatching a new API call when a new tab is clicked (each match has several games, all in new tabs)
Am I on the right track? Or is there a better option? Thanks in advance, and please let me know if you need any more examples!
This depends on many things including who your target client is. Would mobile devices ever use this or strictly desktop?
From what you have said so far, I would opt for "lazy loading".
Either way you generally never want any app to force a user to wait at all especially not over 8 seconds.
You want your page send and show up with something that works as quick as possible. This means you don't want to have to wait until all data resolves before your UI can be hydrated. (This is what will have to happen if you are truly server side rendering because in many situations your client application would be built and delivered at least a few seconds before the data is resolved and sent over the line.)
If you have mobile devices with spotty networks connections they will likely never see this page due to timeouts.
It looks like paginating and lazy loading based on accessing other pages might be a good solution here.
In this situation you may also want to look into persisting the data and caching. This is a pretty big undertaking and might be more complicated than you would want. I know some colleagues who might use libraries to handle most of this stuff for them.

Node.js shared object or singleton to ensure a single source of truth

I'm writing a Node app that will generate an array of JMS queue subscriptions based on configuration pulled from a backend db. Upon initialization, I will retrieve the current state from the backend, but I then need to support inbound API calls to the Node server to add and remote subscriptions on the fly (and I'll then update the list stored in the backend).
There must only ever be one instance of this subscription array available in the entire application, and it must be the source of truth.
Example:
On init, I'd get from the API:
{
"amysCar": {
"name": "2016 Lamborghini Gallardo LP 550-2",
"queue": "/queues/amysracingstats"
},
"bobsCar": {
"name": "1967 Ford Mustang Shelby Shelby GT500",
"queue": "/queues/bobsrestorationproject"
}
}
Then let's say Bob decides he no longer wants to publish his restoration project; he'll make an API call to the Node server DELETE /projects/bobsCar. I'll remove the queue listener from the queues array and update the backend db.
In practice, I need to be able to access the JMS queues array from multiple controllers because the controller that handles the API calls is separate from the controller that handles queue subscriptions.
I am aware that singletons are notoriously difficult to write and manage in Node, are frowned upon, and are allegedly completely unnecessary because Node.js module caching supposedly only ever loads a single instance of a module. If that's correct, I could simply store the array of queue subscriptions as a private object in a module, and then require it in the API call controller.
Is this true? Can I use this ability to meet my requirement?
If so, how would I overwrite the initialization of the module so that I can make that initial backend call to get the current state?
Am I overthinking this?
Is this true? Can I use this ability to meet my requirement?
Yes, it is true. Just use a module property. Because of module caching, there will only be one such property.
If so, how would I overwrite the initialization of the module so that I can make that initial backend call to get the current state?
I would assume that the module itself would just make the back-end call to initialize the state. It can do that when first loaded or via the first module constructor call.
Am I overthinking this?
You seem to already know about module caching and how module properties can work as singletons so if you're still trying to avoid using that capability, then that would be overthinking it. But, if you're just confirming how things work before relying on it, no problem with that.

Output something once per request

I'm creating a module that exports a method that can may be called several times by any code in node.js using it. The method will be called usually from views and it will output some html/css/js. Some of this html/css/js however only needs to be output once per page so I'd like to output it only the first time the module is called per request. I can accomplish doing it the first time the module is called ever but again the method of my module can be called several times across several requests for the time the server is up so I specifically want to run some specific code only once per page.
Furthermore, I want to do this while requiring the user to pass as little to my method as possible. If they pass the request object when creating the server I figure I can put a variable in there that will tell me if my method was already called or not. Ideally though I'd like to avoid even that. I'm thinking something like the following from within my module:
var http = require('http');
http.Server.on('request', function(request, response){
console.log('REQUEST EVENT FIRED!');
// output one-time css
});
However this doesn't work, I assume it's because I'm not actually pointing to the Server emitter that was/may have been created in the script that was originally called. I'm new to node.js so any ideas, clues or help is greatly appreciated. Thanks!
Setting a variable on the request is an accepted pattern. Or on the response, if you don't even want to pass the request to your function.
One more thing you can do is indeed, like you write, have the app add a middleware and have that middleware either output that thing.
I'm not sure if I completely understand your "problem" but what you are trying to achieve seems to me like building a web application using Node.js. I think you should use one of the web frameworks that are available for Node so you can avoid reinventing the wheel (writing routing, static files serving etc. yourself).
Express framework is a nice place to start. You can find tons of tutorials around the internet and it has strong community: http://expressjs.com/

Node.js best practice for require

I am creating an Express application and using Mongoose to save data.
I created a User model (username&password). It needs to be able to be saved. So I use require('mongoose') in Models/user-model.js. In the route for User I want to be able to get all the users or just find some. So I need to require('mongoose') there as well. Also in the main js file (app.js) I create a connection to the database so there is a require('mongoose') there as well.
It all works well, but is it the best way to require mongoose in all of these files? Or is there a better way to do this?
Well, "best" is hard to say for certain. But, what you're doing is a common-enough practice and should be fine in most cases.
The 1st time mongoose is required, it'll be cached for subsequent requires:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Multiple calls to require('foo') may not cause the module code to be executed multiple times. [...]
With module caching in mind...do you think that is a good idea to create a global APP object and store things like models and libraries in the bootstrap and then just pass that object to other modules or just require the module dependencies when it needed no matter if our modules have a lot of require() sentences ?

Get 2 userscripts to interact with each other?

I have two scripts. I put them in the same namespace (the #namespace field).
I'd like them to interactive with another.
Specifically I want script A to set RunByDefault to 123. Have script B check if RunByDefault==123 or not and then have script A using a timeout or anything to call a function in script B.
How do I do this? I'd hate to merge the scripts.
The scripts cannot directly interact with each other and // #namespace is just to resolve script name conflicts. (That is, you can have 2 different scripts named "Link Remover", only if they have different namespaces.)
Separate scripts can swap information using:
Cookies -- works same-domain only
localStorage -- works same-domain only
Sending and receiving values via AJAX to a server that you control -- works cross-domain.
That's it.
Different running instances, of the same script, can swap information using GM_setValue() and GM_getValue(). This technique has the advantage of being cross-domain, easy, and invisible to the target web page(s).
See this working example of cross-tab communication in Tampermonkey.
On Chrome, and only Chrome, you might be able to use the non-standard FileSystem API to store data on a local file. But this would probably require the user to click for every transaction -- if it worked at all.
Another option is to write an extension (add-on) to act as a helper and do the file IO. You would interact with it via postMessage, usually.
In practice, I've never encountered a situation were it wasn't easier and cleaner to just merge any scripts that really need to share data.
Also, scripts cannot share code, but they can inject JS into the target page and both access that.
Finally, AFAICT, scripts always run sequentially, not in parallel. But you can control the execution order from the Manage User Scripts panel

Resources