Lazy loading boilerplate modules - boilerplatejs

in boilerplatejs it looks like modules are pre-loaded
(Refer to code below)
return [
require('./baseModule/module'),
require('./sampleModule2/module'),
require('./customerModule/module'),
require('./orderSearchModule/module'),
require('./orderListModule/module'),
require('./mainMenuModule/module')
];
what is the impact on this when it comes to large scale web applications (module heavy web applications) . Is there any way to lazy load modules in boilerplatejs?

Java script has no reflection type of mechanism to load things. Any module that needs to be loaded has to be registered somewhere. This is why modules (sub contexts) are loaded as below:
appContext.loadChildContexts(moduleContexts);
(in src/application.js)
But code you have above is about requirejs AMD modules. This is basically the import of different JS scripts (each representing code for modules). With requireJS AMD, scripts (your source code) are not lazy loaded, but pre loaded. This make sense since your application source code should be available in the browser for execution. On the other hand, when you do JS optimization, we anyway create a single script containing all your source code. Then there is no meaning of having separate script files, or lazy loading of the source code.
But lazy loading should be applied to the behaviors (not to load code). This is why the UI components (ViewTemplate) are only created at the 'activate()' method. These are created only when users demands for it. That is a lazy loading of behavior, so that application rendering time is less.
this.activate = function(parent, params) {
// if panel is not created, lets create it and initiate bindings
if (!panel) {
panel = new Boiler.ViewTemplate(parent, template, nls);
...
}
vm.initialize(params.name);
panel.show();
}

Related

Is it bad practice to have a requirejs package with no main script?

Lets say i have a bunch of js view scripts defined as AMD modules in directory 'views'.
Rather than listing them all out in the requirejs config, i could just do this:
require = {
baseUrl: 'js',
packages: [
{ name:'views',location:'app/views' }
...
],
...
}
I then require them as ['views/sunset', 'views/ocean'] (or './ocean' if from another view) etc.
This saves me a whopping 20 seconds or so versus listing them all out individually in the require config, and arguably makes my define() calls more expressive (i.e it's clear which scripts are components, which are utilities etc)
Essentially i'm treating the directory as a package, but there is no main script, so require(['views']) would return a 404. Is there any reason why this approach might be considered bad practice? Are there issues with this that i'm not seeing?
In my previous job we did not have main.js file because each page had different modules so they were loaded dynamically based on the content of the page. I didn't notice any issues with that.
Your solution seems to be similar. If you do not have any errors, it's fine :)
main.js is not required but it is recommended by RequireJS from what I remember

How do the core modules in node.js work? (https://github.com/nodejs/node/blob/master/lib)

Does the node interpreter look for core modules (let's say "fs") within the node binary? If yes, are these modules packaged as js files. Are the core modules that are referenced within our code converted to c/c++ code first and then executed? For example, I see a method in the _tls_common.js (https://github.com/nodejs/node/blob/master/lib/_tls_common.js) file called "loadPKCS12" and the only place that I see this method being referenced/defined is within the "node_crypto.cc" file (https://github.com/nodejs/node/blob/master/src/node_crypto.cc). So how does node link a method in javascript with the one defined in the c/c++ file?
here is the extract from the _tls_common.js file that makes use of the "loadPKCS12" method:
if (passphrase) {
c.context.loadPKCS12(buf, toBuf(passphrase));
} else {
c.context.loadPKCS12(buf);
}
}
} else {
const buf = toBuf(options.pfx);
const passphrase = options.passphrase;
if (passphrase) {
c.context.loadPKCS12(buf, toBuf(passphrase));
} else {
c.context.loadPKCS12(buf);
There are two different (but seemingly related) questions asked here. The first one is: "How the core modules work?". Second one being "How does NodeJS let c++ code get referenced and executed in JavaScript?". Let's take them one by one.
How the core modules work?
The core modules are packaged with NodeJS binary. And, while they are packaged with the binary, they are not converted to c++ code before packaging. The internal modules are loaded into memory during bootstrap of the node process. When a program executes, lets say require('fs'), the require function simply returns the already loaded module from cache. The actual loading of the internal module obviously happens in c++ code.
How does NodeJS let c++ code get referenced in JS?
This ability comes partly from V8 engine which exposes the ability to create and manage JS constructs in C++, and partly from NodeJS / LibUV which create a wrapper on top of V8 to provide the execution environment. The documentation about such node modules can be accessed here. As the documentation states, these c++ modules can be used in JS file by requiring them, like any other ordinary JS module.
Your example for use of c++ function in JS (loadPKCS12), however, is more special case of internal c++ functionality of NodeJS. loadPKCS12 is called on a object of SecureContext imported from crypto c++ module. If you follow the link to SecureContext import in _tls_common.js above, you will see that the crypto is not loaded using require(), instead a special (global) method internalBinding is used to obtain the reference. At the last line in node_crypto.cc file, initializer for internal module crypto is registered. Following the chain of initialization, node::crypto::Initialize calls node::crypto::SecureContext::Initialize which creates a function template, assigns the appropriate prototype methods and exports it on target. Eventually these exported functionalities from C++ world are imported and used in JS-World using internalBinding.

RequireJS Multi Injecting

I am building a modular single page application which consumes multiple require config files from different sources. I would like a way in my application to be able to consume a list of all modules of a specific type. something like this:
define('module-type/an-implementation',...)
define('module-type/another-implementation',...)
require('module-type/*', function(modules){
$.each(modules,function(m){ m.doStuff(); });
})
This is similar patterns dependency injectors use with multiple dependency injection (eg. https://github.com/ninject/ninject/wiki/Multi-injection)
Is there a way to do this (or something similar) with require?
RequireJS doesn't know which modules exist until something requires them. Once a module is required / depended upon RequireJS will figure out where to request the module from based on module's name and RequireJS's configuration. Once the module is loaded it can be examined / executed to find out its dependencies and handle them in turn, until all dependencies are loaded and all module bodies are executed.
In order to be able to "consume a list of all modules of a specific type" something would need to be able to find all such modules. RequireJS doesn't have any means to know which modules exist, so it alone wouldn't be enough to implement "Multi Injection".
Speculation
Some kind of module registry could be created and populated with help of the build system: e.g. a file (say module-registry.js) could be generated each time a file in the source directory is added / removed or renamed, then multi inject could be possible like:
multiRequire('module-type/*', function(modules){
$.each(modules,function(m){ m.doStuff(); });
})
which in turn would call
require(findModules(pattern), function() {
callback(Array.prototype.slice.call(arguments, 0));
});
(where multiRequire and findModules are provided by the module registry).

Pattern for [lazy-loading] modules on demand

In my application I need to load modules not at the initialization stage (by enumerating them in ./modules/modules), but on demand later based on some condition (e.g. user authorization results). Imagine I want to provide User A with calculator module, and User B with text editor module.
Let's take boilerplatejs example app for simplicity, and assume that sampleModule1 and sampleModule2 are to be loaded on demand.
So I remove the modules from initial loading sequence in src\modules\modules.js:
return [
require('./baseModule/module'),
/*require('./sampleModule1/module'),
require('./sampleModule2/module')*/
];
and add a control to summary page (src\modules\baseModule\landingPage\view.html) to load them on-demand:
<div>
Congratulations! You are one step closer to creating your next large scale Javascript application!
</div>
<div>
<select id="ModuleLoader">
<option value="">Select module to load...</option>
<option value="./modules/sampleModule1/module">sampleModule1</option>
<option value="./modules/sampleModule2/module">sampleModule2</option>
</select>
</div>
Then I patch src\modules\baseModule\module.js to pass context to the LandingPageComponent (for some reason it doesn't in the original source code):
controller.addRoutes({
"/" : new LandingPageComponent(context)
});
and finally add the loading code to src\modules\baseModule\landingPage\component.js:
$('#ModuleLoader').on('change', function(){
require([this.value], function(mod){
moduleContext.loadChildContexts([mod]);
alert('Module enabled. Use can now use it.');
});
});
This appears to be working, but is this the best way to do it?
Does it handle context loading properly?
How to protect against loading the same module twice?
Smart thinking here.. I was too working on improving BoilerplateJS to lazy-load modules as plugins last few days. I did something similar, and u can access the POC code at:
https://github.com/hasith/boilerplatejs
In the POC I did, I'm trying to achieve 'lazy loading' + 'configurable ui' at the same time.
Each of the screen (called an application) is a collection of components (plugins) placed on to a layout. These application definitions are just a JSON object either dynamically returned from server API or statically defined as JSON files (as it is in the POC). In the POC, application definitions are stored at "/server/application".
These applications can be called dynamically by convention now. For example "/##/shipping-app" will look for an application definition with the same name at "/server/application".
Application loading happens via a new component I have at '/modules/baseModule/appShell'. Anything staring with '/##/' will be routed to this component and then it will try to load the application definition by convention from "/server/application" folder.
When the 'appShell' receives the application definition (as a JSON), it will try to load the components (plugins) that are defined in it dynamically too. These pluggable components are placed in 'src/plugins' and will be loaded by convention too.
Sample application definition will look like below:
{
"application-id" : "2434",
"application-name" : "Order Application",
"application-layout" : "dummy-layout.css",
"components" : [
{
"comp-name" : "OrderDetails",
"layout-position" : "top-middle"
},
{
"comp-name" : "ShippingDetails",
"layout-position" : "bottom-middle"
}
]
}
The benefit of the approach are as follows:
Applications interfaces are component driven and configurable at runtime
It is possible that different application definitions are sent to the user depending on the user role, etc (logic in backend)
Applications can be created on-the-fly by combining already existing components
There is no static code changes needed in the 'framework' either to add new applications or components since loading is based on convention
These are very initial thoughts around. Appreciate any feedback !
You can protect against multiple loading of modules by using named functions for the change event, and unbinding the function after executing it.

On-demand require()

Say I create a library in ./libname which contains one main file: main.js and multiple optional library files which are occasionally used with the main object: a.js and b.js.
I create index.js file with the following:
exports.MainClass = require('main.js').MainClass; // shortcut
exports.a = require('a');
exports.b = require('b');
And now I can use the library as follows:
var lib = require('./libname');
lib.MainClass;
lib.a.Something; // Here I need the optional utility object
lib.b.SomeOtherThing;
However, that means, I load 'a.js' and 'b.js' always and not when I really need them.
Sure I can manually load the optional modules with require('./libname/a.js'), but then I lose the pretty lib.a dot-notation :)
Is there a way to implement on-demand loading with some kind of index file? Maybe, some package.json magic can play here well?
This may possible if you called the "MainClass" to dynamically load the additional modules on-demand. But I suspect this will also mean an adjustment in the api to access the module.
I am guess your motivation is to "avoid" the extra processing used by "non-required modules".
But remember Node is single threaded so the memory footprint of loading a module is not per-connection, it's per-process. Loading a module is a one-off to get it into memory.
In other words the modules are only ever loaded when you start your server not every-time someone makes a request.
I think you looking at this from client-side programming where it's advantages to load scripts when they are required to save on both processing and or http requests.
On the server the most you will be saving is few extra bites in memory.
Looks like the only way is to use getters. In short, like this:
exports = {
MainClass : require('main.js').MainClass,
get a(){ return require('./a.js'); },
get b(){ return require('./a.js'); }
}

Resources