What's the use of 'Buffer.isBuffer' when you could use 'instanceof'? - node.js

I don't understand what is the purpose of the Buffer.isBuffer function when instanceof works like a charm :
var b = new Buffer('blabla')
assert.ok(b instanceof Buffer)

Well, actually these are the same (currently at least):
-- lib/buffer.js:
Buffer.isBuffer = function isBuffer(b) {
return util.isBuffer(b);
};
-- lib/util.js:
function isBuffer(arg) {
return arg instanceof Buffer;
}
exports.isBuffer = isBuffer;
... so the only possible reason is readability. Note that before this specific implementation there was a set of macros for type checks, used when building the source. But it has been changed with this commit, and that was the reasoning:
Adding macros to Node's JS layer increases the barrier to
contributions, and it breaks programs that export Node's js files for
userland modules. (For example, several browserify transforms, my
readable-streams polyfill, the util-debuglog module, etc.) These are
not small problems.
I'd suggest checking the whole discussion in the commit's pull request.

Related

Node.js Globalize es6 modules to act like ImportScripts

The question is simple, how do we make es6 modules act like the ImportScript function used on the web browser.
Explanation
The main reason is to soften the blow for developers as they change their code from es5 syntax to es6 so that the transition does not blow up your code the moment you make the change and find out there are a thousand errors due to missing inclusions. It also give's people the option to stay as is indefinitely if you don't want to make the full change at all.
Desired output
ImportScript(A file path/'s); can be applied globally(implicitly) across subsequently required code and vise-verse inside a main file to avoid explicit inclusion in all files.
ES6 Inclusion
This still does not ignore the fact that all your libraries will depend on modules format as well. So it is inevitable that we will still have to include the export statement in every file we need to require. However, this should not limit us to the ability to have a main file that interconnects them all without having to explicitly add includes to every file whenever you need a certain functionality.
DISCLAIMER'S
(Numbered):
(Security) I understand there are many reasons that modules exist and going around them is not advisable for security reasons/load times. However I am not sure about the risk (if any) of even using a method like "eval()" to include such scripts if you are only doing it once at the start of an applications life and to a constant value that does not accept external input. The theory is that if an external entity is able to change the initial state of your program as is launched then your system has already been compromised. So as it is I think the whole argument around Globalization vs modules boils down to the project being done(security/speed needed) and preference/risk.
(Not for everyone) This is a utility I am not implying that everyone uses this
(Already published works) I have searched a lot for this functionality but I am not infallible to err. So If a simple usage of this has already been done that follows this specification(or simpler), I'd love to know how/where I can attain such code. Then I will promptly mark that as the answer or just remove this thread entirely
Example Code
ES5 Way
const fs = require('fs');
let path = require('path');
/* only accepts the scripts with global variables and functions and
does not work with classes unless declared as a var.
*/
function include(f) {
eval.apply(global, [fs.readFileSync(f).toString()])
}
Main file Concept example:
ImportScript("filePath1");loaded first
ImportScript("filePath2");loaded second
ImportScript("filePath3");loaded third
ImportScript("filePath4");loaded fourth
ImportScript("filePath5");loaded fifth
ImportScript("someExternalDependency");sixth
/* where "functionNameFromFile4" is a function defined in
file4 , and "variableFromFile2" is a global dynamic
variable that may change over the lifetime of the
application.
*/
functionNameFromFile4(variableFromFile2);
/* current context has access to previous scripts contexts
and those scripts recognize the current global context as
well in short: All scripts should be able to access
variables and functions from other scripts implicitly
through this , even if they are added after the fact
*/
Typical exported file example (Covers all methods of export via modules):
/*where "varFromFile1" is a dynamic variable created in file1
that may change over the lifetime of the application and "var" is a
variable of type(varFromFile4) being concatenated/added together
with "varFromFile4".
*/
functionNameFromFile4(var){
return var+varFromFile1;
}
//Typical export statement
exportAllHere;
/*
This is just an example and does not cover all usage cases , just
an example of the possible functionality
*/
CONCLUSION
So you still need to export the files as required by the es6 standard , however you only need to import them once in a main file to globalize their functionality across all scripts.
I'm not personally a fan of globalizing all the exports from a module, but here's a little snippet that shows you how one ESM module's exports can be all assigned to the global object:
Suppose you had a simple module called operators.js:
export function add(a, b) {
return a + b;
}
export function subtract(a, b) {
return a - b;
}
You can import that module and then assign all of its exported properties to the global object with this:
import * as m from "./operators.js"
for (const [prop, value] of Object.entries(m)) {
global[prop] = value;
}
// can now access the exports globally
add(1, 2);
FYI, I think the syntax:
include("filePath1")
is going to be tough in ESM modules because dynamic imports in an ESM module using import (which is presumably what you would have to use to implement the include() function you show) are asynchronous (they return a promise), not synchronous like require().
I wonder if a bundler or a transpiler would be an option?
There is experimental work in nodejs related to custom loaders here: https://nodejs.org/api/esm.html#hooks.
If you can handle your include() function returning a promise, here's how you put the above code into that function:
async function include(moduleName) {
const m = await import(moduleName);
for (const [prop, value] of Object.entries(m)) {
global[prop] = value;
}
return m;
}

node.js most performant way to skip a part of code

In compiling languages like C we have a preprocessor that can be used to skip parts of program without compiling them, effectively just excluding them from the source code:
#ifdef SHOULD_RUN_THIS
/* this code not always runs */
#endif
So that if SHOULD_RUN_THIS is not defined, then the code will never be run.
In node.js we don't have a direct equivalent of this, so the first thing I can imagine is
if (config.SHOULD_RUN_THIS) {
/* this code not always runs */
}
However in node there is no way to guarantee that config.SHOULD_RUN_THIS will never change, so if (...) check will be performed each time in vain.
What would be the most performant way to rewrite it? I can think of
a) create a separate function to allow v8-optimizations:
function f(args) {
if (config.SHOULD_RUN_THIS) {
/* this code not always runs */
}
}
// ...
f(args);
b) create a variable to store the function and set it to an empty function when not needed:
var f;
if (config.SHOULD_RUN_THIS) {
f = (args) => {
/* this code not always runs */
}
}
else {
f = function () {} // does nothing
}
// ...
f(args);
c) do not create a separate function, just leave it in place:
if (config.SHOULD_RUN_THIS) {
/* this code not always runs */
}
What is the most performant way? Maybe some other way...
i personally would adopt ...
if (config.SHOULD_RUN_THIS) {
require('/path/for/conditional/module');
}
the module code is only required where needed, otherwise it is not even loaded in memory let alone executed.
the only downside is that it is not readily clear which modules are being required since your require statements are not all positioned at the top of the file.
es6 modularity adopts this dynamic module request approach.
PS use of config like this is great since, you can for example, use an environment variable to determine your code path. Great when spinning up, for example, a bunch of docker containers that you want to behave differently depending on the env vars passed to the docker run statements.
apologies for this insight if you are not a docker fan :) apologies i am waffling now!
if you're looking for a preprocessor for your Javascript, why not use a preprocessor for your Javascript? It's node-compatible and appears to do what you need. You could also look into writing a plugin for Babel or some other JS mangling tool (or v8 itself!)
If you're looking for a way to do this inside the language itself, I'd avoid any optimizations which target a single engine like v8 unless you're sure that's the only place your code will ever run. Otherwise, as has been mentioned, try breaking out conditional code into a separate module so it's only loaded if necessary for it to run.

Better way to require.extensions with Node.js

I'm testing a bunch of React JSX components. They all need to be transpiled with React, or Babel or whatever, but we have special needs for stubbing requirements, so I'm trying to override requires with a special compiler that's run with Mocha. The solution below works well, but you'll notice that we're using require.extensions[] to capture all the .jsx files. What concerns me is that require.extensions is locked and deprecated. Is there any better way to do this?
// Install the compiler.
require.extensions['.jsx'] = function(module, filename) {
return module._compile(transform(filename), filename);
};
Here's the whole transpiler for reference:
// Based on https://github.com/Khan/react-components/blob/master/test/compiler.js
var fs = require('fs'),
ReactTools = require('react-tools');
// A module that exports a single, stubbed-out React Component.
var reactStub = 'module.exports = require("react").createClass({render:function(){return null;}});';
// Should this file be stubbed out for testing?
function shouldStub(filename) {
if (!global.reactModulesToStub) return false;
// Check if the file name ends with any stub path.
var stubs = global.reactModulesToStub;
for (var i = 0; i < stubs.length; i++) {
if (filename.substr(-stubs[i].length) == stubs[i]) {
console.log('should stub', filename);
return true;
}
}
return false;
}
// Transform a file via JSX/Harmony or stubbing.
function transform(filename) {
if (shouldStub(filename)) {
delete require.cache[filename];
return reactStub;
} else {
var content = fs.readFileSync(filename, 'utf8');
return ReactTools.transform(content, {harmony: true});
}
}
// Install the compiler.
require.extensions['.jsx'] = function(module, filename) {
return module._compile(transform(filename), filename);
};
And some links to simalar solutions...
https://github.com/danvk/mocha-react/issues/1
https://github.com/Automattic/jsx-require-extension
https://www.npmjs.com/package/node-jsx
https://github.com/olalonde/better-require
http://mochajs.org/#usage
http://nodejs.org/api/globals.html#globals_require_extensions
A solution can be forked from here:
https://github.com/danvk/mocha-react
There are two reasons that API has been deprecated. One, the node module resolution algorithm is VERY complicated, it has to look at the specified file, if it doesn't exist it looks for that file and all the possible extensions in the keys of require.extensions, and if it's a directory, look for a package.json or index.js. Oh, and don't forget, if there is no ./ at the beginning, it looks in the node_modules directory, looking at the parent directory if it can't be found in that node_modules. Ryan Dahl said he regrets making it so complicated in his talk at JsConf 2018, and uses a much simpler module resolution algorithm in his deno project. Two, it needs more filesystem calls if there are more extensions in require.extensions, because it has to match extensionless files.
A solution to the second problem is require-extension. I haven't used it myself, but it abstracts the require.extensions API and makes it much more performant.
There is no other way to do this, and this is how everybody does transpiling (babel, etc). #uni_nake's answer - to use node-hook - is OK in that it hides this from you, but it essentially uses the same mechanism: a look in it's code shows that it uses Module._extensions, but this is the same as require.extensions, as shown by a test I wrote: https://github.com/giltayar/playing/blob/1f04f6ddc1a0028974b403b4d1974afa872edba4/javascript/node/test/is-module-extensions-same-as-require-extensions.test.js
So final answer - I would assume that nobody at Node will break Babel, and if they do, they will probably give another solution for the same problem. I would not hesitate to use it!
I use node-hook to stub all .scss calls in my tests.
You'll see from the docs that when a matching file is required it will execute the containing string instead and is really quite powerful as it also passes you the original source.
Hope that is what you're looking for.
I think you should use pirates
I think the PR when require.extensions used in babel-register was replaced with pirates would be helpful.
https://github.com/babel/babel/pull/3670/files#diff-75a0292ed78043766c2d5564edd84ad2L85-L93
Hope that is what you're looking for.

how to pass a shared variable to downstream modules?

I have a node toplevel myapp variable that contains some key application state - loggers, db handlers and some other data. The modules downstream in directory hierarchy need access to these data. How can I set up a key/value system in node to do that?
A highly upticked and accepted answer in Express: How to pass app-instance to routes from a different file? suggests using, in a lower level module
//in routes/index.js
var app = require("../app");
But this injects a hard-coded knowledge of the directory structure and file names which should be a bigger no-no jimho. Is there some other method, like something native in JavaScript? Nor do I relish the idea of declaring variables without var.
What is the node way of making a value available to objects created in lower scopes? (I am very much new to node and all-things-node aren't yet obvious to me)
Thanks a lot.
Since using node global (docs here) seems to be the solution that OP used, thought I'd add it as an official answer to collect my valuable points.
I strongly suggest that you namespace your variables, so something like
global.myApp.logger = { info here }
global.myApp.db = {
url: 'mongodb://localhost:27017/test',
connectOptions : {}
}
If you are in app.js and just want to allow access to it
global.myApp = this;
As always, use globals with care...
This is not really related to node but rather general software architecture decisions.
When you have a client and a server module/packages/classes (call them whichever way you like) one way is to define routines on the server module that takes as arguments whichever state data your client keeps on the 'global' scope, completes its tasks and reports back to the client with results.
This way, it is perfectly decoupled and you have a strict control of what data goes where.
Hope this helps :)
One way to do this is in an anonymous function - i.e. instead of returning an object with module.exports, return a function that returns an appropriate value.
So, let's say we want to pass var1 down to our two modules, ./module1.js and ./module2.js. This is how the module code would look:
module.exports = function(var1) {
return {
doSomething: function() { return var1; }
};
}
Then, we can call it like so:
var downstream = require('./module1')('This is var1');
Giving you exactly what you want.
I just created an empty module and installed it under node_modules as appglobals.js
// index.js
module.exports = {};
// package.json too is barebones
{ "name": "appGlobals" }
And then strut it around as without fearing refactoring in future:
var g = require("appglobals");
g.foo = "bar";
I wish it came built in as setter/getter, but the flexibility has to be admired.
(Now I only need to figure out how to package it for production)

Using a global variable in Node.js for dependency injection

I'm starting out a long term project, based on Node.js, and so I'm looking to build upon a solid dependency injection (DI) system.
Although Node.js at its core implies using simple module require()s for wiring components, I find this approach not best suited for a large project (e.g. requiring modules in each file is not that maintainable, testable or dynamic).
Now, I'd done my bits of research before posting this question and I've found out some interesting DI libraries for Node.js (see wire.js and dependable.js).
However, for maximal simplicity and minimal repetition I've come up with my own proposition of implementing DI:
You have a module, di.js, which acts as the container and is initialized by pointing to a JSON file storing a map of dependency names and their respective .js files.
This already provides a dynamic nature to the DI, as you may easily swap test/development dependencies.
The container can return dependencies by using an inject() function, which finds the dependency mapping and calls require() with it.
For simplicity, the module is assigned to a global variable, i.e. global.$di, so that any file in the project may use the container/injector by calling $di.inject().
Here's the gist of the implementation:
File di.js
module.exports = function(path) {
this.deps = require(path);
return {
inject: function(name) {
if (!deps[name])
throw new Error('dependency "' + name + '" isn\'t registered');
return require(deps[name]);
}
};
};
Dependency map JSON file
{
"vehicle": "lib/jetpack",
"fuel": "lib/benzine",
"octane": "lib/octane98"
}
Initialize the $di in the main JavaScript file, according to development/test mode:
var path = 'dep-map-' + process.env.NODE_ENV + '.json;
$di = require('di')(path);
Use it in some file:
var vehicle = $di.inject('vehicle');
vehicle.go();
So far, the only problem I could think of using this approach is the global variable $di.
Supposedly, global variables are a bad practice, but it seems to me like I'm saving a lot of repetition for the cost of a single global variable.
What can be suggested against my proposal?
Overall this approach sounds fine to me.
The way global variables work in Node.js is that when you declare a variable without the var keyword, and it gets added to the global object which is shared between all modules. You can also explicitly use global.varname. Example:
vehicle = "jetpack"
fuel = "benzine"
console.log(vehicle) // "jetpack"
console.log(global.fuel) // "benzine"
Variables declared with var will only be local to the module.
var vehicle = "car"
console.log(vehicle) // "car"
console.log(global.vehicle) // "jetpack"
So in your code if you are doing $di = require('di')(path) (without var), then you should be able to use it in other modules without any issues. Using global.$di might make the code more readable.
Your approach is a clear and simple one which is good. Whether you have a global variable or require your module every time is not important.
Regarding testability it allows you to replace your modules with mocks. For unit testing you should add a function that makes it easy for you to apply different mocks for each test. Something that extends your dependency map temporarily.
For further reading I can recommend a great blog article on dependency injection in Node.js as well as a talk on the future dependency injector of angular.js which is designed by some serious masterminds.
BTW, you might be interested in Fire Up! which is a dependency injection container I implemented.

Resources