When I do for..in, can I dispense with the if (obj.hasOwnProperty(key)) {...} check in Node.js if I am careful never to modify Object.prototype myself?
Or if I require() some third party package that happens to modify Object.prototype, does that screw up the Object.prototype for my module too?
Can I dispense with the if (obj.hasOwnProperty(key)) {...} check in Node.js if I am careful never to modify Object.prototype myself?
No, because Object is global. Anyone that messes with it in any module you use (or, any module they use by proxy) and they've broken it for your code regardless of what you do. The property check is damage control. It's a check on the nature of someone mucking with you're implicitly inheriting from.
node_modules/breakit.js
Object.prototype.foobar = () => "borked";
module.exports = true;
run node..
> require('breakit');
true
> let a = {};
> for ( let k in a ) { console.log(k) }
foobar
> a.foobar();
borked
It's important that while the .hasOwnProperty is necessary for the for .. in loop, the loop itself is probably not necessary and better avoided in most cases. The air-bnb eslint style guide for instance bans looping over an object all together.
There is nothing specific in node that would warrant not having this check. If you are already using lodash you can use _.forOwn() https://lodash.com/docs#forOwn
Another option is:
Object.getOwnPropertyNames(yourObj)
.forEach(function(keyName) {
yourObj[keyName] // do stuff
});
Or continue doing the 'if' check, there's a few options, but I would definitely keep one of the checks in, your choice which.
Related
The documentation for --frozen-intrinsics says:
Only the root context is supported. There is no guarantee that globalThis.Array is indeed the default intrinsic reference. Code may break under this flag
I couldn't understand this. Can someone help me understand this in simple words with an example?
Background: I was checking nicolo-ribaudo/jest-light-runner where there is a mention of --frozen-intrinsics.
When you use --frozen-intrinsics, all the built-in JavaScript objects and functions are recursively frozen, except for globalThis.
If you run node --frozen-intrinsics, you can check it:
> Object.isFrozen(Array)
true
> Object.isFrozen(Array.prototype)
true
> Object.isFrozen(globalThis);
false
> Array.isArray = () => true; Array.isArray(2); // you cannot modify built-in properties, this will not return true
false
> globalThis.foo = 3; foo; // you can still define new globals
3
> globalThis.Array = 4; Array; // However, you can also replace existing globals
4
This prevents your code from accidentally modifying globals, so I recommended it in jest-light-runner to prevent tests from accidentally influencing each other (since it doesn't isolate test files like Jest's default runner does).
Note that you still have a communication channel by attaching new properties to the global object, so it's not an isolation mechanism.
Now, lets take a look at the docs you quoted.
Only the root context is supported.
In Node.js, you can create multiple "global contexts" using the vm built-in module. --frozen-intrinsics does not affect those contexts, so if you run node with --frozen-intrinsics:
> Object.isFrozen(Array)
true
> Object.isFrozen(require("vm").runInNewContext("Array"))
false
There is no guarantee that globalThis.Array is indeed the default intrinsic reference.
As mentioned earlier, globalThis.Array could still be replaced. However, if you have a "safe reference" to an object (either using syntax, or by storing the original Array global in a local variable), you can be sure that it's not modified:
let OriginalArray = Array;
require("untrusted-module");
globalThis.Array; // this might have been replaces
[].push; // this is still the original one
OriginalArray.isArray; // this is still the original one
Code may break under this flag
If code needs to modify built-ins, it would obviously stop working. For example, you cannot use polyfills that modify the global objects.
When using WebAssembly, there is a callback for onRuntimeInitialized(). You basically can't do anything until it happens.
So if you have a library that is implemented in it, you have to say:
var mylib = require('mylib')
mylib.onRuntimeInitialized = function() {
...
// Anything that wants to use *anything* from mylib
// (doesn't matter if it's synchronous or asynchronous)
...
}
On the plus side, you're not making Node wait to do any initialization that might not rely on mylib...so other modules can be doing fetches or whatever they need. On the negative side, it's pretty bad ergonomics--especially if everything you're doing depends on this library.
One possibility might seem to be to fold the initialization waiting into a promise, and then wait on it:
var mylib = require('mylib')
await mylib.Startup()
But people apparently write about how much they don't like the idea of top-level AWAIT. And my opinion on it is fairly irrelevant either way, as it's not allowed. :-/
So is there really no way to hold up code at the top level besides wrapping the whole app in a callback?
One thing to note with Node is that requires will return the same object, no matter what file that require is called in. Order does matter, but it will be the same object in all files.
So in your main index.js you could do something like
var myLib = require('mylib')
myLib.libby = myLib.initialize()
and then in another file, doesStuff.js, you can do:
const libby = require('mlib').libby
module.exports = function doStuff() {
/* do stuff with initialized libby object */
}
Typically the way this works is that call in doStuff.js is not called until everything is initialized and say the web route is handled. So your server is running already and so libby will be initialized and ready to use once it's called.
If you have something that absolutely cannot fail, like the server will not run if DB connection is not successful or something, then waiting is appropriate, so yes, you'd need to wrap everything (at least the core of your actions) in a callback so your server knows when it's safe to start.
I got to the point that I'd like to have a factory to manage all my dependencies for the modules in a single place instead of having a lot of statements using require all over the place in my code.
I've looked at some approaches that rely on AMD, but I'd like to know how to do it by using node.js / express combination with the OOB module loader which I think it uses common.js.
I've been thinking of doing something like this:
module.exports = {
lib:[],
load:function(name){
if(this.lib[name]!==undefined && this.lib[name]!==null){
return this.lib[name];
}
switch(name)
{
case 'express':
this.lib[name] = require('express');
break;
case 'morgan':
this.lib[name] = require('morgan');
break;
case 'body-parser':
this.lib[name] = require('body-parser');
break;
}
console.log(this.lib);
return this.lib[name];
}
};
Some people say that's more than a factory its a mediator pattern, so either way I just wanted to illustrate my point.
my basic requirement is to handle all the dependencies from a single place in the system if I need to change a dependency I just change it on this file and automatically updates through the whole system.
so is there a better way to handle this? any Implementation that already have done this approach?
thanks!
Technically this is what require() does internally.
require('foo'); require('foo')
guarantees that it will load and run foo only once. The second call will return a cached copy from its internal array.
You can achieve the same naming indirection (and an API adapter if you'll ever decide to change the implementation without changing callers) by requiring JS files or your node modules that re-export modules you actually use (e.g. require('./my-express-wrapper') instead of require('express')).
if I need to change a dependency I just change it on this file and automatically updates through the whole system.
I'd be concerned that it will cause code to be surprising:
require('factory').load('body-parser'); // loads Formidable!?
I see little benefit in having such layer of indirection:
Even in the best case of drop-in-replacement it saves very little work, because project-global find'n'replace of require('foo') with require('bar') is an easy task in most text editors.
The hard part of replacing module (which is unlikely to be 100% compatible) is getting existing code to correctly work with it. This is not avoided by use of the factory pattern. You'll need to write an adapter either way, and sometimes it may even be better to actually change uses of the module everywhere than to write an emulation layer for an API that probably wasn't good anyway.
I have a node toplevel myapp variable that contains some key application state - loggers, db handlers and some other data. The modules downstream in directory hierarchy need access to these data. How can I set up a key/value system in node to do that?
A highly upticked and accepted answer in Express: How to pass app-instance to routes from a different file? suggests using, in a lower level module
//in routes/index.js
var app = require("../app");
But this injects a hard-coded knowledge of the directory structure and file names which should be a bigger no-no jimho. Is there some other method, like something native in JavaScript? Nor do I relish the idea of declaring variables without var.
What is the node way of making a value available to objects created in lower scopes? (I am very much new to node and all-things-node aren't yet obvious to me)
Thanks a lot.
Since using node global (docs here) seems to be the solution that OP used, thought I'd add it as an official answer to collect my valuable points.
I strongly suggest that you namespace your variables, so something like
global.myApp.logger = { info here }
global.myApp.db = {
url: 'mongodb://localhost:27017/test',
connectOptions : {}
}
If you are in app.js and just want to allow access to it
global.myApp = this;
As always, use globals with care...
This is not really related to node but rather general software architecture decisions.
When you have a client and a server module/packages/classes (call them whichever way you like) one way is to define routines on the server module that takes as arguments whichever state data your client keeps on the 'global' scope, completes its tasks and reports back to the client with results.
This way, it is perfectly decoupled and you have a strict control of what data goes where.
Hope this helps :)
One way to do this is in an anonymous function - i.e. instead of returning an object with module.exports, return a function that returns an appropriate value.
So, let's say we want to pass var1 down to our two modules, ./module1.js and ./module2.js. This is how the module code would look:
module.exports = function(var1) {
return {
doSomething: function() { return var1; }
};
}
Then, we can call it like so:
var downstream = require('./module1')('This is var1');
Giving you exactly what you want.
I just created an empty module and installed it under node_modules as appglobals.js
// index.js
module.exports = {};
// package.json too is barebones
{ "name": "appGlobals" }
And then strut it around as without fearing refactoring in future:
var g = require("appglobals");
g.foo = "bar";
I wish it came built in as setter/getter, but the flexibility has to be admired.
(Now I only need to figure out how to package it for production)
I'm using some external libraries intended to be used in a browser and they set global variables implicitly like a='a' (without the var).
It seems like when I require certain scripts that do this, sometimes the variable will be accessible outside its scope just like in a browser, but for other scripts the global variable is not accessible outside its own script.
Anyone know how nodejs handles implicit global variables, and why I'm seeing somewhat random behavior? I found surprisingly little on the internet.
I can go into the scripts. write something like
if(typeof exports !== 'undefined' && this.exports !== exports){
var GLOBAL=global;
}
else{
var GLOBAL=window;
}
and then change all implicit references to GLOBAL.reference but these scripts are not my own and every time I want to get the latest version of them I would have to do this over again, which is clearly not desirable.
Using module.exports would be cleaner because then I don't have change all the references, but just add a section of the top of every file that exports the globals, but my original question about how node handles implicit globals is still relevant
I am not sure if this answer will help you, since it is hard to diagnose what is going on with your code, but maybe, some of this reasonings can help you diagnose the actual problem in your code.
The behavior in node is actually similar to that of the browser. If you would declare a variable without the var keyword the variable will be accesible through the global object.
//module foo.js
a = 'Obi-wan';
//module bar.js
require('./foo');
console.log(global.a); //yields Obi-wan
console.log(a); //yields Obi-wan
It is not clear why you say this behavior is not consistent in your code, but if you think about it, the use of global variables is precisely subject to this kind of problems since they are global and everyone could overwrite them at any time, causing as a result this unexpected conditions.
There is one aspect in which node is different from the browser though and that could be affecting the behavior that you see.
In the browser, if you do something like this directly in a JavaScript file:
console.log(this==window); //yields true
But if you do the same thing in a Node.js module:
console.log(this==global); //yields false
Basically, in the outer scope of a Node.js module the this reference points to the current module.exports object.
console.log(this==exports); //yield true
So, chances are that if you are putting data in the global scope (window) in the browser through the use of this, you may end up with a module scope in Node.js instead.
Interestingly, the code inside a function in Node.js behaves pretty much as in the browser, in terms of the use of the global scope.
(function what(){
console.log(this==global); //yields true
})();
This does not directly answer your question but it provides a solution since I don't think it is possible.
I love regexp. They are so powerful:
js = js.replace(/^(\t|\s{4})?(var\s)?(\w+)\s=/gm, function () {
if (arguments[1] || arguments[2]) return (arguments[1] || '') + (arguments[2] || '') + arguments[3] + ' =';
return 'exports.' + arguments[3] + ' =';
});*
JSFiddle here
How does it work? I will retrace my work:
/(\w+)\s=/g will take any var, return 'exports.' + arguments[1] + ' ='; will turn them into an export. Not very good.
/(var\s)?(\w+)\s=/g will take any var, but in the callback we examined first group (var\s). Is it undefined? Then we should export it, else nothing should happen. But what about scopes?
/^(\t|\s{4})?(var\s)?(\w+)\s=/gm now we use indent to determine the scope :)
You should be able to run this regex on your file. Be careful, you need it properly indented, and be aware that I might have forgotten some things.
Ah, the problem was that the global variable that was being declared globally in the browser wasn't being declared via a='a', but with var a='a'. In a browser if the var keyword is used not inside a function it will still declare a global variable. It only declares a local variable if the var keyword is inside a function. Node.js doesn't behave this way, and all var declarations are considered local.
Its ashame node.js does this, it makes it less compatible with browser scripts, for no real reason. (other than allowing people not to have to wrap all their scripts in a function).