Cannot browserify exports default statement [duplicate] - node.js

Before, babel would add the line module.exports = exports["default"]. It no longer does this. What this means is before I could do:
var foo = require('./foo');
// use foo
Now I have to do this:
var foo = require('./foo').default;
// use foo
Not a huge deal (and I'm guessing this is what it should have been all along).
The issue is that I have a lot of code that depended on the way that things used to work (I can convert most of it to ES6 imports, but not all of it). Can anyone give me tips on how to make the old way work without having to go through my project and fix this (or even some instruction on how to write a codemod to do this would be pretty slick).
Thanks!
Example:
Input:
const foo = {}
export default foo
Output with Babel 5
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
var foo = {};
exports["default"] = foo;
module.exports = exports["default"];
Output with Babel 6 (and es2015 plugin):
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
var foo = {};
exports["default"] = foo;
Notice that the only difference in the output is the module.exports = exports["default"].
Edit
You may be interested in this blogpost I wrote after solving my specific issue: Misunderstanding ES6 Modules, Upgrading Babel, Tears, and a Solution

If you want CommonJS export behavior, you'll need to use CommonJS directly (or use the plugin in the other answer). This behavior was removed because it caused confusion and lead to invalid ES6 semantics, which some people had relied on e.g.
export default {
a: 'foo'
};
and then
import {a} from './foo';
which is invalid ES6 but worked because of the CommonJS interoperability behavior you are describing. Unfortunately supporting both cases isn't possible, and allowing people to write invalid ES6 is a worse issue than making you do .default.
The other issue was that it was unexpected for users if they added a named export in the future, for example
export default 4;
then
require('./mod');
// 4
but
export default 4;
export var foo = 5;
then
require('./mod')
// {'default': 4, foo: 5}

You can also use this plugin to get the old export behavior back.

For library authors you may be able to work around this problem.
I usually have an entry point, index.js, which is the file I point to from the main field in package.json. It does nothing other than re-export the actual entry point of the lib:
export { default } from "./components/MyComponent";
To workaround the babel issue, I changed this to an import statement and then assign the default to module.exports:
import MyComponent from "./components/MyComponent";
module.exports = MyComponent;
All my other files stay as pure ES6 modules, with no workarounds. So only the entry point needs a change slightly :)
This will work for commonjs requires, and also for ES6 imports because babel doesn't seem to have dropped the reverse interop (commonjs -> es6). Babel injects the following function to patch up commonjs:
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
I've spent hours battling this, so I hope this saves someone else the effort!

I have had such kind of issue. And this is my solution:
//src/arithmetic.js
export var operations = {
add: function (a, b) {
return a + b;
},
subtract: function (a, b) {
return a - b;
}
};
//src/main.js
import { operations } from './arithmetic';
let result = operations.add(1, 1);
console.log(result);

Related

Discard and re-import dynamic import [duplicate]

From the node.js documentation:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Is there a way to invalidate this cache? i.e. for unit testing, I'd like each test to be working on a fresh object.
You can always safely delete an entry in require.cache without a problem, even when there are circular dependencies. Because when you delete, you just delete a reference to the cached module object, not the module object itself, the module object will not be GCed because in case of circular dependencies, there is still a object referencing this module object.
Suppose you have:
script a.js:
var b=require('./b.js').b;
exports.a='a from a.js';
exports.b=b;
and script b.js:
var a=require('./a.js').a;
exports.b='b from b.js';
exports.a=a;
when you do:
var a=require('./a.js')
var b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js', a: undefined }
now if you edit your b.js:
var a=require('./a.js').a;
exports.b='b from b.js. changed value';
exports.a=a;
and do:
delete require.cache[require.resolve('./b.js')]
b=require('./b.js')
you will get:
> a
{ a: 'a from a.js', b: 'b from b.js' }
> b
{ b: 'b from b.js. changed value',
a: 'a from a.js' }
===
The above is valid if directly running node.js. However, if using tools that have their own module caching system, such as jest, the correct statement would be:
jest.resetModules();
If you always want to reload your module, you could add this function:
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
and then use requireUncached('./myModule') instead of require.
Yes, you can access the cache via require.cache[moduleName] where moduleName is the name of the module you wish to access. Deleting an entry by calling delete require.cache[moduleName] will cause require to load the actual file.
This is how you would remove all cached files associated with the module:
/**
* Removes a module from the cache
*/
function purgeCache(moduleName) {
// Traverse the cache looking for the files
// loaded by the specified module name
searchCache(moduleName, function (mod) {
delete require.cache[mod.id];
});
// Remove cached paths to the module.
// Thanks to #bentael for pointing this out.
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if (cacheKey.indexOf(moduleName)>0) {
delete module.constructor._pathCache[cacheKey];
}
});
};
/**
* Traverses the cache to search for all the cached
* files of the specified module name
*/
function searchCache(moduleName, callback) {
// Resolve the module identified by the specified name
var mod = require.resolve(moduleName);
// Check if the module has been resolved and found within
// the cache
if (mod && ((mod = require.cache[mod]) !== undefined)) {
// Recursively go over the results
(function traverse(mod) {
// Go over each of the module's children and
// traverse them
mod.children.forEach(function (child) {
traverse(child);
});
// Call the specified callback providing the
// found cached module
callback(mod);
}(mod));
}
};
Usage would be:
// Load the package
var mypackage = require('./mypackage');
// Purge the package from cache
purgeCache('./mypackage');
Since this code uses the same resolver require does, just specify whatever you would for require.
"Unix was not designed to stop its users from doing stupid things, as
that would also stop them from doing clever things." – Doug Gwyn
I think that there should have been a way for performing an explicit uncached module loading.
There's a Simple Module for that (with tests)
We had this exact issue while testing our code (delete cached modules so they can be re-required in a fresh state) so we reviewed all the suggestions of people on the various StackOverflow Questions & Answers and put together a simple node.js module (with tests):
https://www.npmjs.com/package/decache
As you would expect, works for both published npm packages and locally defined modules. Windows, Mac, Linux, etc.
How? (usage)
Usage is pretty simple:
install
Install the module from npm:
npm install decache --save-dev
Use it in your code:
// require the decache module:
const decache = require('decache');
// require a module that you wrote"
let mymod = require('./mymodule.js');
// use your module the way you need to:
console.log(mymod.count()); // 0 (the initial state for our counter is zero)
console.log(mymod.incrementRunCount()); // 1
// delete the cached module:
decache('./mymodule.js');
//
mymod = require('./mymodule.js'); // fresh start
console.log(mymod.count()); // 0 (back to initial state ... zero)
If you have any questions or need more examples, please create a GitHub issue:
https://github.com/dwyl/decache/issues
For anyone coming across this who is using Jest, because Jest does its own module caching, there's a built-in function for this - just make sure jest.resetModules runs eg. after each of your tests:
afterEach( function() {
jest.resetModules();
});
Found this after trying to use decache like another answer suggested. Thanks to Anthony Garvan.
Function documentation here.
The solutions is to use:
delete require.cache[require.resolve(<path of your script>)]
Find here some basic explanations for those who, like me, are a bit new in this:
Suppose you have a dummy example.js file in the root of your directory:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
Then you require() like this:
$ node
> require('./example.js')
{ message: 'hi', say: [Function] }
If you then add a line like this to example.js:
exports.message = "hi";
exports.say = function () {
console.log(message);
}
exports.farewell = "bye!"; // this line is added later on
And continue in the console, the module is not updated:
> require('./example.js')
{ message: 'hi', say: [Function] }
That's when you can use delete require.cache[require.resolve()] indicated in luff's answer:
> delete require.cache[require.resolve('./example.js')]
true
> require('./example.js')
{ message: 'hi', say: [Function], farewell: 'bye!' }
So the cache is cleaned and the require() captures the content of the file again, loading all the current values.
rewire is great for this use case, you get a new instance with each call. Easy dependency injection for node.js unit testing.
rewire adds a special setter and getter to modules so you can modify their behaviour for better unit testing. You may
inject mocks for other modules or globals like process
leak private variables
override variables within the module.
rewire does not load the file and eval the contents to emulate node's require mechanism. In fact it uses node's own require to load the module. Thus your module behaves exactly the same in your test environment as under regular circumstances (except your modifications).
Good news to all caffeine-addicts: rewire works also with Coffee-Script. Note that in this case CoffeeScript needs to be listed in your devDependencies.
I'd add to luff's answer one more line and change the parameter name:
function requireCached(_module){
var l = module.children.length;
for (var i = 0; i < l; i++)
{
if (module.children[i].id === require.resolve(_module))
{
module.children.splice(i, 1);
break;
}
}
delete require.cache[require.resolve(_module)];
return require(_module)
}
Yes, you can invalidate cache.
The cache is stored in an object called require.cache which you can access directly according to filenames (e.g. - /projects/app/home/index.js as opposed to ./home which you would use in a require('./home') statement).
delete require.cache['/projects/app/home/index.js'];
Our team has found the following module useful. To invalidate certain groups of modules.
https://www.npmjs.com/package/node-resource
I am not 100% certain of what you mean by 'invalidate', but you can add the following above the require statements to clear the cache:
Object.keys(require.cache).forEach(function(key) { delete require.cache[key] })
Taken from #Dancrumb's comment here
requireUncached with relative path: 🔥
const requireUncached = require => module => {
delete require.cache[require.resolve(module)];
return require(module);
};
module.exports = requireUncached;
invoke requireUncached with relative path:
const requireUncached = require('../helpers/require_uncached')(require);
const myModule = requireUncached('./myModule');
I couldn't neatly add code in an answer's comment. But I would use #Ben Barkay's answer then add this to the require.uncache function.
// see https://github.com/joyent/node/issues/8266
// use in it in #Ben Barkay's require.uncache function or along with it. whatever
Object.keys(module.constructor._pathCache).forEach(function(cacheKey) {
if ( cacheKey.indexOf(moduleName) > -1 ) {
delete module.constructor._pathCache[ cacheKey ];
}
});
Say you've required a module, then uninstalled it, then reinstalled the same module but used a different version that has a different main script in its package.json, the next require will fail because that main script does not exists because it's cached in Module._pathCache
If you want a module to simply never be cached (sometimes useful for development, but remember to remove it when done!) you can just put delete require.cache[module.id]; inside the module.
here's my version of this answer, which handles not loading a file if it has (for example) syntax errors
function reacquire(module) {
const fullpath = require.resolve(module);
const backup = require.cache[fullpath];
delete require.cache[fullpath];
try {
const newcopy = require(module);
console.log("reqcquired:",module,typeof newcopy);
return newcopy;
} catch (e) {
console.log("Can't reqcquire",module,":",e.message);
require.cache[fullpath] = backup;
return backup;
}
}
Following two step procedure is working perfectly for me.
After changing Model file i-e 'mymodule.js' dynamically, you need to Delete precompiled model in mongoose model first then reload it using require-reload
Example:
// Delete mongoose model
delete mongoose.connection.models[thisObject.singular('mymodule')]
// Reload model
var reload = require('require-reload')(require);
var entityModel = reload('./mymodule.js');
The documentation says:
Modules are cached in this object when they are required. By deleting a key value from this object, the next require will reload the module. This does not apply to native addons, for which reloading will result in an error.
If it's for unit tests, another good tool to use is proxyquire. Everytime you proxyquire the module, it will invalidate the module cache and cache a new one. It also allows you to modify the modules required by the file that you are testing.
I made a small module to delete module from the cache after loading. This forces reevaluation of the module next time it is required. See https://github.com/bahmutov/require-and-forget
// random.js
module.exports = Math.random()
const forget = require('require-and-forget')
const r1 = forget('./random')
const r2 = forget('./random')
// r1 and r2 will be different
// "random.js" will not be stored in the require.cache
PS: you can also put "self-destruct" into the module itself. See https://github.com/bahmutov/unload-me
PSS: more tricks with Node require in my https://glebbahmutov.com/blog/hacking-node-require/

How does jest allow mutation of modules?

In this question that I asked here:
Why does mutating a module update the reference if calling that module from another module, but not if calling from itself?
I'm asking about the nature of module mutation.
However as it it turns out, ES6 modules can't actually be mutated - all of their properties are treated as constants. (See this answer)
But somehow - when Jest tests modules - they can be mutated, and that's how Jest allows for mocking.
How is this happening?
I imagine that it's a babel plugin that that's running - transpiling the module to CommonJS modules? Is there any documentation about this?
Is there a way to view the transpiled code?
ES6 modules can't actually be mutated - all of their properties are treated as constants.
Interesting. You're right, even something as simple as this:
import * as lib from "./lib"; // import an ES6 module
const spy = jest.spyOn(lib, 'someFunc'); // spy on someFunc
...technically shouldn't be allowed since jest.spyOn replaces the method on the object with a spy and lib.someFunc should be a binding to someFunc in the ES6 module.
But somehow - when Jest tests modules - they can be mutated, and that's how Jest allows for mocking.
How is this happening?
They can only be mutated because Jest isn't actually using ES6 modules.
(I guess for completeness it might be possible to run Jest using actual ES6 modules by using Node's experimental support for ES6 Modules but I haven't tried).
I imagine that it's a babel plugin that that's running - transpiling the module...Is there any documentation about this?
"babel-jest is automatically installed when installing Jest and will automatically transform files if a babel configuration exists in your project. To avoid this behavior, you can explicitly reset the transform configuration option".
So by default Jest will use babel-jest which transpiles the source code using babel (and does a few other things like hoisting calls to jest.mock).
Note that Jest can be also be configured using transform which maps "regular expressions to paths to transformers".
Is there a way to view the transpiled code?
Yes. Transformations are done in jest-runtime here and the output is saved to a cache here.
The easiest way to look at the transpiled code is to view the cache.
You can do that by running Jest with the --showConfig option which will output the config used when running Jest. The cache location can be found by looking at the value of "cacheDirectory".
Then run Jest with the --clearCache option to clear out the cache.
Finally, run Jest normally and the cache directory will contain the transpiled code for your project.
Example
The latest Jest (v24) will transpile this code:
// lib.js
export const someFunc = () => 1;
// code.js
import { someFunc } from './lib';
export const func = () => someFunc() + 1;
// code.test.js
import { func } from './code';
import * as lib from './lib';
test('func', () => {
const spy = jest.spyOn(lib, 'someFunc');
func();
expect(spy).toHaveBeenCalled(); // SUCCESS
});
...to this:
// lib.js
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.someFunc = void 0;
const someFunc = () => 1;
exports.someFunc = someFunc;
// code.js
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.func = void 0;
var _lib = require("./lib");
const func = () => (0, _lib.someFunc)() + 1;
exports.func = func;
// code.test.js
"use strict";
var _code = require("./code");
var lib = _interopRequireWildcard(require("./lib"));
function _interopRequireWildcard(obj) { if (obj && obj.__esModule) { return obj; } else { var newObj = {}; if (obj != null) { for (var key in obj) { if (Object.prototype.hasOwnProperty.call(obj, key)) { var desc = Object.defineProperty && Object.getOwnPropertyDescriptor ? Object.getOwnPropertyDescriptor(obj, key) : {}; if (desc.get || desc.set) { Object.defineProperty(newObj, key, desc); } else { newObj[key] = obj[key]; } } } } newObj.default = obj; return newObj; } }
test('func', () => {
const spy = jest.spyOn(lib, 'someFunc');
(0, _code.func)();
expect(spy).toHaveBeenCalled(); // SUCCESS
});
The import * as lib from 'lib'; line gets handled by _interopRequireWildcard which uses require under the hood.
Every call to require "will get exactly the same object returned, if it would resolve to the same file" so code.js and code.test.js are getting the same object from require('./lib').
someFunc is exported as exports.someFunc which allows it to be reassigned.
So yes, you're exactly right. Spying (or mocking) like this only works because the ES6 modules are getting transpiled by babel into Node modules in a way that allows them to be mutated.

module.exports vs. export default in Node.js and ES6

What is the difference between Node's module.exports and ES6's export default? I'm trying to figure out why I get the "__ is not a constructor" error when I try to export default in Node.js 6.2.2.
What works
'use strict'
class SlimShady {
constructor(options) {
this._options = options
}
sayName() {
return 'My name is Slim Shady.'
}
}
// This works
module.exports = SlimShady
What doesn't work
'use strict'
class SlimShady {
constructor(options) {
this._options = options
}
sayName() {
return 'My name is Slim Shady.'
}
}
// This will cause the "SlimShady is not a constructor" error
// if in another file I try `let marshall = new SlimShady()`
export default SlimShady
The issue is with
how ES6 modules are emulated in CommonJS
how you import the module
ES6 to CommonJS
At the time of writing this, no environment supports ES6 modules natively. When using them in Node.js you need to use something like Babel to convert the modules to CommonJS. But how exactly does that happen?
Many people consider module.exports = ... to be equivalent to export default ... and exports.foo ... to be equivalent to export const foo = .... That's not quite true though, or at least not how Babel does it.
ES6 default exports are actually also named exports, except that default is a "reserved" name and there is special syntax support for it. Lets have a look how Babel compiles named and default exports:
// input
export const foo = 42;
export default 21;
// output
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
var foo = exports.foo = 42;
exports.default = 21;
Here we can see that the default export becomes a property on the exports object, just like foo.
Import the module
We can import the module in two ways: Either using CommonJS or using ES6 import syntax.
Your issue: I believe you are doing something like:
var bar = require('./input');
new bar();
expecting that bar is assigned the value of the default export. But as we can see in the example above, the default export is assigned to the default property!
So in order to access the default export we actually have to do
var bar = require('./input').default;
If we use ES6 module syntax, namely
import bar from './input';
console.log(bar);
Babel will transform it to
'use strict';
var _input = require('./input');
var _input2 = _interopRequireDefault(_input);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
console.log(_input2.default);
You can see that every access to bar is converted to access .default.
Felix Kling did a great comparison on those two, for anyone wondering how to do an export default alongside named exports with module.exports in nodejs
module.exports = new DAO()
module.exports.initDAO = initDAO // append other functions as named export
// now you have
let DAO = require('_/helpers/DAO');
// DAO by default is exported class or function
DAO.initDAO()
You need to configure babel correctly in your project to use export default and export const foo
npm install --save-dev #babel/plugin-proposal-export-default-from
then add below configration in .babelrc
"plugins": [
"#babel/plugin-proposal-export-default-from"
]

How do you write a node module using typescript?

So, the general answer for the other question (How do you import a module using typescript) is:
1) Create a blah.d.ts definition file.
2) Use:
/// <reference path="./defs/foo/foo.d.ts"/>
import foo = require("foo");
Critically, you need both the files foo.d.ts and a foo.js somewhere in your node_modules to load; and the NAME foo must exactly match for both. Now...
The question I would like to have answered is how to you write a typescript module that you can import that way?
Lets say I have a module like this:
- xq/
- xq/defs/Q.d.ts
- xq/index.ts
- xq/base.ts
- xq/thing.ts
I want to export the module 'xq' with the classes 'Base' from base.ts, and 'Thing' from thing.ts.
If this was a node module in javascript, my index.ts would look something like:
var base = require('./base');
var thing = require('./thing');
module.exports = {
Base: base.Base,
Thing: thing.Thing
};
Let's try using a similar typescript file:
import base = require('./base');
export module xq {
export var base = base.Base;
}
Invoke it:
tsc base.ts index.ts things.ts ... --sourcemap --declaration --target ES3
--module commonjs --outDir dist/xq
What happens? Well, we get our base.d.ts:
export declare class Base<T> {
...
}
and the thrillingly unuseful index.d.ts:
export declare module xq {
var Base: any; // No type hinting! Great. :(
}
and completely invalid javascript that doesn't event import the module:
(function (xq) {
xq.base = xq.base.Base;
})(exports.xq || (exports.xq = {}));
var xq = exports.xq;
I've tried a pile of variations on the theme and the only thing I can get to work is:
declare var require;
var base = require('./base');
export module xq {
export var base = base.Base;
}
...but that obviously completely destroys the type checker.
So.
Typescript is great, but this module stuff completely sucks.
1) Is it possible to do with the built in definition generator (I'm dubious)
2) How do you do it by hand? I've seen import statements in .d.ts files, which I presume means someone has figured out how to do this; how do those work? How do you do the typescript for a module that has a declaration with an import statement in it?
(eg. I suspect the correct way to do a module declaration is:
/// <reference path="base.d.ts" />
declare module "xq" {
import base = require('./base');
module xq {
// Some how export the symbol base.Base as Base here
}
export = xq;
}
...but I have no idea what the typescript to go along that would be).
For JavaScript :
var base = require('./base');
var thing = require('./thing');
module.exports = {
Base: base.Base,
Thing: thing.Thing
};
TypeScript :
import base = require('./base');
import thing = require('./thing');
var toExport = {
Base: base.Base,
Thing: thing.Thing
};
export = toExport;
Or even this typescript:
import base = require('./base');
import thing = require('./thing');
export var Base = base.Base;
export var Thing = thing.Thin;
Typescript has really improved since this question was asked. In recent versions of Typescript, the language has become a much more strict superset of Javascript.
The right way to import/export modules is now the new ES6 Module syntax:
myLib.ts
export function myFunc() {
return 'test'
}
package.json
{
"name": "myLib",
"main": "myLib.js",
"typings": "myLib.d.ts"
}
Dependents can then import your module using the new ES6 syntax:
dependent.ts
import { myFunc } from 'myLib'
console.log(myFunc())
// => 'test'
For a full example of a node module written in Typescript, please check out this boilerplate:
https://github.com/bitjson/typescript-starter/

Typescript AMD implementation bad with Javascript / RequireJS

If I have this ts module:
export function say(){
console.log("said");
}
and I compile it with the amd option I can use it quite easily from a ts client :
import foo = module("tsmodule")
foo.say();
export var x = 123;
However if I have javascript equivalent to the ts module:
define(["require", "exports"], function(require, exports) {
function say() {
console.log("said");
}
exports.say = say;
})
There is no way to use it easily. The simplest possible solution:
// of course you can use .d.ts for requirejs but that is beside the point
declare var require:any;
// will fail with error module has not been loaded yet for context
// http://requirejs.org/docs/errors.html#notloaded
var useme = require("jsmodule")
useme.say();
export var x = 123;
import foo = module("tsmodule")
foo.say();
fails because of error http://requirejs.org/docs/errors.html#notloaded . Since "jsmodule" was not passed to the define call in the generated typescript.
The two workarounds I have
don't use import / export (language features lost)
use require([]) (still can't export something that depends on the require([]) call)
have limitations : https://github.com/basarat/typescript-requirejs . Is there another way? If not can you vote here : https://typescript.codeplex.com/workitem/948 :)
If you want to load in a JavaScript module you could always use the (badly documented) amd-dependency tag:
/// <amd-dependency path="jsmodule" />
This will put jsmodule in the dependency array of your define call.
And then provide a declaration file in which you would simply state
module useme {
function say(): void;
}

Resources