require is not defined after using plugin-transform-runtime - node.js

I am trying to create a simple ui window, where i can quickly test react code.
So the idea is that i am typing react jsx code into the window and in the other window i am getting app rendered from that code.
Input (with react jsx code) is sent to the nodejs process that converts it to normal js code that can be processed with in browser react library. The problem i am having is with this error regeneratorRuntime is not defined.
my current code:
const babel = require("#babel/core");
// body comes from window input
console.log(
babel.transform(body, {
"presets": ["#babel/env", "#babel/react"],
"plugins": ["#babel/plugin-proposal-class-properties"]
}).code
);
I read few topics about this error and most seem to recommend to add "#babel/plugin-transform-runtime"
so it becomes:
console.log(
babel.transform(body, {
"presets": ["#babel/env", "#babel/react"],
"plugins": ["#babel/plugin-proposal-class-properties", "#babel/plugin-transform-runtime"]
}).code
);
However at this point code returned by babel transform contains these at beginning:
"use strict";
var _interopRequireDefault = require("#babel/runtime/helpers/interopRequireDefault");
var _regenerator = _interopRequireDefault(require("#babel/runtime/regenerator"));
var _asyncToGenerator2 = _interopRequireDefault(require("#babel/runtime/helpers/asyncToGenerator"));
var _classCallCheck2 = _interopRequireDefault(require("#babel/runtime/helpers/classCallCheck"));
var _createClass2 = _interopRequireDefault(require("#babel/runtime/helpers/createClass"));
var _assertThisInitialized2 = _interopRequireDefault(require("#babel/runtime/helpers/assertThisInitialized"));
var _inherits2 = _interopRequireDefault(require("#babel/runtime/helpers/inherits"));
var _possibleConstructorReturn2 = _interopRequireDefault(require("#babel/runtime/helpers/possibleConstructorReturn"));
var _getPrototypeOf2 = _interopRequireDefault(require("#babel/runtime/helpers/getPrototypeOf"));
var _defineProperty2 = _interopRequireDefault(require("#babel/runtime/helpers/defineProperty"));
But since this code is processed by browser it throws error Uncaught ReferenceError: require is not defined
How can this be solved to feed browser with already "ready" code that doesnt contain any requires?

Basically to run the plugin-transform-runtime, you will need some kind of bundler. Have a look at babelify.

I had this issue while testing a module meant for both web and node targets (front and backend). When the bundle was ready, pasting it on the browser console to check the functionality threw up the exact same eror.
Things that worked for me when I had this issue:
Check the version of Nodejs in your system. if you see type: module in your package.json this would throw an error on require usage since it is expecting es6 import. If you see, try changing it to type: commonjs or remove it from package.json
Yes #babel/polyfill which is now #babel/plugin-transform-runtime has all the builtins, helpers to support Promise, Symbol, Set, and other ES6 builtins. Earlier we relied on Bluebird to backup Promise in browsers. Babel runtime and Babel plugin transform runtime needs a check.
As an alternative you could try #babel/register to run files using babel on the fly. The require hook will bind itself to the node’s require and will automatically compile files at runtime.
If you're using nodeexternals then it would expect node's require function. Set the target as umd
Add this "modules": "commonjs" to your .babelrc file.
I had to specify libraryTarget: 'umd' in the webpack.config output. It is needed for nodeExternals as well if used. The UMD (Universal module definition) format allows JavaScript modules to be imported using commonjs.
The last step 6 may or may not be applicable for you, but for me steps 1,2,3 and 6 resolved all issues.

Related

How requiring a module on entry point makes available on other modules on NodeJS?

This is probably a duplicated question, but I couldn't find anything.
Since I'm learning NodeJS, I think that I'm not using the right words to search, so it's hard to find an answer.
Here is the situation:
I'm currently following an online course about NodeJS and coding an API.
In the current step we are using Winston library to log errors. The instructor, have configured on Index,js, which is the entry point of the app, like this:
File: index.js
const winston = require('winston');
const errorHandler = require(./middleware/error.js);
//(...) some other imports
app.use(errorHandler);
winston.add(winston.transports.File,{filename:'logFile.log'});
And in other module we've created in the course to handle errors, he requires winston and simply call to log the error. Something like this:
File: error.js
const winston = require('winston');
function errorHandler(err,req,res,next){
winston.error(err.message,err);
res.status(500).send("something failed");
}
module.exports = errorHandler;
After doing a test, the error is correctly written to the file, and my question is: How it works? How a setting made on the 'required version' of winston at index.js is visible from the other required version at error.js?
From index.js we are importing error.js too, so i can imagine somehow this two modules are sharing this winston object, but again, I don't understand how or where is it shared.
Again, please excuseme if I'm not using the right terms to refer anything here, I'll accept any advice.
Thanks.
When a module is loaded in node.js, it is cached by the require() sub-system. So, when you then require() it again, that means you'll get the exact same module as the previous one.
So ... if you initialized the module after you first loaded it and the module stores some state that represents that intialization, then subsequent use of that module will be using the same (already initialized) module.
And in other module we've created in the course to handle errors, he requires winston and simply call to log the error.
It gets the same instance of the winston module that was already initialized/configured previously.
After doing a test, the error is correctly written to the file, and my question is: How it works? How a setting made on the 'required version' of winston at index.js is visible from the other required version at error.js?
Module caching as describe above. There's only one winston module that all are sharing so if it's initialized/configured in one place, all will use that configuration.

How to compile ReactJS for use on server with command line arguments?

I've decided to try out ReactJS. Along with that, I've decided to use Gulp for compiling .jsx to .js, also for the first time.
I can compile it no problem for the client use with browserify. Here's my gulp task:
browserify("./scripts/main.jsx")
.transform(
babelify.configure({
presets: ["react"]
}))
.bundle()
.pipe(source('bundle.js'))
.pipe(gulp.dest('./scripts/'));
But since I use PHP to generate data, I need to get those data to node. If I use browserify, it will prevent me from using process.argv in node. I can save data to file and read that file in node, so I wouldn't need to pass the whole state to node, but I still need to pass the identifying arguments, so the node knows which file to load.
What should I use instead of browserify?
If you need to compile a React module to es5 for use on the server, use Babel itself.
A module that may help with reading and writing files is this one: https://nodejs.org/api/fs.html
Have you considered posting and getting from a database?
Here's how I solved it:
I have learnt that you can create standalone bundles with browserify, so I've compiled all the server code I need (components + rendering) as a standalone bundle. Then I have created small node script which is responsible only for reading arguments, loading data and sending it to the rendering code.
I'm not sure if this is a proper way how it should be done but it works.
Here's code for the "setup" script:
var fs = require('fs');
var Server = require('./server.js');
if (process.argv[2]) {
region = process.argv[2].toLowerCase().replace(/[^a-z0-9]/, '');
if (region != '') {
var data = JSON.parse(fs.readFileSync(__dirname + '/../tmp/' + region + '.json', 'utf8'));
console.log(Server.render(data.deal, data.region));
}
}
This way I only need to deploy two files and I still can easily compile jsx to js.

Webpack Aliases in Node JS Server code

I'm building an isomorphic React/React-Router/Redux/Webpack application and I'm attempting to implement server side rendering.
My directory looks like:
/client
/actions
/components
/containers
/server
/server.js
In my webpack config, I have aliases set up for all the folders inside client:
var path_base = path.resolve(__dirname, '..');
const resolve = path.resolve;
const base = function() {
var args = [path_base];
args.push.apply(args, arguments);
return resolve.apply(resolve,args);
};
const resolve_alias = base.bind(null, 'src/client');
const aliases = [
'actions',
'components',
'constants',
'containers',
'middleware',
'reducers',
'routes',
'store',
'styles',
'utils',
'validation'
];
so that inside the code that gets bundled by webpack, I can do:
import { Widget } from 'components';
and that import gets resolved by webpack.
Now in my server code, in order to do the rendering, I have to import some of my client files, like routes/index.js. The problem I'm running into when I import my routes file, it's using a webpack alias to another file, say components or containers so naturally, the node js require system can't resolve it.
How do I fix something like that? I looked at this question and it talks about essentially setting up the same aliases that exist in webpack with mock-require. But then the issue becomes that my routes file imports all my components which then all import things like stylesheets, images, etc. Should I then be using something like webpack-isomorphic-tools?
The guides I've been looking at (this for example) are all great at showing how server side rendering is accomplished but none of them really talk about how to resolve all the requires and whatnot.
After battling with this issue for 2 days I settled on babel-plugin-webpack-alias.
What you need to do to resolve paths with that is:
$ npm install --save-dev babel-plugin-webpack-alias
Add the plugin to your .babelrc
Add the aliases to your webpack.config (make sure you use path.join())
Refer to this post if you have problems loading styles
The other option I tried was universal-webpack but I found it to be a bit verbose. If you want to see roughly how the whole server-side loading works, you can check out this video.
If you really want them, run your server side code through babel and use this plugin: https://www.npmjs.com/package/babel-plugin-module-alias which will let you do the same thing as webpack.
Edit: This one works a lot better: https://github.com/jagrem/babel-resolve-relative-module it allows multiple paths
Try to use NODE_PATH. Node will always look for a module in this path during require calls. It allows to short cut your relative paths as you want.
// turn this
import {Widget} from '../../components';
// into this
import {Widget} from 'components';
See Node.js docs for more information.
P.S. this thing is very sensitive, so use it carefully. Now your code tightly depends from the environment and may break somewhere.
If you use webpack-isomorphic-tools then it'll take your webpack config into account for your server side which will make all your aliases work.
https://www.npmjs.com/package/webpack-isomorphic-tools

require.main.require works but not inside Mocha test

I have written a global function for requiring certain files of my app/framework:
global.coRequireModel = function(name) {
// CRASH happens here
return require.main.require('./api/_co' + name + '/_co' + name + '.model');
}
This module is in /components/coGlobalFunctions.
It is required in my main app app.js like this:
require('./components/coGlobalFunctions');
Then in other modules using "something" from the framework I use:
var baseScheme = coRequireModel('Base');
This works but not in the Mocha tests which give me a "Error: Cannot find module" right before the require.main.require call.
It seems that the test is coming from another source folder. But I thought the require.main.require would take out the aspect of having to relatively linking to modules.
EDIT:
An example test file living in api/user:
var should = require('should');
var app = require('../../app');
var User = require('./user.model');
...
require.main points to the module that was run directly from node. So, if you run node app.js, then require.main will point to app.js. If, on the other hand, you ran it using mocha, then require.main will point to mocha. This is likely why your tests are failing.
See the node docs of more details.
Because require.main was not index.html in my node-webkit app when running mocha tests, it threw errors left and right about not being able to resolve modules. Hacky fix in my test-helper.js (required first thing in all tests) fixed it:
var path = require('path')
require.main.require = function (name) {
// navigate to main directory
var newPath = path.join(__dirname, '../', name)
return require(newPath)
}
This feels wrong, though it worked. Is there a better way to fix this? It's like combining some of the above solutions with #7 to get mocha testing working, but modifying main's require just to make everything work when testing feels really wrong.
For other avoid-the-".."-mess solutions, see here:
https://gist.github.com/branneman/8048520
This is pretty old, but here is my solution.
I needed a test harness module to be published to a private registry and required by the mocha test suite. I wanted the calling test code to pass the code under test to the harness rather than requiring it directly:
var harness = require('test-harness');
var codeUnderTest = harness('../myCode');
Inside harness (which was found in the project node_modules directory), I used the following code to make require find the correct file:
if (!path.isAbsolute(target)) {
target = path.join(path.dirname(module.parent.paths[0]), target);
}
var codeUndertest = require(target);
...
return codeUnderTest;
This relies on the require path resolution that always starts with looking for a node_modules subdirectory relative to the calling file. Couple that with module.parent and you can get access to that search path. Then just remove the trailing node_modules part and concatenate the relative filename.
For other scenarios not using relative paths, this could be accomplished with the options parameter to require:
var codeUndertest = require(target, {paths: module.parent.paths});
...
return codeUnderTest;
And the two could be combined as well. I used the first form because I was actually using proxyquire which does not offer the paths option.

How to reference local files in a npm module?

I wrote a simple npm module to precompile my handlebars templates when using django compressor to do post-processing for some client side components and found that I need to ship the npm module with a few js files.
Currently I just assume no one is installing this with the global flag because I've "hard coded" the path to these dependencies in the npm module itself
example layout of my npm module
/
* /bin
* /lib/main.js
* /vendor/ember.js
Now inside main.js I want to use the ember.js file ... currently my hard coded approach looks like this
var emberjs = fs.readFileSync('node_modules/django-ember-precompile/vendor/ember.js', 'utf8');
Again -this only works because I assume you install it local but I'd like to think node.js has a more legit way to get locally embedded files
Anyone know how I can improve this to be more "global" friendly?
What you can do is get the directory of the current file and make your file paths relative to that.
var path = require('path')
, fs = require('fs');
var vendor = path.join(path.dirname(fs.realpathSync(__filename)), '../vendor');
var emberjs = fs.readFileSync(vendor + '/ember.js', 'utf8');
Hope that helps!
One of the great strengths of Node.js is how quickly you can get up and running. The downside to this approach is that you are forced to fit the design patterns it was build around.
This is an example where your approach differs too much from Nodes approach.
Node expects everything in a module to be exposed from the modules exports, including templates.
Move the readFileSync into the django-ember-precompile module, then expose the returned value via a module export in lib/main.js.
Example:
package.json
{
"name": "django-ember-precompile",
"main": "lib/main.js"
}
lib/main.js
module.exports.ember = readFileSync('vendor/ember.js')
vendor/ember.js
You obtain your template via
var template = require('django-ember-precompile').ember
This example can be refined, but the core idea is the same.

Resources