Create and use Babel plugin without making it a npm module - node.js

In my project, I'm using Babel 6 with the require hook. I need to load a custom babel plugin that I wrote. But do I really need to publish my plugin using npm first, and then include the plugin name in my main project's .babelrc?
Is there any way to just directly load the plugin code? In other words, can I just load the following directly?
export default function({types: t }) {
return {
visitor: {
...
}
};
}

Where you list your plugins in your .babelrc, provide the path to your plugin instead of your standard published plugin name.
"plugins": ["transform-react-jsx", "./your/plugin/location"]
When exporting your plugin function, you'll probably need to use module.exports = instead of export default, since ES2015 modules haven't been fully implemented in Node yet.

This is my entire babel.config.js file.
module.exports = function (api) {
api.cache(true);
const presets = ["#babel/preset-env", "#babel/preset-react"];
const plugins = [
["#babel/plugin-proposal-pipeline-operator", { "proposal": "minimal" }],
"c:\\projects\\my-babel-plugin"
];
return {
presets,
plugins
};
}
First item in the plugins array is a plugin with options in form of an array. Second item is my own local plugin.
Inside of my-babel-plugin folder there needs to be a package.json with the "main" entry, usually "main": "lib/index.js" or "main": "src/index.js".

Related

How to import a node module inside an angular web worker?

I try to import a node module inside an Angular 8 web worker, but get an compile error 'Cannot find module'. Anyone know how to solve this?
I created a new worker inside my electron project with ng generate web-worker app, like described in the above mentioned ng documentation.
All works fine until i add some import like path or fs-extra e.g.:
/// <reference lib="webworker" />
import * as path from 'path';
addEventListener('message', ({ data }) => {
console.log(path.resolve('/'))
const response = `worker response to ${data}`;
postMessage(response);
});
This import works fine in any other ts component but inside the web worker i get a compile error with this message e.g.
Error: app/app.worker.ts:3:23 - error TS2307: Cannot find module 'path'.
How can i fix this? Maybe i need some additional parameter in the generated tsconfig.worker.json?
To reproduce the error, run:
$ git clone https://github.com/hoefling/stackoverflow-57774039
$ cd stackoverflow-57774039
$ yarn build
Or check out the project's build log on Travis.
Note:
1) I only found this as a similar problem, but the answer handles only custom modules.
2) I tested the same import with a minimal electron seed which uses web workers and it worked, but this example uses plain java script without angular.
1. TypeScript error
As you've noticed the first error is a TypeScript error. Looking at the tsconfig.worker.json I've found that it sets types to an empty array:
{
"compilerOptions": {
"types": [],
// ...
}
// ...
}
Specifying types turns off the automatic inclusion of #types packages. Which is a problem in this case because path has its type definitions in #types/node.
So let's fix that by explicitly adding node to the types array:
{
"compilerOptions": {
"types": [
"node"
],
// ...
}
// ...
}
This fixes the TypeScript error, however trying to build again we're greeted with a very similar error. This time from Webpack directly.
2. Webpack error
ERROR in ./src/app/app.worker.ts (./node_modules/worker-plugin/dist/loader.js!./src/app/app.worker.ts)
Module build failed (from ./node_modules/worker-plugin/dist/loader.js):
ModuleNotFoundError: Module not found: Error: Can't resolve 'path' in './src/app'
To figure this one out we need to dig quite a lot deeper...
Why it works everywhere else
First it's important to understand why importing path works in all the other modules. Webpack has the concept of targets (web, node, etc). Webpack uses this target to decide which default options and plugins to use.
Ordinarily the target of a Angular application using #angular-devkit/build-angular:browser would be web. However in your case, the postinstall:electron script actually patches node_modules to change that:
postinstall.js (parts omitted for brevity)
const f_angular = 'node_modules/#angular-devkit/build-angular/src/angular-cli-files/models/webpack-configs/browser.js';
fs.readFile(f_angular, 'utf8', function (err, data) {
var result = data.replace(/target: "electron-renderer",/g, '');
var result = result.replace(/target: "web",/g, '');
var result = result.replace(/return \{/g, 'return {target: "electron-renderer",');
fs.writeFile(f_angular, result, 'utf8');
});
The target electron-renderer is treated by Webpack similarily to node. Especially interesting for us: It adds the NodeTargetPlugin by default.
What does that plugin do, you wonder? It adds all known built in Node.js modules as externals. When building the application, Webpack will not attempt to bundle externals. Instead they are resolved using require at runtime. This is what makes importing path work, even though it's not installed as a module known to Webpack.
Why it doesn't work for the worker
The worker is compiled separately using the WorkerPlugin. In their documentation they state:
By default, WorkerPlugin doesn't run any of your configured Webpack plugins when bundling worker code - this avoids running things like html-webpack-plugin twice. For cases where it's necessary to apply a plugin to Worker code, use the plugins option.
Looking at the usage of WorkerPlugin deep within #angular-devkit we see the following:
#angular-devkit/src/angular-cli-files/models/webpack-configs/worker.js (simplified)
new WorkerPlugin({
globalObject: false,
plugins: [
getTypescriptWorkerPlugin(wco, workerTsConfigPath)
],
})
As we can see it uses the plugins option, but only for a single plugin which is responsible for the TypeScript compilation. This way the default plugins, configured by Webpack, including NodeTargetPlugin get lost and are not used for the worker.
Solution
To fix this we have to modify the Webpack config. And to do that we'll use #angular-builders/custom-webpack. Go ahead and install that package.
Next, open angular.json and update projects > angular-electron > architect > build:
"build": {
"builder": "#angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./extra-webpack.config.js"
}
// existing options
}
}
Repeat the same for serve.
Now, create extra-webpack.config.js in the same directory as angular.json:
const WorkerPlugin = require('worker-plugin');
const NodeTargetPlugin = require('webpack/lib/node/NodeTargetPlugin');
module.exports = (config, options) => {
let workerPlugin = config.plugins.find(p => p instanceof WorkerPlugin);
if (workerPlugin) {
workerPlugin.options.plugins.push(new NodeTargetPlugin());
}
return config;
};
The file exports a function which will be called by #angular-builders/custom-webpack with the existing Webpack config object. We can then search all plugins for an instance of the WorkerPlugin and patch its options adding the NodeTargetPlugin.

Jest not picking up configs in multi project mode

We recently migrated two different repos into a monorepo. Each uses jest with its own custom configurations, defined in their own package.json files.
I'd like to use the --projects flag to run Jest across both projects from the root of the monorepo. I've added a jest.config.js file to the root of the monorepo:
module.exports = {
projects: ['<rootDir>/projectA', '<rootDir>/projectB']
};
The runner successfully picks up the tests for both projects, but it doesn't appear to be using each project's custom configuration. For example, in "projectA", I'm using babel-plugin-module-resolver. When I run jest in that project alone, babel-jest successfully picks up that plugin and it works fine, but when I run it from the root in multi-project mode, I get "Cannot find module..." errors that indicate the plugin isn't being used.
Similarly, in "projectB" I'm using a custom setupTestFrameworkScriptFile. Running jest in this project runs that file just fine, but it's ignored when running from the root.
My understanding of the multi-project mode was that each individual project should keep its own settings/configs intact. Did I miss something? Do I need to configure these in the root as well?
I think there are some bugs with jest multi project runner, we need to provide some failing examples so jest can fix it. There are almost no docs about it
I made this work providing custom babel-transformer instead of using babel-jest directly.
Check this link https://twitter.com/sseraphini/status/1061779382669316098
Use this for your transformer inside packages
const config = require('../shared/babel.config.js');
const { createTransformer } = require('babel-jest');
module.exports = createTransformer({
...config,
});
and use this for your root transfomer
const { join, resolve } = require('path');
const { createTransformer } = require('babel-jest');
const packagePath = resolve('../');
const packageGlob = join(packagePath, '*');
module.exports = createTransformer({
babelrcRoots: packageGlob,
});
use like this on jest.config.js
transform: {
'^.+\\.(js|ts|tsx)?$': '<rootDir>/test/babel-transformer',
},

Output an executable file with webpack

I'm currently writing a node CLI tool and using webpack to bundle all of my assets. The entry point for this application is the js file where I actually parse process.argv and run a command (For reference, I'm using tj/commander). This way, once the bundling is complete, I can enter ./<outputFile> and it will run my application. The entry file looks like this:
import cli from './cli';
cli.parse(process.argv);
// If nothing was supplied
if (!process.argv.slice(2).length) {
cli.outputHelp();
}
The bundling works fine but I can't get webpack to output the file as an executable. Once I run chmod +x <outputFile>, everything works perfectly. Is there a way that I can tell webpack what permissions to grant an output file?
I'm surprised no one said a thing about webpack's BannerPlugin. I do something similar than #oklas, but using BannerPlugin to add the specific node shebang:
{
plugins: [
new webpack.BannerPlugin({
banner: '#!/usr/bin/env node',
raw: true,
}),
],
}
Then I simply add the execution permissions just adding chmod to my package.json file:
"scripts": {
"build": "webpack && chmod +x dist/mycommand"
}
Anyway, if you'd like to just use webpack you can use the WebpackShellPlugin, as said by oklas (note that using this forces you to add a new dependency, that's why I avoid using this approach):
const WebpackShellPlugin = require('webpack-shell-plugin')
{
// [...]
plugins: [
new WebpackShellPlugin({
onBuildEnd:['chmod +x dist/mycommand'],
}),
],
}
If you want to avoid including WebpackShellPlugin as a dependency, you can try to define a custom plugin based on fs, as said by #taylorc93
One simple way is to use npm. Do you have an package.json in your project?
Add "build": "webpack && chmod +x outputFile" to the scripts section of your package.json and build your project by running npm run build.
Another way is to add one of these solutions to your webpack.config.js:
simple plugin from this answer which has pre and post build handlers
use on-build-webpack plugin, which executes js code at the end of the webpack build process
Whatever you choose, you'll need to add this piece of code:
var chmod = require('chmod');
chmod("outputFile", 500);
You'll need to append #!/usr/bin/env node on top of the file.
I ended up with this webpack plugin using shelljs
plugins: [
// ...plugins,
function () {
this.plugin('done', () => {
shell
.echo('#!/usr/bin/env node\n')
.cat(`${__dirname}/build/outputfile.js`)
.to(`${__dirname}/commandname`)
shell.chmod(755, `${__dirname}/commandname`)
})
},
]
Although #oklas's solution worked perfectly for me, I really wanted to try and keep all of this within webpack. I realized after a little more thought that this could all be done by a very simple plugin:
plugins: [
// ...plugins,
function() {
this.plugin('done', () => {
fs.chmodSync('bin/program-name.js', '755');
// When the webpack output doesn't have a .js extension, minification fails :(
fs.renameSync('bin/program-name.js', 'bin/program-name');
})
},
]
Use whichever way suits your needs!
This is how I did it with Webpack 5:
import { promises as fs } from 'fs';
plugins: [
new webpack.BannerPlugin({
banner: '#!/usr/bin/env node',
raw: true,
entryOnly: true
}),
function() {
this.hooks.done.tapPromise('Make executable', async () => {
await fs.chmod(`${__dirname}/dist/app.js`, '755');
});
}
]

How to replace requireJS config in webpack es6 project

I want to refactor a large requireJS project to use es6 import/export and webpack. In the requireJS requirejs.config call, I use the config section to pass some project specific settings to some views:
requirejs.config({
baseUrl: 'js/cfe/app',
paths: { },
config: {
'views/test/TestView': {
isTest: true
}
})
and in the view:
define(['module'], function(module) {
var t = module.config().isTest
})
How can I accomplish the same behaviour in my webpack setup?
I'm not quite sure if I understand your question correctly, but maybe you can use my answer anyways.
I think you could extract your configuration object to JSON file, use a loader (RAW loader works fine) to include it in your bundle, and then when you need it simply use ES6 import:
import config from 'myconfig.json';

how to require a specific file using duojs

I need to include a library that is present on github, but is not well-packaged; using Duo.js
At the moment of writing I am using the following to achieve what I desire:
bower
gulp
main-bower-files
Bower just downloades the library.
Gulp, with main-bower-files are useful to override the single package options and setup a so-called "main file" that I can build.
Example:
gulp.task('copy-libs', function () {
return gulp.src(bowerFiles({ env: 'development' }))
.pipe(gulp.dest('build/libs/'));
});
bower.json file:
"dependencies": {
"cash": "sudo-js/cash",
"bootstrap": "~3.3.2",
"delorean": "~0.8.7",
"react": "~0.12.2"
},
"overrides": {
"cash": {
"main": {
"development": "build/debug/cash.js"
}
}
}
}
How can i achieve this with duojs?
The documentation is quite thin regarding libraries that does not ship with a valid component.json
You can specify the path to an entry file for your lib. It won't be as clean as just specifying user/repo, but it'll get the job done.
For example, when including Twitter Bootstrap from twbs/bootstrap
require('twbs/bootstrap#v3.3.2:dist/js/bootstrap.js');
// repo: twbs/bootstrap
// version/tag: v3.3.2
// path: dist/js/bootstrap.js
Unfortunately, this doesn't work out-of-the-box since it assumes you have the jQuery global... so you need to add this above the previous line.
jQuery = require('components/jquery'); // leave out `var` so it becomes a global
This includes jQuery from the wonderful components project. (they package up popular libs so they can be consumed by various package managers.
Also, it turns out there is a components/bootstrap that is properly packaged with a component.json.
So, you can actually make bootstrap work with the following:
jQuery = require('components/jquery');
require('components/bootstrap');
For the other libraries that aren't as common, you can use the process mentioned first to specify the path to the right JS/CSS file. (ie: user/repo#version:path)

Resources