I have code like the following
define("ModuleA", ["InitialDependency"], function (initDep){
return {};
});
define("ModuleB", ["ModuleA", "OtherDependency"], function (moduleA, otherDep){
return {};
});
Each of these modules is defined in separate files "ModuleA.js", "Moduleb.js", "InitialDependency.js" and "OtherDependency.js".
These modules are loaded sequentially in my application. ModuleB is always loaded after ModuleA. this means that in the optimization stage I do not want ModuleA's script combined in the built script for ModuleB. I want the following
ModuleA.built.js includes
InitialDependency
ModuleA
ModuleB.built.js includes
OtherDependency
ModuleB
I don't want them all in the same file however as ModuleB may never be loaded.
I can do a build script for both modules but this will be time consuming as I have quite a few modules in my project and would like a build script that will build the lot of them at once.
What do I need to know to create a build script for building both of these modules (and more that follow the same dependency pattern)?
To achieve this, you'd have to play with the modules configuration option.
It could look like this:
{
modules: [
{
name: "ModuleA",
include: [],
exclude: []
},
{
name: "ModuleB",
exclude: [
"moduleA"
]
}
]
}
There's a similar example setup by James here: https://github.com/requirejs/example-multipage
Of course, by building these modules separately, you may end up needing to update paths. If so, the best way then would be to create a file containing a require.config call with special setting for your builded app and including this configuration instead of your usual one. But if you set dependencies in a good separated way, then you'll probably be fine. By "good separated" way, I mean that if moduleA is the base script, then it shouldn't have dependencies packed with moduleB - but I guess this is common sense!
Note about shimmed modules: As shimmed config only work whe files are loaded and by r.js to order plugins, be sure you don't include a shim module without it's dependency if you're not 100% sure these will be loaded before. More info here: https://github.com/requirejs/example-multipage-shim
Hope this help!
Related
I have a hybrid cjs/esm Node package written in Typescript.
It consists of, let's say, two files - core.ts and extra.ts and I want to keep them separate. extra.ts imports core.ts (literally import { ... } from './core';) and they are separate entry points of the package. Clients import either only core.ts or both.
Rollup makes it easy to have build steps for multiple entry points, for multiple output formats, and all seemed fine.
But now I ran into an issue:
Let's say I have example.mjs or example.cjs where I import or require my-package and my-package/extra entry points.
That doesn't work. Error messages are slightly different but meaning is the same - ./core module can not be found when reading extra.mjs/cjs.
extra.cjs built by Rollup contains the line var core = require('./core');
extra.mjs built by Rollup contains the line import { ... } from './core';
By default, Node 12 does not guess file extensions (I'm not questioning this).
I have to call it as node --experimental-modules --es-module-specifier-resolution=node ./example.mjs to make it work. This is unsatisfactory solution. I need example.mjs to be runnable without additional flags.
It appears to me that file extensions can and should be added to import/require statements in compiled cjs/mjs files to make it work according to the spec.
Although, since I have different build steps in Rollup config for both files and external: ['./core'] in the options for extra.ts, for Rollup they are totally unrelated. And Rollup will just bundle them into a single file otherwise, which is not what I need either.
So the question:
Is there a plugin or a configuration option to make Rollup produce files with correct local imports (file extensions added to local imports according to the format)?
What would be the best way to add an extra step to the building process to patch imports if there is no existing solution?
Maybe there a different bundler that might work for the same task?
Got a satisfactory solution.
output.preserveModules option keeps all files separate while building them in one pass (can be limiting in some aspects though).
output.entryFileNames option allows to specify file extensions. (It won't work with modules considered external, can't use it without preserveModules.)
I can build only extra.ts as it imports every other file. But I have to be mindful of tree shaking when building like this - need to retain all exports of core.ts.
export default [
{
external: [],
input: 'src/extra.ts',
treeshake: false,
plugins: [
typescript(),
cleanup({ extensions: ['ts'] })
],
output: [
{
dir: 'lib',
format: 'es',
preserveModules: true,
entryFileNames: '[name].mjs',
},
{
dir: 'lib',
format: 'cjs',
preserveModules: true,
entryFileNames: '[name].cjs',
},
],
},
];
One step back is that I get another output file for another ts file imported by core.ts and extra.ts. I think I can live with that.
Ideal solution would require to monkey-patch output files with sed-like tool after the build in my initial configuration.
In a simple Typescript program I require Node FFI with
import * as Electron from 'electron';`
import * as ffi from 'ffi';`
and then
mylib = ffi.Library('libmoi', {
'worker': [ 'string', [ 'string' ] ],
'test' : [ 'string', [] ]
} );
Linking that up via webpack yields
WARNING in ./~/bindings/bindings.js
Critical dependencies:
76:22-40 the request of a dependency is an expression
76:43-53 the request of a dependency is an expression
# ./~/bindings/bindings.js 76:22-40 76:43-53
The problem seems to be that FFI has a dynamic require and the fix seems to be to apply webpack.ContextReplacementPlugin in the webpack.config.js file.
This is a bit out of my reach, but an example for an Angular case is:
plugins: [
new webpack.ContextReplacementPlugin(
// The (\\|\/) piece accounts for path separators in *nix and Windows
/angular(\\|\/)core(\\|\/)(esm(\\|\/)src|src)(\\|\/)linker/,
root('./src') // location of your src
)
]
Any idea how to do this for FFI?
Here is the answer: github issue comment on the Johnny-Five repo
Quoting from brodo's answer, this is what you do to stop webpack getting snarled up with "bindings" and similar:
... the webpack config looks like this:
module.exports = {
plugins: [
new webpack.ContextReplacementPlugin(/bindings$/, /^$/)
],
externals: ["bindings"]
}
I also had a similar issue, somehow, I managed to resolve it. I will first explain my understanding.
Main work of webpack is to bundle the separate code file into one file, by default it bundles all the code that is referenced in its tree.
Generally two types of node_modules:
To be used on browser side(angular, rxjs etc)
To be used on nodejs side(express, ffi etc)
It is safer to bundle browser side node_module but not safer to bundle node side node_module because they are not designed like that So the solution is below two steps:
Give appropriate target(node, electron etc) in webpack.config.js file e.g "target":'electron-renderer' by default it is browser
Declare node_side module as external dependency in your webpack.config.js file e.g.
"externals": {
"bindings": "require('bindings')",
"ffi": "require('ffi')"
}
Architecture
I would like to share code between client and server side. I have defined aliases in the webpack config:
resolve: {
// Absolute paths: https://github.com/webpack/webpack/issues/109
alias: {
server : absPath('/src/server/'),
app : absPath('/src/app/'),
client : absPath('/src/client/'),
}
},
Problem
Now on the server side I need to include webpack in order to recognize the correct paths when I require a file. For example
require('app/somefile.js')
will fail in pure node.js because can't find the app folder.
What I need (read the What I need updated section)
I need to be able to use the webpack aliases. I was thinking about making a bundle of all the server part without any file from node_modules. In this way when the server starts it will use node_modules from the node_modules folder instead of a minified js file (Why? 1st: it doesn't work. 2nd: is bad, because node_modules are compiled based on platform. So I don't want my win files to go on a unix server).
Output:
Compiled server.js file without any node_modules included.
Let the server.js to use node_modules;
What I need updated
As I've noticed in https://github.com/webpack/webpack/issues/135 making a bundled server.js will mess up with all the io operation file paths.
A better idea would be to leave node.js server files as they are, but replace the require method provided with a custom webpack require which takes in account configurations such as aliases (others?)... Can be done how require.js has done to run on node.js server.
What I've tried
By adding this plugin in webpack
new webpack.optimize.CommonsChunkPlugin(/* chunkName= */"ignore", /* filename= */"server.bundle.js")
Entries:
entry: {
client: "./src/client/index.js",
server: "./src/server/index.js",
ignore: ['the_only_node_module'] // But I need to do that for every node_module
},
It will create a file server.js which only contains my server code. Then creates a server.bundle.js which is not used. But the problem is that webpack includes the webpackJsonp function in the server.bundle.js file. Therefore both the client and server will not work.
It should be a way to just disable node_modules on one entry.
What I've tried # 2
I've managed to exclude the path, but requires doesn't work because are already minified. So the source looks like require(3) instead of require('my-module'). Each require string has been converted to an integer so it doesn't work.
In order to work I also need to patch the require function that webpack exports to add the node.js native require function (this is easy manually, but should be done automatically).
What I've tried # 3
In the webpack configuration:
{target: "node"}
This only adds an exports variable (not sure about what else it does because I've diffed the output).
What I've tried # 4 (almost there)
Using
require.ensure('my_module')
and then replacing all occurrences of r(2).ensure with require. I don't know if the r(2) part is always the same and because of this might not be automated.
Solved
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
Related
https://www.bountysource.com/issues/1660629-what-s-the-right-way-to-use-webpack-specific-functionality-in-node-js
https://github.com/webpack/webpack/issues/135
http://webpack.github.io/docs/configuration.html#target
https://github.com/webpack/webpack/issues/458
How to simultaneously create both 'web' and 'node' versions of a bundle with Webpack?
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
Thanks
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
My solution was:
{
// make sure that webpack will externalize
// modules using Node's module API (CommonJS 2)
output: { ...output, libraryTarget: 'commonjs2' },
// externalize all require() calls to non-relative modules.
// Unless you do something funky, every time you import a module
// from node_modules, it should match the regex below
externals: /^[a-z0-9-]/,
// Optional: use this if you want to be able to require() the
// server bundles from Node.js later
target: 'node'
}
I know we could use requirejs combine files into one js file.
such like the following config.
module.exports = {
baseUrl: 'js/',
mainConfigFile: 'src/js/common.js',
dir: 'scripts/',
optimize: 'uglify2',
modules: [
{
name: 'common',
include: [
'jquery',
]
}
]
};
my result into one file is
common.js
----------------
jquery.js
modernizr.js
common.js
my question is, do we still need to put a require.js file in scripts folder and to use the following format
<script data-main="scripts/common" src="scripts/require.js"></script>
or we could just use
<script src="scripts/common.js"></script>
as files are compressed into one file?
You still need to load require.js the usual way to actually make use of the module loading benefits that it provides, and especially if you use the asynchronous functionality a lot. However, you can have a look at almond providing your code uses AMD and (from the README):
optimize all the modules into one file -- no dynamic code loading.
all modules have IDs and dependency arrays in their define() calls -- the RequireJS optimizer will take care of this for you.
only have one requirejs.config() or require.config() call.
do not use RequireJS multiversion support/contexts.
do not use require.toUrl() or require.nameToUrl().
do not use packages/packagePaths config. If you need to
use packages that have a main property,
volo can create an adapter module so
that it can work without this config. Use the amdify add command to
add the dependency to your project.
Almond is great because it doesn't need require.js at all; it wraps your own code with itself, which is a very minimal AMD loader skeleton and nowhere near as powerful as the main library. You then get a single optimised file that can be linked directly in your HTML:
<script src="scripts/common.js"></script>
The Gruntfile config for almond could look something like this:
compile: {
options: {
name: 'path/to/almond',
baseUrl: 'js',
include: ['main'],
insertRequire: ['main'],
mainConfigFile: 'scripts/config.js',
out: 'scripts/main.js',
optimizeAllPluginResources: true,
wrap: true
}
}
The above is all standard r.js boilerplate, you can find many more examples at the almond homepage.
I'm working on an application built with Brunch. I would like to load some* of the vendor-supplied javascript as modules, so that I can require them in my code, rather than relying on global variables. Is there some way to do this, without copying all the vendor code into my app directory?
I tried creating a vendorlib directory, but brunch doesn't seem to look anywhere bu app and vendor. I also tried making a vendor/modules directory, but brunch seems to not wrap anything found under vendor (even when I convinced it to combine those files with the files other modules found under app.)
*The "some" that I'm working on right now are Chaplin, Backbone and Underscore. If I get those to work, I'll move more over later.
You can override config.modules.wrapper and make it wrap, for example, all files in vendor/modules directory. Or you can make add more directories that are handled by brunch to config.paths.watched.
For those following along at home, this is what my config.coffee eventually looked like:
paths:
watched: ['app','vendor','test','vendorlib']
files:
javascripts:
joinTo:
'javascripts/app.js': /^app/
'javascripts/vendor.js': /^vendor/
'test/javascripts/test.js': /^test[\\/](?!vendor)/
'test/javascripts/test-vendor.js': /^test[\\/](?=vendor)/
order:
# Files in `vendor` directories are compiled before other files
# even if they aren't specified in order.before.
before: [
'vendor/scripts/console-polyfill.js',
]
after: [
'test/vendor/scripts/test-helper.js'
]
stylesheets:
joinTo:
'stylesheets/app.css': /^(app|vendor)/
'test/stylesheets/test.css': /^test/
order:
after: ['vendor/styles/helpers.css']
templates:
joinTo: 'javascripts/app.js'
modules:
nameCleaner: (path) ->
path.replace(/^(app|vendorlib)\//, '')
This lets me populate a vendorlib directory with modules from vendors that support loading as modules. I currently have Chaplin, jQuery, and Backbone in there. I had to rename them not to include the version numbers.