I need to include a library that is present on github, but is not well-packaged; using Duo.js
At the moment of writing I am using the following to achieve what I desire:
bower
gulp
main-bower-files
Bower just downloades the library.
Gulp, with main-bower-files are useful to override the single package options and setup a so-called "main file" that I can build.
Example:
gulp.task('copy-libs', function () {
return gulp.src(bowerFiles({ env: 'development' }))
.pipe(gulp.dest('build/libs/'));
});
bower.json file:
"dependencies": {
"cash": "sudo-js/cash",
"bootstrap": "~3.3.2",
"delorean": "~0.8.7",
"react": "~0.12.2"
},
"overrides": {
"cash": {
"main": {
"development": "build/debug/cash.js"
}
}
}
}
How can i achieve this with duojs?
The documentation is quite thin regarding libraries that does not ship with a valid component.json
You can specify the path to an entry file for your lib. It won't be as clean as just specifying user/repo, but it'll get the job done.
For example, when including Twitter Bootstrap from twbs/bootstrap
require('twbs/bootstrap#v3.3.2:dist/js/bootstrap.js');
// repo: twbs/bootstrap
// version/tag: v3.3.2
// path: dist/js/bootstrap.js
Unfortunately, this doesn't work out-of-the-box since it assumes you have the jQuery global... so you need to add this above the previous line.
jQuery = require('components/jquery'); // leave out `var` so it becomes a global
This includes jQuery from the wonderful components project. (they package up popular libs so they can be consumed by various package managers.
Also, it turns out there is a components/bootstrap that is properly packaged with a component.json.
So, you can actually make bootstrap work with the following:
jQuery = require('components/jquery');
require('components/bootstrap');
For the other libraries that aren't as common, you can use the process mentioned first to specify the path to the right JS/CSS file. (ie: user/repo#version:path)
Related
TL;DR How can I create an alias for a local yarn workspace dependency?
I've tried yarn workspaces before and never succeeded, and I'm giving it another try.
I've set "workspaces": ["packages/*"] in package.json.
For each package, I decided to use the naming convention #-/package-name to prevent naming conflicts and without worrying about having namespaces for internal packages.
When adding packages as dependencies, I've been following a style where I use an interface name for resolution, but point that towards a concrete implementation. This is what I did before using yarn workspaces:
"dependencies": {
"my-interface-name": "file:some/path/to/packages/some-concrete-implementation"
}
This is basically to allow what I like to call compile-time static dependency injection. And it also means each package can individually name their interface dependencies appropriately to their need and to prevent naming conflicts.
However, I can't figure out how to accomplish this with yarn workspaces. How do I create an alias for my yarn workspaces package #-/some-concrete-implementation called my-interface-name?
What I've already tried with no success:
Defining the dependency like "my-interface-name": "#-/some-concrete-implementation"} - for some reason this causes yarn to look for #-/some-concrete-implementation on the npm registry instead of in the local workspace
I've also tried to use the workspace protocol: "my-interface-name": "workspace:#-/some-concrete-implementation"} but it still looks for the package on the npm registry!
What I haven't yet tried and could work but removes benefits of using yarn workspaces in the first place:
"dependencies": {"my-interface-name": "file:../../node_modules/#-/some-concrete-implementation"}"
Have you seen the resolutions package.json key? Is it what you need?
I've used it for aliasing/overriding external packages but the example in the docs shows it working with local packages.
Resolutions
Allows you to override a version of a particular nested dependency. See the Selective Versions Resolutions RFC for the full spec.
{
"resolutions": {
"transitive-package-1": "0.0.29",
"transitive-package-2": "file:./local-forks/transitive-package-2",
"dependencies-package-1/transitive-package-3": "^2.1.1"
}
}
From the RFC:
"**/a" denotes all the nested dependencies a of the project.
"a" is an alias for **/a (for retro-compatibility, see below, and because if it wasn't such an alias, it wouldn't mean anything as it would represent one of the non-nested project dependencies, which can't be overridden as explained below).
So, I believe the rule you need is:
"**/my-interface-name": "file:some/path/to/packages/some-concrete-implementation"
// OR equivalent
"my-interface-name": "file:some/path/to/packages/some-concrete-implementation"
I believe it works in the package's package.json. Worst case you can hoist it to the workspace root and make the rule specific to the workspace e.g. "a/b".
The workspace: alias protocol (available in pnpm too) seems the direction to take.
I've also tried to use the workspace protocol: "my-interface-name": "workspace:#-/some-concrete-implementation"} but it still looks for the package on the npm registry!
Be sure to have yarn 3 installed, otherwise you'll run into weird issues.
Note that the syntax of "my-interface-name": "workspace:#-/some-concrete-implementation" looks incorrect.
It should be "#xxx/some-concrete-implementation": "workspace:*", assuming the name of linked the package is "name": "#xxx/some-concrete-implementation".
With this in mind you don't even need to create a specific #-/name. With workspace protocol, yarn will ensure it's never downloaded from npm. It becomes an internal workspace dependency.
PS:
Yarn 3 installation
Generally a simple yarn set version 3.0.2 && yarn plugin import workspace-tools) will work.
To avoid pnp current limitation, check the generated config .yarnrc.yml and ensure nmLinker is set to 'node-modules'
# Yarn 2+ supports pnp or regular node_modules installs. Use node-modules one.
nodeLinker: node-modules
plugins:
- path: .yarn/plugins/#yarnpkg/plugin-workspace-tools.cjs
spec: "#yarnpkg/plugin-workspace-tools"
yarnPath: .yarn/releases/yarn-3.0.2.cjs
PS: you might want to add this to .gitignore too
.yarn/*
!.yarn/patches
!.yarn/releases
!.yarn/plugins
!.yarn/sdks
!.yarn/versions
.pnp.*
Run a yarn install just after.
About package.json's
Like you did, the root package.json will define the workspace paths:
{
"name": "monorepo",
"workspaces": [
"packages/*" // Enable package discovery in packages/* directory.
],
// ...
"devDependencies": {
"husky": "7.0.2", // Only what's needed for monorepo management
}
In your app packages/app/package.json
{
"name": "my-app",
"devDependencies": {
"#types/node": "16.10.1",
//...
},
"dependencies": {
// Assuming the name of packages/shared is "#your-org/some-concrete-implementation",
// we explicitly declare the dependency on it through
// workspace: alias (package-manager perspective)
"#your-org/some-concrete-implementation": "workspace:*",
}
}
You consumed package should declare the same name
{
"name": "#your-org/some-concrete-implementation",
}
Bonus: Typescript aliases
If your project is written in ts, you can even replicate your paths through
typescript path mapping. It will allow to include the files just as is (no prior compilation needed).
Following your example, just edit a ./packages/xxx/tsconfig.json in this way
{
"compilerOptions": {
// here baseUrl is set at ./src (good practice), can
// be set to '.'
"baseUrl": "./src",
"paths": {
// Declare deps here (keep them in sync with what
// you defined in the package.json)
// PS: path are relative to baseUrl
"#your-org/some-concrete-implementation/*": ["../../some-concrete-implementation/src/*"],
// if you have a barrel in ui-lib
"#your-org/some-concrete-implementation": ["../../some-concrete-implementation/src/index"],
}
},
}
PS: for non typescript: babel/plugin-module-resolver can be used in a similar manner.
I try to import a node module inside an Angular 8 web worker, but get an compile error 'Cannot find module'. Anyone know how to solve this?
I created a new worker inside my electron project with ng generate web-worker app, like described in the above mentioned ng documentation.
All works fine until i add some import like path or fs-extra e.g.:
/// <reference lib="webworker" />
import * as path from 'path';
addEventListener('message', ({ data }) => {
console.log(path.resolve('/'))
const response = `worker response to ${data}`;
postMessage(response);
});
This import works fine in any other ts component but inside the web worker i get a compile error with this message e.g.
Error: app/app.worker.ts:3:23 - error TS2307: Cannot find module 'path'.
How can i fix this? Maybe i need some additional parameter in the generated tsconfig.worker.json?
To reproduce the error, run:
$ git clone https://github.com/hoefling/stackoverflow-57774039
$ cd stackoverflow-57774039
$ yarn build
Or check out the project's build log on Travis.
Note:
1) I only found this as a similar problem, but the answer handles only custom modules.
2) I tested the same import with a minimal electron seed which uses web workers and it worked, but this example uses plain java script without angular.
1. TypeScript error
As you've noticed the first error is a TypeScript error. Looking at the tsconfig.worker.json I've found that it sets types to an empty array:
{
"compilerOptions": {
"types": [],
// ...
}
// ...
}
Specifying types turns off the automatic inclusion of #types packages. Which is a problem in this case because path has its type definitions in #types/node.
So let's fix that by explicitly adding node to the types array:
{
"compilerOptions": {
"types": [
"node"
],
// ...
}
// ...
}
This fixes the TypeScript error, however trying to build again we're greeted with a very similar error. This time from Webpack directly.
2. Webpack error
ERROR in ./src/app/app.worker.ts (./node_modules/worker-plugin/dist/loader.js!./src/app/app.worker.ts)
Module build failed (from ./node_modules/worker-plugin/dist/loader.js):
ModuleNotFoundError: Module not found: Error: Can't resolve 'path' in './src/app'
To figure this one out we need to dig quite a lot deeper...
Why it works everywhere else
First it's important to understand why importing path works in all the other modules. Webpack has the concept of targets (web, node, etc). Webpack uses this target to decide which default options and plugins to use.
Ordinarily the target of a Angular application using #angular-devkit/build-angular:browser would be web. However in your case, the postinstall:electron script actually patches node_modules to change that:
postinstall.js (parts omitted for brevity)
const f_angular = 'node_modules/#angular-devkit/build-angular/src/angular-cli-files/models/webpack-configs/browser.js';
fs.readFile(f_angular, 'utf8', function (err, data) {
var result = data.replace(/target: "electron-renderer",/g, '');
var result = result.replace(/target: "web",/g, '');
var result = result.replace(/return \{/g, 'return {target: "electron-renderer",');
fs.writeFile(f_angular, result, 'utf8');
});
The target electron-renderer is treated by Webpack similarily to node. Especially interesting for us: It adds the NodeTargetPlugin by default.
What does that plugin do, you wonder? It adds all known built in Node.js modules as externals. When building the application, Webpack will not attempt to bundle externals. Instead they are resolved using require at runtime. This is what makes importing path work, even though it's not installed as a module known to Webpack.
Why it doesn't work for the worker
The worker is compiled separately using the WorkerPlugin. In their documentation they state:
By default, WorkerPlugin doesn't run any of your configured Webpack plugins when bundling worker code - this avoids running things like html-webpack-plugin twice. For cases where it's necessary to apply a plugin to Worker code, use the plugins option.
Looking at the usage of WorkerPlugin deep within #angular-devkit we see the following:
#angular-devkit/src/angular-cli-files/models/webpack-configs/worker.js (simplified)
new WorkerPlugin({
globalObject: false,
plugins: [
getTypescriptWorkerPlugin(wco, workerTsConfigPath)
],
})
As we can see it uses the plugins option, but only for a single plugin which is responsible for the TypeScript compilation. This way the default plugins, configured by Webpack, including NodeTargetPlugin get lost and are not used for the worker.
Solution
To fix this we have to modify the Webpack config. And to do that we'll use #angular-builders/custom-webpack. Go ahead and install that package.
Next, open angular.json and update projects > angular-electron > architect > build:
"build": {
"builder": "#angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./extra-webpack.config.js"
}
// existing options
}
}
Repeat the same for serve.
Now, create extra-webpack.config.js in the same directory as angular.json:
const WorkerPlugin = require('worker-plugin');
const NodeTargetPlugin = require('webpack/lib/node/NodeTargetPlugin');
module.exports = (config, options) => {
let workerPlugin = config.plugins.find(p => p instanceof WorkerPlugin);
if (workerPlugin) {
workerPlugin.options.plugins.push(new NodeTargetPlugin());
}
return config;
};
The file exports a function which will be called by #angular-builders/custom-webpack with the existing Webpack config object. We can then search all plugins for an instance of the WorkerPlugin and patch its options adding the NodeTargetPlugin.
I'm trying to use the linkurious library (a sigma fork), which provides a "main": "dist/sigma.require.js" (in the package.json). this allows me to do:
var sigma = require('linkurious');
however, the plugins are not included so I have to require them separately. the problem is that the plugins rely on the sigma variable being available in the global scope. so I've shimmed things as follows (from the package.json):
"browser": {
"sigma": "./node_modules/linkurious/dist/sigma.js",
"linkurious/plugins": "./node_modules/linkurious/dist/plugins.js"
},
"browserify-shim": {
"sigma": {"exports": "sigma"},
"linkurious/plugins": { "depends": [ "sigma" ] }
},
"browserify": {
"transform": [ "browserify-shim" ]
},
which, when run in a browser doesn't generate errors during inclusion of the plugins (I gather this means the global variable is available) but references to the plugins fail (as if they failed to attach themselves, or attached themselves to a non-global variable).
I'm using grunt-browserify to run the process where I have it configured like this (from the Gruntfile.js):
grunt.initConfig({
browserify: {
libs: {
files: { 'inc.js': ['index.js'] },
},
}
});
I've attached a little project to this issue with the minimal required code to demonstrate the problem in the hopes that someone else can replicate/figure out. unpack, type npm install; npm start and run a browser against http://localhost:8002/ to see the issue.
thanks in advance,
ekkis
sigma.zip
- edit I -
incidentally, bendrucker at the git repo (see: https://github.com/thlorenz/browserify-shim/issues/215) suggests I need to do a global transform. It's been explained to me that shimming doesn't work on node_modules files and for those I need a global transform. this doesn't make much sense to me as the whole point of shimming is that you don't own the code you're shimming. in any case, bendrucker pointed me to this other SO post where the question is posed but no answers are provided.
help?
In my project, I'm using Babel 6 with the require hook. I need to load a custom babel plugin that I wrote. But do I really need to publish my plugin using npm first, and then include the plugin name in my main project's .babelrc?
Is there any way to just directly load the plugin code? In other words, can I just load the following directly?
export default function({types: t }) {
return {
visitor: {
...
}
};
}
Where you list your plugins in your .babelrc, provide the path to your plugin instead of your standard published plugin name.
"plugins": ["transform-react-jsx", "./your/plugin/location"]
When exporting your plugin function, you'll probably need to use module.exports = instead of export default, since ES2015 modules haven't been fully implemented in Node yet.
This is my entire babel.config.js file.
module.exports = function (api) {
api.cache(true);
const presets = ["#babel/preset-env", "#babel/preset-react"];
const plugins = [
["#babel/plugin-proposal-pipeline-operator", { "proposal": "minimal" }],
"c:\\projects\\my-babel-plugin"
];
return {
presets,
plugins
};
}
First item in the plugins array is a plugin with options in form of an array. Second item is my own local plugin.
Inside of my-babel-plugin folder there needs to be a package.json with the "main" entry, usually "main": "lib/index.js" or "main": "src/index.js".
I'm having problems using canJS together with stealjs, i've cloned the repo of javascriptmvc (3.3 use canJS). Now i've this folder structure
/js
/can
/documentjs
/funcunit
/plugins
.
.
.
In another part of my application i've a "standalone module" e.g layout (generated using the scaffolding tool).
I load this module using "js/steal/steal.js?path/to/module/layout" inside my page and it works. If I stole some jquery plugins (e.g. located in the main js folder) inside my layout.js like so:
steal('plugins/jqueryplugin.js', 'plugins/jqueryplugin.css', function() {
// my code here
});
it still work, but when i try to add in the list of "dependecies" some component from "canJS" (even fixture.js generated with the tool...because it stoles can.fixture) it just stops to work and breaks everything. I've also tried using:
steal('that').then('this', function() {});
But i've the same results.....fail!!! anyone have any hints?
Ok i found the problem. There is nothing wrong with stealjs and canjs, but
canjs just load its own version of jquery
that will break my application. Now I need to find a way to load canjs and jquery separately (i use yii and some extensions need to have jquery loaded at a certain time so cannot wait for canjs).
Is the issue the version of jQuery or the order of dependencies?
You can configure steal via the stealconfig.js to use another version of jQuery and manage any dependencies.
An example can be found in the github repo: (this example does not show dependencies so i added one below)
https://github.com/bitovi/steal/blob/master/stealconfig.js
steal.config({
map: {
"*": {
"jquery/jquery.js": "jquery", // Map to path
"bootstrap/bootstrap.js": "bootstrap",
"can/util/util.js": "can/util/jquery/jquery.js"
}
},
paths: {
"jquery": "can/lib/jquery.1.8.3.js", // Path to jQuery
"bootstrap": "lib/bootstrap.js"
"yui/yui.js" : "can/lib/yui-3.7.3.js",
},
shim : {
jquery: {
exports: "jQuery"
},
bootstrap: { // A dependency example
'deps': ['jquery']
}
},
ext: {
js: "js",
css: "css",
less: "steal/less/less.js",
coffee: "steal/coffee/coffee.js",
ejs: "can/view/ejs/ejs.js",
mustache: "can/view/mustache/mustache.js"
}
});
Note: this is an untested example, hope this helps.
i had problem too with stealJs i have known that it work well with JavascriptMVC,
now i'm using AMD requireJs to dependency manage, an it works great with canjs.
here is the documentation http://canjs.com/guides/using-require.html, i hope that it help you!