How to get snowpack to look inside a package for subpath - node.js

I am building a snowpack app right now, and I would like to import socket.io client in the frontend (For intellisense and offline dev testing). However, socket.io only exports the backend materials when using import ... from 'socket.io'.
Normally, I use
import { io } from 'socket.io/client-dist/socket.io.js';
Which gets all the correct files and exports, however, when building with snowpack I get this error:
Package exports for 'C:\dev\JS\Node+Browser\foo\node_modules\socket.io' do not define a './client-dist/socket.io.js' subpath
Which fails the build, stopping everything.
Right now, my snowpack.config is really bare bones:
module.exports = {
buildOptions: {
out: 'dist/client'
},
mount: {
"src/client": "/"
}
}
All of the rest of my modules run fine, because they are all imported with only import ... from 'module-name. I understand what the error is saying, but I cant find anything online or thing of anything to solve it. Does anyone know how to fix this?

NOTE: This is a "hacky" fix that I think is messy and can not be used for larger projects.
I patched this by editing the package.json of the socket.io package (In node_modules) to use a temporary export alias that was exactly the same as the real directory path:
node_modules/socket.io/package.json
"exports": {
".": [
{
"require": "./dist/index.js",
"import": "./wrapper.mjs"
},
"./src/index.js"
],
"./client-dist/socket.io": "./client-dist/socket.io.js",
"path-to-other-modules": "same-path"
},

Related

eslint rule #nrwl/nx/enforce-module-boundaries fails

Intro
I was very confused with that rule when I recently ported the Ng code base to Nx 12.x. I hope this post helps others who begin migrating from Ng to Nx.
The code base above is a rather small single repo which is now used in production. When using Nx it's a good practice to follow the recommendations for monorepo to be able to use the monorepo benefits in the future as the code base is growing. (E.g. here I'm avoiding the overexposing of the code in the current repo).
I put the code base above into my-org/apps/my-small-repo. By linting I was confused by the failure of the rule #nrwl/nx/enforce-module-boundaries. So I tried different possibilities of mapping the src/app of my-org/apps/my-small-repo where either compiler or linter or both just failed.
I figured out the following solutions.
Solution 1
Just put
"compilerOptions": {
"baseUrl": "src"
},
into the root of apps/my-small-repo/tsconfig.json and replace all of your imports inside of apps/my-small-repo with imports beginning with app.
Example for a DashboardComponent:
import { DashboardComponent } from 'app/components/dashboard/dashboard.component';
Probably a better solution
This solution is tested on nx 13.x, but it probably works on previous versions of nx also.
Put
"app/*": ["apps/my-org/src/app/*"]
to the paths in compilerOptions of your tsconfig.base.json in the repo root. Then put "allowCircularSelfDependency": true, to the rule #nrwl/nx/enforce-module-boundaries in the repo root.
We decided for "allowCircularSelfDependency": true, to avoid working with ugly relative paths like like e.g. this one ../../../../../ in the app. And we also want to have library namespaces in tsconfig.base.json only.
Documentation of the rule
https://github.com/nrwl/nx/blob/master/packages/eslint-plugin-nx/src/rules/enforce-module-boundaries.ts
For those who are coming here without this getting resolved. (nx monorepo usage)
Trouble shooting the 2 errors (TS error and lint error):
First the Alias error:
Cannot find module '#account/components/something' or its corresponding type declarations.
On your base tsconfig.base.json (not tsconfig.json under your apps as it gets overrided), add:
"compilerOptions":{
...
baseUrl:"." // Try "src" as well incase of boiler plates or if your resolved path (on the error) is missing an src.
path: {
"#account/*": ["app/*"],
"#account/components/*": ["app/components/*"]
}
},
The above will resolve:
import { authMiddleware } from '#account/components/something';
from
import { authMiddleware } from '../../../components/something';
For lint error:
Projects should use relative imports to import from other files within the same project - eslint rule #nrwl/nx/enforce-module-boundaries fails`
Add "allowCircularSelfDependency": true.
"#nrwl/nx/enforce-module-boundaries": [
"error",
{
"allowCircularSelfDependency": true, -> This may solve the lint error.
"allow": ["#account/**"], -> // White list the lint error.
...
}
Whitelist the folders: Add "allow": [#foldername]
"#nrwl/nx/enforce-module-boundaries": [
"error",
{
"allow": ["#account/**"], -> // White list the lint error.
...
}
That should fix it.
To get this working:
On your base tsconfig.base.json or your local tsconfig.json.
I suggest to do it on the tsconfig.base.json
Considering your path apps/my-org/src/app/*
"compilerOptions":{
...
baseUrl:"src"
path: {
"#app/*": ["app/*"] // << Here is the change
}
},
Import in your code files from this apps/my-org/src/app/*
to this #app/*

How to import a node module inside an angular web worker?

I try to import a node module inside an Angular 8 web worker, but get an compile error 'Cannot find module'. Anyone know how to solve this?
I created a new worker inside my electron project with ng generate web-worker app, like described in the above mentioned ng documentation.
All works fine until i add some import like path or fs-extra e.g.:
/// <reference lib="webworker" />
import * as path from 'path';
addEventListener('message', ({ data }) => {
console.log(path.resolve('/'))
const response = `worker response to ${data}`;
postMessage(response);
});
This import works fine in any other ts component but inside the web worker i get a compile error with this message e.g.
Error: app/app.worker.ts:3:23 - error TS2307: Cannot find module 'path'.
How can i fix this? Maybe i need some additional parameter in the generated tsconfig.worker.json?
To reproduce the error, run:
$ git clone https://github.com/hoefling/stackoverflow-57774039
$ cd stackoverflow-57774039
$ yarn build
Or check out the project's build log on Travis.
Note:
1) I only found this as a similar problem, but the answer handles only custom modules.
2) I tested the same import with a minimal electron seed which uses web workers and it worked, but this example uses plain java script without angular.
1. TypeScript error
As you've noticed the first error is a TypeScript error. Looking at the tsconfig.worker.json I've found that it sets types to an empty array:
{
"compilerOptions": {
"types": [],
// ...
}
// ...
}
Specifying types turns off the automatic inclusion of #types packages. Which is a problem in this case because path has its type definitions in #types/node.
So let's fix that by explicitly adding node to the types array:
{
"compilerOptions": {
"types": [
"node"
],
// ...
}
// ...
}
This fixes the TypeScript error, however trying to build again we're greeted with a very similar error. This time from Webpack directly.
2. Webpack error
ERROR in ./src/app/app.worker.ts (./node_modules/worker-plugin/dist/loader.js!./src/app/app.worker.ts)
Module build failed (from ./node_modules/worker-plugin/dist/loader.js):
ModuleNotFoundError: Module not found: Error: Can't resolve 'path' in './src/app'
To figure this one out we need to dig quite a lot deeper...
Why it works everywhere else
First it's important to understand why importing path works in all the other modules. Webpack has the concept of targets (web, node, etc). Webpack uses this target to decide which default options and plugins to use.
Ordinarily the target of a Angular application using #angular-devkit/build-angular:browser would be web. However in your case, the postinstall:electron script actually patches node_modules to change that:
postinstall.js (parts omitted for brevity)
const f_angular = 'node_modules/#angular-devkit/build-angular/src/angular-cli-files/models/webpack-configs/browser.js';
fs.readFile(f_angular, 'utf8', function (err, data) {
var result = data.replace(/target: "electron-renderer",/g, '');
var result = result.replace(/target: "web",/g, '');
var result = result.replace(/return \{/g, 'return {target: "electron-renderer",');
fs.writeFile(f_angular, result, 'utf8');
});
The target electron-renderer is treated by Webpack similarily to node. Especially interesting for us: It adds the NodeTargetPlugin by default.
What does that plugin do, you wonder? It adds all known built in Node.js modules as externals. When building the application, Webpack will not attempt to bundle externals. Instead they are resolved using require at runtime. This is what makes importing path work, even though it's not installed as a module known to Webpack.
Why it doesn't work for the worker
The worker is compiled separately using the WorkerPlugin. In their documentation they state:
By default, WorkerPlugin doesn't run any of your configured Webpack plugins when bundling worker code - this avoids running things like html-webpack-plugin twice. For cases where it's necessary to apply a plugin to Worker code, use the plugins option.
Looking at the usage of WorkerPlugin deep within #angular-devkit we see the following:
#angular-devkit/src/angular-cli-files/models/webpack-configs/worker.js (simplified)
new WorkerPlugin({
globalObject: false,
plugins: [
getTypescriptWorkerPlugin(wco, workerTsConfigPath)
],
})
As we can see it uses the plugins option, but only for a single plugin which is responsible for the TypeScript compilation. This way the default plugins, configured by Webpack, including NodeTargetPlugin get lost and are not used for the worker.
Solution
To fix this we have to modify the Webpack config. And to do that we'll use #angular-builders/custom-webpack. Go ahead and install that package.
Next, open angular.json and update projects > angular-electron > architect > build:
"build": {
"builder": "#angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./extra-webpack.config.js"
}
// existing options
}
}
Repeat the same for serve.
Now, create extra-webpack.config.js in the same directory as angular.json:
const WorkerPlugin = require('worker-plugin');
const NodeTargetPlugin = require('webpack/lib/node/NodeTargetPlugin');
module.exports = (config, options) => {
let workerPlugin = config.plugins.find(p => p instanceof WorkerPlugin);
if (workerPlugin) {
workerPlugin.options.plugins.push(new NodeTargetPlugin());
}
return config;
};
The file exports a function which will be called by #angular-builders/custom-webpack with the existing Webpack config object. We can then search all plugins for an instance of the WorkerPlugin and patch its options adding the NodeTargetPlugin.

Trying to share code from Hyperapp with Bit.dev

I'm trying to share my code from my front (hyperapp) to my admin (hyperapp to) to make "preview" button.
The setup of these projects was made by an other dev, so i had to learn hyperapp workflow on the job, i'm not expert.
From what i know he was inspired by Facebook React conf.
All my usefull code is in src/ folder, and there is many dependencies so i have to export all (api, constants, utils, etc..).
Here is my bit configuration (that work, it export code correctly):
"bit": {
"env": {
"compiler": "bit.envs/compilers/react#1.0.2"
},
"packageManager": "yarn",
"packageManagerArgs": [
"--production",
"--no-optional"
],
"packageManagerProcessOptions": {
"shell": true
},
"resolveModules": {
"modulesDirectories": [
"src"
]
},
"dist": {
"entry": "src",
"target": "dist"
}
}
So, the code is "correctly" exported to bit.dev, but, when i import it from my admin with
"#bit/adrienbelair.betterise-web.modules": "^0.3.0",
i get the following error after running yarn:
yarn install
ls: Command failed.
Exit code: 1
Command: node .bit.postinstall.js
...
Error: ENOTDIR: not a directory, mkdir 'node_modules/utils/HOA'
Yes, if i look into node_module, utils is a file, and not a directory
All these are auto-generated, i dont understand what am i doing wrong?
Second thing, probably from this above error, when i try to import a component (even if there is an error, vendor are downloaded and at their place), i get:
import { Advice } from '#bit/adrienbelair.betterise-web.modules/dist/modules';
./node_modules/#bit/adrienbelair.betterise-web.api/controlleur.js
Module not found: Can't resolve 'api' in '/Users/prinzivalle/Web/betterise/admin-front/node_modules/#bit/adrienbelair.betterise-web.api'
From this line (if i look into node_module, where the error is thrown):
import { User, Cardline } from 'api';
I know, its a very specific case, mine, but i dont find any forum or explicit tutorial. Only some little component export with not a lot of dependencies.
I made my code with a little knowledge of Hyperapp/React and without thinking about sharing it one day..
Thank for reading.

Webpack fails with Node FFI and Typescript - dynamic require error

In a simple Typescript program I require Node FFI with
import * as Electron from 'electron';`
import * as ffi from 'ffi';`
and then
mylib = ffi.Library('libmoi', {
'worker': [ 'string', [ 'string' ] ],
'test' : [ 'string', [] ]
} );
Linking that up via webpack yields
WARNING in ./~/bindings/bindings.js
Critical dependencies:
76:22-40 the request of a dependency is an expression
76:43-53 the request of a dependency is an expression
# ./~/bindings/bindings.js 76:22-40 76:43-53
The problem seems to be that FFI has a dynamic require and the fix seems to be to apply webpack.ContextReplacementPlugin in the webpack.config.js file.
This is a bit out of my reach, but an example for an Angular case is:
plugins: [
new webpack.ContextReplacementPlugin(
// The (\\|\/) piece accounts for path separators in *nix and Windows
/angular(\\|\/)core(\\|\/)(esm(\\|\/)src|src)(\\|\/)linker/,
root('./src') // location of your src
)
]
Any idea how to do this for FFI?
Here is the answer: github issue comment on the Johnny-Five repo
Quoting from brodo's answer, this is what you do to stop webpack getting snarled up with "bindings" and similar:
... the webpack config looks like this:
module.exports = {
plugins: [
new webpack.ContextReplacementPlugin(/bindings$/, /^$/)
],
externals: ["bindings"]
}
I also had a similar issue, somehow, I managed to resolve it. I will first explain my understanding.
Main work of webpack is to bundle the separate code file into one file, by default it bundles all the code that is referenced in its tree.
Generally two types of node_modules:
To be used on browser side(angular, rxjs etc)
To be used on nodejs side(express, ffi etc)
It is safer to bundle browser side node_module but not safer to bundle node side node_module because they are not designed like that So the solution is below two steps:
Give appropriate target(node, electron etc) in webpack.config.js file e.g "target":'electron-renderer' by default it is browser
Declare node_side module as external dependency in your webpack.config.js file e.g.
"externals": {
"bindings": "require('bindings')",
"ffi": "require('ffi')"
}

shimming linkurious - how to configure?

I'm trying to use the linkurious library (a sigma fork), which provides a "main": "dist/sigma.require.js" (in the package.json). this allows me to do:
var sigma = require('linkurious');
however, the plugins are not included so I have to require them separately. the problem is that the plugins rely on the sigma variable being available in the global scope. so I've shimmed things as follows (from the package.json):
"browser": {
"sigma": "./node_modules/linkurious/dist/sigma.js",
"linkurious/plugins": "./node_modules/linkurious/dist/plugins.js"
},
"browserify-shim": {
"sigma": {"exports": "sigma"},
"linkurious/plugins": { "depends": [ "sigma" ] }
},
"browserify": {
"transform": [ "browserify-shim" ]
},
which, when run in a browser doesn't generate errors during inclusion of the plugins (I gather this means the global variable is available) but references to the plugins fail (as if they failed to attach themselves, or attached themselves to a non-global variable).
I'm using grunt-browserify to run the process where I have it configured like this (from the Gruntfile.js):
grunt.initConfig({
browserify: {
libs: {
files: { 'inc.js': ['index.js'] },
},
}
});
I've attached a little project to this issue with the minimal required code to demonstrate the problem in the hopes that someone else can replicate/figure out. unpack, type npm install; npm start and run a browser against http://localhost:8002/ to see the issue.
thanks in advance,
ekkis
sigma.zip
- edit I -
incidentally, bendrucker at the git repo (see: https://github.com/thlorenz/browserify-shim/issues/215) suggests I need to do a global transform. It's been explained to me that shimming doesn't work on node_modules files and for those I need a global transform. this doesn't make much sense to me as the whole point of shimming is that you don't own the code you're shimming. in any case, bendrucker pointed me to this other SO post where the question is posed but no answers are provided.
help?

Resources