I have a monorepo that has many micro-services in it. There are some library-type functions / classes that I want to make available to any micro-service that needs it. However, if that library package declares a peer dependency, the peer dependency is not found when running code from within the thing that depends on the library.
Consider this repo structure:
lib
some-library (peerDepends on foo)
index.js (requires foo)
node_modules will be empty
services
some-service (depends on foo, and some-library)
index.js (requires some-library)
node_modules will have:
foo
some-library will be a symlink to ../../lib/some-library
When running node services/some-service/index.js, you'll get the error "Cannot find module 'foo'", emanating from lib/some-library/index.js.
Presumably this happens because node is only looking at lib/some-library/node_modules and any node_modules folder that is in an ancestor directory. But since this code was run from services/some-service (as the working directory), and because of the symlink in services/some-service/node_modules, I would've expected this to work.
Here's a repo you can easily clone to see the problem: https://github.com/jthomerson/example-local-dependency-problem
git clone git#github.com:jthomerson/example-local-dependency-problem.git
cd example-local-dependency-problem
cd services/some-service
npm install
node index.js
I only see two solutions:
Don't use peerDependencies inside the library
Install each peer dependency at the root of the project for the sake of local development and testing.
Neither of those is a real great solution because it doesn't allow each service to have different versions of the dependencies, and thus means that if the local version (or the library's version) of a dependency is bumped, all services that uses the library then have their dependencies version bumped at the same time, which makes them more brittle because they're all tied together.
How about adding the --preserve-symlinks flag?
E.g.:
node --preserve-symlinks index.js
Here's a link to the docs
I had a dual-workspace setup in which:
workspace1
shared-library
module-library (peer depends on shared-library)
workspace2
main-app (depends on module-library and shared-library)
Now dependencies for workspace projects are defined in the tsconfig.base.json file under compilerOptions.paths.
However for workspace2 which is unrelated to workspace1, I install the packages (both via file:. When I then build main-app I get the error that module-library is unable to find shared-library (even though it's installed in workspace2.
I had to add ./../workspace1/dist/shared-library to the compilerOptions.paths in tsconfig.base.json of workspace2 (note the reference to workspace1).
This obviously couples the workspaces on my filesystem. But for development purposes this is perfect.
Related
I have a React Native app built using TypeScript, and I would like to also develop a number of CLI tools to help developers and 'back office' folks, also using TypeScript. I would like these to live in the same monorepo.
Based on advice from colleagues and my own research, I have tried doing this by creating a subfolder in the repo, and creating a second package.json (and all the other config files), and npm install-ing as if it were a completely separate project. It didn't take long for this to become a total mess, primarily with duplicate imports where one thing mysteriously seems to import modules from the other targets' node_modules, but also just remembering to re-npm install all the different subprojects before each commit, etc. It gets even more confusing with the TS build folders lying around; they're another place for people to import the wrong thing from. The confusion caused by this approach has been a significant drain on productivity, and it feels like there has to be a better way. So that's the question:
What is the best practice for building multiple TS/Node targets (and by "targets", I don't mean ES6 vs ESNext, I mean in the C/C++ sense: multiple output "programs", so in this case I want it to both create the bundle necessary for my RN app, but then also generate a CLI executable.) that all share code with one another, from a single monorepo?
If it matters, I am also using Expo.
You're essentially describing a monorepo. pnpm has fantastic tooling out of the box for this.
Download the pnpm CLI and install it:
$ npm i -g pnpm # Download pnpm
$ mkdir monorepo # Create a new monorepo folder
$ cd monorepo
$ mkdir packages # This will be the folder where you add your
# apps, libraries, etc.
$ touch package.json # Create a package.json
$ echo "{}" > package.json
Create a pnpm-workspace.yaml file and add the following:
packages:
- 'packages/**'
Congratulations. You now have a monorepo where you can add multiple apps. Here's how it works:
Every folder under packages that has a package.json file is now a Node app that you can control using pnpm from the root of your workspace.
When you run pnpm i from the root, it will install all of the dependencies for all of your apps and libraries.
You can even install libraries that you create locally without needing to run npm link, or deal with adding file:../ to your package.json or any of that mess.
pnpm also supports running scripts across all of your repos at the same time.
I made an example project for you to look at. In this example, you'll notice I created a library named my-library and a dependent app named awesome-app. When I ran
$ pwd
~/stackoverflow/packages/awesome-app
$ pnpm i my-library
pnpm knew to pick up my workspace library automatically:
{
"name": "awesome-app",
"dependencies": {
"my-library": "workspace:0.1.0"
}
}
Note the workspace:0.1.0 that matches the version of the package I have in my repo.
I love pnpm. It isn't without its faults, but I've been very productive with it. There are also other alternatives, such as Lerna, and npm and yarn support their own workspace protocol. I just find pnpm to work the best and has the least amount of ceremony.
Enjoy.
I am developing a library that I want to release as a node package and I am building the library using the webpack build system. I have a package.json and a package-lock.json that are commited to the repository. The webpack build is producing a set of compiled and bundled artifacts and assets into a dist folder that make up the library that I want to release.
My assumption is, that when I release the compiled and bundled library to an npm repository, that the developers who consume the package do not want to rebuild the library and thus do not need to download any of the dependencies or devDependencies of the library since I am shipping the compiled output of the library in the package that I release.
This means that during the npm publish step, I need to have a package.json that has removed the dependencies and devDependencies fields. Otherwise, developers who depend on my library will receive all these dependencies when they run npm install into their workspace resulting in extra overhead.
Is there a best-practice on how to generate a new package.json out of the checked-in version that removes these fields and places them into the dist folder before release?
I can think of many ways to do this such as:
Using the webpack build with the copy-webpack-plugin and a transform function to output a new package.json into the dist folder.
By adding a custom step to my build pipeline that generates a modified package.json into the dist folder.
By committing a separate package.json into a subfolder that use used specially for release and automatically copy that into the dist folder at release time.
I am wondering if there is some commonly accepted best-practice way on how to do this or if the npm tooling already has support for that use-case build-in?
Answer
I have not seen any standard way. All the ways you propose are aiming to generate a new and clean package.json and there is nothing wrong with that. I guess what you can see weird is the generation of a new package.json. Don't worry, it's quite common but, as far as I know, there are no rules about how to generate the modified copy. However, I would like to share some points of view about your worries.
About devDependencies
If installed from npm repo, the dev dependencies of a dependency will not be installed (or shouldn't) and this should be enough in most cases.
What about dependencies?
Should be included. If you consider that some dependency is not needed in the final dist and only in development, it's a development dependency by definition. Move it to devDependencies array in the package.json.
What about bundled dependencies?
Well, your dependencies should never be bundled or packed and excluded from dependencies. You can but this goes against the purpose of npm and similar package managers, losing the advantages related to dependencies modularity, caching and version control. If you want to force the distribution of some files because you are working with modified third party libraries or other reasons to have absolute distribution control, you maybe should at least take a look to bundledDependencies key in the package.json.
We're starting to adopt a monorepo setup using yarn workspaces and we'd like to have our firebase functions inside it. The repo structure is something like:
repo
node_modules <- all dependencies
packages
core
commom
functions <- firebase functions
So, I have 2 problems with this setup:
The dependencies of the functions don't live on the same folder as the entry file from functions
The functions depends on other packages such as core and commom that are in the repo so yarn symlinks from node_modules to the packages in the repo.
Is there anyway I can handle this?
With Yarn 2 node_modules aren't fetched and placed into in the respective functions directory (as it would be the case with calling npm i in the functions directory). So when calling firebase deploy --project default --only function the node_modules folder is missing and firebase will complain about this and abort the deployment process with the following error (or similar):
Error parsing triggers: Cannot find module [...]
Try running "npm install" in your functions directory before deploying.
There are two github issues that are tracking this issue at the moment:
Support mono-repos in deployment
Functions deployment fails when firebase-functions has been hoisted by a monorepo manager like yarn/workspaces or lerna
In the two issues above, several clever workarounds are presented by firebase users, e.g. using webpack to create a build that contains all the local packages in the release or using rsync or other tools that rewire the packages before release.
Another solution is not hoisting your project packages, if that is possible. You can do this, be adding the following two directives to your .yarnrc.yml file.
# yarnrc.yml
# disables yarn's plugnplay style and uses node_modules instead
nodeLinker: node-modules
# makes sure the node_modules are not hoisted to the (monorepo) project root
nmHoistingLimits: "dependencies"
The two directives above are explained in the yarnrc configuration docs as follows:
nmHoistingLimits Defines the highest point where packages can be hoisted. One of workspaces (don't hoist packages past the workspace that depends on them), dependencies (packages aren't hoisted past the direct dependencies for each workspace), or none (the default, packages are hoisted as much as possible). This setting can be overriden per-workspace through the installConfig.hoistingLimits field.
nodeLinker Defines what linker should be used for installing Node packages (useful to enable the node-modules plugin), one of: pnp, node-modules.
The solution I found for this is Yarn's nohoist option in your root package.json file.
By default Yarn hoists dependencies to the root directory so they can be shared between your packages. Unfortunately this will not work with Firebase. This means you need to tell Yarn not to hoist the dependencies used by your Firebase functions.
The documentation for nohoist is less than ideal, but here is an official blog post about it here:
https://yarnpkg.com/blog/2018/02/15/nohoist/
You probably want something like this:
{
"workspaces": {
"packages": [
"packages/*"
],
"nohoist": [
"functions/core",
"functions/common",
"functions/**"
]
}
}
Keep in mind that this uses the name field used in the package.json files of each workspace package. So in this example, it is assume that the functions directory has a package.json with "functions" as it's name.
functions/** tells yarn not to hoist any of the dependencies specified in packages/functions/package.json. This doesn't work for your shared yarn packages though, so functions/core and functions/common need to be specified separately.
You also need to include your workspaces as dependencies in your functions project, so add them to your package.json:
{
"name": "functions",
"dependencies": {
"core": "*",
"common": "*",
}
}
Once you have added all that, you should delete your packages/functions/node_modules directory and run yarn install. After doing this, you should see all your dependencies included in packages/functions/node_modules (not symlinks).
I am not sure I understand the question exactly, but I could give you my two cents on yarn workspaces based on whatever I understood from your question and from my experience using it.
Yarn workspaces consolidate all your dependencies into the node_modules present in project root as well as in a single package-lock.json to reduce conflicts and enables yarn to optimize the installation process giving you a faster yarn install. And also another advantage of it is, with a single pass yarn install can install dependencies of all packages under the workspace.
Edit: I think for some reason yarn link is not being called and instead only yarn install is being run, which will search the npm registries and throws the error mentioned in comment since it can't find the mentioned package on npm registry. So for a solution try creating an entry in the firebase's package.json like
"dependencies": {
"a": "file:../dependency-package-name/",
}
I am developing a very complex app that is using internally developed, open source NPM modules.
I often need to change one of those modules (extra features, bug fixing, etc.) in order for the main application to work.
At the moment, I have:
A directory called my_modules, each containing a git repository one for each module. For example module1, module2.
A directory called my_apps, where for example there is app1 which has module1 as a dependency
Under my_apps/app1/node_modules I have module1 and module2, installed via NPM
In the server, deploy by pulling the git repository, running an npm install and npm dedupe, and running the server with forever.
At this stage, if I have to fix something in one of the modules, I:
Fix it within my_apps/app1/node_modules/module1 (not git)
When it's all working, COPY the files over to my_modules/module1 and do a git push and npm publish
The server will pull the latest modules after deploy thanks to npm install
This is way, way less than ideal. It's just too error-prone. However:
Having a symbolic link link my_apps/app1/node_modules/module1 => my_modules/module1 means that module1 will look for dependencies in its own path, which often causes problems (for example, I need to make sure that EVERY module uses the same copy of module1, which is imperative)
Having a git repo under my_apps/app1/node_modules/module1 feels dangerous, in case I accidentally overwrite changes using NPM on the module. Also, once fixed the change in the local git repo, I would still then need to pull the changes in my_modules/module1. Yes a step forward from copying files over...
What's the "recommended" way of dealing with this? Any best practices?
is good practice to include my modules in node_modules for make require search easy,if not why not?
Explanation:
In node.js cms calipso( https://github.com/cliftonc/calipso)
their modules not inside node_modules: then the include the modules without auto option:
calipso = require(path.join(rootpath, 'lib/calipso'));
vs if it was inside node_modules:
calipso = require('calipso');
node_modules is typically ignored in the version control (GIT, etc) and most of the developers assume that this folder contains only packages listed in the package.json. I can imagine the approach on updating the modules just by removing this folder completely and executing npm install. Considering these I would rather say that keeping own modules in node_modules is not consistent with the node.js workflow.
Update: this is assuming that "my modules" is actually just a set of files. If your modules are "npm" modules, that can be restored by executing "npm install" then this is completely fine to keep them in node_modules.