I have a Monorepo under Lerna and Yarn Workspaces. The repo has packages which are published to npm and consumed outside the monorepo as well as within the monorepo. While developing in the monorepo we would like the main field of package.json for all such packages to point to the src directory, while when a package is used outside the monorepo, we would like the consumer to use the transpiled code in the dist folder.
I want this to be consistent across all uses of the packages, My current solution is to have the main field point to the dist folder. Then for each of the tools within the monorepo, namely jest, tsc, webpack, parcel I've had to come up with a different tool specific solution for aliasing the src directory instead of the dist directory. But I don't like the fact that I've had to do this work for each of these tools. It just doesn't seem scalable.
Has anybody come up with a lower level solution, where a module resolves to a different folder based on the environment?
Thank you.
If your internal code base is always transpiling the sources, why not just import { thing } from "my-package/src/main.js?
Then you can just leave the main field as dist for the consumers who ideally shouldn't have to keep track of additional paths when importing your packages.
There are a lot of details left our in your answer, but assuming you're using a single webpack/other instance to compile all your packages.
Another approach, since you're already coupled all your packages via the same compilation step, why not just use relative paths between the packages? That way you'll never have to act as a consumer but with slightly different needs.
And finally the third approach, which I think sounds a bit convoluted but should do exactly what you're asking for. Create a script that uses globby or some other npm package to grab all package.json files in your repository (excluding node_modules!). require() / iterate through these package.json manifest files and set the main field to an input value (say "dist"). Then, create two bin js files (hint: bin field) called set-main-dist and set-main-src, and possibly a third called unset-main.
Next, no matter what scripts you run in your package.json files at the root (or using lerna run), make sure to let the script look either like this:
"prebuild": "set-main-src"
or like this
"build": "set-main-src && build etc"
Hope one of these options work out for you. Remember that it's rarely worth going against the stream of usual patterns in tooling and what not. Make this one worth it.
I had exactly the same dilemma but with yarn3.
The solution importing always from source dint worked for my case, as the package itself might get published to npm too.
So after digging around I luckily found the package.json property publishConfig.main https://yarnpkg.com/configuration/manifest#publishConfig
With it I can change the main field from source to dist on npm publish.
Basically only for publishing we use a modified package.json.
Implemented in my package.json it looks like this:
{
"main": "./src/index.ts",
"publishConfig": {
"main": "./dist/index.js"
}
}
and when I run yarn npm publish or yarn pack the main field will be replaced temporary for the zip.
While all my tooling (jest and ts) can still rely on the main field pointing to the source.
Related
I am working on an npm package that includes an example directory to run/test the actual package. In the example directory, I have included back the parent package using "file:..".
This works fine when developing and making frequent changes to the parent package, but if I want to use the example as a stand-alone app, I would need to point to the actual npm package.
Is there a way to have "2 configs" in the same package.json:
one that points to `"file:.." for local development
one that points to the npm package to use as a stand-alone app
This would avoid duplicating the example directory
You could do this with lerna which is a mono-repository CLI tool.
First of all, you would have to define multiple projects within the same repository. Lerna calls these projects "packages" and stores all of them within a /packages folder.
package.json
/packages
/my1stPackage
package.json
/my2ndPackage
package.json
Lerna has all kind of optimizations, which I won't dive in too deep here. But there are a couple of basics:
to initially install all dependencies of all repos, run lerna bootstrap --hoist.
You can still run npm run ... as before, but those refer to your root package.json file. To run npm scripts for specific sub-package you should run lerna run <args> -scope=<packageName>. (e.g. lerna run build --scope=my1stPackage)
You can add shortcuts for this in the root /package.json script section.
"scripts": {
"setup": "lerna bootstrap --hoist",
"build:my1stPackage": "lerna run build --scope=my1stPackage"
}
What will interest you most, is that sibling packages can just reference each other from their specific package.json to include each other as dependencies.
So, let's assume that my1stPackage uses my2ndPackage. Inside the package.json file of my1stPackage there would be something like
"dependencies": {
...
"my2ndPackage": "^0.0.1"
}
The my2ndPackage could actually be a package which is published in npm. However (!) while developing locally, lerna will add a symbolic link inside the /packages/my1stPackage/node_modules/my2ndPackage, which points to the folder of /packages/my2ndPackage. (And that really does work on all relevant operating systems.)
You package.json looks the same for local development as it does for people who download your package through npm. However, it's your lerna setup that fixes this with this symbol link.
I found two potential ways to do this:
npm link : https://docs.npmjs.com/cli/v7/commands/npm-link/
npm workspaces : https://docs.npmjs.com/cli/v7/using-npm/workspaces
But in my specific case, there are packages that can conflict between the parent and child (example) packages.
I couldn't find a robust way to make it work and decided that the simpler approach would be to simply create a separate repository that would contain a stand-alone version of the example directory and a script that can keep it up to date with the "master example" in the original repository. This way development stays fast and the "example copy" is easy to keep up to date without duplicating code.
I'm playing around with Yarn 2, and I want to do something like this.
I have a monorepo of the structure:
/
packages/
shared-ts/
package.json
src/
lib*/
app-ts/
package.json
src/
lib*/
app-js/
package.json
src/
lib*/
where lib* denotes that the folder is gitignored, but is where the compiled code will live.
In this example, I have a dependency library shared-ts that is used by two apps, app-ts and app-js.
The conventional approach
The conventional approach to configuring a monorepo like this, is that in shared-ts I would have a package.json like:
"main": "lib/index.js"
"scripts" : {
"build": "tsc"
}
Where the build script will build index.js and index.d.ts into the lib folder.
When both app-ts and app-js then resolve the package, they look in the lib folder and find the index.js and in app-ts's case - the index.d.ts.
This works fine, except that the developers need to remember to run the build script if they have made changes to shared-ts in order for the changes to propagate across.
Where this could potentially become problematic is where there are many layers of dependencies.
Attempted work around 1 - point main to src/index.ts.
I can change shared-ts package.json to
"main": "src/index.ts"
"scripts" : {
"build": "tsc"
}
This generally won't work, a plain node process won't be able to parse the syntax in the .ts file (eg. the import keyword).
Potential workaround - publishConfig
So something I'm considering, but haven't tried yet is, using the publishConfig
fields in the package.json
This field contains various settings that are only taken into consideration when a package is generated from your local sources (either through yarn pack or one of the publish commands like yarn npm publish).
"main": "src/index.ts",
"publishConfig": {
"main": "lib/index.js"
}
The idea being that:
When you publish a package to npm, lib/index.js will be used as main. 👍 code is ready for consumption, no compilation required.
If being used directly in the monorepo src/index.ts will be used as main. 😕 This kind of works as if you were running app-ts with ts-node for example.
However, where this starts breaking down is:
Running app-js in a development environment (where you don't have any additional syntax parsing set up).
Practical current best solution
My current best solution is to 'just give up on this 'no compile' aspiration' - if a developer makes changes to some code, they need to re-run build for the changes to propagate across.
How about using this?:
import someValue from 'some-package/src/index';
I can do this in my monorepo like the image below
I believe using nx will be good choice here. While it won't help you run the uncompiled code, it has pretty good features. In particular, you can automatically run the affected:apps on certain changes. For example, if you have a start command, it will run the start command for all the affected apps.
I wanted the same thing but had to compromise on the "Automatic compilation on changes" option in my JetBrains IDE.
It allows me to debug with ts-node as well as run the code using the native node binary.
I'm developing various Angular 2 projects and I want to share node_modules folder between multiple projects. I would like to create a structure like this:
MainFolder
- Project1
- Project2
- package.json
so I would have just 1 package.json for all the projects. My answer: is it possible to do this?
If possible, I have to lunch npm install with -g?
I can't understand how -g works.
Can someone give me instructions how to proceed?
Very thanks
I forgot to say that I build the projects with angular-cli.
The way I go around this for small/learning/test projects is with (I call it) "git projects". Basically I manage the various projects via git, and just "load" the project I want to work on. Of course this doesn't work if you want to have access to multiple projects at the same time.
I like to use a git client for this purpose because it's easier to visualize my existing "projects".
So my workflow is this...
Create my main/base folder. This will contain the git repo, the single node_modules folder, and whatever else that should be common to all projects.
I create the basic package.json file (using npm init). No description, no nothing, just the basic skeleton package.json file. (However, if you know you will use certain packages in ALL of your projects, you can npm install them first, so they will be added to package.json as your "base" modules.)
Now I check the bare package.json into the repo (and anything else that you may want to have in all of your projects, but usually it's just the package.json file). This will be the bare-bones starting branch for all projects.
Once this is checked in, I create a branch off of this in the git repo. This will be "Project 1" - or whatever you want to call it. Then build up your project however you want, installing modules, checking in changes, etc, etc.
When I want to start a new project, I simply check out the first bare-bones project (which is just the empty, or almost empty, package.json file) and do another branch off of it. This will be my 2nd project.
And so forth...
So the main thing is that every new "project" will be a new branch in the git repo, and to create a new project, just switch back to the original bare-bones one and do a new branch off of that.
Of course it is possible to create branches within a project, too. It's all about naming conventions. You could, for example, prefix a new project branch with "P_" or "PROJECT_", etc, so you can quickly tell in your git client which branches are projects. And of course use a different naming scheme if you just need a new branch within an existing project. That's basically how I go about it.
You may not like this workflow, but this way I don't need to install packages globally. When I do a backup, I can simply delete the single (possibly huge) node_modules folder. All project related modules can be reinstalled by simply checking out a branch for a particular project and run "npm install" on its package.json. Hope it makes sense.
Here is documentation on the various npm install arguments
In global mode (ie, with -g or --global appended to the command), it
installs the current package context (ie, the current working
directory) as a global package.
The -g install locations based on environment can be found here
One way you can achieve what you want is to have one solution for both projects and each project route uses it's own lazy loaded module.
Unless you have a specific business need to share resources, it's better to keep each project separate with it own resources and configuration.
-g Stands for global Installation, i.e. the packages you install will be available for all applications.
And why do you want to share node_modules and package.json file?
Keep them seperate for each seperate project. And if you need to share your project, you may share your package.json instead of sharing the node_modules folder.
Also to point out, if you manually install packages by listing their names, then you can use -g (global) flag, but if you do use only npm install then your packages won't be installed as global packages.
If it really is just for testing simple applications, could rename tha app folder in some way provide a solution. It assumes that all the dependencies are the same or at least a subset of the dependencies provided.
I've converted to npm for my build system: no gulp etc. Also no webpack, rollup etc, it's an es6 system based on modules & no bundling. Sure is simple!
Obviously I don't want to drag around a node_modules hierarchy for my run-time, front end modules. And don't want to import foo from './node_modules/god/awful/path.js'. So I'd like to have a top level directory for the run-time, front-end dependencies.
So how do I copy my "dependencies", not "devDependencies", to a top level directory for deployment?
I've got a run script that can do it but it's pretty messy and the location of the package under node_modules is not always obvious. Maybe there's a package for doing this automatically? Or a nifty npm trick of some sort?
OK, I'm starting to need this even more than before so thought I'd nag and be clearer about what I seem to be forced to do.
First of all, I use no workflow task managers, just npm run scripts in package.json.
My dependencies (npm --save .. not --save-dev) are:
"dependencies": {
"lzma": "^2.3.0",
"pako": "^1.0.0",
"three": "*"
},
.. and my scripts cli for hoisting the dependencies into a top level libs/ dir is simply a huge cp:
"build-deps": "cp
node_modules/lzma/src/lzma.js
node_modules/lzma/src/lzma_worker.js
node_modules/pako/dist/pako.min.js
node_modules/three/build/three.js
node_modules/three/build/three.min.js
node_modules/three/examples/js/controls/OrbitControls.js
node_modules/three/examples/js/controls/FlyControls.js
node_modules/three/examples/js/controls/FirstPersonControls.js
node_modules/three/examples/js/libs/stats.min.js
node_modules/three/examples/js/libs/dat.gui.min.js
libs/",
This is pretty primitive: I have to find the dependencies in node_modules (not always obvious) and add them to the list by hand. Sure makes me want fewer dependencies! :)
I know that bower is designed for "front end" dependencies (--save in npm speak). But it seems like npm would be perfect for dependencies, and yet it appears I need to be this primitive.
What am I missing here? What do you do?
This is something I wrestled with in my earliest forays into development with Node packages. I had the exact same idea you had: "I can use npm to pull in whatever I need for my project, great!" and then trying to manage how I reference each of my dependent libraries. I didn't properly grasp at the time that npm wasn't just there to help me fetch my dependencies, it's also there to help me manage them.
Here's the long & short of it: you do NOT want to be copying anything out of your node_modules folder to some other, more human-convenient location. There's a lot of reasons why, but the biggest is that you don't need to copy anything out of node_modules -- everything your project needs is right there.
When you're developing in ECMAScript 2015+ you should only ever have to do the following (apologies for the overly-simplistic code):
/* N.B. These all reside under node_modules, yet I don't
* have to spell out their paths under node_modules: */
import $ from 'jquery';
import _ from 'lodash';
import moment from 'moment';
import NiftyLibrary from 'niftywhatever';
// ... code ...
let $name = $('#name');
let now = moment();
// ... other code ...
In other words, your development environment setup should just handle the module resolution for you. You should never have to specify that the jQuery library you want to use is at "node_modules/jquery/dist/jquery.min.js". If you're doing this you should take a minute to figure out why you're doing this -- it's unnecessary, it's a time- and brain-suck and you'd rather be writing your application code than managing your dependencies ... NOT managing the node_modules tree.
You mentioned you're developing with ES6 modules, but not using webpack, gulp, Grunt, rollup or any other build or bundling tool. Is your project intended to run entirely in Node? I ask because the last I heard most browsers aren't quite ready to run ES6 modules natively. So how are your modules getting transpiled to ES5? Maybe you're approaching this in some novel way I haven't heard of yet, but in my experience a build or bundling tool is necessary. (Also, it's a heck of a lot of fun.)
I've used Grunt with RequireJS in the past, but am now using webpack 3 with Babel and a few additional loaders (depending on the type of project I'm working on). I use npm scripts to handle my top-level tasks (running a development server, building a finished distribution package, running tests, etc.), but I let webpack handle all the business of transpiling ES6 into ES5, translating Sass styles, precompiling Vue components, etc. It's a bit of work to wrap your mind around the webpack approach, but it's well worth the effort.
Maybe webpack doesn't fit your style -- fair enough. But there are a number of other tools you could use instead. They all require a little time to get acclimated to their approaches, but they should all be able to take care of module resolution for your dependencies. Once you get a build environment set up correctly it should cease being a visible part of your development workflow; you'll just reference your dependent libraries by their names, map them to a module-local variable and use them.
Is this helpful?
EDIT: This is webpack-specific, but there should be similar options available with other bundlers or build tools.
In webpack you can use the copy-webpack-plugin to copy npm-sourced dependencies to a separate folder. This can be useful within a service worker, for example, where the execution context is a little different.
What I would suggest is scoping your dependencies (#frontend/...) and symlinking the scoped folder to your top-level dir in a post install script (similar to how bower handled the transition to yarn: see https://github.com/sheerun/bower-away)
Example:
...
"dependencies": {
"#frontend/jquery": "npm:jquery#~3.4.1",
},
"engines": {
"npm": ">=6.9.0",
"node": ">=12.10.0"
},
"scripts": {
"postinstall": "node -e \"try { require('fs').symlinkSync(require('path').resolve('node_modules/#frontend'), 'static/bower', 'junction') } catch (e) { }\""
}
Requires NPM >= 6.9 for alias support.
FYI: aliases break audits
Recently started working with Gulp and I can't figure out is it really necessary to have a copy of node_modules directly in folder with current project?
E.g. I have this structure:
mysite
└─builder
└──node_modules
└─work
└─work2
How can I access node_modules in folder 'builder' from folder 'work' or 'work2' without copying it? It is quite large, about 100mb, and seems to me it has no sense to have a copy of it for every new project.
I tried this line export NODE_PATH='D:\OpenServer\domains\mysite\build' in file package.json and then tried command gulp but it replied[10:24:27] Local gulp not found in d:\OpenServer\domains\mysite\work
[10:24:27] Try running: npm install gulp
Short answer
Don't do it. Let NPM work the way it's designed to. However, to save space, you can delete the node_modules folder on projects that are currently dormant, and recreate it with a single shot of npm install when you switch back to them.
Justification
Even if you share your node_modules, you'll probably have redundancies in it anyway. What will you do about them next ?
It is the essence of NPM to replicate modules per project. If you dig into the node_modules folder tree, you may notice that it can even contain several replications of a same library under one given dependencies tree. Say you requested two modules explicitely, and both these modules themselves pulled a dependency that takes care of a lot of things, and is therefore called lib_DADDYMUMMY :
node_modules
+ a_module_you_use v0.5
+ lib_DADDYMUMMY v0.1 (pulled as a dependency of this module)
+ another_module_that_you_requested v0.3
+ lib_DADDYMUMMY v0.1 (again ! pulled as a dependency of this other module)
This comes in handy when your two module start needing different versions of lib_DADDYMUMMY. This comes in handy when you maintain long-lived projects ! And hell knows that in the JavaScript world, with fast changing APIs, you can consider most any decent project as long-lived. :)
One could imagine having all dependencies being shared by everyone, living in a flat structure, with several versions of a library living next to each other and every one finding what he needs there. That repository could be called, say, .m2. But that's just not the way NPM works unfortunately.
NPM considers that storage space is cheap. That's its price for helping you manage versions in dependencies, dependencies of dependencies, and dependencies of dependencies of dependencies. I consider that it's an affordable price for taking care of the dirty jobs the day when work and work2, as their lives go on, take diverging maintenance paths. I wouldn't try getting in its way by forcing a half-Maven-like folder model.
Maybe you should put your package.json into your root directory(mysite/package.json),
then try to install node_modules on the root.
In addition, you write gulpfile on the same dir.
eg.
mysite
|- package.json
|- node_modules
|- gulpfile.js
└─builder
└─work
└─work2
However, I recommend that you write one single gulpfile for each project.
One problem why you shouldn't do this is because of versioning. If your modules require different versions of the same package, you're going to run into problems. One package is going to win, and it might break another package.
Further, you get into the problem of having to merge the dependency lists in some way - meaning, you'll have to get the dependencies from work/package.json, work2/package.json, etc. and then install all of them at once.
Merging node_modules/ won't solve your problem, either - believe me, don't try.
Paste the node_modules folder inside your mySite directory.
All npm packages such as gulp will work in your work or work2 directory.
But, now(your folder structure) work folders can't find node_modules in their parent directory.