set custom PATH for npm - node.js

Is there a way that I can add a directory to the PATH of npm. I DON'T want to add this directory to the machine PATH, just the one npm uses when running scripts.
I know that npm adds node_modules/.bin in addition to any pre-existing machine PATH (see here)
To give more detail on my specific case. I have a project with nested directories, each with its own package.json. When running a script on a sub-directory which depends on a parent binary, the binary won't be found because it's not on the local node_modules/.bin but inside a parent node_modules/.bin.
I could specify the path to the binary inside the script but this is cumbersome and makes the scripts less readable.
So, is there a way to tell npm to export PATH before running every script? It's already doing something like this to add the local node_modules/.bin

I can't think of any simple way to accomplish what you are describing.
You can set/modify environment variables right before a script like:
{
"scripts": {
"parent-script": "PATH=$PATH:/path/to/parent/node_modules/.bin parent-script"
}
}
But as you mentioned this is cumbersome to do this on every script. Also you might as well just do as you described:
{
"scripts": {
"parent-script": "/path/to/parent/node_modules/.bin/parent-script"
}
}
A complicated, but possibly more maintainable approach could be to build yourself a search-script Node module that will traverse parent directories for a script passed as an argument, then run it:
{
"dependencies": {
"search-script": "^0.0.1"
},
"scripts": {
"parent-script": "search-script parent-script"
}
}
Unfortunately NPM does not provide a lot of flexibility for things like this.

Only thing I can see is creating a shell script called npm, place it in a folder in your PATH and remove npm from your PATH and have your shell script set the parent binary dir in your PATH and call the npm binary passing the rest of args. It's not really worth it and might cause other issues.
I would just add all those binaries your scripts in nested folders depend on in their own package.json, which in some ways is worth it, specially if you want to deploy them independently.
This is a node/npm issue itself as it kind of forces you to download the same package multiple times, but at least it's easy to know which version your package uses.

Related

Is there a way to specify different paths for the same dependencies in package.json?

I am working on an npm package that includes an example directory to run/test the actual package. In the example directory, I have included back the parent package using "file:..".
This works fine when developing and making frequent changes to the parent package, but if I want to use the example as a stand-alone app, I would need to point to the actual npm package.
Is there a way to have "2 configs" in the same package.json:
one that points to `"file:.." for local development
one that points to the npm package to use as a stand-alone app
This would avoid duplicating the example directory
You could do this with lerna which is a mono-repository CLI tool.
First of all, you would have to define multiple projects within the same repository. Lerna calls these projects "packages" and stores all of them within a /packages folder.
package.json
/packages
/my1stPackage
package.json
/my2ndPackage
package.json
Lerna has all kind of optimizations, which I won't dive in too deep here. But there are a couple of basics:
to initially install all dependencies of all repos, run lerna bootstrap --hoist.
You can still run npm run ... as before, but those refer to your root package.json file. To run npm scripts for specific sub-package you should run lerna run <args> -scope=<packageName>. (e.g. lerna run build --scope=my1stPackage)
You can add shortcuts for this in the root /package.json script section.
"scripts": {
"setup": "lerna bootstrap --hoist",
"build:my1stPackage": "lerna run build --scope=my1stPackage"
}
What will interest you most, is that sibling packages can just reference each other from their specific package.json to include each other as dependencies.
So, let's assume that my1stPackage uses my2ndPackage. Inside the package.json file of my1stPackage there would be something like
"dependencies": {
...
"my2ndPackage": "^0.0.1"
}
The my2ndPackage could actually be a package which is published in npm. However (!) while developing locally, lerna will add a symbolic link inside the /packages/my1stPackage/node_modules/my2ndPackage, which points to the folder of /packages/my2ndPackage. (And that really does work on all relevant operating systems.)
You package.json looks the same for local development as it does for people who download your package through npm. However, it's your lerna setup that fixes this with this symbol link.
I found two potential ways to do this:
npm link : https://docs.npmjs.com/cli/v7/commands/npm-link/
npm workspaces : https://docs.npmjs.com/cli/v7/using-npm/workspaces
But in my specific case, there are packages that can conflict between the parent and child (example) packages.
I couldn't find a robust way to make it work and decided that the simpler approach would be to simply create a separate repository that would contain a stand-alone version of the example directory and a script that can keep it up to date with the "master example" in the original repository. This way development stays fast and the "example copy" is easy to keep up to date without duplicating code.

Pointing the main field in package.json conditionally

I have a Monorepo under Lerna and Yarn Workspaces. The repo has packages which are published to npm and consumed outside the monorepo as well as within the monorepo. While developing in the monorepo we would like the main field of package.json for all such packages to point to the src directory, while when a package is used outside the monorepo, we would like the consumer to use the transpiled code in the dist folder.
I want this to be consistent across all uses of the packages, My current solution is to have the main field point to the dist folder. Then for each of the tools within the monorepo, namely jest, tsc, webpack, parcel I've had to come up with a different tool specific solution for aliasing the src directory instead of the dist directory. But I don't like the fact that I've had to do this work for each of these tools. It just doesn't seem scalable.
Has anybody come up with a lower level solution, where a module resolves to a different folder based on the environment?
Thank you.
If your internal code base is always transpiling the sources, why not just import { thing } from "my-package/src/main.js?
Then you can just leave the main field as dist for the consumers who ideally shouldn't have to keep track of additional paths when importing your packages.
There are a lot of details left our in your answer, but assuming you're using a single webpack/other instance to compile all your packages.
Another approach, since you're already coupled all your packages via the same compilation step, why not just use relative paths between the packages? That way you'll never have to act as a consumer but with slightly different needs.
And finally the third approach, which I think sounds a bit convoluted but should do exactly what you're asking for. Create a script that uses globby or some other npm package to grab all package.json files in your repository (excluding node_modules!). require() / iterate through these package.json manifest files and set the main field to an input value (say "dist"). Then, create two bin js files (hint: bin field) called set-main-dist and set-main-src, and possibly a third called unset-main.
Next, no matter what scripts you run in your package.json files at the root (or using lerna run), make sure to let the script look either like this:
"prebuild": "set-main-src"
or like this
"build": "set-main-src && build etc"
Hope one of these options work out for you. Remember that it's rarely worth going against the stream of usual patterns in tooling and what not. Make this one worth it.
I had exactly the same dilemma but with yarn3.
The solution importing always from source dint worked for my case, as the package itself might get published to npm too.
So after digging around I luckily found the package.json property publishConfig.main https://yarnpkg.com/configuration/manifest#publishConfig
With it I can change the main field from source to dist on npm publish.
Basically only for publishing we use a modified package.json.
Implemented in my package.json it looks like this:
{
"main": "./src/index.ts",
"publishConfig": {
"main": "./dist/index.js"
}
}
and when I run yarn npm publish or yarn pack the main field will be replaced temporary for the zip.
While all my tooling (jest and ts) can still rely on the main field pointing to the source.

Make node script packaged with zeit-pkg aware of full filesystem

I have a node script which uses command line parameters using the module commander.
I want to pack it with pkg, but I am running into some trouble.
Normally I would execute my script with:
node index.js --file ./test.csv
but the file argument could point to any folder in the user's filesystem.
I have looked into configuring the assets and scripts attributes for pkg in package.json, but it looks like you need to specify a folder in there, such as:
"pkg": {
"scripts": "build/**/*.js",
"assets": "views/**/*"
}
How can I make a zeit-pkg packged node script aware of any possible location in the filesystem?
I am simply building with pkg package.json , since in package.json I have the entry:
"bin" : "index.js"
In your pkg-packed source, add this in the beginning:
console.log("process.cwd() = " + process.cwd());
When you run your packaged exe this will tell you what
your executable sees as its working directory. You can
then interpret any relative arg-paths of your application
(like "./index.csv") relative to that.
It seems based on my experiments that pkg-applications have
full access to the file-system as long as your program knows
the absolute paths of the files you want to read or write
or even "require".
The only tricky thing seems to be relative paths.
The reason is that pkg wants you to be able to package
resource/asset -files into the executable and then access them
like you would any file at runtime. That is actually a
great feature but often more than you need.
If you don't need to package any (extra) files into your
executable then file-system access should be simple and
work just normally with absolute paths. You just need to
know "where you are" if you want to use relative paths.
I'm not affiliated with the pkg project so my answer is
not authoritative in any way. I hope zeit would put more
documentation about the file-system access into their site,
especially cross-platform. After doing some experimentation
myself it just seems accessing files by their absolute paths
works, on Windows.

Finding node module from different directory?

Always feel stupid asking here because people are always confused with my questions, or I have a dumb problem, but, I'm working on a program in node.js and the text editor I'm using (NP++) doesn't seem to like to save files in the system32 directoy, (The directory where my modules are), and that is where my script is as well. (So I have .../.../node_modules/(modules) and .../.../node_modules/script.js) this becomes a pain when I want to edit the script, I have to clone the script to my desktop, then edit it, then overwrite the one in the node_modules directory. I tried saving the script to my desktop and running it, but it just gives me an error of module not found. (In my script I have the modules as var example = require('example.js')) Is there any way I can get it to get the modules from the node_modules directory, while keeping the script file somewhere easily accessible and editable? (i.e desktop?) (Sorry if this is confusing, not the best at these kind of things)
I'm not 100% sure that this is what's happening because I haven't used npm on Windows, but it sounds to me like you're installing your dependencies globally using npm -g. The more proper way to use Node is to install your dependencies locally, using npm without the -g flag. That way your dependencies get installed in your current working directory.
For example, let's say you've saved your project in a directory on your Desktop, and your script uses require("lodash"). If you cd to your directory and run npm install lodash, then the lodash module will be available to your script.

NPM - Conditional additions to global path

In a Node package.json file, you can map multiple executables to the PATH environmental variable on a global NPM install (npm install -g):
"bin": {
"foo": "./bin/foo.js",
"bar": "./bin/bar.js"
},
I have a unique project that requires mapping existing PATH variables on Operating Systems that do not have it. For example, I want to add a command named grep to PATH, if and only if it is being installed on a Windows computer. If the computer is running any other OS, the NPM installation will obviously fail.
Is there any way to run logic that pre-determines what bin options are available in the installation?
Oh snap - I just had an idea!
Would this work:
Parent module has npm (programmatic version) as a dependency.
On global installation, run a post-install script as declared in the package.json of parent module.
Post-install script does a check on the system to see which commands exist. This would be more mature than "Windows or not Windows" - it would try to exec a list of commands and see which ones fail.
For every command that doesn't exist, post-install script programmatically runs npm install -g on all sub-modules (one for each command, such as grep).
This would take a while and the npm module is huge, but it seems like it would work. No?
There doesn't seem to be a way to do this directly through package.json, but it might be possible (and desirable) to do something like:
Make a separate npm module for each executable you want to register (eg my-win-grep).
In the my-win-grep module, implement the executable code you want to run, and register the PATH/executable value in this module.
In the package.json for my-win-grep, include an os field that limits it to installing on windows.
In your original module, list my-win-grep as an optionalDependency.
In this way, if someone installs your original module on Windows, it would install my-win-grep, which would register an executable to run under the grep command.
For users on other systems, the my-win-grep module would not install because of the os requirement, but the main module would continue to install because it ignores failures under optionalDependencies. So the original grep executable would remain untouched.
Updated question
This approach does sound like it should work - as you say, the npm dependency is pretty large, but it does avoid having to preform additional symlinking, and still has the benefit outlined above of having each piece of OS specific functionality in a separate module.
The only thing to watch for, possibly, in the programmatic npm docs:
You cannot set configs individually for any single npm function at
this time. Since npm is a singleton, any call to npm.config.set will
change the value for all npm commands in that process
So this might just mean that you can't specify -g on your installs, but instead would have to set it globally before the first install. This shouldn't be a problem, but you'll probably need to test it out to find out exactly.
Lastly...
You might also want to have a look at https://github.com/lastboy/package-script - even if you don't use it, it might give you some inspiration for your implementation.

Resources