How does `CI=true` affect the `npm install` command? - node.js

I'm having trouble finding good documentation on how the CI=true effects the npm install command.
How is CI=true npm install different from npm install.
Does the CI=true even effect the install?
I tried it locally and it does NOT cause it to behave like npm ci. I tried running it without the package-lock.json or npm-shrinkwrap.json file present. It created the lock file.
Also, is there a difference between
CI=true npm install
and
CI=true
npm install
and
export CI=true
npm install

Setting the CI environment variable to true effects how npm gathers usage data.
How it effects npm is explained in the Does npm send any information about me back to the registry section of the docs (For convenience, I've provided a verbatim copy of that section below). Consider particularly the description about the Npm-In-CI header.
In summary, setting CI=true causes npm to set the Npm-In-CI header to true, thus as a result the data gathered (by npm) assumes the package(s) are being installed via a "build farm", (i.e. for Continuous Integration purposes), instead of a "human".
The following is what is stated in the docs at the aforementioned link:
Does npm send any information about me back to the registry?
Yes.
When making requests of the registry npm adds two headers with information about your environment:
Npm-Scope – If your project is scoped, this header will contain its scope. In the future npm hopes to build registry features that use this information to allow you to customize your experience for your organization.
Npm-In-CI – Set to “true” if npm believes this install is running in a continuous integration environment, “false” otherwise. This is detected by looking for the following environment variables: CI, TDDIUM, JENKINS_URL, bamboo.buildKey. If you’d like to learn more you may find the original PR interesting. This is used to gather better metrics on how npm is used by humans, versus build farms.
With regards to the specific part of your question, i.e.
Does the CI=true even effect the install?
Generally "No", there is no notable difference to the resultant installation by npm with or without CI=true.
However, a possible scenario whereby the installation of a package by npm could/may be effected, is if a package author defined a postinstall script in package.json that performed different conditional logic if/when the CI environment variable is set to true.
Regarding the last part of your question, i.e.
Also, is there a difference between ...
The accepted answer to this question addresses that.
In summary;
Your first two commands are similar - they both set the environment variable for the current shell only.
However your last example that utilizes export sets the environment variable for the current shell and all processes started from the current shell.

Related

How does npm behave differently with ignore-scripts set to true?

I just watched a talk where the speaker recommended running:
npm config set ignore-scripts true
so that post-install scripts and pre-install scripts of a package don't run. That way, you would avoid a virus in a malicious package.
My question is: After running this command, must I do anything differently to npm install packages and get them to work within a project?
If running this command comes with no additional inconvenience when using npm, then running it would have no downside. It would only help you avoid viruses.
If this was the case, why wouldn't this be the default setting?
I ask because I assume that by ignoring package scripts, npm packages would behave differently and one would have to do more things manually.
I agree with #RobC here. It also disactivated running custom scripts in my package.json completely for me, which obviously is a deal breaker since you can't define and run your custom scripts anymore.
Although it's probably useful to think about these security concerns, I don't think running npm config set ignore-scripts true is the right option. I ran it as well and ended up turning it back off to keep running my custom package scripts.
So the advice from the video ended up being not all too sound, I guess...
If you want to be safe, use '--ignore-scripts' or the config setting, but also use can-i-ignore-scripts.
It helps you find out which scripts exist (especially when you install new dependencies), but prevents automatically executing new scripts which appear with a new version of a library you already use.
I faced a similar problem when some dependencies need running scripts to build platform-specific code with node-gyp.
Would be nice to have an option in ignore scripts per project to enable specific ones to build.
So far decided to stay on ignore-scripts = true globally in .npmrc and using an extra script in my project that basically does this:
#!/bin/bash
set -e
npm explore sqlite3 -- yarn run install
npm explore bcrypt -- yarn run install
p.s. yarn does not have explore

Does NPM install run in a sandbox?

Basically what's to prevent me from publishing an NPM module with arbitrary installation script that steals everything from your computer when you npm install my-malicious-package if the installation is not running in sandbox?
In this article they suggest that most of the attackers would place their malicious script in the pre/post install hooks. That's easy to detect and filter out. I'm mostly concerned with the actual installation of the package where arbitrary could be ran.
There's nothing preventing those scripts from performing any actions as current user. You need to avoid running install scripts.
Most malicious packages are using those, but in rare cases (like lofygang recently) the packages may carry malicious code in the functionality too.
How to protect a project from malicious packages
make sure you don't run lifecycle (postinstall) scripts unless they're known and necessary (see my talk on this topic)
put 3rdparty code in a compartment, lock down the environment, decide on which powerful APIs to pass to each package.
The second step requires the use of Compartment, which is a work-in-progress in TC39 https://github.com/tc39/proposal-compartments/
But a shim exists. And Some tooling was built on top of that shim.
You could use the SES-shim directly and implement your own controls, or use the convenience of LavaMoat
LavaMoat lets you generate and tweak a per-package policy where you can decide which globals and builtins it should have access to.
LavaMoat also offers a tool to manage install scripts.
Here's my talk on SES and LavaMoat with a demo at the end.
How to set up LavaMoat
See LavaMoat docs for more details
disable/allow dependency lifecycle scripts (eg. "postinstall") via #lavamoat/allow-scripts
npm i --ignore-scripts -D #lavamoat/allow-scripts
npx --no-install allow-scripts setup
npx --no-install allow-scripts auto
then, edit the allow-list in package.json
after every insstall/reinstall run allow-scripts
run your server or build process in lavamoat-node
npm i -D lavamoat
in your package.json add something like:
"scripts": {
"lavamoat-policy": "lavamoat app.js --autopolicy",
"start": "lavamoat app.js"
run lavamoat-policy every time you make changes to your dependency tree and review the policy (see also: policy override)
run npm start to start your app
Disclaimer: I contribute to LavaMoat and Endo. They are Open Source projects on permissive licenses.
The only way that npm itself runs package code is in install hooks.
If you disable install hooks, no untrusted code can run until you actually load it in your application (at which point you're hosed).
I've made node-safe, which allows you to use the native macOS sandbox when using node, npm and yarn:
# Allow reading files, but only in the current folder
node --enable-sandbox --allow-read="./**" myscript.js
When using the sandboxed package managers rogue dependencies are not able to compromise your system anymore through postinstall scripts and other means.

npm update unlinks linked packages

I have a project, which consists of one root node package containing subpackages linked together by npm link - these subpackages depend on each other (listed in package.json dependencies) and the structure basically looks like this:
-rootpackage
--subpackageA
--subpackageB
Lets say subpackageA has dependency on subpackageB, so I link them to avoid publishing/reinstalling subpackageB in subpackageA after every change in the source of subpackageB.
The link works just fine until I run npm update in subpackageA, which causes the subpackageB to be unlinked.
Now, I see two options:
I can theoretically run the npm link operation after each npm install or npm update to ensure the links are always present. This works with postinstall in case of installation, but in case of an update the postinstall is not called. I don't know any postupdate command for npm, which is to be called after update.
Maybe there is a way to do this more cleverly, perhaps with yarn, which I am also using, in a way, that it kind of prevents unlinking or excludes the update for my subpackages, so I don't lose the links between my subpackages, but right now I am not aware of such a way.
Is there any way to make one of those options work or any other way to solve this problem ? I need to keep this and other links so we don't have to run npm link after every installation/update. I can't really find information about this issue anywhere. Btw I am using Node 6.4.0 and NPM 3.10.3.
So the solution is to use Yarn Workspaces or maybe project like Lerna.
Yarn Workspaces is a utility that expects a structure similar to what was described in the question and which maintains the linking subpackages and root automatically. It is very easy to set up (just 2 lines in root package.json and executing yarn for the first time) and after it you don't have to worry about upgrade or install at all, the links stay in place unless you delete them manually.
Lerna expands on that and provides you with additional tooling for managing multipackage projects. It can use Yarn Workspaces internally for the linking if you use yarn but it is not a requirement and works fine with npm. Just make sure to have Git because last time I checked Lerna didn't work with SVN or other VCSs.

NPM - Conditional additions to global path

In a Node package.json file, you can map multiple executables to the PATH environmental variable on a global NPM install (npm install -g):
"bin": {
"foo": "./bin/foo.js",
"bar": "./bin/bar.js"
},
I have a unique project that requires mapping existing PATH variables on Operating Systems that do not have it. For example, I want to add a command named grep to PATH, if and only if it is being installed on a Windows computer. If the computer is running any other OS, the NPM installation will obviously fail.
Is there any way to run logic that pre-determines what bin options are available in the installation?
Oh snap - I just had an idea!
Would this work:
Parent module has npm (programmatic version) as a dependency.
On global installation, run a post-install script as declared in the package.json of parent module.
Post-install script does a check on the system to see which commands exist. This would be more mature than "Windows or not Windows" - it would try to exec a list of commands and see which ones fail.
For every command that doesn't exist, post-install script programmatically runs npm install -g on all sub-modules (one for each command, such as grep).
This would take a while and the npm module is huge, but it seems like it would work. No?
There doesn't seem to be a way to do this directly through package.json, but it might be possible (and desirable) to do something like:
Make a separate npm module for each executable you want to register (eg my-win-grep).
In the my-win-grep module, implement the executable code you want to run, and register the PATH/executable value in this module.
In the package.json for my-win-grep, include an os field that limits it to installing on windows.
In your original module, list my-win-grep as an optionalDependency.
In this way, if someone installs your original module on Windows, it would install my-win-grep, which would register an executable to run under the grep command.
For users on other systems, the my-win-grep module would not install because of the os requirement, but the main module would continue to install because it ignores failures under optionalDependencies. So the original grep executable would remain untouched.
Updated question
This approach does sound like it should work - as you say, the npm dependency is pretty large, but it does avoid having to preform additional symlinking, and still has the benefit outlined above of having each piece of OS specific functionality in a separate module.
The only thing to watch for, possibly, in the programmatic npm docs:
You cannot set configs individually for any single npm function at
this time. Since npm is a singleton, any call to npm.config.set will
change the value for all npm commands in that process
So this might just mean that you can't specify -g on your installs, but instead would have to set it globally before the first install. This shouldn't be a problem, but you'll probably need to test it out to find out exactly.
Lastly...
You might also want to have a look at https://github.com/lastboy/package-script - even if you don't use it, it might give you some inspiration for your implementation.

How to best automate deployment of NPM-dependent project?

I'm used to deploy code depending on Composer (PHP's NPM cousing), that sports .json and .lock files. The first one describes the package and your version constraints, and the second one lists exactly what was installed. Always there's a lock file and you run composer install you're sure to receive the same set of packages; running composer update will re-read the json file, install new versions, and update the lock file.
That's awesome for production deployment, since you don't need to checkout your dependencies to your versioning system and you're sure to have the exact same set of dependencies in production as you have in development.
My question is: how to best automate deployment of NPM-dependent code? Is it possible to achieve a method similar to Composer? I've noticed that npm install only installs what's first available in the package.json file. After the first run, i.e. if you change a version constraint you must manually npm update that package - and that would render automate deployment useless, as there's no way to check in to versioning "update this package here to a new version"...
npm shrinkwrap is a analog of composer.lock file. It will generate a npm-shrinkwrap.json, that have all deps with version in it, so you can use it to deploy to production env. Also you can try a various libs from npm to lock versions or search for updates of it without changing packages.json.

Resources