Installing Meteor packages globally - node.js

Is there a way to install meteor packages globally?
So, having the once globally installed packages installable without internet connection for projects created later, avoid repetitive downloading, and other benefits one may imagine.
Like in Node.js, using npm command (of Node Package Manager) with -g flag, npm install -g, doing so npm installs node packages into a global directory and when wanted to be loaded from javascript programs, loading from there if available, as well as looking in and loading packages from project's node modules folder.

Meteor already downloads packages into a global repository that all your local apps benefit off of.
So if you meteor add iron:router#1.0.7 it is downloaded and added to your project. Next time another project requires the same version, it is used off that same spot.
Also, there is a PACKAGES_DIR environment variable, when set, allows you to keep your own local packages centrally, so that you can share them among projects. In fact, you can keep that on a network drive (NFS) which your whole team can mount and consume centrally.
Yet, there is an inherent problem. Meteor's version resolver looks up for updates unless you pin down your package dependency versions so that is exactly why meteor seems to be so desperate to be connected.
Even if you pin your dependencies, the packages you depend on may not have (which apparently is the case for most packages) so Meteor keeps looking for updates to the whole package tree and downloads those that it deems satisfying the version constraint resolver.
The good news is, they are constantly improving their tooling, requiring lower number of lookups, faster builds, better search etc.
All in all, in essence, there is not much you can do unless Meteor provides some way of hosting an entire mirror of its package repository for you to consume offline. And I guess it is very unlikely to happen.
Meteor is a tool for the connected world and it does assume your connectivity. Heck, the whole journey begins with a curl https://install.meteor.com/ | sh
And yes, it would be great if we could hack away on a remote beach, or the 12 hour flight to that beach.
Until then, happy coding online ;)

Related

How can I restore node modules for multiple platforms?

My Node application needs to be deployed on Windows and Linux. The main deployment package is built on a Linux CI server.
When this package is deployed to Windows, it crashes immediately due to missing native bindings, such as those for sqlite. Only the bindings for the build platform (Linux) are restored.
With a deadline approaching, we just set up a Windows build configuration which outputs a Windows specific package that contains the appropriate bindings, and we choose the appropriate artifact to bundle in the installer.
This works but feels fragile, as we would need to keep the Node versions in sync between the two otherwise unrelated environments. I would like to be able to do this with a single build configuration.
I couldn't find any guidance on how this is done. I'm imagining a command-line option like --platform=windows to npm ci, or a modification to package.json but I couldn't find any information about this. Presumably this is a reasonably rare requirement, and perhaps there is no tooling around this, which would be a shame.
Another requirement is that the application must be installed without an internet connection. We cannot run npm ci or npm install when we install it as some of our clients do not permit their servers to access the public internet.
Based on your requirements it sounds like building a package on each required platform would be the safest bet, with the least number moving parts to go wrong.
As the comments have suggested most projects rely on an npm install on the required platform so you are stepping into not that common territory.
This works but feels fragile, as we would need to keep the Node versions in sync between the two otherwise unrelated environments. I would like to be able to do this with a single build configuration.
Node uses NODE_MODULE_VERSION (displayed on the releases page) to track ABI compatibility for native modules. This only changes with a new major Node release number.
The CI builds would need to create app packages for each major version of Node you run on each platform. Keeping the Node.js major versions in sync for the application a good thing in any case. Running Node N and N-1 builds until that can be achieved is good cover and probably the best option with the air gap requirements.
NPM Cache
If the air gapped clients are largely on common networks, an NPM cache/proxy (nexus/verdaccio) may be of use. The NPM cache will need a process to snapshot the repo after a production npm install on all required platforms, to be pushed out to your endpoints. Unfortunately binary modules are often distributed out of band from NPM so won't be stored in regular NPM caches. Each client instance will need a complete build environment to build any native modules from source which can sometime present it's own difficulties on Windows platforms.
Alternatives
Node.js is not a great platform for distributing packaged applications to many diverse clients, especially if you need to distribute Node itself. Any language with an external VM requirement presents difficulties. Nodes package management choices and reliance on native modules exacerbate this.
I've given up in the past and converted clients (albeit thin) to Go, as it lends itself to cross platform distribution a lot better by removing the external runtime requirement and having less variables.

When to add a dependency? Are there cases where I should rather copy the functionality?

I lately helped out on a project, where I added a really small dependency - in fact, it only contained a regular expression (https://www.npmjs.com/package/is-unc-path).
The feedback I got from the developer of the project was that he tries to minimize third-party dependencies if they can be implemented easily - whereby he - if I understand it correctly - asks me to just copy the code instead of adding another dependency.
To me, adding a new dependency looks just like putting some lines of code into an extra file in the repo. In addition, the developers will get informed by an update if the code needs a change.
Is it just a religious thought that drives a developer to do this? Are there maybe any costs (performance- or space-wise, etc) when adding a dependency?
I also had some disputes with my managers once concerning the third party libraries, the problem was even greater he got into believing that you should version the node_modules folder.
The source of any conflict usually is the ignorance.
His arguments were:
you should deliver to the client a working product not needing for him to do any other jobs like npm install
if github, npm is down in the moment when you run npm install on the server what you will do ?
if the library that you install has a bug who will be responsible
My arguments were:
versioning node_modules is not going to work due to how package dependencies work, each library will download his own node_modules dependencies and then your git repository will grow rapidly to hundreds of mb. Deploy will become more and more slow, downloading each time half a gb of code take time. NPM does use a module caching mechanism if there are no changes it will not download code uselessly.
the problem with left-pad was painfull but after that npm implemented a locking system and now for each package you just lock to a specific commit hash.
And Github, and npm does not have just a single instance service, they run in cloud.
When installing a dependency you always have some ideas behind and there are community best practices, usually they resume to: 1. Does the repo has unit tests. 2. The download number 3. When was the latest update.
Node.js ecosystem is built on modularity, it is not that node is so popular cause of some luck, but cause of how it was designed to create modules and reuse them. Sometimes working in node.js environment feels like putting lego pieces together and building your toy. This is the main cause of super fast development in node.js. People just reuse stuff.
Finally he stayed on his own ideas, and I left the project :D.

Does every git branch of an NPM project have different node_modules dependencies?

I assume that when developing an NPM project, that every git branch (or whatever version control system you use) probably points to a different set of node_modules on the filesystem. Is that true? How does that work? Does it pose any problems for diskspace etc?
Or perhaps, since node_modules is most commonly .gitignore'd, then the node_modules files are shared between Git branches? Again, how would/does that work?
*Note that Node.js / NPM is fundamentally different than other platforms/languages since dependencies are typically stored locally to a proejct rather than in some central location on a machine.
By convention, one should not add any files, libraries or binaries which can be generated or pulled in from an external source. This includes things like node_modules; since that is made readily available* once you do npm install, there's no reason or incentive** to want to put that into your source control. At worst, it will also bloat your repository, filling your diffs with things you simply don't control and don't necessarily want to review.
I would not expect different Git branches of an NPM project to contain different node_modules folders. I'd only expect the one node_modules folder, and if a branch gave me fits about dependencies, I'd look to reinstall the dependencies (and note it down to be sure that something else hadn't gone awry).
As an addendum, any files or folders in .gitignore are simply not indexed or tracked by Git. If the contents of those files or folders change, Git is none the wiser. This also means, when switching between branches, the contents of the files or folders in .gitignore remain the same.
*: Provided that the library you're using isn't suddenly yanked. Or the repository is not impacted by a colossal DDoS.
**: There may be some incentive to do this given that the reliability of certain NPM packages hasn't been 100% this year, but that's a team and architecture-driven decision, and I doubt that placing it into source control is the most ideal and convenient way to deal with it.
There are two schools of thought, and both have merit.
1) Never check in node_modules and rebuild on deploy/install
The approach relies heavily on NPM and the connectivity of your deploy environment. node_modules are downloaded and installed (and/or compiled) each time the deploy is run.
Positives:
Your repository is much smaller.
NPM modules are installed in the environment they will run on.
Concerns:
Tied to 3rd party for sources - Go read about that whole left-pad thing. If one dependency cannot be downloaded, your entire build system is hung out to dry. "Cranky and paranoid old timers" will cite this as the reason to check everything in (or run your own private NPM somewhere).
Branch management - Like you mentioned in the question, some branches might not have the same dependencies. Dev1 adds a new features and used a new package. Now Dev2 runs the dev branch or whatever, and everything is broken and they need to know to npm install the new package. More subtle is the case where a npm package is version changed (now you need npm update as npm install will say nothing has changed), or where their node_modules are upgraded to work on "new feature 10" but they need to clear everything out to "downgrade" to go fix "prior bug 43". If you are in active development with a team of more than 2-3, watch out for this one.
Build Time - If it is a concern, it takes a little longer to download and install everything. Or a lot longer.
2) Always check in everything you can
This approach includes node_modules as part of the repo.
Positives:
Not dependent on 3rd party sources. You have what you need to run. You code can live on its own forever, and it does not matter if npm is down or a repo is deleted.
Branches are independent, so new features from Dev1 are auto included when Dev2 switches to that branch
Deploy time is shorter because not much needs to be installed.
Concerns:
Repository is much larger. Clones of code take longer as there are many more files.
Pull Requests need extra care. If a package is updated (or installed) along with core code, the PR is a mess and sometimes unintelligible. "500 files changed", but really you updated a package and changed two lines of core code. It can help to break down into two PRs - one that is is a mess (the package update) and one that is actually reviewable (the core code change). Again, be prepared for this one. The packages will not change too often, but your code review takes a little longer (or a little more care) when they do.
OS Dependent Packages can break. Basically anything that is installed/compiled with gyp can be OS dependent (among others). Most packages are "pure JS" and, being just scripts, run everywhere. Imagine all your devs run and test on OSX while you deploy to Linux, you cannot check in those packages that were compiled on a MAC because they will not run on Linux. An odd workaround for this is to define most packages as "dev dependencies" (--save-dev) and the ones that need compiled as normal ("production", --save), then you run npm install --production so the dev dependencies are not installed (and are already present), but the others are.
Conclusions
It depends. (Don't you hate hearing that all the time? : )
Depending on your team and your concerns, you might go either approach. Both have their merits, and you will decide which is more beneficial to you. Both have drawbacks as well, so be aware of those before you get bit!
Personally I ignore .node_modules but I have different package.json in different branch and when I switch I reinstall the dependencies
Two branches having different set of node modules is in scenario where one branch is in development phase and other is your production branch. In such cases development branch will have more node modules than production. If I am not wrong any other scenario might get you in trouble.
Pushing node_modules to remote version control repository is bad practice hence just rely on npm install whenever you clone a branch or pull the code to download any new node module added to package.json.
Apparently, since you don't have your node_modules in your actual repository, you need to install node modules again and each branch might have its own requirement, as you might update your server.js with new dependency and you also need to make sure you have these newly added node dependencies in your production server as well.

How to deal with local package dependencies in nodejs with npm

How should we deal with local packages that are a dependency in other local packages?
For simplicities sake, say we have the follow packages
api - express application
people - a package to deal with people
data-access - a package that deals with data access
And then the dependencies are
api depends on people
people depends on data-access
Currently we have these dependencies setup as file dependencies.
I.e. api package.json would have
"dependencies": {
"people": "file:../people"
}
Trouble with this is that we're finding it a PITA when we make updates to one package and want those changes in the other packages that depend on it.
The options we have thought of are:
npm install - but this won't overwrite previously installed packages if changes are made, so we have to delete the old one from the node_modules directory and re-run npm install... which can be niggly if the package dependency is deep.
npm link - we're not sold on the idea because it doesn't survive version control... Just thinking about it now, maybe we have some kind of local build script that would run the npm link commands for us... this way it could survive version control. Would that be a grunt job?
grunt - we haven't dived too deep into this one yet, but it feels like a good direction. A little bit of googling we came accross this: https://github.com/ahutchings/grunt-install-dependencies
So, what option would work best for our situation?
Are there other options that we haven't thought of yet?
Ps. we're a .NET shop doing a PoC in node, so assume we know nothing!
Pps. if you strongly believe we're setting up our project incorrectly and we shouldn't have smaller individual packages, let me know in the comments with a link to some reading on the subject.
So, I agree that going with 'many small packages' is usually a good idea. Check out 12factor.net if you haven't already.
That said, in specific answer to your question I'd say your best bet is to consider mainly how you want to maintain them.
If the 'subcomponents' are all just parts of your app (as, for example, data-access implies), then I'd keep them in the same folder structure, not map them in package.json at all, and just require them where you need them. In this case, everything versions together and is part of the same git repository.
If you really want to or need to keep them all in separate git repositories, then you can do npm link, but to be honest I've found it more useful to just use the URL syntax in package.json:
dependencies: {
"people" : "git://path.to.git:repo#version.number"
}
Then, when you want to explicitly update one of your dependencies, you just have to bump the version number in your package.json and run npm install again.

Provide Node.JS webapp "key in hand"

I am building a simple Node.JS application for a client. The webapp should be easy to deploy on each server instance (which are RedHat EL 6.3), "key in hand".
What is the best way to package a Node.JS app? Basically, I need an "installer" or "package" to:
Install Node.JS
Install the dependencies (npm install)
Populate the application files (CSS, JS, HTML, etc.)
You should deliver a self-contained package. Please check out the great site The Twelve-Factor App, specifically the build, release, run section. There is a lot of hard-won wisdom from experienced operations engineers embodied in that site.
In your app's repo, write a script (shell, node, whatever) that can generate a distributable archive
RPM or tar archive are the 2 most sensible choices for you. tar is more portable and simpler. RPM would integrate nicely with an RPM-based distribution. I would recommend starting with tar if you have not done a lot of software packaging/management work. RPM is significantly more complex than tar.
The tar archive should embed the node.js files within it. This will make your app easy to install and avoid sharing a system-wide node install thus creating artificial coupling. If you go the RPM route, you can specify node as a dependency in your RPM spec file (but you probably shouldn't - see below).
The archive should embed all of the npm dependencies as well. Don't run npm install at package install time. Consider using the npm shrinkwrap tool to manage your dependencies during development, but at deployment time they should be pre-bundled and ready to run.
Specifically, these are bad ideas you should avoid:
Do not download anything from the Internet during installation. This is brittle, slow, and potentially can throw you bad surprises including security problems
Do not build artifacts at install time that can be built at build time. So ship pre-build CSS files, requirejs optimized files, pre-compiled binaries, etc.
As to whether your application RPM should list node.js as a dependency or embed node into the RPM, here are some points to consider.
Embed node.js into your RPM
Single .rpm file to distribute
Allows your application to tightly control the node version it uses. (see below)
Higher reliability. The fact is your app is probably coupled fairly tightly to at least the minor version of node.js you develop on (0.8.x for example) or even a patch release (>= 0.8.12 < 0.9 for example). It's best to allow node.js to decouple your app from the OS, but don't be fooled into thinking your app will work reliably on a different version of node.js without testing & adjustment. Most commonly these days there's just 1 app running on the OS, and the notion of sharing node between apps incorrectly values conservation of disk space over proper decoupling and operational independence of applications.
It's unclear whether there are any official/reliable pre-built RPMs out there in yumland that will "just work".
Specify node.js as a dependency of your PRM
Follows the general philosophy of OS package management (avoid duplication, conserve disk space, etc)
RPM provides capabilities beyond TAR around inventory management, uninstallation, upgrade, etc. Since you are asking this question, you are probably not ready to address these properly yet, so you might want to start with tar and once you have a solid understanding of that, consider RPM upgrade scripts, etc.
The "single file to distribute" nice point can quickly become untenable once your app starts using a database or 3, supporting daemons for email, log aggregators, etc.

Resources