Setup Private NPM feed and publish Packages - node.js

I have set up an Azure DevOps Artifacts Feed for NPM.
I followed the Instructions on https://learn.microsoft.com/en-us/azure/devops/artifacts/get-started-npm?view=azure-devops&tabs=windows
In the next Step, I wanted to publish packages from a “node_modules” Directory within a Visual Studio Project that got its packages from the Public source.
I thought if I run “npm publish” next to my custom “.npmrc” and “package.json” File it would publish all my libraries from the “node_modules” directory. Instead, it published my VisualStudio Project, which uses these libraries. It even followed the Git Ignore rules to not include the “node_modules” folder …
What would be the default way to publish the packages I depend on?
Do I have to write a script to do it for every single Package manually?
What do I with prebuild requiring packages?
After I ran a simple script a couple packages failed.
script:
for /d %i in (C:\Path\node_modules\*) do ( cd "%i" & npm publish )
Error:
…
6 warn prepublish-on-install As of npm#5, `prepublish` scripts are deprecated.
7 warn prepublish-on-install Use `prepare` for build steps and `prepublishOnly` for upload-only.
8 warn prepublish-on-install See the deprecation note in `npm help scripts` for more information.
…
23 error code ELIFECYCLE
24 error errno 1
25 error xml-name-validator#3.0.0 prepublish: `node scripts/generate-grammar.js < lib/grammar.pegjs > lib/generated-parser.js`
25 error Exit status 1
26 error Failed at the xml-name-validator#3.0.0 prepublish script.
…
I saw these packages have their own sub packages : /
PS: My DevOps server and workstation do not have direct access to public Networks!
Thanks for any Help!

What would be the default way to publish the packages I depend on?
You would not re-publish all your project's dependencies (= modules already published by other people) but let users of your module load them automatically by installing your project as a dependency.
Do I have to write a script to do it for every single Package manually?
Again, you would not publish other people's packages.
You might use a bundler like Parcel, Rollup or Webpack to include all your dependencies' build code within your own build artefact, so it will not have external dependencies anymore. To tell your module's users about that fact, you would also have to tweak the package.json of your project (i.e. dependencies become devDependencies) and you also should take care of licenses (some of them require you to include legal headers inside your artefact, you publish other people's work under your own name).
Furthermore you break with the modularity of the ecosystem, so don't expect overall efficiency.
PS: My DevOps server and workstation do not have direct access to public Networks!
I don't know if this was already possible at the time of your post:
Within an Azure Feed you can define Upstream sources.
So your Feed will provide a proxy to ( and cache for ...) npmjs.org, where the dependencies of your project are published / hosted (no need for direct access to public Networks because you will download from npmjs.org through your Feed).

Related

npm Azure Artifacts feed doesn't install all dependencies from upstream source

Trying to set up a proof-of-concept for the place I work using a private npm registry to limit the packages developers can download. I set up a feed on Azure Artifacts, and set the official npm registry (https://registry.npmjs.org) as the only upstream source. This feed was set as the registry in the npmrc file, and the project is correctly identifying that as the registry source. (per npm config get registry).
When a user (with permissions to install from upstream) tries to install a package from the empty feed, it installs the package (from the upstream) correctly along with all of its dependencies. It also saves the package to the Artifacts feed, but only some of its dependencies are saved to the feed. There seems to be no rhyme or reason as to which dependencies it saves, as it changes almost every time I install the same exact package.
When a user that does not have permission to install from an upstream source tries to install that same package, it fails on one of the dependencies that wasn't saved, giving a 404 error for the artifacts feed, saying that the package was not found in the registry.
I've set up quite a few different feeds, both project-scoped and organization-scoped to see if I perhaps fiddled with the wrong settings/set something up wrong, but I get the same behavior with every feed I set up.
Are there certain criteria that determine whether or not a dependency is downloaded, and is there a way that I can make it so all dependencies are saved to the feed when a package is installed from the upstream?
Are there certain criteria that determine whether or not a dependency is downloaded
npm has a local cache. You'll want to run npm cache clean before testing. Otherwise, there's no guarantee that the package will be downloaded. It may be installed from the cache instead.
and is there a way that I can make it so all dependencies are saved to the feed when a package is installed from the upstream?
I suppose you can try disabling the cache, but that will likely greatly inflate installation times for your users. You may only want to do that while testing. That said, there are various somewhat-hacky ways to do it more permanently-ish. You can use the force config option but that has other side effects. I imagine you can set the cache to be /dev/null or something like that, although I've never tried that. There are other ideas in the answers provided to the "Disable npm cache" Stackoverflow question.

How can I switch between a linked npm dependency (in development) and an installed dependency (in staging/prod)?

I have a custom npm module that I am working on, and it has a GitHub repo. I'm also working on a project that uses the custom module. When working on the larger project, it is nice to use npm link so I can make changes to the module and see them right away in the main project.
To deploy to staging or production, I use shrinkwrap and shrinkpack so I can do an npm install after every deploy (some of the dependencies need binaries, and dev systems aren't the same as production systems, so they do need to be installed and not just kept in source control). Edit: I'm crossing this out as the answer below technically solves my issue, even though it doesn't solve for this particular point, but that wasn't as important as the rest of it.
Of course, since the module is linked to my main project and not listed in package.json, a deploy and install misses it entirely. I can go ahead and list it in package.json and have it point to the appropriate GitHub repo, but then every time I need to test a change in the main project I would have to commit and push those changes, then update the main project, kill and restart the app...that would get tiresome pretty quickly.
I guess I need something like the opposite of "devDependencies"; something where I can have it not install the module on dev, but do install it from GitHub when doing npm install on staging or production. Other than remembering to manually change package.json every time I need to go back and forth, is there a better way to do this?
you can specify a github repository as your package to install, in your package.json file:
{
dependencies: {
"my-library": "githubusername/my-library"
}
}
this will work in your production environment.
in your development environment, use "npm link".
from within the "my-library" folder, run npm link directly. that will tell npm on your local box that "my-library" is avaialable as a link.
now, in your project that uses "my-library", run npm link my-library. this will create a symlink to your local development version of "my-library", allowing you to change code in that repository and have it work in your other project that needs it.
once you are ready to push to production, push "my-library" to your github repository, and then you can npm install on your servers, like normal.

NPM errors and control in Azure Websites

I want to build my Node.JS application in a Azure Website.
There will be an usage of different NPM packages via my packages.json file.
My problem is that I often receive error messages which are related to missing NPM files.
Normally I put my files via FTP or edit them per VS Studio 15 Azure plugin directly on the server. This may be the reason why NPM isn't triggering as Microsoft intended it.
I would prefer a way in which I can just run commands with elevated privileges to have full control over NPM by myself.
Which ways are possible to avaid these problems?
If you're publishing your nodeJS application 'manually' via FTP there are little concerns about that.
First of All, 'manually' means manually.
Git
If you use continuous deployment via Git the final deployment step is to call npm install in your current application folder, this will install all the packages listed in package.json file.
The node_modules folder is excluded by default in .gitignore file, so all packages are downloaded by the server
Web deployment
If you're using web deployment from visual studio or command line, all the files contained by your solution are copied to Hosting environment including node_modules folder , because of this the deployment would take a long time to finish due the huge amount of dependencies and files that the folder contains.
Even worst: this scenario could take you to the same scenario you're facing right now.
FTP deployment
You're copying everything yourself. So the same thing occurs in Web Deployment is happen in FTP deployment method.
--
The thing is that when you copy all those node_modules folder contents you're assuming that those dependencies remains the same in the target enviroment, most of the cases that's true, but not always.
Some dependencies are platform dependent so maybe in you're dev environment a dependency works ok in x86 architectures but what if your target machine or website (or some mix between them) is x64 (real case I already suffer it).
Other related issues could happen. May be your direct dependencies doesn't have the problem but the linked dependencies to them could have it.
So always is strongly recommended to run npm install in your target environment and avoid to copy the dependencies directly from your dev environment.
In that way you need to copy on your target environment the folder structure excluding node_modules folder. And then when files are copied you need to run npm install on the server.
To achieve that you could go to
yoursitename.scm.azurewebsites.net
There you can goto "Debug Console" Tab, then goto this directory D:\home\site\wwwroot> and run
npm install
After that the packages and dependencies are downloaded for the server/website architecture.
Hope this helps.
Azure tweak the Kudu output settings, in local Kudu implementations looks the output is normalized.
A workaround -non perfect- could be this
npm install --dd
Or even more detailed
npm install --ddd
The most related answer from Microsoft itself is this
Using Node.js Modules with Azure applications
Regarding control via a console with elevated privileges there is the way of using the Kudu console. But the error output is quite weird. It's kind of putting blindly commands in the console without much feedback.
Maybe this is a way to go. But I didn't tried this yet.
Regarding deployment it looks like that Azure wants you to prefer Continuous Deployment.
The suggested way is this here.

Best way to set up a node.js web project in a closed environment

We build a web application and our project uses various npm packages for development, testing and run-time.
The project is built as part of a large project in TFS. TFS runs ant to build the project. Our build.xml first runs npm install, then transpiles and minifies the TypeScript and Sass files (using Grunt tasks) and then builds the final war fie.
This all works OK, but our TFS is not allowed to access the Internet during the build, only our local network. Therefore, we have all the npm libraries we use copied to a file server in our network, and our package.json dependencies point to paths on that file server.
Does this seems like a reasonable solution?
The problem we have is that the npm install takes about 10 minutes to get all the >50 packages we use (which includes karma, grunt, sass, tslint, etc. – total is 170MB).
We are now looking for way to reduce the TFS build time. One option is to but the node_modules in our source control and skip the npm install step, but is seems wrong to put third-party code in our source control.
I’d love to hear other ideas to handle this and have shorter build time.
Note that on developers machine the project builds in no time, as all packages are already installed, but TFS builds start by getting a clean environment from source control, so nothing is installed.
Tough problem. You could have TFS check if your package.json checksum has changed in order to determine if a "clean" is necessary. You'd still have a 10 minute build whenever package.json is updated, but package.json changes are usually infrequent.
The lines become blurred when you host your own npm libraries since this is essentially taking a snapshot of only the dependencies you need. Therefore, if you added a dependency, colors, you'd have to update your npm repo. That could be viewed as updating the node_modules folder on your npm repo. It's a static list of available dependencies which essentially defeats the purpose of a package.json (unless of course other internal apps use the internal npm repo).
BUT, I digress, I'd argue that the best option is to have a package.json checksum for TFS to know if it should bother rebuilding node_modules.

How to automate testing user-version of npm package instead of running the development version on continious integration?

It happens occasionally that the development version of a module works in my development workspace and passes on Travis-CI but after publishing to npm it turns-out the end-user package is broken.
For example if you use a sub module that should be in dependencies but had it in devDependencies then CI will pass (but there are plenty other possible breakages).
How do you automate testing this? Do you use external rigging? Is there a secret module? Do you have a user acceptance test suite?
I use Github with Travis-CI but the standard setup uses the development install.
Once upon a time I discovered that npm would let me publish packages that are uninstallable. So I've added a target to my Gruntfile that does this:
Issue npm pack to create a package from my source.
Into a directory created (automatically by my Gruntfile) just for testing install the new package using npm install <path to the package created in the previous step>.
I have a target for publishing a new version that will publish only if the steps above are successful.
The steps above would not catch the dependency problem you mentioned in the question but they could easily be extended to catch it. To do this, I'd add one or more tests that cause the package installed in step 2 above to call require with all that it depends on.
I would suggest to set up your own CI server that does essentially one thing, npm install package ; cd node_modules/package ; npm test. This would ensure that your package is installable at least on your server.
I heard that Jenkins is good for this (at least, that's what node.js core team seems to be using), but don't have any first hand experience yet. We're just planning to set in up in a couple of weeks.
Also, having some external module that depends on you and testing it helps a bit. :)

Resources