unable to download private github package through aws codedeploy - node.js

We have a code pipeline hosted on AWS. In the CodeDeploy stage, it installs the packages in our nodejs project. Now one of these packages is a private package hosted on Github on a Pro user account. I have setup the integration with Github. The source was Github version 1, but sinceit wasn't recomended, I changed it to Github version 2, with a proper connection. Now there's an AWS app installed on our repo to manage access. I configured it to allow access to both the nodejs repo and the private package repo. But We're still getting 404 not found when the pipeline runs.

So basically I did two things to solve my problem:
1- Move the library from dev dependencies to dependencies in package.json, since our pipeline only installs production packages.
2- Copy over the .npmrc file from home folder to the repo. This way the pipeline knows it has to download from the github registry and not from the npm registry.
Hope this saves somebody's time in the future.

Related

Can node modules folder which is downloaded with nodejs installation locally, be set up as jfrog local repo

Is it possible to create a private repository hosting packages which are downloaded with node.js installation? After downloading nodejs locally can I zip the default node modules folder which is available and make it a local npm repo in jfrog so that when i give npm install command, the pacakged should be downloaded from this repo. I want to have the packages of node modules folder of nodejs version (14.4.0) as a repository in jfrog instead of having a remote npm repo which pulls packages from www.npmjs.org. Reason for not wanting remote npm repo is
1: It pulls different new version which is available in npmjs.org
2: Third party API call from jfrog is something that my organization doesn't permit for a project.

Push all project node dependencies to a private package feed

Currently, in a project, we're using some packages from a private registry hosted on Artifactory, along with some packages from npm.
We're trying to migrate all the packages (public and privates) to another Artifactory server, which is offline. However, when I run an npm publish command on the project, it only pushes the project itself as a package and not its dependencies.
We'd like to publish all dependencies located in node_modules one by one to the private registry so they can be accessed from any offline project. Is it possible to accomplish this?
I already tried to add the packages to bundledDependencies in package.json, but this, however, doesn't push the dependencies individually.
As a workaround, what we did was to create a script that ran npm publish on every package in node_modules. Everything from the root of the project (Which had the corresponding .npmrc pointing to the target repository). This created a copy of every dependence on the new Artifactory.

How to deploy one app from a large monorepo with dependencies to packages in the same repo to google app engine?

I have a large node.js monorepo with several applications and packages and inter dependencies. It is all managed with yarn workspaces and a little bit of lerna. Everything works great for me, however I am having trouble trying to deploy one of the applications in this monorepo to google app engine.
The main issue is that the app engine wants to install packages that are located only locally and are not on npm, and it throws an error.
I've scoured the google cloud documentations but did not manage to find anything that I could use to specify custom node packages or anything similar.
Is there a way to make such a deployment without publishing the local packages to npm?
The basic structure of the app I want to deploy looks like this:
-root
-packages
-packageA
-package.json
-apps
-deployable-app
-package.json <-contains dependency: "packageA": "0.0.1"
-app.yaml
Create a minimal docker image by coping to the image only the deployable-app and the packages from the same monorepo that the deployable-app depends on (in your case it's packageA).
When installing, yarn will do all the work to link between them.
Notes:
Deterministic installation - It is recommended to do so in monorepos to force deterministic install because nodejs packag emanagers (pnpm, yarn, npm) does not read the lock files of the dependencies of your app so when you run install, and packageA is located in public/private npm-registry, the package manager will install packageA dependencies as he wants. But you have a yarn.lock file in your monorepo that described what exectly should be installed and in which version.
Small docker image, better caching - Copy only the local packages from your monorepo that your deployable-app needs and create a script that removed all devDependencies from all the package.jsons in the monorepo (inside the dockerfile). they are not needed in production. make your image small.
I have the same problem with Firebase Cloud Functions, so I decided to publish my packages into a private registry and configure the Cloud Function environment with a .npmrc to use my private registry. I suppose that you can do the same with App Engine.
For private registry, I already tried two: GitHub Package Registry(now in Beta) and Verdaccio(which is a self-hosted option)

How do you go back to an older version of dependencies if the location or the npm registry has migrated to a new location?

How do you go back to an older version if the locations of a git dependency or the npm registry itself has migrated to a new location?
Occasionally there is a need to change the location where a git repo is hosted. Or, if maintaining a private npm registry, the URL of the registry may change.
Since their URLs are checked in as part of package.json or yarn.lock (or npm equivalent).
How do you deal with the case where you need to build an older version but the location has changed?
Is there a possibility to overwrite the resolved URL before the fetch occurs?
Thanks!
I recommend you to use JFrog Artifactory and follow this steps:
Create a remote repo with the external repo or public registry that you need (probably you have this URL in your registry or in your dependencies inside the package.json).
Create a virtual NPM repo in Artifactory and add to this virtual repo the remote repo created in the steep 1.
Replacing the default registry with your new virtual repository with this command:
npm config set registry http://<ARTIFACTORY_SERVER_DOMAIN>:8081/artifactory/api/npm/your-npm-virtual-repo-name
Remove server links and replace with only the dependency name and version like:
"dependency-name1": "0.0.1",
"dependency-name2": "0.0.1",
And publish your projects without server links, It's a bad practice to have the URL of the repo inside the dependencies in the package.json.
More info here: Npm Registry with Jfrog Artifactory

How to install a private Node.js package repository for a company intranet?

What is the recommended way of distributing packages to Node.js servers, running inside a company intranet? The problem is most servers cannot directly access the npm registry. Is it possible to install a private repo, sync it with the official one, and then sync the internal servers from here?
Best practice is to check your node_modules into your git repository (remove node_modules from .gitignore). Then, only your developer machines will need access to npm.org, and the servers will get the packages from your internal git repository.

Resources