How to dynamically choose specific branches of dependancies in package.json - node.js

Scenario:
We have a dependency structure such that there are 3 separate private repos:
libA
serviceA
serviceB
Both services require the use of libA (a shared, private library). In addition, serviceA has serviceB in the devDependencies for tests. a visual representation of the tree is shown below.
serviceA ---> libA
|--> serviceB ---> libA
All private repos are listed in the package.json to be cloned using SSH
git+ssh://git#github.com:{ORG}/{repo}.git
Question:
We have many scenarios where we want to change code (by creating branch and PR) in all 3 repos at the same time. In order for the tests to pass, all repos must be cloned using the specified branch, and if not, use the default. When running tests locally, we will sym-link the deps manually, or use npm link.
When running the tests for serviceA in CI, it would be possible to run a pre-build script to edit the package.json with the new branches (for libA and serviceB). Unfortunately, that doesn't fix the fact that the deps for serviceB need to be edited so that when it installs libA it uses the correct, new branch.
What would be the best way to go about this?

Related

How to use the gitlab merge request as a npm module

Consider that there are two npm projects, package-a and package-b, which are hosted in a private gitlab instance. package-a depends on package-b via git+ssh protocol.
Now as an engineer I need to modify some code in package-b. Before I merge my change to the trunk branch, I hope that I can test my change with package-a to avoid unexpected bugs. How can I let package-a use the package-b that contain the un-merged changes?
Gitlab provides a corresponding refs/merge-requests/$iid/merge for the merge result of the merge, which could be used for this purpose.
Let's say, your merge request id in project-b is 106, then you can run the following command in the project-a to test with your merge request
npm install 'git+ssh://git#git.yourcompany.com:products/package-b.git#merge-requests/106/merge'
More reading: https://gitlab.com/gitlab-org/gitlab-foss/-/issues/47110

How to store node modules between jobs and stages in gitlab with continuous integration

I am fairly new to GitLab CI and I've been trying different approaches to use the node_modules directory in my entire pipeline. From what I've read in the official docs, cache and artifacts seem to be valid approaches to pass on files between jobs:
cache is used to specify a list of files and directories which should
be cached between jobs. You can only use paths that are within the
project workspace.
However, my issue with the caching method is that the node_modules would be persisted between pipelines by default:
cache can be set globally and per-job.
from GitLab 9.0, caching is enabled and shared between pipelines and jobs by default.
I do not want to persist the node_modules between pipelines. What I actually want is to trigger a fresh install with npm in my setup stage and then allow all further jobs in the pipeline to use these modules. Hence, I started using artifacts instead of cache, which is described similarly:
artifacts is used to specify a list of files and directories which
should be attached to the job after success. [...]
The artifacts will be sent to GitLab after the job finishes
successfully and will be available for download in the GitLab UI.
The dependency feature should be used in conjunction with artifacts
and allows you to define the artifacts to pass between different jobs.
The artifact-dependency method seems to be usable in my case. However, both cache and artifacts are extremely inefficient and slow. The node_modules are installed and usable, but the entire directory then gets uploaded somewhere and is re-downloaded between each job. (I would really love to know what happens here... Where do the modules go?)
Is there a better approach to run npm install only once at the beginning of the pipeline and then keep the node_modules in the pipeline during its entire runtime? I do not want to keep the node_modules after all jobs are finished so they don't need to be uploaded or downloaded anywhere.
Sample pipeline configuration file to reproduce the behavior:
image: node:lts
stages:
- setup
- build
- test
node:
stage: setup
script:
- npm install
artifacts:
paths:
- node_modules/
build:
stage: build
script:
- npm run build
dependencies:
- node
test:
stage: test
script:
- npm run lint
- npm run test
dependencies:
- node
Where do the modules go?
By default artifacts are saved on the main gitlab machine:
/var/opt/gitlab/gitlab-rails/shared/artifacts
Is there a better approach to run npm install only once at the beginning of the pipeline and then keep the node_modules in the pipeline during its entire runtime?
There are some options that you can try:
Merge setup and build stages to one stage.
Local npm cache on builder machines. Faster npm install times. Or use private npm proxy registry (for example - Nexus/Artifactory)
Check if gitlab main machine and the builders are in the same network so the upload/download will be faster
Consider packaging your build in docker. You will get reusable docker images between your gitlab stages. (Of course that there is an overhead of uploading the images to docker registry)

Share node_modules between different projects

I'm developing various Angular 2 projects and I want to share node_modules folder between multiple projects. I would like to create a structure like this:
MainFolder
- Project1
- Project2
- package.json
so I would have just 1 package.json for all the projects. My answer: is it possible to do this?
If possible, I have to lunch npm install with -g?
I can't understand how -g works.
Can someone give me instructions how to proceed?
Very thanks
I forgot to say that I build the projects with angular-cli.
The way I go around this for small/learning/test projects is with (I call it) "git projects". Basically I manage the various projects via git, and just "load" the project I want to work on. Of course this doesn't work if you want to have access to multiple projects at the same time.
I like to use a git client for this purpose because it's easier to visualize my existing "projects".
So my workflow is this...
Create my main/base folder. This will contain the git repo, the single node_modules folder, and whatever else that should be common to all projects.
I create the basic package.json file (using npm init). No description, no nothing, just the basic skeleton package.json file. (However, if you know you will use certain packages in ALL of your projects, you can npm install them first, so they will be added to package.json as your "base" modules.)
Now I check the bare package.json into the repo (and anything else that you may want to have in all of your projects, but usually it's just the package.json file). This will be the bare-bones starting branch for all projects.
Once this is checked in, I create a branch off of this in the git repo. This will be "Project 1" - or whatever you want to call it. Then build up your project however you want, installing modules, checking in changes, etc, etc.
When I want to start a new project, I simply check out the first bare-bones project (which is just the empty, or almost empty, package.json file) and do another branch off of it. This will be my 2nd project.
And so forth...
So the main thing is that every new "project" will be a new branch in the git repo, and to create a new project, just switch back to the original bare-bones one and do a new branch off of that.
Of course it is possible to create branches within a project, too. It's all about naming conventions. You could, for example, prefix a new project branch with "P_" or "PROJECT_", etc, so you can quickly tell in your git client which branches are projects. And of course use a different naming scheme if you just need a new branch within an existing project. That's basically how I go about it.
You may not like this workflow, but this way I don't need to install packages globally. When I do a backup, I can simply delete the single (possibly huge) node_modules folder. All project related modules can be reinstalled by simply checking out a branch for a particular project and run "npm install" on its package.json. Hope it makes sense.
Here is documentation on the various npm install arguments
In global mode (ie, with -g or --global appended to the command), it
installs the current package context (ie, the current working
directory) as a global package.
The -g install locations based on environment can be found here
One way you can achieve what you want is to have one solution for both projects and each project route uses it's own lazy loaded module.
Unless you have a specific business need to share resources, it's better to keep each project separate with it own resources and configuration.
-g Stands for global Installation, i.e. the packages you install will be available for all applications.
And why do you want to share node_modules and package.json file?
Keep them seperate for each seperate project. And if you need to share your project, you may share your package.json instead of sharing the node_modules folder.
Also to point out, if you manually install packages by listing their names, then you can use -g (global) flag, but if you do use only npm install then your packages won't be installed as global packages.
If it really is just for testing simple applications, could rename tha app folder in some way provide a solution. It assumes that all the dependencies are the same or at least a subset of the dependencies provided.

Deploying node app with self-maintained NPM modules

I am developing a very complex app that is using internally developed, open source NPM modules.
I often need to change one of those modules (extra features, bug fixing, etc.) in order for the main application to work.
At the moment, I have:
A directory called my_modules, each containing a git repository one for each module. For example module1, module2.
A directory called my_apps, where for example there is app1 which has module1 as a dependency
Under my_apps/app1/node_modules I have module1 and module2, installed via NPM
In the server, deploy by pulling the git repository, running an npm install and npm dedupe, and running the server with forever.
At this stage, if I have to fix something in one of the modules, I:
Fix it within my_apps/app1/node_modules/module1 (not git)
When it's all working, COPY the files over to my_modules/module1 and do a git push and npm publish
The server will pull the latest modules after deploy thanks to npm install
This is way, way less than ideal. It's just too error-prone. However:
Having a symbolic link link my_apps/app1/node_modules/module1 => my_modules/module1 means that module1 will look for dependencies in its own path, which often causes problems (for example, I need to make sure that EVERY module uses the same copy of module1, which is imperative)
Having a git repo under my_apps/app1/node_modules/module1 feels dangerous, in case I accidentally overwrite changes using NPM on the module. Also, once fixed the change in the local git repo, I would still then need to pull the changes in my_modules/module1. Yes a step forward from copying files over...
What's the "recommended" way of dealing with this? Any best practices?

Reusing code via custom modules in node.js

I have 3 projects in a bitbucket repo: projectA, projectB and projectCommon.
Last one, projectCommon should be used in ProjectA and ProjectB and it has been structured as node module which it is not public (not published to npm directory).
How can I use ProjectCommon module in ProjectA and ProjectB?
I've tried using doing npm link but I'm not very convinced in using this in a production environment.
Is there a better way for doing it? Maybe should I remove projectCommon from the repo and add it in a new repo?
¿How the package.json must be configured?
Add it as a dependency, e.g.
"dependencies" : {
"Your_Module": "https://bitbucket.org/:username/:projectname/get/master.tar.gz"
}
After which you hit npm install

Resources