Should I use yarn workspaces on production? - node.js

Me and a coworker are working on the deploy of some APIs that have some sharedlibs in common.
The thing is, he said that workspaces shouldn't be used in production, but no reason was presented to defend that. I came with the following setup:
Set SSH Keys from the remote for pull the repository
Run yarn workspace <api-name> install for a single API
Run yarn build for ES6/Babel
Run API from dist folder. And we're done. No need for bash scripts or NPM private packages.

I see nothing wrong with your setup. Yarn Workspaces is used to simplify management tasks on your code tree (and I see install script as code). It does not really matters whether you run yarn workspace <api-name> install or cd <api-name>; ./install.sh - but the former is less error-prone.

Related

Is it mandatory for each deployment to production for remove node modules and run npm install?

I use vuetify (vue)
Is it mandatory for each deployment to production for remove node modules and run npm install? Or just run npm run build?
I have two option :
Option 1 : Every deployment, I run the npm run build directly
Option 2 :
Delete the contents of dist folder
Delete node_modules folder
npm install
npm run build
Which is the best option?
npm install
This command installs a package, and any packages that it depends on. If the package has a package-lock or shrinkwrap file, the installation of dependencies will be driven by that, with an npm-shrinkwrap.json taking precedence if both files exist. See package-lock.json and npm-shrinkwrap.
If you did not install or update the package before releasing the project, you do not need to execute npm install, otherwise, you need to execute it to ensure that dependent packages on the production environment is consistent with your local dependent package version.
If you are using an automatic build deployment tool like jenkins, for convenience you can execute the install command before each build. It's okay.
Imagine more environments, not just a production:
development
testing1
staging
uat
production
Can we upload the npm run build result (compressed js) or node_modules to our git repository? ANSWER IS NOT!!. So if you need to have a version of your app running in any of these environments, you must to execute npm run build. And this command needs the classic npm run install. I think this last sentence, answer your question.
(ADVICE) Docker to the rescue
assumption 1 your client-side app (vue) is not complex(no login, no session, no logout, etc ), you could publish it using a basic nginx, apache, basic-nodejs.
assumption 2 you are able to have one more server for docker private repository. Also if you are in google, amazon o azure, this service is ready to use, of course a payment is required
In one line, with docker you must execute just one time npm install and npm run build. Complete flow is:
developer push some changes to the git repository
manually or automatically a docker build in launched.
inside Dockerfile, npm install and npm run build is executed. Also a minimal server with nodejs (example) is configured pointing to your builded assets
your new docker image is uploaded to your docker private repository
that's all
If your quality assurance team needs to perform some tests to your new app, just a docker image download is required. If everything is ok, you pass to the next stage (staging or uat) or production. Steps will be the same: just download the docker image.
Optimizations
Use docker stages to split build and start steps
If your app does not have complex flows(no login, no session, no logout, etc ), replace node basic server with a simple nginx
I need login and logout
In this case, nginx or apache does not helps you because they are a simple static servers.
You could use a minimal nodejs code like this:
https://github.com/jrichardsz/nodejs-static-pages/blob/master/server.js
Adding /login , /logout, etc
Or use my server:
https://github.com/utec/geofrontend-server
which has a /login, /logout and other cool features for example: How are you planning to pass your backend api urls to your vue app in any of your environments?.

How to run npm webpack vue project on both Windows and Mac?

I took a Vue 2 online course and it did show (but didn't really explain) how to install node.js, npm and vue. Currently using vue-cli to set up my project using vue init webpack-simple. Problem is I have a Windows desktop and a Mac laptop. I'm using Box cloud on both but I need to have 2 separate folders for the same project. Basically, project-1-windows and project-1-mac.
I can't run npm run dev on the project-1-mac while on Windows 10 and vice versa. The only way I know to run both is to delete the node_modules folder and run npm install. However it takes a while for the files to download. Is there an easier way to do this?
It looks like you want either GitHub/BitBucket/friends or (for much more complex set-ups) Docker.
I will explain only the first (easier) option. To set-up docker, you rather go to its docks.
So, for GitHub/BitBucket/friends way, you need some one-time set-up (you have to do all of this in terminal of your machine. I don't go too deep into details because you may find corresponding docks for each thing easily by googling it).
Install git if needed on both machines. On mac, you just run git --version in terminal. It'll either show the version of installed git or will ask if you want to install git together with other developer tools.
Install brew on Mac, install any of these on Windows. These are just package managers. Use them to install nvm.
Install nvm (it's node version manager, arguably the most convenient way to manage node.js installations) on both machines.
Use nvm to install node.js (npm comes bundled with it) on both machines. That's it! One-time set-up done. Run node -v && npm -v to check that both are installed.
Now, to start each project you would do the following:
Create a repo (which is like a folder but on GitHub/BitBucket server) that you may freely access from any device that has internet connection.
Start project on any of your machines with something like npm init or vue init webpack-simple or whatever you feel comfortable with.
Run git init
When you do changes, commit & push them into your online repo.
Avoid committing files that might be auto-regenerated, they simply don't worth storing.
You may use any npm commands.
When you want to continue working on another machine, simply git clone your existing repo, run npm install and you are done.
Commit changes if needed.
git pull changes to another device if needed

How to install and deploy node.js application?

I'm new to node.js. After creating modularized project with express, tests, .nvmrc etc. it's finally time to deploy the app. How it should be done? in java you bundle your project into a single file, self containing and you put in into a server with some configuration. what about node.js?
Should i just copy the whole directory with sources and node_modules to production machine and use systemd, pm2 or other process manager to just run it? but i heard some of the dependencies might be system-dependend so they may work incorrectly
or should i copy only sources and run npm install --production on the production machine? but this way the deployment is only possible when npm repositories are online. also it takes time to build the application and it has to be done on all machines in the cluster. also what about quickly rolling back to previous version in case there is some bug? again, time and online npm repos are needed
another option is to build a docker image. but it seems awkward that the only way to easily and safely deploy the app is using third party technology
how it's being done in real life scenarios?
sure don't copy the whole directory especially node_modules.
all the packages installed on your system should be installed with --save option example: npm install --save express if you do so you will have in your package.json the dependencies required for your project whether they are dev dependencies or production dependencies.
I don't know what your project structure looks like, but as a node application you have to run npm init . in your project to setup the package.json file and then you can start adding your dependencies with --save.
usually we use git
version control system
to deploy to the server, first we push our code to a git repository then we pull from it to the server git
you have to add .gitignore in your project and ignore node_modules from being committed to your git repository.
then you can pull to your server and run npm install on the server. and sure you need to launch your web server to serve your application example ngnix
you can try Heroku for an easy deployment, all you have to do is to setup your project with Heroku, and when you push your code, Heroku manages the deployment . Heroku

How to test cli (global link) with yarn?

I am trying to test a cli (feathers-cli) that I am working on. I have cloned it's primary dependancy (feathers-generator) and made my modifications, this is what I have done.
Gone into feathers-generator (master branch) and run yarn link
Gone into feathers-cli (3.0 branch) and run yarn link "feathers-generator"
Run yarn link
created a new directory, removed my existing version of feathers-cli
Run yarn link "feathers-cli" then run yarn global add "feathers-cli"
feathers however, at this point it is using the regular version it has pulled from npm. I have looked through the yarn docs and can't seem to find anything about globally linking packages. How do I approach this?
With yarn v1.19 you can do:
yarn global add file:/fullpath/to/myproject
Yarn, unfortunately, does not seem to support this directly in any way. The best thing I've found is to symbolically link the file to the user's yarn bin folder.
On Windows: %LOCALAPPDATA%/Yarn/bin
On Linux: ~/.yarn/bin/
Easiest way is to test within a node project, even if it's a global cli tool.
Example
Say I want to test a global tool my-yarn-cli, which has a few commands new, --help, --version, etc...
In the cli tool directory, yarn link this makes the local version of my-yarn-cli available to yarn.
In an empty project use yarn link my-yarn-cli, this adds a reference to the local version of my-yarn-cli
Test out commands using yarn as a proxy, for example:
yarn my-yarn-cli --help
yarn my-yarn-cli --version
yarn my-yarn-cli new TEST_ARGUMENT
This has been a useful workflow for me, and is much better than messing with symlinks or copying directories.

Speeding up the npm install

I am trying to speed up the npm install during the build process phase. My package.json has the list of packages pretty much with locked revisions in it. I've also configured the cache directory using the command
npm config set cache /var/tmp/npm-cache --global
However, on trying to install using npm install -g --cache, I find that this step isn't reducing the time to install by just loading the packages from cache as I would expect. In fact, I doubt if it's even using the local cache to look up packages first.
Proposing two more modern approches:
1) npm ci
Use npm ci, which is available from npm version 5.7.0 (although I recommend 5.7.1 and upwards because of the broken release) - this requires package-lock.json to be present and it skips building your dependency tree off of your package.json file, respecting the already resolved dependency URLs in your lock file.
A very quick
boost for your CI/CD envs (our build time was cut down to a quarter of the original!) and/or to make sure all your developers sit on the same versions of dependencies during development (without having to hard-code strict versions in your package.json file).
Note however that npm ci removes the node_modules/ directory before installing, so it won't benefit from any caching strategies.
2) npm i --prefer-offline
Use the --prefer-offline flag with your regular npm install / npm i. With this approach, you need to make sure you've cached your node_modules/ directory between builds (in a CI/CD environment). If it fails to find packages locally with the specific version, it falls back to the network safely.
You can also add --no-audit --progress=false to reduce pre-install checks and remove the progress bar (latter is only a very slight improvement)
For pure npm solution, you may try
npm install --prefer-offline --no-audit --progress=false
Prefer offline may not be useful for the first run.
As suggested by #Daniel Serodio
You could also include your node_modules folder inside your repository but you should probably zip it first than add to repo, and while installing you can unzip it and just
npm rebuild
(which works cross platform) it is quite fast.
This would also give you the benefit of full control over all your dependencies.
Also you can set the process flag to false to increase your speed by 2x.
npm set progress=false
Read source for more info
Update:
You can also use pnpm for this
npm i -g pnpm
This basically use local cached modules (i have heard its better then YARN)
It's better to install pnpm package using the following command:
npm i -g pnpm
pnpm uses hard links and symlinks to save one version of a module only ever once on a disk. When using npm or Yarn for example, if you have 100 projects using the same version of lodash, you will have 100 copies of lodash on disk. With pnpm, lodash will be saved in a single place on the disk and a hard link will put it into the node_modules where it should be installed.
As an example I can mention that whenever you want to install the dependencies of package.json file, what you should do is simply that enter the pnpm i and it handles the other things by itself.
UPDATE: The original answer is from 2014. I wouldnt recommend checking in node_modules, as there are definitly better options around speeding up the install especially for a ci pipeline, eg. npm ci --only=production
You could also include your node_modules folder inside your repository (you are probably using git), and just npm rebuild (which works cross platform) on build/deploy processes, and is pretty fast.
This would also give you the benefit of full control over all your dependencies (I know that's what shrinkwrap usually should be used for)
Edit:
Also you can set the progress flag to false to increase your speed by at least 20%. This works only with npm#v3.x.x, and there will be hopefully fixes for that soon (see second link)
npm set progress=false
Tweet about finding
Github Issue Cause identification
As very modern solution you can start to use Docker.
Docker allows you virtualize and pre-define as image the current state of your code, including installed npm-modules and other goodies.
Once the docker image for your infrastructure/env is built locally, or retrieved from remote repository, it will be stored on the host machine, and you can spin server in seconds.
Another benefit of it is that you use same virtualized code infrastructure on any machine where you deploy your code.
Docker speeds up install/deployment processes and is widely used technology.
To start using docker is enough to (all the snippets are just mock/example for pre-setup and are not by any means most robust/elegant solution) :
1) Install docker and docker-compose using manuals and get some basic understanding of it at docker.com
2) Write Dockerfile file in root of your application
FROM node:6.9.5
RUN mkdir /usr/local/app
WORKDIR /usr/local/app
COPY package.json package.json
RUN npm install
3) create docker-compose.yml in the root of your project with such content:
version: "2"
server:
hostname: server
container_name: server
image: server
build: .
command: sh -c 'NODE_ENV=development PORT=8080 node app.js'
ports:
- "8080:8080"
volumes: #list of folders and files to use
- ${PWD}/server:/usr/local/server
- ${PWD}/app.js:/usr/local/app.js
4) To start server you will need to docker-compose up -d. To see the logs docker-compose logs -f server. If you will restart your server it will do it in seconds once it built the image already at once.
Then it will cache build layers locally so next run will take only few seconds.
I know this might be bit of a robust solution, but I am sure it is have most potential/flexibility and is widely used in industry. And while it requires some learning for anyone who did not use Docker before, in my humble oppinion, it is the best one for your problem.
Nothing helped me more than disabling antivirus (Windows Defender in my case) I got from 2:30 to 1 minute.
With npm-cache package I got to ~30 secs.
I tried to use yarn, which is very fast, but was randomly failing in my case.
We have been trying to solve this problem to speed up our deployments.
We have settled on using pac, which follows the principles in the other answers. It zips the npm modules and includues them in your repo so you don't have a million files in your commits and code reviews and you can just unzip/rebuild for the target machine.
https://www.npmjs.com/package/pac

Resources