I'm developing on a Windows machine with a Linux docker image. I'm using node-sass which has native bindings stored under node_modules/node-sass/vendor
Every time I add or update a package using yarn, I have to enter the docker image and manually re-run npm rebuild node-sass in order to reinstate the linux-x64 binding.
Is there a way to tell yarn to not delete files in this folder?
Related
I'm trying to run a Node application on AWS Linux 2 on Elastic Beanstalk and need to install the dependencies using yarn. (My Node app causes errors if you try to use npm to install dependencies instead of yarn.)
I've already figured out how to set up a script in .platform/hooks/prebuild/ to get it to run yarn, but even though it's running the yarn installation, it still also tries to run npm install, which errors out, causing my deploy to fail.
So I need to figure out how to prevent the default npm install step from running.
(Does anyone know what file that command is run from in the AWS Linux 2 setup process? I was wondering if I could just add another script in .platform/hooks/prebuild/ that would modify that file to prevent the call to npm.)
yes, you can avoid npm install
When you deploy a node_modules directory to an Amazon Linux 2 Node.js platform version, Elastic Beanstalk assumes that you're providing your own dependency packages, and avoids installing dependencies specified in a package.json file.
source doc
On my local machine, npm run build works just fine. On my Docker image launched via Jenkins, I get issues like
Cannot find module: 'file-saver'. Make sure this package is installed.
You can install this package by running: npm install file-saver.
and
FAIL src/core/App.test.js
● Test suite failed to run
Cannot find module 'surface-nets' from 'vtext.js'
On my local machine, I have cleared the npm cache (npm cache clean -f), removed node_modules/, and reinstalled (npm i). I have even used npm update and the npm-check-updates package to update everything. I installed all peer dependencies. My local copy should be as wiped-clean as the copy within Docker is. In the Jenkinsfile, I put npm list and it shows surface-nets and file-saver and all my other packages. I also put ls node_modules/ and I can see the package folders are there. I have reduced my Dockerfile down to just 1 line: FROM node:current.
Why is it saying "Cannot find module" when the modules are installed?
The root cause of this issue is still unknown, but I did find out that the repository was corrupted. I ended up wiping the whole remote repo and git init-ing a new one, then pushing to that. I assume that corruption contributed to the issue at hand here.
I have a Jenkins server on which I observe a private git repository for changes, which then triggers a pipeline script (the repository contains a nodejs app). In this pipeline script I need to do the following steps:
Install dependencies (npm install)
Build my application (npm run build, which creates a dist folder)
Build a docker container (docker build) and run the container (which runs a script in the dist folder)
Which of the following two options would be the recommended way to do this, and why?
Option A: Run npm install and npm run build in the jenkins pipeline and copy the dist folder to the docker container during the docker build. This would allow me to only install runtime dependencies in the docker container using npm install --only=production, therefore reducing the image size significantly.
Option B: Run npm install and npm run build during docker build (In the Dockerfile). This would allow me to run the docker container outside the CI server if I have to (I don't have a use case for it now, but it seems cleaner because it is more independent). However, the image size would significantly increase and I am not sure if this is the recommended way.
Any suggestions?
I would choose option B.
The reason behind it would be that there are some npm packages that runs a node-gyp, gcc, and other platform-dependent builds.
Look at the popular bcrypt package as an example.
Going with option A would mean that your docker and Jenkins machine need to hold the same infra for such builds which is not common, to say the least.
I'm developing some nodeJS applications on a mac machine. For testing purpose I'm using Parallels to get a virtual windows machine (win 10).
If I'm running npm install for a project on my mac, I can't run the project on the windows machine, as I'm getting an access denied error for the node_modules folder.
So I deleted the folder an run npm install on the windows machine. With that I can run the app. But then on my mac machine I do get e.g. sh: /Users/project/node_modules/.bin/nodemon: Permission denied.
What can I do to set the correct access rights to the node_modules directory to get the app running on both OS?
That is because the binaries (.bin) compiled in macOS need not necessarily work with windows too.
For your scenario, use YARN. Because YARN offers offline install.
Reference link: https://yarnpkg.com/blog/2016/11/24/offline-mirror/
In either of your systems install YARN.
npm install yarn
Inside your project folder in a cmd or a terminal just type yarn
-Yarn will start resolving your packages.
Once done, create a .yarnrc file by executing the following commands (same for both windows and macOS).
yarn config set yarn-offline-mirror ./npm-packages-offline-cache
yarn config set yarn-offline-mirror-pruning true
A .yarnrc file will be created in your home directory (macOS => ~/.yarnrc || windows => C:\{user}\.yarnrc).
Move that file into your project to make it specific only to your project.
Now do a yarn install -> Results in node_modules folder and a yarn.lock file. Also note that in the home directory under the folder npm-packages-offline-cache you will have all the dependencies in tarball format.
All you have to do is Commit this tarball directory and the yarn.lock to a repository common to both the environments, setup yarn in other environment by repeating the same 1-5 steps.
Finally run yarn install -offline, you will have the dependencies loaded.
Long story, Short! You can't just copy paste node_modules and get it working between windows and unix.
Hope it helps you..
I'm very new to Node Package Manager and also Vue, and I'm trying to understand what exactly is going on with using the Vue CLI.
The vue.js website has this as instructions for running the official Vue CLI:
I have a few questions about this:
Does npm install --global vue-cli need to be executed only once on a machine, or once on a directory, or once per new project you're starting? In other words, once it's on your computer, is that the last time you need to run that command, or do you need to execute this command every single new project you start?
Once a new project is initiated, are local copies of the newest version of vue (and vue-router, if selected) installed?
If I finish this project and want to deploy it, how do I then port this over to a production server?
Once in a machine, except for the rare cases where one is isolating one's npm install (such as by using nodeenv or inside a container); that's what the global option is for.
After running npm install, yes.
Running npm run build and copying the contents of the resulting dist directory to the production machine (often within a /var/www directory or similar). This can be automated further in many ways.