Meteor build with shrinkwrap - node.js

I am trying to build and deploy a Meteor project. My constraint is that I have to deploy the app on a remote computer of my local network without internet, so the npm install should not need to connect anywhere.
My idea is to use shrinkwrap and shrinkpack as described here:
https://guide.meteor.com/using-npm-packages.html#npm-shrinkpack
Then build as described here:
https://docs.meteor.com/commandline.html#meteorbuild
What I do on the dev box:
meteor npm shrinkwrap // this properly updates the npm-shrinkwrap.json
meteor npm shrinkpack // this properly populates the \node_shrinkwrap folder with the tars of the dependencies
meteor --production
//terminate meteor
meteor build ..\build // I cannot find the tars for the dependencies in the output
Then I copy the tarball to the remote computer:
tar -zxf coolMeteorAngular2App.tar.gz
cd bundle\programs\server
npm install // still connects to internet to get the dependencies instead of getting the from the local bundle
node main.js
Does anyone knows what I am doing wrong? How do I include the dependencies tarballs into the build, and how do I tell npm install to use them?

Related

How to create an NPM environment for an offline build

I am trying to do an NPM build, install and bundle on a server which does not have a network connection
I can run the build successfully on an online server, and I can copy the directories which are required over to the offline server
How can I reproduce the NPM environment on the offline server so that an an NPM build, install and bundle will be successful?
I assume I should be copying the node_modules and package-lock.json and running an npm install --offline.
I can run the build successfully on an online server, and I can copy the directories which are required over to the offline server
Knowing that, on a system that has access to the internet, run npm install to create and populate the node_modules directory with all the dependencies.
Copy over the entire application directory including node_modules directory, allowing all dependencies available to the offline server.
This is the approach shared in comments: How to install npm package while offline?
If the dependencies need to be updated, these steps would need to be repeated.

How to "upload" my nodejs source code to production server?

I am not asking about deploying Node.js app or how do I deal with node_modules package, I am talking about "upload" my own nodejs code to production server.
So far I have tried 3 ways, each with its own pros & cons
git clone. To reduce to the cloned stuff I use git clone --depth 1 -b release_branch. But still I got the files I do not need for deployment, e.g except for .git I also get documents (b/c I put documents in my git repo).
npm install. use npm install git+https://gitusername:gitpassword#myserver/path/to/repo.git. With proper package.json files setting I can get my source codes only, which is what I want. But the problem is the directory structure. After running npm install, the directory is like this,
.
├── node_modules
└── package-lock.json
My package is located inside node_modules alongside with its own dependencies.
npm pack. Then upload (scp in my case) the tgz file to server, then npm install tgz_file.tgz --production But it has the same problem with npm install git+https. It is probably better than npm install git+https b/c I can control my releases.
So is there other ways (simple) that I can get proper directory structure and source codes only ?
BTW, I know this nodejs express app deploying to production but their discussion is not the same as mine.
----- update -----
Now I am kinda sure npm install tarball has some bug, so I just tar xf & npm i I believe this is the simplest solution.

Npm Install straight from package.json

I have a simple question that i cant seem to find the answer for. I cloned a git repo to my local machine.
When attempting to start node, I receive an error because i don't have the required npm dependencies installed. However, they are located in the packages.json file that was cloned.
I was wondering if there was a simple way to install the dependencies located in that file, without having to npm install for every individual package.
Within the directory of the package.json file, just run npm install. It will read package.json and install all dependencies. If you want to limit it to only non-dev dependencies, use npm install --only=production.

How to install npm package while offline?

I'm working on an offline network and want to install angular-cli using npm.
I have a zip file of angular-cli and using the latest node and npm version.
I'm using the command: npm install ./angular-cli-master to install angular-cli from the folder.
But I keep getting this error telling me I don't have an internet connection (which is ok).
So how can I install this angular-cli while offline using the zip I downloaded from Github?
Thanks for your help.
You simply copy the package and all dependencies in your node_modules folder, inside the project for local installation, or in the global folder (npm config get prefix to see where it is located) for a global installation.
The behavior of npm install is to check for the dependencies, and install them first. When it doesn't find them installed, nor the local file containing them, it tries to download them.
Since all of those steps fail (you don't have the dependency installed, it isn't available on the expected location, and it can't download it), the installation fails.
You can find the dependency list in the package.json of each module, but since it is recursive, it can take a long time to have everything set right if you do it manually, npm does it by recursion.
For you, the easiest way would be to create a new folder on the connected PC, and inside it npm install angular-cli, zip the folder and transfer it on the offline machine.
Jan 2016 - check out Addy Osmani's recommendations for offline installation of npm packages
May 2017 - as of npm 5, you can pass the --prefer-offline flag to npm install
yarn does this out of the box.
In 2019, I found none recommended approaches were applicable to an "air gapped" server with no internet access.
I found the only solution was to, on windows, using artillery.io as an example:
install the package on a machine with internet access, e.g local dev machine. npm install -g artillery
Browse to C:\Users\{username}\npm
zip up the \node_modules\artillery (e.g artillery.7z)
Copy the zip and the files artillery, artillery.cmd (at root of npm folder) to the server
Paste the two artillery, artillery.cmd to the root of the servers npm folder (C:\Users\{serverusername}\npm)
Extract the zip to C:\Users\{serverusername}\npm\node_modules
This is the complicated version for just one tool. If your local machine's npm folder is relatively light on tools, you could always just zip the whole npm folder and copy + extract it on the server.
I still think it's odd that npm insists on trying to connect to the registry even when using npm pack and npm install -g <tarfile>
Problem: I'd been in similar situation where I can't install the express.js and all other dependencies specifies by package.json on my local machine (offline) using npm due to unavailability of internet connectivity.
Solution: I've a solution that works on Windows(not so sure of other platforms) through which I installed express framework with all the dependencies I required for my project which include cookie-parser, jade, morgan etc.
Steps :
Install all the package(s) on a remote machine which has an internet access.
In my case I'm using Windows on both remote as well as local machines and my requirement was of installation of express.js on local machine . So I run below command on my remote machine to install express.js
C:\Users>npm install -g express-generator`
After installation of express.js I created an app on my remote machine using:
C:\Users\Name\Desktop>express Project`
C:\Users\Name\Desktop\Project>npm install -g =>to install all other dependencies globally*
Now browse to location where npm's global modules are stored, you can view the location by
C:\Users>npm config get prefix
Generally in Windows its
C:\Users\{Username}\AppData\Roaming\
Simply copy the npm and npm-cache folder of your remote machine.
And place both copied folders viz. npm and npm-cache into your local machine on same location thats
C:\Users\{Username}\AppData\Roaming\
the short answer, you can't. Most NPM packages such as #angular/cli need other dependencies and those need child dependencies which get installed when you run npm install
You can, however, install the cli when on the network and use it when offline.
You can find the npm install command documentation here: https://docs.npmjs.com/cli/install
I am not quite sure and unfortunately, I do not have the chance to test it myself right now, but I would try to either unzip the folder and remove the dot, like that:
npm install /angular-cli-master
(= installing a folder not a zip file)
or just add the zip file ending like that:
npm install ./angular-cli-master.tgz
(= installing a zip-file not a folder, file ending may be .zip or something else, though)
Was test success with node 18.x.x.
The following step guild how to install http-server package
On Online PC:
npm install -g http-server
After finish install, copy http server folder. (Usually locate at: C:\Users[UserName]\AppData\Roaming\npm\node_modules)
On offline PC:
Paste http-server folder. e.g. D:\http-server
npm install -g D:\http-server
Online computer:
npm install -g offline-npm
copy the npm-module to the offline computer and thats it !

npm installs different package on server vs localhost

I'm running npm on my local environment. If I run npm install on my local environment, it will install a few packages into node_modules/ and everything works fine.
I then pull the latest changes on the remote server server and try run gulp and get some errors. So I run the following:
rm -rf node_modules
npm install
gulp
and everything works correctly. But of course a whole bunch of items in node_modules have changed. So, I do a git push from the server, and then pull locally. But now my local build will not run gulp and I need to repeat the above process (remove node_modules, npm install).
Basically, it seems that npm install installs slightly different packages on my local environment vs server environment, despite the fact that they are both Ubuntu 14.04. The nodejs version for both server and local is also the same at v5.3.0.
As suggested, we don't check in node_modules. So we run npm install and gulp on the server.

Resources