I need to install Socket.io on a machine without internet access.
I've downloaded Node.js and Socket.IO on another box, but when I copy and try to install them on the isolated machine, Node.js installs ok, but Socket.IO insists on connect to GitHub.
How can I install Socket.IO without an internet connection? Should I install all dependencies offline? If so, what are the dependencies of Socket.IO?
It turns out that npm supports package caching. Basically you create a cache on the development machine that does have internet access, copy that cache onto your target at the same time that you install your nodejs application, and then install the packages from the cache. I assume from your question that the target machine already has nodejs and npm installed.
Step 1. Use npm to create a cache directory on your development machine
First, create a cache directory and configure npm to use it. Then install each of your packages.
mkdir ../my-cache
npm config set cache ../my-cache
npm install --save async#0.9.0
npm install --save restify#2.8.3
etc.
If you look in the my-cache directory you'll see sub-directories for each installed package.
Step 2. Copy the cache to your target machine along with your node application
No rocket science here: make sure you copy the my-cache directory to your target machine.
Step 3. Use npm to install the packages from the cache
Configure npm to use the cache directory. Be aware that npm will still try to go fetch the package files form the internet. And it will retry after a failure. I found a couple recommendations for forcing npm to use the cache, but those options did not work. But I did find a way to significantly reduce the amount of time npm spends trying to fetch before looking in the cache.
npm config set cache ../my-cache
npm config set fetch-retries 1
npm config set fetch-retry-maxtimeout 1
npm config set fetch-retry-mintimeout 1
npm install async#0.9.0
npm install restify#2.8.3
Be aware that you cannot just type npm install because npm will not use the cache. This is a bit of a pain. If you want a robust install you can write a tiny nodejs app to parse out the dependencies and calls child_process.exec to install each one.
(*) I should mention that there is a package called npm-cache (https://www.npmjs.com/package/npm-cache). In my case npm-cache did not suit my needs. But you may be able to make it work for you.
Related
When I perform a basic npm install in an application I am attempting to set up a dev. environment for corrupted packages end up being pulled from my Verdaccio proxy instance.
To Reproduce
Steps to reproduce the behavior:
Set up package.json for my project including adding my dependencies,
etc.
Set my local system to use my Verdaccio instance: npm set
registry [[Verdaccio Server URL]]:4873.
Perform npm install within the directory I have the package.json in.
Results
I get a ton of output similar to the following:
npm http fetch GET 200 [[Proxy NPM Site]]/#angular%2fplatform-browser-dynamic/-/platform-browser-dynamic-5.2.11.tgz 6430ms
npm WARN tarball tarball data for #angular/router#5.2.11 (sha512-NT8xYl7Vr3qPygisek3PlXqNROEjg48GXOEsDEc7c8lDBo3EB9Tf328fWJD0GbLtXZNhmmNNxwIe+qqPFFhFAA==) seems to be corrupted. Trying one more time.
npm WARN tarball tarball data for jquery#3.3.1 (sha512-Ubldcmxp5np52/ENotGxlLe6aGMvmF4R8S6tZjsP6Knsaxd/xp3Zrh50cG93lR6nPXyUFwzN3ZSOQI0wRJNdGg==) seems to be corrupted. Trying one more time.
npm WARN tarball tarball data for ng-bootstrap#1.6.3 (sha1-1B/UIVTAWTQiy4PEc6OCiqdSW/U=) seems to be corrupted. Trying one more time.
Note the URL encoded name of the package, in this case #angular%2fplatform-browser-dynamic. I do not get this when I set my proxy to be https://registry.npmjs.org/.
Of course, what I would like to do is be able to perform an npm install just as if I were connected to the official registry.
Configuration and Log Files
verdaccio-log.txt
npm-verbose-log.txt
config.yaml.txt
Additional Information
NPM Version: Latest - 6.1.0 but it happens with older versions.
Node Version that Verdaccio is running on: 10.4.0
Node Version that Client
is running on: 10.4.0
Environment: Windows Server 2012 (SP2)
The server that Verdaccio is on is not behind a proxy.
So after performing the following everything worked as intended:
rmdir /S /Q node_modules (or rm -rf node_modules on a *nix derivative OS)
del package-lock.json (or rm package-lock.json on a *nix derivative OS)
npm set registry [[My Verdaccio Instance's IP]]:4873
npm cache clean --force
npm install --force --verbose --no-bin-links
If you roll back to NPM version 3, it will start working again. Publishing to Verdaccio with NPM#3 and then pulling with NPM#>=5 was causing this problem for me.
I solved my issue with the following steps:
remove package-lock.json
run npm cache clear --force
run npm cache verify
run npm install
This seems like a problem with your node/npm and not with Angular CLI.
I suggest you try to use
npm cache clear --force
npm install
Thanks.
Had to turn off Verdaccio's cache
uplinks:
npmjs:
url: https://registry.npmjs.org/
cache: false
Removing the package-lock.json. It solved my problem.
I have Internet connection in my home and I can install the latest version of TypeScript with this command: npm install -g typescript , But unfortunately There is no Internet at my work place (in fact we are not allowed to use Internet).
Beside this I googled But It seems There is no offline installer for Typescript. My question is how can I handle this problem ?
I am totally new to npm and a step by step workaround would be appreciated .
There is an ugly solution: do npm install at home and copy the content of your globally installed packages folder to work.
If you want to be able to do npm install without access to the internet you will need to configure your own npm registry in your local network.
I've used Sinopia in the past when working offline. It works as a cache for npm allowing you to work off-line provided you have installed the required packages while having an internet connection.
As per https://www.npmjs.com/package/sinopia#installation you can install and configure Sinopia with the following steps:
# installation and starting (application will create default
# config in config.yaml you can edit later)
$ npm install -g sinopia
$ sinopia
# npm configuration
$ npm set registry http://localhost:4873/
# if you use HTTPS, add an appropriate CA information
# ("null" means get CA list from OS)
$ npm set ca null
I'm working on an offline network and want to install angular-cli using npm.
I have a zip file of angular-cli and using the latest node and npm version.
I'm using the command: npm install ./angular-cli-master to install angular-cli from the folder.
But I keep getting this error telling me I don't have an internet connection (which is ok).
So how can I install this angular-cli while offline using the zip I downloaded from Github?
Thanks for your help.
You simply copy the package and all dependencies in your node_modules folder, inside the project for local installation, or in the global folder (npm config get prefix to see where it is located) for a global installation.
The behavior of npm install is to check for the dependencies, and install them first. When it doesn't find them installed, nor the local file containing them, it tries to download them.
Since all of those steps fail (you don't have the dependency installed, it isn't available on the expected location, and it can't download it), the installation fails.
You can find the dependency list in the package.json of each module, but since it is recursive, it can take a long time to have everything set right if you do it manually, npm does it by recursion.
For you, the easiest way would be to create a new folder on the connected PC, and inside it npm install angular-cli, zip the folder and transfer it on the offline machine.
Jan 2016 - check out Addy Osmani's recommendations for offline installation of npm packages
May 2017 - as of npm 5, you can pass the --prefer-offline flag to npm install
yarn does this out of the box.
In 2019, I found none recommended approaches were applicable to an "air gapped" server with no internet access.
I found the only solution was to, on windows, using artillery.io as an example:
install the package on a machine with internet access, e.g local dev machine. npm install -g artillery
Browse to C:\Users\{username}\npm
zip up the \node_modules\artillery (e.g artillery.7z)
Copy the zip and the files artillery, artillery.cmd (at root of npm folder) to the server
Paste the two artillery, artillery.cmd to the root of the servers npm folder (C:\Users\{serverusername}\npm)
Extract the zip to C:\Users\{serverusername}\npm\node_modules
This is the complicated version for just one tool. If your local machine's npm folder is relatively light on tools, you could always just zip the whole npm folder and copy + extract it on the server.
I still think it's odd that npm insists on trying to connect to the registry even when using npm pack and npm install -g <tarfile>
Problem: I'd been in similar situation where I can't install the express.js and all other dependencies specifies by package.json on my local machine (offline) using npm due to unavailability of internet connectivity.
Solution: I've a solution that works on Windows(not so sure of other platforms) through which I installed express framework with all the dependencies I required for my project which include cookie-parser, jade, morgan etc.
Steps :
Install all the package(s) on a remote machine which has an internet access.
In my case I'm using Windows on both remote as well as local machines and my requirement was of installation of express.js on local machine . So I run below command on my remote machine to install express.js
C:\Users>npm install -g express-generator`
After installation of express.js I created an app on my remote machine using:
C:\Users\Name\Desktop>express Project`
C:\Users\Name\Desktop\Project>npm install -g =>to install all other dependencies globally*
Now browse to location where npm's global modules are stored, you can view the location by
C:\Users>npm config get prefix
Generally in Windows its
C:\Users\{Username}\AppData\Roaming\
Simply copy the npm and npm-cache folder of your remote machine.
And place both copied folders viz. npm and npm-cache into your local machine on same location thats
C:\Users\{Username}\AppData\Roaming\
the short answer, you can't. Most NPM packages such as #angular/cli need other dependencies and those need child dependencies which get installed when you run npm install
You can, however, install the cli when on the network and use it when offline.
You can find the npm install command documentation here: https://docs.npmjs.com/cli/install
I am not quite sure and unfortunately, I do not have the chance to test it myself right now, but I would try to either unzip the folder and remove the dot, like that:
npm install /angular-cli-master
(= installing a folder not a zip file)
or just add the zip file ending like that:
npm install ./angular-cli-master.tgz
(= installing a zip-file not a folder, file ending may be .zip or something else, though)
Was test success with node 18.x.x.
The following step guild how to install http-server package
On Online PC:
npm install -g http-server
After finish install, copy http server folder. (Usually locate at: C:\Users[UserName]\AppData\Roaming\npm\node_modules)
On offline PC:
Paste http-server folder. e.g. D:\http-server
npm install -g D:\http-server
Online computer:
npm install -g offline-npm
copy the npm-module to the offline computer and thats it !
I want to install tfx-cli on a machine which does not have access to Internet.
I have installed nodejs with node-v6.4.0-x64.msi.
Per msdn guide after installing node.js install tfx-cli with below command
npm install -g tfx-cli
But the above requires internet connection.
How to install tfx-cli in a machine which has no internet access.
Since you are using nodejs-v6.4.0. Afraid there is no way to install without internet access.
Certainly, there are some ways to use npm for offline pack installation. Such as offline-npm However it's not suitable for your case:
npm >= v3.x bundled with node >= v5 has broken this project.
preinstall script is since then called after requests to npm registry
are made. This makes it impossible for offline-npm to start as a
registry server.
And Use local-npm for offline NPM package installation, however you still should install many other things from internet. It doesn't make sense. If you have internet, you could directly install it.
Recently, I started using Yeoman to create static site projects built with Jekyll. yo jekyllrb runs fine but, in terms of security, I'm concerned about the part that comes after it prints:
I'm all done. Running bower install & npm install for you to install the required dependencies. If this fails, try running the command yourself.
If I'm not connected to the internet bower install still runs fine, but npm install gives the following error:
npm ERR! git fetch -a origin (git://github.com/gruntjs/grunt-contrib-watch.git) fatal: unable to connect to github.com:
npm ERR! git fetch -a origin (git://github.com/gruntjs/grunt-contrib-watch.git) github.com: Name or service not known
npm ERR! git fetch -a origin (git://github.com/dannygarcia/grunt-jekyll.git) fatal: unable to connect to github.com:
npm ERR! git fetch -a origin (git://github.com/dannygarcia/grunt-jekyll.git) github.com: Name or service not known
From what I understood by reading what is said in response to this question about sudo, npm, and chroot, running an npm install potentially executes code from the internet and is therefore recommended to prefix such command with sudo in order to allow npm to downgrade privileges. Doing an npm config set unsafe-perm=false is supposed to force one to follow this recommendation. This, however does not have any effect on npm install, I guess, because it's a local install. This makes sense to me since (especially in a dev. env.) the code that is installed locally is most likely supposed to be executed by one's own user. What does not make sense to me is that, in the case of Yeoman, code has to be fetched from the web and executed with the same frequency that I start a new (simple) project, especially since npm does not check signatures. I imagine that a solution to this would be to disable npm for my user and copy the double-checked node_modules dir from another similar project. In the case of projects with more diverse needs, I would consider using a sandbox (perhaps with chroot).
What is the standard way of dealing with this issue? Does Yeoman provide any options that would allow one to work around this problem?
I think Bower is a little smarter about handling packages than npm, most times it will use a cached copy of a plugin so if you have jQuery version X installed in one project, and then use that same version in another, then Bower doesn't need to connect to the Internet in order to resolve that dependency, it will just pull from cache instead. That being said you still would have had to download that package in the first place in order to have that cached copy.
On the sudo thing, I've since learned that you shouldn't use sudo with package managers. Just going to quote from this answer here: "sudo npm install -g grunt-cli" gives me an error
According to the maintainer of npm, installing packages with sudo is
considered bad practice because you are allowing that package to have
complete control of your system and you can't and SHOULDN'T trust
these packages with root access. Think Debian's long release cycles as
an extreme example of protecting end users from community maintained
packages for this exact reason.
http://howtonode.org/introduction-to-npm
You should do what Issaacs suggests and chown your /usr/local folder
so you have RW permissions.
So the issue you're having is that you don't want to run things with sudo. So don't. But if npm install throws an error it may be that you need to chown -R /usr/local, or another issue. In the case of node modules, they are installed per project, unless you install them globally with the -g flag, and even then you can have multiple projects with different versions of packages.
If you already have a project that uses Y version of grunt, yeoman etc, you might just want to copy over the relevant modules from another project. This will still work. However it's a lot simpler to do npm install <package, ...> --save or --save-dev to persist these dependencies in your package.json file, which makes cloning the project to another machine a lot easier (git clone <project> && npm i)
TLDR: Both Bower and npm install packages from the Internet, whether run under Yeoman or not; Bower is different in that sometimes it can make use of a cache.