Okay, so I get it that I might be looking for a totally unexpected behavior.
My current company has nothing for me to do for a few weeks, so I would like to start a side project. The problem is, the firewall is really strong here, so I cannot download anything with Git or Npm. I also am not allowed to do any request to the IT support, since I am not in my company's office but some offices owned by a client (that applies said strict policy). In short, I am stuck with firewall and proxy policies that I cannot modify.
I may download a module's zipped archive through the browser, and install it from there. However, it has multiple dependencies, that themselves have dependencies, and so forth. And since I cannot run npm install to retrieve the dependencies, I'm stuck.
I saw multiple possible solutions in order to solve those dependencies issues:
make NPM run all requests and downloads through the browser, since browsers are allowed to access to the network. I don't see any options for that so far.
Download all required dependencies as tarball and step-by-step install each of them. Because of the potential number of dependencies being huge, I am looking for a huge bulk of modules commonly used to download once.
Most solutions I find make the assumption that I may use npm install properly, while my proxy doesn't allow it.
I wouldn't like to spend days on Chrome's built in game. Any idea?
maybe you can create the project somewhere else, and then "import" it on your office local machine:
npm init
edit package.json with your dependecies
npm install
put everything on usb stick and put it on your local computer at work.
or
send an archive of the code via email and download the tarball from email at office.
Related
I have a Node.js API on a Windows computer without internet access and I need to access data from a Microsoft SQL Server database. I have the module "mssql" however I need to use Windows authentication. I found online that this method requires the module "msnodesqlv8," however when I install the module on an internet-connected device and then move the files onto the non-internet device, the module won't run. The error I receive when I run the API indicates that the module needs to be rebuilt. Whenever I run "npm rebuild," the build attempts to install the dependencies - which of course is not possible without internet.
To try to circumvent this issue, I removed the dependency list prior to rebuild, but the rebuild still snags trying to install "safe-buffer." I don't know where this install is getting triggered. When I search for "safe-buffer" in the project, no references to "safe-buffer" are found. My assumption is that a dependency is trying to use it and I have not skimmed through every dependency yet.
I also tried packing the "msnodesqlv8" module, then installing it from the .tgz file, but this produced the same errors.
One avenue I have not yet tried is packing the module on the internet device, then transerfing the .tgz file.
Any recommendations would be awesome - I am blocked until I can solve this issue.
I was able to solve this problem by downloading a prebuild from the Github releases. I noticed in the stack trace of the rebuild that the process was accessing a specific build from this link, so I moved it into the "prebuilds" folder where it was expecting it, and the module powered up on the next rebuild.
This process has to be replicated on every device, from workstation to servers, but it works. Just make sure you download the build that matches your system.
I lately helped out on a project, where I added a really small dependency - in fact, it only contained a regular expression (https://www.npmjs.com/package/is-unc-path).
The feedback I got from the developer of the project was that he tries to minimize third-party dependencies if they can be implemented easily - whereby he - if I understand it correctly - asks me to just copy the code instead of adding another dependency.
To me, adding a new dependency looks just like putting some lines of code into an extra file in the repo. In addition, the developers will get informed by an update if the code needs a change.
Is it just a religious thought that drives a developer to do this? Are there maybe any costs (performance- or space-wise, etc) when adding a dependency?
I also had some disputes with my managers once concerning the third party libraries, the problem was even greater he got into believing that you should version the node_modules folder.
The source of any conflict usually is the ignorance.
His arguments were:
you should deliver to the client a working product not needing for him to do any other jobs like npm install
if github, npm is down in the moment when you run npm install on the server what you will do ?
if the library that you install has a bug who will be responsible
My arguments were:
versioning node_modules is not going to work due to how package dependencies work, each library will download his own node_modules dependencies and then your git repository will grow rapidly to hundreds of mb. Deploy will become more and more slow, downloading each time half a gb of code take time. NPM does use a module caching mechanism if there are no changes it will not download code uselessly.
the problem with left-pad was painfull but after that npm implemented a locking system and now for each package you just lock to a specific commit hash.
And Github, and npm does not have just a single instance service, they run in cloud.
When installing a dependency you always have some ideas behind and there are community best practices, usually they resume to: 1. Does the repo has unit tests. 2. The download number 3. When was the latest update.
Node.js ecosystem is built on modularity, it is not that node is so popular cause of some luck, but cause of how it was designed to create modules and reuse them. Sometimes working in node.js environment feels like putting lego pieces together and building your toy. This is the main cause of super fast development in node.js. People just reuse stuff.
Finally he stayed on his own ideas, and I left the project :D.
In this case,
I have one Project with some Packages and Dependencies (Nodejs). I use NPM for manager.
My client created one Web Server VM for me, without network access, just one Rule on the firewall to I can call my API.
I need some way to my project works perfectly inside my client, but my VM inside my client don't have network access.
I searched about this on Google, but every solve results have to install other dependencies, and I need something like copy and paste for my application will execute perfectly.
I saw this dependencies, but, I'm not sure if I need to install the dependencies inside my client for get my Packages from node_modules for my application to work well.
I saw: npmbox, yarn, npm-offline.
If someone use and know how to do that, please help me.
If have another way to do solved, I would like to hear.
Is there a way to install meteor packages globally?
So, having the once globally installed packages installable without internet connection for projects created later, avoid repetitive downloading, and other benefits one may imagine.
Like in Node.js, using npm command (of Node Package Manager) with -g flag, npm install -g, doing so npm installs node packages into a global directory and when wanted to be loaded from javascript programs, loading from there if available, as well as looking in and loading packages from project's node modules folder.
Meteor already downloads packages into a global repository that all your local apps benefit off of.
So if you meteor add iron:router#1.0.7 it is downloaded and added to your project. Next time another project requires the same version, it is used off that same spot.
Also, there is a PACKAGES_DIR environment variable, when set, allows you to keep your own local packages centrally, so that you can share them among projects. In fact, you can keep that on a network drive (NFS) which your whole team can mount and consume centrally.
Yet, there is an inherent problem. Meteor's version resolver looks up for updates unless you pin down your package dependency versions so that is exactly why meteor seems to be so desperate to be connected.
Even if you pin your dependencies, the packages you depend on may not have (which apparently is the case for most packages) so Meteor keeps looking for updates to the whole package tree and downloads those that it deems satisfying the version constraint resolver.
The good news is, they are constantly improving their tooling, requiring lower number of lookups, faster builds, better search etc.
All in all, in essence, there is not much you can do unless Meteor provides some way of hosting an entire mirror of its package repository for you to consume offline. And I guess it is very unlikely to happen.
Meteor is a tool for the connected world and it does assume your connectivity. Heck, the whole journey begins with a curl https://install.meteor.com/ | sh
And yes, it would be great if we could hack away on a remote beach, or the 12 hour flight to that beach.
Until then, happy coding online ;)
How should we deal with local packages that are a dependency in other local packages?
For simplicities sake, say we have the follow packages
api - express application
people - a package to deal with people
data-access - a package that deals with data access
And then the dependencies are
api depends on people
people depends on data-access
Currently we have these dependencies setup as file dependencies.
I.e. api package.json would have
"dependencies": {
"people": "file:../people"
}
Trouble with this is that we're finding it a PITA when we make updates to one package and want those changes in the other packages that depend on it.
The options we have thought of are:
npm install - but this won't overwrite previously installed packages if changes are made, so we have to delete the old one from the node_modules directory and re-run npm install... which can be niggly if the package dependency is deep.
npm link - we're not sold on the idea because it doesn't survive version control... Just thinking about it now, maybe we have some kind of local build script that would run the npm link commands for us... this way it could survive version control. Would that be a grunt job?
grunt - we haven't dived too deep into this one yet, but it feels like a good direction. A little bit of googling we came accross this: https://github.com/ahutchings/grunt-install-dependencies
So, what option would work best for our situation?
Are there other options that we haven't thought of yet?
Ps. we're a .NET shop doing a PoC in node, so assume we know nothing!
Pps. if you strongly believe we're setting up our project incorrectly and we shouldn't have smaller individual packages, let me know in the comments with a link to some reading on the subject.
So, I agree that going with 'many small packages' is usually a good idea. Check out 12factor.net if you haven't already.
That said, in specific answer to your question I'd say your best bet is to consider mainly how you want to maintain them.
If the 'subcomponents' are all just parts of your app (as, for example, data-access implies), then I'd keep them in the same folder structure, not map them in package.json at all, and just require them where you need them. In this case, everything versions together and is part of the same git repository.
If you really want to or need to keep them all in separate git repositories, then you can do npm link, but to be honest I've found it more useful to just use the URL syntax in package.json:
dependencies: {
"people" : "git://path.to.git:repo#version.number"
}
Then, when you want to explicitly update one of your dependencies, you just have to bump the version number in your package.json and run npm install again.