I was working on a node module, where I would have to check if the user had installed an specific package (that's the logic I have to follow) and if not, install that using child_process.exec.
That's where the problem appears, even though my script installed the package (node_modules contains it, also package.json), if I call require.resolve again, it says the module does not exist.
For what I found (from node github issues), node has sort of a "cache", where on the first call, if the package is not installed, the cache entry for this package is set to a random value which is not the path to it. Then when you install and try to call that again, it calls that cache entry.
Is there any other way to check if packages are installed, or my local node installation is broken?
Take a look into node.js require() cache - possible to invalidate?
It's possible to clean up the require cache, with something like this:
delete require.cache[require.resolve('./module.js')]
One way to check if a package is installed is trying to require it. Please see my code below that verifies if a package is not installed. Just add your code that installs it after the comment now install the missing package.
try {
require('non-existent-package');
} catch (error) {
// return early if the error is not what we're looking for
if (error.code != 'MODULE_NOT_FOUND') {
throw error;
}
// now install the missing package
}
Let me know if this works.
Related
Is there a library or is it perhaps built into NPM itself to manage packages and install them within a script? I'm writing a process that checks if a local package exists and installs if it doesn't. Then I'd like to be able to dynamically require it in the same process.
This is definitely possible, but probably inadvisable.
I found the npm module "npm-programmatic" which lets you npm install. Once you have that, all you need to do is wrap your require with a try catch so that you can handle when a require fails.
const npm = require('npm-programmatic')
let myPackage
try {
myPackage = require('my-package')
} catch(err) {
npm.install(['my-package']).then(function() {
myPackage = require('my-package')
console.log(myPackage)
})
}
The biggest problem you're probably going to face here is that the script will need to be running with more than standard privileges. You will likely need to sudo run this script which is very inadvisable.
No, I think that there is no way to accomplish your aim, and in fact, it is a "bad" idea, I think.
Under Node.js, we always use package.json to manage all dependencies, and when we want to deploy them, we only use to run
$npm install
it is very easy and effective, but based on your mind, we need write a new function instead of require function, for example, require2, and while we use it to load the module, it always check the module exists or not at first, it is not effective, I think.
My code is :
http://pastebin.com/rCy4wSUK
As soon as this function is called by router it prints
"done copying contents of clean base into temp" and then error which is here:
http://pastebin.com/UxEu4PaS
So at least it is not giving an error in copying but what is causing it to throw this error.
Sounds like the fs.extra module has not installed completely and is missing a dependency. Your code runs fine for me with a fresh npm install fs.extra q
Remove your node_modules folder and re-run npm install (if your dependencies are listed in package.json) or npm install fs.extra q (if they are not).
I've seen this issue on case-insensitive filesystems in projects that require different versions of walk that depend on forEachAsync#2.x and foreachasync#3.x. Because the casing but not name of foreachasync changed across versions, it seems like it might be confusing npm and not properly installing the right versions.
I was able to fix this in our project by explicitly depending on foreachasync#^3.0.0.
I am attempting to install localForage into a node.js application (with Angular) and Browserify.
Here is a link to the localForage documentation
It appears that using localForage and angular-localForage causes a problem with browserify based around the use of 'require'
If I require the localforage.js file in the src file I get the following error:
Warning: module "promise" not found from "/Users/mgayhart/Sites/epson- receipts/bower_components/localforage/src/localforage.js" Use --force to continue.
If I require the localforage.js file in the dist file, I get the following error:
Warning: module "./drivers/indexeddb" not found from "/Users/mgayhart/Sites/epson- receipts/bower_components/localforage/dist/localforage.js" Use --force to continue.
Anyone know of a workaround to be able to move forward with these libraries?
There is an issue on github with this problem: https://github.com/ocombe/angular-localForage/issues/26
I'll be working on it soon, you can subscribe to the github notifs on this issue to know when it's fixed !
For me installing it through bower and using it with browserify-shim worked. So in package.json:
"browser": {
"localforage":"./src/lib/vendor/localforage/dist/localforage.min.js"
},
"browserify-shim": {
"localforage":"localforage"
}
And to expose it as an angular-service (if you don't want to use angular-localforage):
app.factory "localforage",-> require 'localforage'
I've just been having this issue myself tonight, but I think I found a fix.
Instead of trying to get the bower modules to work with browserify, why not just use npm like its made for?
npm install localforage
and then when you use require you don't have to give it the path
but it still didn't work for me until I copied the folder:
localforage/src/drivers TO localforage/dist/drivers
Then it found all the dependencies and worked like a champ!
Alternatively if you must use bower you could try to use the debowerify tranform w/ gulp:
https://github.com/eugeneware/debowerify
I don't know if I've worded the question properly, so I apologize if it isn't clear from the title what I mean.
Say I have an NPM package which installs an executable. Presumably I want users to install this package with the -g flag so that they can run it whenever.
In this case, when I call require() from within the executable, it will look for packages installed globally.
But suppose this package provides generic functionality for Node projects. I might want to know which packages the current project has installed locally. Should I just assume:
path.join(process.cwd(), 'node_modules')
Or is there a more "correct" way to set the NODE_PATH in this case? (Or rather than set NODE_PATH, should I just require(absolute_path_to_file)?)
require will not only lookup the package inside $(CWD)\node_modules but also inside all node_modules of parent, grandparent, etc. So you can use resolve on npm to solve this problem
FILE: your_global_command.js
// npm install resolve
var resolve = require('resolve').sync;
// Lookup for local module at current working dir
function require_cwd(name) {
var absolute_path = resolve(name, { basedir: process.cwd() });
return require(absolute_path);
}
// Load local express
// this will throw an error when express is not found as local module
var express = require_cwd('express');
I also create a package to require a package at current-working-dir (instead of __dirname of module):
https://npmjs.org/package/require-cwd
Other than grabbing the package.json file at the project root is there a way to determine the list of dependencies of a running node.js application? Does node keep this meta information available as some var in the global namespace?
If you are just looking for the currently installed npm packages in the application directory, then you can install the npm package (npm install -g npm) and programatically invoke ls to list the installed packages and the dependency trees.
Obviously, this has no bearing on whether the installed packages are actually require'd in the application or not.
Usage is not that well documented but this should get you started.
var npm = require('npm');
npm.load(function(err, npm) {
npm.commands.ls([], true, function(err, data, lite) {
console.log(data); //or lite for simplified output
});
});
e.g.:
{ dependencies:
{ npm: { version: '1.1.18', dependencies: [Object] },
request: { version: '2.9.202' } } }
Otherwise, I believe the only other option is to introspect the module module to get information pertaining to the currently loaded/cached module paths. However this definitely does not look to have been developed as a public API. I'm not sure if there are any alternatives so would be keen to hear if there are e.g.
var req = require('request'); // require some module for demo purposes
var m = require('module');
// properties of m contain current loaded module info, e.g. m._cache
I believe you could use require-analyzer, which sort of works according to Isaacs(could miss some). You could hear this in Nodeup's first podcast from 11:55.
Or you could try James node-detective which probably will find your dependencies better(but not by running code), but because of Javascript dynamic nature(12:46).
detective
Find all calls to require() no matter how crazily nested using a
proper walk of the AST.
P.S: to expose those package.json variables to node.js you could use node-pkginfo